In this long-form post, we take a look at the questions raised by AI in healthcare. And how an empathetic and agile design practice can help address them.
We examine the rise of self-management, automation and AI, followed by our take on how user-centred design can help build a coherent approach to healthcare transformation.
State of Play
The emergence of artificial intelligence has brought about a collective state of high anxiety – with some hailing a bright future of automation and efficiency, and others seeing a threat to our autonomy, our jobs and fundamental human connection. Pick your way amongst the polemic narratives, though, and you get a different – more measured – picture.
This technology is already beginning to permeate our everyday lives, in very real and practical ways – whether that's automated checkouts, social media algorithms or artificial intelligence lawyers. Incrementally, ‘robots’ – by which we mean a machine with digital automation or artificial intelligence technology – are starting to take over or contribute towards tasks that were previously handled by humans. However, the grand visions of automated cities and JARVIS-like personal assistants are still a long way from being commercially available. The utility of the technology needs to be consolidated – both in terms of efficiency, technology and social acceptability – before it becomes the new paradigm.
Healthcare, as an industry, has become a focus for this type of innovation. And it’s easy to understand why – even a casual glance at the headlines will tell you that our system is under great strain. One in five Britons wait a week or more to see a GP. We are living longer, which means an increased demand for treatment of conditions that come with old age. We are seeing a shift from communicable to chronic diseases; in the UK long-term conditions now account for 70 per cent of health spending.
While there is concern about whole industries being replaced by machines – and healthcare is no exception – this is one sector where the use of machines is needed to augment not substitute the human workforce. In this space, there is no substitute for human empathy – only the potential for technology to improve the opportunities clinicians have to connect with and treat their patients. Indeed, the World Bank estimates that the number of global healthcare workers needs to double by 2030. We don’t need robot doctors to replace humans – we need them to scrub-in and help out.
There are two areas we will focus on where we think technology can be particularly impactful. First, there is automation – using digital technology to completely take over tasks that were once done by a human or to simplify those tasks by processing or collating data. Automation can also allow us to provide services that, previously, would have been impossible to handle at scale, such as allowing patients and people in general to manage their own health or improving the way patients and families interact with the healthcare system. Second, there is artificial intelligence – and in particular, machine learning and deep learning. Though the terms are often used interchangeably, we are treating machine learning as a subset of AI. Beyond simplifying and streamlining processes, AI also promises to make diagnosis and treatments more accurate. With access to more and more data, these machines will enable us to better practice preventive medicine and – unlike human doctors – will be able to factor in mistakes, so as never to make them again.
These two approaches will make a difference in healthcare. We have already started seeing this first-hand across the digital health projects we’ve worked on. But we believe design is vitally important in introducing these practices in a sustainable way that considers the needs of the patients and clinicians. Here we will discuss the opportunities in more detail before sharing our best practices that can help make change in a constructive way.
Healthcare and Self-Management
One reason why automation is beginning to succeed in healthcare today is that people are more willing than ever to manage their own health and wellbeing. In an April 2017 report, PwC said that 55 per cent of those surveyed across Europe, the Middle East and Africa would be willing to use AI and robotics as part of their care, particularly for heart monitoring, customised fitness advice and taking and testing blood samples.
Self-management is easier than ever thanks to the ubiquity of smartphones and increasing availability of tablets and wearables. These devices have set an expectation that services and experiences will be personalised and easy to use and healthcare is not exempt from that expectation. Finally, the availability of advanced analytics makes it possible to give personalised insight into a growing range of health topics. We will be able to diagnose earlier, too, and identify hard-to-spot complications such as acute kidney injury (AKI) – a condition linked to 40,000 deaths in the UK every year, a quarter of which NHS England estimates are preventable. DeepMind’s Streams programme, which we are the design partner for, integrates different types of data and test results from a range of existing IT systems to help detect AKI.
As these technologies become widespread, and with the potential addition of AI, you get some powerful predictive tools that will learn about their users. As this happens, more healthcare can be conducted remotely – this holds particular promise to those with busy lives, mobility problems or social anxiety, for example. Moreover, if you are treated at home you are less prone to infection than in hospital. The Economist has described the hospital of the future as being more like “an air-traffic control tower” – a base from which clinicians could monitor patients anywhere in the world. A recent telehealth trial by US hospital group Banner Health concluded that admissions could be reduced by almost half and costs cut by a third.
A 2016 Frost and Sullivan report drew distinctions between the Medical Products era of the past, the Medical Platforms era of the present and the imminent Medical Solutions era. The first era saw companies profit from selling parts and hardware, followed by upgrades, consumables and repairs. In the current era we see more managed services – platforms such as wearables, mHealth, big data and analytics being leveraged to provide care. The future will add intelligence, the report said, with AI powering insight, automation and robotics-as-a-service. Service design is important now but it will become crucial to the future of healthcare. Businesses are likely to see more of their profits come from keeping the population healthy and less from treating the sick.
We’ve all experienced it: you wake up one morning and realise that you need to see a doctor today. The sooner the better. If you are lucky enough to have a GP that still offers same-day appointments then you phone as soon as the surgery opens and, if you were quick enough, there might still be an appointment available.
Once you get into the doctor’s office, they spend as much time updating their computer with your information as they do talking to you. You need a prescription – perhaps all you came for is a repeat prescription – so she prints one off and you carry your slip of paper to the nearest pharmacy.
Leaving the surgery, you pull out your phone and skim through your notifications: an urgent work email, a photo from a friend, a delivery reminder and a series of messages from a family WhatsApp group. And you wonder: why isn’t a visit to the doctor’s surgery as simple as this?
A growing number of startups are experimenting with automating medical services for the smartphone era. Echo, for example, aims to simplify the repeat prescription process. Scan the barcode on your medication, add the details of your NHS GP and they take care of the rest. Your repeat prescriptions are automated when you need them and the medication is posted to you, free of charge. The app even reminds you when it’s time to take the medicine. It saves the patient time but has crucial benefits for the NHS too: patients are more likely to correctly complete their course of medication, ensuring that they get better without further intervention, and medication won’t be ordered until it’s needed, reducing costly wasted prescriptions.
Apps like PushDoctor, meanwhile, are experimenting with telemedicine – offering GP consultations via secure video chat. It promises appointments within just a few minutes and prescriptions can be sent directly to a nearby pharmacy. It’s a paid-for service and, of course, a doctor can’t physically examine a patient via video, but it demonstrates the potential for simplifying and speeding-up some consultations and taking pressure off surgeries.
Moodnotes, which we developed in partnership with Thriveport, is a journaling app that helps people with mood disorders, such as anxiety and depression, to manage their condition using the principles of cognitive behaviour therapy. It is not a substitute for therapeutic intervention for those who need it, but it does offer an easy way for people to improve their thinking habits and emotional health.
Products like these have quickly become an established part of self-care and are increasingly being validated and supported by clinicians – who are even making moves towards formally prescribing them. Keith McNeil, chief clinical information officer of the NHS, recently told the FT: “In five years’ time, smartphones – or whatever device we use to access information – will take the burden away from the limited number of human specialists we have.”
As artificial intelligence progresses, these automated services will become even more powerful, Mr McNeil said: “People will get really intelligent triage that’s personalised to them from their phones, or be empowered to look after their own chronic conditions, like diabetes, via home monitoring.”
With an average length of 10 minutes – which according to the BBC is the shortest in the developed world – the GP consultation has never been under more strain. For clinicians, each appointment adds more paperwork to the pile with any delays only make the waiting room busier. In the context of this under-resourcing – patients often experience a detached, less emotionally attentive experience.
Clinicians spend up to 70 per cent of their time on paperwork, often putting the same information in multiple places. There are good reasons to keep this information – so that other clinicians can understand the steps taken so far, for example, or to create a record for legal reasons – but much of the data entry is repetitive and doctors often must use poorly-designed paper-based systems that do not or cannot integrate with the various other systems they use.
A lot of our work with partners like DeepMind and Fresenius has been about developing tools that collate all the information doctors need into one app, freeing them from the need to fill in multiple forms and print out piles of paper before they make their rounds. Though most doctors use mobile devices in their personal lives, those we have shadowed in clinical environments have to work on paper. An app that brings together patient history and test results is a simple way to make significant change – allowing them to spend more time with the person they are treating. Add in the ability to send alerts, for example when a test result indicates that patient is at risk of developing a serious condition, or the facility to coordinate a team, then you have a very powerful tool indeed.
There are numerous obstacles to overcome before applications like these are in widespread use by patients and clinicians but none of them is insurmountable. There are regulatory and legal concerns about the storing and transfer of sensitive personal data and, as these applications start to offer advice there will be insurance concerns to navigate too.
There are also questions about how comfortable patients and doctors are with using this kind of technology but, as we’ll see, these can often be dealt with during the design process. Finally, there is the difficulty of changing longstanding processes within a sector as vast as healthcare. Often, these have been developed for good reason – mistakes in healthcare can be fatal – and so changing them has to be approached sensitively and with the input of all stakeholders.
“I think it is going to be 10 times more precise than a doctor. No human brain is ever going to be capable of doing anything of the sort.” That’s how Ali Parsa, the founder of health app Babylon, describes the capabilities of his product. However, he has also pointed out that his service will assist, rather than replace, doctors: "No machine can put its hand on your shoulder and say trust me I'm going to take care of you."
According to the FT, the average human doctor does 7,000 consultations per year. Babylon’s AI-powered app draws on many times more, and billions of data points, to offer diagnoses to patients. It’s one of several AI applications that are transforming medical diagnoses. In a demonstration of the level of interest in such services, Babylon announced in April 2017 that it had raised a further $60m, taking its valuation to $200m.
The University Hospital Marburg, Germany, is using IBM Watson to improve diagnosis and treatment of rare diseases; Deep Medic, at Imperial College, London, is assessing scans for signs of brain trauma in patients with head injuries; and the Oncology Expert Advisor, at the University of Texas MD Anderson Cancer Centre, is extending the hospital’s reach by acting as a virtual advisor to patients outside its catchment area.
DeepMind is using its AI at the Crick Institute to help develop new drugs and to diagnose cancer at University College Hospital, in London.
Sebastian Thrun, the computer scientist who started Google X – the search company’s research base for advanced technology, developed a deep learning AI with Stanford University to diagnose skin cancer. In tests, it was as accurate as experienced dermatologists. A dermatologist in full-time practice, writes Siddhartha Mukherjee in the New Yorker, will see around 200,000 cases in a career. Stanford’s machine had analysed 130,000 cases in three months.
There is now so much new medical research published each year – 11,000 articles in the field of dermatology alone, for example – that human beings cannot possibly keep up. These learning machines can. Each case they ‘study’ will increase their knowledge and, unlike a human, they never forget what they have seen. When they do make a mistake, they can immediately, and permanently, factor it into future decisions.
This worries a lot of doctors. Will it make their services redundant? Well, such is the stress on healthcare systems around the world that AIs will simply take some of the pressure off human doctors by "compensating for human deficiencies and amplifying our strengths". Geoffrey Hinton, a computer scientist at the University of Toronto, told the New Yorker: “The role of radiologists will evolve from doing perceptual things that could probably be done by a highly-trained pigeon to doing far more cognitive things.” We will able to prioritise the empathetic care of patients, be able to focus on keeping well people well, have more opportunities to innovate and to discover new systematic efficiencies. Moreover, with the time we could save clinicians – this sort of preventive, attentive care could be performed at scale.
In time, we will see AI used to enhance everyday health monitoring on personal devices, such as wearables, and that in turn will lead to a rise in early detection of a whole range of conditions, enabling patients to be treated before a disease has advanced and saving the healthcare system money. It will help to speed up research, aid the training of human doctors and one day perhaps even provide the brains for robot surgeons.
Alder Hey Children’s Hospital, in Liverpool, is working on a ‘cognitive hospital’ – using AI to answer patients’ questions about the hospital, its facilities and the treatment they can expect. The hospital is working with IBM Watson, the Science and Technology Foundation’s Hartree Centre and with ustwo on the design of new digital service that will help prepare children for a hospital stay. It will explain what the hospital is like, who they will meet and other information, all done in a way that will engage children of all ages.
“There is lots of evidence to show that preparing children for a hospital visit is very important,” said Iain Hennessey, consultant paediatric and neonatal surgeon and clinical director of innovation at Alder Hey. “Less well prepared children need more anaesthetic and they heal less quickly.”
Studies also show that distraction with a tablet computer can be as effective as some forms of sedative for certain procedures, so the digital service might one day offer video or games. The hospital is also considering a rewards system that will incentivise children to follow their treatment plan and encourage them through difficult procedures. AI will be used to answer questions, with natural language processing handling the various forms of words that children might use to ask questions. Questions that can’t be answered will be passed on to someone who can add the answer so that Watson knows it in future.
However, Dr Hennessey says that projects like these can be a challenge for the NHS. “This project will absolutely have clinical benefits but whenever the NHS spends money on something like this, people will ask whether the money wouldn’t be better spent on medicine or medical equipment. In this case, this project is being funded by charity money. Shop Direct and Liverpool John Lennon Airport – both of which understand the importance of customer experience – have helped to fund this.”
The anxieties in this space extend to patients as well – people are nervous about using AI for diagnosis. For example, PwC’s research found that 54 per cent of Britons would be unwilling “to use an ‘intelligent healthcare assistant’ via a smartphone, tablet or personal computer” to diagnose a loved one such as child or parent. Residents of Germany, Sweden, Belgium and the Netherlands were marginally more willing but in Africa and the Middle East there was widespread approval of the idea. Ninety-one per cent of Nigerians would be willing to use such a service and positive results were also found in Turkey (85%), South Africa (79%), Saudi Arabia (70%) and Qatar (64%).
It’s possible that healthy people will be happy to have what healthcare interactions they need handled by an automated service or AI. Likewise, people who have a chronic condition may be happy to manage that themselves, with technological assistance. However, there are points where everyone will want some human interaction, particularly those who are already socially isolated or people who are in the process of being diagnosed with a serious health problem.
This technology will not become dominant overnight, just as we won’t wake up one day to find our cities awash with self-driving cars. As AI health technologies are tested, introduced, iteratively improved and scaled – our comfort levels with them will rise, particularly once we begin to experience the benefits. Allowing an AI to monitor you might benefit even those who are in good health, for example by granting you cheaper health insurance.
Similarly for clinicians, the utility of AI needs to be made evident in order to encourage adoption. In our work, many of the doctors we worked with were happy to accept AI assistance when it helped them to do something they previously couldn’t. In the case of our work with one health client, we have seen an AI being used to duplicate a task that doctors already do. Some of the doctors suspect that the AI does not serve the patient’s interests in the same way they do and that it cannot possibly match their accumulated experience. The evidence is that not only does it match them, it surpasses them in terms of diagnostic accuracy.
As with any transition as impactful as the digital transformation of healthcare, we can not be expected to take it at face value nor should we expect the rate of innovation to overwhelm us. The reality will be much more human-centered. The difference is that the successful applications of AI in health will address and alleviate corporeal human need – they will be efficacious, they will be designed to make doctors work easier and patients treatment better.
That’s not to say there won’t be obstacles. We will face similar issues as with automation but they are potentially more serious. Who is responsible if an AI misdiagnoses you? Then there is the ‘black box’ nature of AIs, meaning we might be unable to know how the system reached a decision – something that could complicate both accountability and ongoing medical research.
What steps can those building these self-learning solutions take to ensure both the efficiency and acceptability of these AIs? We believe getting the underpinning design process right, is key.
We will able to prioritise the empathetic care of patients, be able to focus on keeping well people well, have more opportunities to innovate and to discover new systematic efficiencies.
User-centred Design and Healthcare
At ustwo, collaboration is core to our approach with all our client projects and own IP experiments. However, healthcare has particular challenges that we have to take into account alongside this process.
It is our working partnerships that have made it possible to do the work we do. Shared expertise and different perspectives are crucial to every step of the process. Making software any other way is much harder. Using ‘traditional’ approaches in healthcare doesn’t work. The process of gathering requirements then technologists retreating for an extended period before returning to reveal the finished application, is how you end up with the kind of healthcare apps that slow clinicians down, leave them frustrated or don’t meaningful help the treatment of patients.
Instead, we work in collaboration with clinicians – testing our ideas with them every step of the way. In almost every other sector we have worked in, we have become expert enough to make some decisions about a problem. Healthcare is much too complicated for that: It makes access to and working relationships with experts not only important, but vital.
It’s important to build a genuinely single team, where doctors, designers and technologists can work together and share expertise. Bringing skills sets and ways of working from two very different industries is both a challenge and opportunity. Sometimes, there are conversations about managing time-frames or technological limitations – making a certain change a week before launch might just not be possible, for example. But, these conversations are an integral part of creating a shared understanding to launch a product that really works.
What isn’t important is that they have the same technological knowledge. We’ve worked with some doctors who also code but, while that adds something to the process, it isn’t essential. In fact, sometimes a little technology knowledge can harm the process because our partners will be less likely to ask for things that they think might be difficult. We don’t want them to be concerned about that. If something needs to be on the screen then we want them to ask for it and we’ll work out how to get it there.
The agile approach involves gathering requirements, developing functionality, deploying, testing and then reviewing before beginning the cycle again. There has been some debate about whether it is appropriate for the healthcare space. We would say that it is actually more important for this space than for most others. It is essential to have that tight feedback loop with clinicians because it can take time to get exactly the right outcome.
One screen of the app that we worked on with one of our health clients, for example, went through around 200 iterations before we were all happy with it. There is a process of trial and error, experimentation and exploration than has to be done together.
Another advantage of this approach is that it allows us to deliver early and often. We can show the clinicians a prototype very quickly and this helps to develop understanding and buy-in. As we tweak the product and return to show them new versions, the level of interest and enthusiasm tends to remain high.
The process of designing for patients is not so different from designing for users of other kinds of service. We have found a receptive audience for healthcare applications. Many individuals – whether that’s clinical patients or well people engaged in preventive self-care – are already likely to be comfortable using the kinds of devices that will enable them to more actively manage their own health. Smartphone penetration in the UK is high, tablet penetration is still increasing and many people are using wearables and fitness trackers. All of these devices can play a role in self-monitoring, whether it is of general wellbeing or of a specific, ongoing condition.
Access to technology has also made people more willing, and able, to manage their own health. Once a task becomes simple and accessible enough, people will take to it if they can see a benefit. Maintaining your health with minimal effort is a tangible enough benefit for most. The real task is making these user-focused applications as medically rigorous and meaningful as possible.
Designing with clinicians is slightly different. These people use their personal devices, of course, and are customers and patients outside of work. However, there are peculiarities to the way doctors work and think that means that it takes some time to learn how best to work with them. As one clinician put it early in one of our projects: if you have three doctors in a room then you’ll get five different opinions.
Nik Barnes, consultant radiologist and chief clinical information officer at Alder Hey Children’s Hospital, in Liverpool, said people often underestimate how complicated hospital systems are. He said that working closely with ustwo had enabled him to help convey those issues.
“A hospital isn’t like a factory,” he said, “with the identical materials going in at one end and identical products coming out at the other. Patient needs vary enormously and as a result hospitals have built up complicated processes and legacy technology systems going back years. Anyone working with us has to be able to understand that. ustwo have done that with enthusiasm and brought interactivity to the whole process.”
In our work with one health client, we have worked closely with two groups of doctors – one group of subject matter experts and another who are day-to-day practitioners. This gives us a good range of perspectives on how to present the information they need, though in this case the two groups didn’t always agree. However, there is of course a respect for experience and frontline knowledge – whether that comes through more operations, more diagnoses, more research or more intimate experience with patients. That means, as we learned quickly, that coming to a consensus often means seeking the opinion of whoever is the most experienced.
“One of the biggest obstacles to change is not the technology but requiring people to learn whole new processes,” said Mr Barnes. “When you force people to make major changes to the processes they follow – especially in an environment like a hospital, where staff are stressed and budgets are tight – then they are likely to not bother. Clinicians don’t have much time to learn new systems.”
Sometimes, one purpose of our projects is to bring about organisational change. A client will ask us to develop something that will become the seed of a new way of working for that organisation. With some of our health clients, broader organisational change happens as a result of the way we work together.
Often, when we’ve worked with doctors, they become the strongest advocates for the software once we present it back. They are often the ones insisting that they carry on working this way and not revert to old ways because they have seen the results. Seeing that kind of reaction is incredibly rewarding and encouraging for future projects in this space.
As we have seen, many of the obstacles that stand in the way of increased automation and use of AI in healthcare are systemic, social or cultural. These are complex barriers – but, we believe, product design is one very practical way to start overcoming them. Working with clinicians can uncover some of these obstacles and we can find ways to clear them. And, placing the user at the center of the design process and practicing iterative design allows us to test and refine our ideas until we have a solution that works for patients and clinicians alike.
There is an enormous amount of opportunity in this space and the potential work that can be done here is too important to be ignored.
We have a great opportunity to use technology to transform healthcare. The circumstances are not just right for change – they make it imperative. Our healthcare system is under stress because we are living longer, our health problems require more long-term treatment and there is a squeeze on how much money is going into the system.
Automation and AI can help the system to cope better but they will also allow us to do things that could not be done before, such as making earlier diagnoses. Some of the solutions will be simple, such as using a smartphone app to track and order repeat prescriptions. Others will be vastly more complex, such as smart steering wheels that monitor our grip for the very specific tremors that can be an early indicator of Parkinson’s disease.
Diagnosing cancer will be vastly more efficient when carried out by a learning machine that is be able to make an accurate diagnosis using its knowledge of hundreds of thousands of cases – far more than a human clinician could see in a lifetime, let alone remember. And the availability of telehealth services will turn hospitals into medical hubs that remotely monitor their patients, calling them in only when procedures are required that cannot be carried out by video chat and sensors.
The worry that this will replace clinicians is misplaced. Such are the demands on the healthcare system and irreplaceable need for human empathy in healthcare, that the value lies with this as assistive technology. In some areas, such as for diagnostics, we will see more machines and fewer humans. That will free up those humans to do something more useful that machines cannot yet do. And doctors will remain the primary decision makers in providing care, with new technology acting as an assistant.
Mr Hennessey believes that people will learn to trust AI in healthcare – just as pilots trust the autopilot – but there will also be a need for the doctor-patient relationship. He said: “Patients have to feel safe and cared for. That’s an important part of being a doctor. A huge part of my job is making people feel less anxious and explaining things to them.”
Likewise, patients will be able to automate lots of routine medical procedures, such as ordering and collecting repeat prescriptions, where they might currently have to see a clinician but don’t necessarily want, or need, to. But there are some tasks where patients very definitely want to see a person, perhaps to ask specific questions, and those interactions will remain. Indeed, automating some of the routine tasks that don’t require face-to-face time with a clinician, should make more time available for the interactions that do.
Sometimes the technology itself will facilitate more rewarding interactions between doctors and patients. In some of our health work we have found that doctors will visit a patient on a ward and then, because of a lack of mobile technology, have to return to their office to issue a prescription, update notes or perform other tasks that they consider to be ‘admin’. Mobile apps that allow the doctor to perform all of these tasks at the bedside mean more time spent with patients and, we think, will change how these tasks are perceived.
When doctors have the information they need in an easily digestible form, they have the option to show this to the patient and explain their diagnosis. Again, this is technology serving to enrich the relationship between patient and doctor.
But this only works with well-designed services, built in close collaboration with doctors and patients. It seems obvious but this hasn’t always been the case with healthcare technology. We have seen too many examples of systems that seem to have been built to meet the needs of managers or scientists or accountants – anyone but actual clinicians, it seems.
We face a huge problem with how we deliver healthcare today and how we deliver it in future. We also have some immensely powerful technological solutions. The technology will probably arrive whether we like it or not – if we want it to be effective then technologists and clinicians need to work together.
To discuss how ustwo can partner with you on healthcare solutions, get in touch – firstname.lastname@example.org.