According to Go Live training, when considering the daily tasks of a doctor, they are comparable to 3 types of fires:
- The fire in a garden. For example, this includes an inpatient that is deteriorating rapidly or for a patient that urgently requires a scan and the “waiting list” is just too long.
- The fire that is down a road. These include broader challenges when it comes to delivering care for the outpatient clinics that have resulted in delays that are unacceptable or attempting to ensure enough staff to accommodate care that is minimally viable.
- The last one is the fire that is far away, which can include priorities that are more distant. This is a category whereby the majority of the service improvements generally happen.
Traditionally, adoption of the latest technology such as AI (Artifical Intelligence) fits well with the last type as it understandably takes the 2nd or 3rd place to actual patient care. The phenomenal Adam Kay’s “This is Going to Hurt” paints a picture that is detailed about the realities of the lives of doctors. If you would like to gain a better understanding of working in these systems, “Do No Harm”, by Henry Marsh, articulates clearly the challenges that consultants often go through. Both of these books indicate how doctors are driven to provide patient care and the systems that they require in order to achieve this.
It is very true that patients need to be prioritised. However, workforce limitations often mean the progress on a larger-scale digital transformation in association with the way that we work with data is often stalled. There are only very few doctors that have even joined in with the conversation, and it is impossible for us to advance with patient care should we carry on in this way.
While the press is known for sensationalising AI to a degree, a survey recently revealed that healthcare organisations and executives now have an understanding that is more realistic of where and how it can assist. The operational areas, that are less likely to result in anxiety among clinicians and patients, are preceding the more life-critical, clinical functions in regards to adoption. This is an approach that shows wisdom and may assist with reducing the phase of disappointment that is often present as soon as the latest technologies are overexposed and sold to different markets.
In addition, clinical responsibility can’t be assigned fully to machines. In the technical terms, this is not yet possible, and when it comes to ethical terms implications still need to be properly controlled and thought through. The clinicians are all trained well through their specific professional bodies, and so far there is just no equivalent when it comes to machines. This means that we will not hand over complete clinical decisions to algorithms any time soon.
While humans are known for making errors, there seems to be an expectation wrongly or rightly that machines should not. At this stage, clinical AI needs time to learn from real humans as well as improve, while at the same time minimising any risks to the patients. Effectively, we are actually talking about augmented intelligence rather than artificial intelligence.
The conundrum can be compared to that of first-time job seekers that require experience in order to obtain employment, yet cannot gain the experience since no one is willing to hire them. How is it possible for health systems to implement or introduce AI when the doctors are already well-stretched? AI requires trials and needs clinical assistance in order for the machine to learn.
We should not expect that doctors will prioritise AI development over the care of patients, yet we should start facilitating inclusion and engagement of ideas that come from a lot more doctors instead of a select few. Most doctors already have clinics that are overflowing with patients that are battling diseases. It is not very practical or helpful, nor realistic or relevant to suggest that an integrated and hardworking care team is going to be replaced with AI robots.
Instead, we should be focused on ways to improve these clinics. Maybe certain appointments could be conducted through telemedicine? If this is possible, which ones would qualify? What type of digital tools are accessible to patients? What are the capabilities of the workforce to make use of different digital tools? We require analytics to gain insights into the experiences of healthcare workforces and patients.
We should be using insights and data in order to deal with the more immediate fires when it comes to health services. This will work on freeing up valuable time for health workforces so they are able to spend the time that they should with their patients.