Acceso abierto

The Devastating Impact of Artificial Intelligence (AI) on the Traditional Ways of Practising Imaging and Cardiology Are ‘imagers’ an endangered species, and should trainees rethink their careers?


Cite

Introduction

Artificial Intelligence (Al) - software used by computers to mimic human intelligence[1] - has been in the news a lot recently, due to the advent of ChatGPT (OpenAI, CA, USA).[2] In our view, the publicised performance of this software[3] is, at most, a high-level gimmick, a sort of slick party trick for people who have time on their hands and who don’t mind paying yet another digital subscription. As is increasingly common, the noise tends to drown the message, and it is rather difficult to reach a personal and balanced opinion, what with luminaries musing about an ‘existential threat to humanity,’[4] Hollywood actors panicking about their likeness being embodied digitally and exploited with no benefit accruing to them[5] and pundits wringing their hands about the risk of fake news which look more plausible than reality itself.[6]

The authors of this piece are cardiologists interested in coronary intervention as well as cardiac imaging, and are also, by nature, sceptical and pessimistic. It is from this vantage point that we would like to paint a (purely personal) picture of how we see cardiac imaging evolve over the next 5–10 years, with AI now firmly on the scene, at least in the West. We think ChatGPT has hit the headlines for the wrong reasons, and with actors generally being a more vocal group than doctors, it is not surprising that alarming developments in Hollywood are more visible than concerns over what we think will be the effective obliteration of large swathes of the medical profession, but the point needs to be made, and, even more importantly, we, the medics, should prepare actively for the new world ushered by AI.

How do AI algorithms work?

This was a question we asked ChatGPT, which generated an impressive and succinct response which both illustrates its abilities and demonstrates to the reader how the technology works.

The authors’ query to ChatGPT: “Write a single paragraph on how artificial intelligence algorithms work”

ChatGPT response (July 2023): “Artificial intelligence algorithms work by processing vast amounts of data through predefined mathematical models and rules to identify patterns, correlations, and features within the data. These algorithms can be tailored to specific tasks, such as image recognition, natural language processing, or decision-making. Through a process called training, the algorithms adjust their parameters iteratively, reducing the error between predicted outputs and actual outcomes. As the algorithms learn from data, they become increasingly proficient at making predictions or performing tasks, achieving a level of performance that may rival or exceed human capabilities in certain domains. The success of AI algorithms heavily relies on the quality and quantity of the data they are exposed to, as well as the suitability of the chosen mathematical models for the given task.”

Of interest to the authors is the self-proclaimed superiority of AI in describing its level of performance as that which may ‘rival or exceed human capabilities’. When it comes to image interpretation and pattern recognition, the superiority of AI may well be true given its rapid access to historical imaging data (the ‘training’ datasets) and the ability to analyse images in details relying on individual raw or pixel data, which has been tested in histological,[7] electrocardiographic[8] and indeed radiographic images.[9,10] All of these studies have concluded that AI simulates and supports human intuition and decision making as it relates to health care.[11]

Impact of AI on cardiac imaging
Cardiac imaging is a rapidly growing area

Cardiac imaging is a fast-expanding field, with demand for certain imaging tests, such as CT coronary angiography (CTCA), growing at astronomical speeds. [12] This focuses investors’ attention and pressures to maximise profits are greater than in other fields of health care such as, say, cardiac rehabilitation, or palliative care in end-stage heart failure. Having said that, the impact of such ‘non-sexy’ interventions on quality of life is far higher than, for instance, prophylactic coronary calcium scoring coupled with other non-invasive cardiovascular screening tests.[13]

Scalability

Diagnostic tests are poorly ‘scalable’:[14] there is a limited number of scans that can be done and interpreted per unit of time, determined by the physical laws of the Universe and by the availability of scanners and of trained radiologists (whereas, for instance, a 3-minute song can become a hit, be downloaded or streamed billions of times, and generate almost limitless revenue - it is highly scalable). It follows that in order to maximise profits, increasing the availability of imaging is a less effective strategy than cutting the costs associated with it. Together with the increasing potential profits at a time when demand for imaging is increasing, its poor scalability realises a perfect storm for the ‘rise of the machines’ and the elimination of human involvement in imaging.[15]

How does AI compare to ‘imagers’?

Reviews on the topic of AI in cardiac imaging are generally upbeat and emphasize the ‘bright side’ of AI16, which is deemed to make cardiac imaging ‘thrive’[17] and which is credited with benefits, for instance at the time of the COVID pandemic.[18]

Image reporting by radiologists or cardiac ‘imagers’ is patternrecognition backed up by experience and judgement. AI can do that much faster, more accurately and cheaper than humans. While a human needs decades to train in order to deliver a productive professional life-span of 2 or 3 decades, AI trains itself in days or weeks using the whole accumulated experience of humanity in any given field. Throughout a doctor’s career, relatively high wages,[19] sickness benefits, and pension plans need to be paid by the employer, who remains financially liable (through having to pay a pension) potentially for decades after the doctor has retired and has stopped generating any value for the health care system. Humans make mistakes, and their performance may vary unpredictably due to external factors over which they have no control, which causes increasing levels of litigation, with their added financial costs to the employers.[20]

In comparison, AI is almost implausibly cheap,[21] and – as a consequence of Moore’s law – (although we accept its continued validity is debated)[22] it is likely to become progressively cheaper over time. Moreover, AI image interpretation is consistent, fast, and at least of similar accuracy to human reporting, even in studies performed from an ‘AI-hostile’ point of view[23] leading to the opinion that ‘we should stop training radiologists now’.[24] AI also makes radiomics possible, i.e., the extraction of quantitative data from an imaging dataset;[25] early and already widely used applications include longitudinal strain in echo and T1 mapping in cardiac magnetic resonance (CMR), but the potential of this type of ‘deep-dive’ into data mining is much broader and not yet fully realised.

Finally, AI can identify patterns and structure where the human eye does not discern them, and, as such, can identify new cardiac phenotypes, e.g. in aortic stenosis[26] or predict the risk of adverse outcomes from seemingly normal ECG tracings.[27]

Impact of AI on patient care pathways

There is no research looking at how AI could have a major effect directly on the way in which health care is delivered throughout the hospital, during ward rounds and at the bedside, with immediate impact on the workforce numbers and qualifications, and on job descriptions, but the building blocks for a system which almost completely dispenses with doctors, radiologists and sonographers are already here, and it is only a matter of time before managers, financiers and government health departments join the dots and implement it, in order to reduce costs and (in settings where applicable, such as the USA, or, increasingly, Germany)[28] increase profits.

The disparate constituents are already available

One of the most common indications for performing a transthoracic echocardiogram in hospital is the assessment of left ventricular ejection fraction.[29] At present, at least in the UK, the process for this involves submitting to the local echo department a digital or paper-based request for a scan, taking the patient to the scanner when a slot becomes available for scanning, having a highly-specialised echo physiologist perform the scan over approximately 45min, and then generate a report which is sent to the requester on paper or digitally (depending on local set-up).

The advent of ‘pocket’ and smart-phone echo scanners[30] demonstrated that useful, diagnostic quality images can be acquired by almost anyone after minimal training.[31] Even better, AI can now guide (in real time!) a novice to acquire high-quality transthoracic echocardiograms, as demonstrated in some spectacular YouTube videos.[32]

AI is already deployed on high-end, commercially-available scanners, found in most cardiology departments in the West, and is able to measure autonomously (w/o any human intervention) ventricular and atrial volumes and ejection fractions, as well as global longitudinal myocardial strain, a process that takes about half a minute and which requires a single 3D, 4-chamber acquisition from the apex.[33]

A bleak future

Here is how we see the hospital echocardiography service, 10 years from now: There will be at most 2–3 people specialised in echo (with skills equivalent to those of any of the 8–10 highly specialised echocardiography physiologists in a contemporary UK department) in any average-sized cardiac department. They will be there primarily for quality control (QC) and for the occasional complex patients. Just like today’s nurses do echo ward rounds, scanning bladders in oliguric patients, there will be minimally-trained health-care assistants able to acquire apical 4-chambers view in 3D, using dedicated small scanners with on-board (or Bluetooth/ Wi-Fi accessible) AI, to guide the acquisition of echo images, doing echo ward rounds and acquiring apical 3D datasets potentially in all patients. In poorer settings they will acquire just 2D images under AI guidance, perhaps using pocket or smart phone-based scanners. Once a study is finalised, the datasets are uploaded on a central server where AI produces bi-atrial and bi-ventricular volumes and EF and (if deemed useful) some myocardial strain-based parameters.

The research to prove the next point is yet to be performed, but we think a single 3D colour flow mapping acquisition from the apex can resolve the assessment of the severity of valvular regurgitant lesions in most patients with regurgitation of the atrioventricular valves, and a parasternal one in those with aortic valve lesions. Patients with difficult images will be reviewed by the few remaining sonographers, with a view to consider either echo contrast studies or (more likely) immediate CMR (which will also be entirely automatized and with mostly autonomous reporting – current reporting software already uses AI to measure volumes and EF, and it is increasingly accurate).

We emphasize that the technology to implement such a model is already available; all that is needed is for the penny to drop and for someone to join the dots - the cost savings would be astounding. Through a combination of inertia, possible active resistance from various professional medical bodies (totally understandable, because this would ruin the huge industry of echo reporting, in fee-for-procedure healthcare systems) and simple, blissful ignorance, we continue to practice in the old ways, and to train new generations to acquire skills that would make them redundant within years from now.

What can we do to survive?

A simple answer seems to be: choose a job where you do ‘stuff’ to the patient with your hands. Even with extensive automation, we think the jobs of basic carers will be protected for the foreseeable future (it is much simpler to train a robot to interpret complex imaging datasets than to change soiled bed sheets; that may eventually come, but would require major breakthroughs in computer science, material science and engineering).

At the high end, cardiac surgery is unlikely to be ever taken over, but its scope is now significantly restricted by the massive expansion of PCI and of structural intervention. Aortic surgery however is unlikely to disappear, or be taken over by AI, as is surgery for complex congenital heart disease.

If you want to do imaging, in spite of all this, perhaps imaging complex congenital heart disease may require the involvement of human brains for longer. However, the paediatric patient population usually generates superb echo images, due to their thin chest wall and small body size, meaning that training AI bots to a high standard should probably be feasible even for complex anatomies.

Emergency-focused echo is unlikely to be automatized because of its time-critical nature, where slick, highly trained operators will maintain an edge over AI for a long time to come. Because of the same considerations, we think transplant cardiology and LVAD/ Impella implant and management will also remain the preserve of cardiologists.

A shrinking island

We thought, until recently, that an obvious imaging niche where humans seemed protected was intra-procedural echo (mainly TOE, but also ICE). However, we are now less sanguine than we were just a few weeks ago. At 14:35hrs on 14/06/2023, on the Twitter feed (@GHCardiology) of the Glenfield Hospital (a major teaching and NHS institution in Leicester, in England) a post appeared describing how a nurse practitioner (identified only as ‘John’, and depicted with a mask covering the lower half of his face) had just completed a TAVR procedure as a first operator. The story was presented with a very positive spin, as a major breakthrough, and ‘a true transformation addressing NHS needs’. Within minutes, a huge backlash developed, to the point where the British Cardiac Intervention Society (BCIS) was forced to release an official statement to address the outcry. Trainees as well as senior, established consultants joined in an indignant chorus of opprobrium: such a development was perceived as putting patients at risk, flying in the face of training needs of structural fellows, and demeaning a complex procedure by devolving it to a non-physician grade. Clearly, if a NP does TAVR, there is no reason for which they (or echo physiologists) cannot do intra-procedural TOE to guide the small number of structural interventions still requiring it - in fact apocryphal evidence suggests this is already happening in some centres.

Implications for training

We think the stethoscope should be abandoned, and ubiquitously replaced with ‘point of care’ pocket ultrasound scanners (POCUS). Trainees should be taught to image all scannable parts of the body, as part of the physical exam, starting with their first clinical attachment, and POCUS should become the new stethoscope. In this way it may be possible to maintain the relevance of the doctor in the rapidly-shifting, unfamiliar landscape of contemporary medicine. Otherwise, our prediction is that we will be decimated and annihilated by the rise of AI, coupled with dwindling resources and increased demand from the ageing, sicker population.

Conclusion

There is little doubt that AI will shape the future, and no denying that it will replace many of the skills that are currently carried out by doctors. In the field of cardiology, the skills most at risk of being replaced are routine imaging and non-procedural skills, but these are increasingly being devolved to non-physician grades. We therefore write this pessimistic article to provoke deeper thoughts and consideration within the cohort of cardiology trainee to reassess their priorities and chosen plans to ensure they have a long, AI-proof career.

eISSN:
2734-6382
Idioma:
Inglés