NEWYou can now listen to Fox News articles!

There are now hundreds of image-specific AI algorithms across the fields of radiology and cardiology. 520 applications have been approved by the FDA between 2019 and January 2023 and close to 400 of these are for radiology. This is all very exciting.

But what about direct clinical applications? According to a recent study on clinical use of AI in osteoporosis published in the journal Nature, "Applying the AI algorithms in a clinical setting could help primary care providers classify patients with osteoporosis and improve treatment by recommending appropriate exercise programs."

Unfortunately, there is a huge caveat. The problem occurs when these algorithms are extended to clinical practice without set standards and requiring massive amounts of data on which to train. And when AI fails in a diagnosis, there is vulnerability for a lawsuit to both the health care facility or hospital and potentially the physician.

This problem is made worse with unregulated bot models that are available to the general public. According to a study just published in the Annals of Biomedical Engineering, "ChatGPT, a language model developed by OpenAI, with its ability to generate human-like text based on large amounts of data, ChatGPT has the potential to support individuals and communities in making informed decisions about their health."

‘IT PRESUMES TO REPLACE US’: CONCERNS OF BIAS IN A.I. GROW AFTER WARNING FROM ELON MUSK

A new study from the Netherlands showed that ChatGPT could answer basic cardiology-related questions 90 percent of the time. But this success rate diminished dramatically to 50 percent when the questions became more complex. And then there's the concern that AI can "hallucinate," giving a completely inappropriate answer to a question just when you feel you can rely on it.

This is why a patient, rather than wait an hour in a waiting room for a ten-minute visit with a real doctor, can’t simply resort instead to ChatGPT. And it’s not only a question of accuracy. We doctors also bring clinical judgment along with empathy and nuance to a personalized doctor’s visit. Risk benefit analyses of tests, treatments, and vaccines are complex and are too personalized for even the most advanced AI.

CLICK HERE TO GET THE OPINION NEWSLETTER

AI robot process automation RPA Document management system DMS for company digital transformation paperless workflow, security cloud digital technology. futuristic 3D robot (iStock)

Which is not to say that AI can’t contribute. According to a recent study on clinical use of AI in osteoporosis published in the journal Nature, "Applying the AI algorithms in a clinical setting could help primary care providers classify patients with osteoporosis and improve treatment by recommending appropriate exercise programs."

CLICK HERE TO GET THE FOX NEWS APP

Health insurance companies are starting to incorporate AI to mass review health insurance claims without actually reading through the claims. There is little doubt that this streamlining can save time and money, but I am concerned that it will also interfere with personalized health solutions. A 2022 McKinsey review demonstrated that AI can automate up to two thirds of the manual work involved with prior authorizations. Blue Cross trials in Massachusetts have demonstrated this to be a fair estimate, but the larger issue remains, quality of care can be jeopardized in the process.

There is no doubt that we doctors need help – we are buried in computerized paperwork and bureaucratic interfaces that interfere with patient care. But the rush to a computerized solution must not further dehumanize health care.

CLICK HERE TO READ MORE FROM DR. MARC SIEGEL