21 Comments
User's avatar
Ernest N. Curtis's avatar

I'll go out on a limb and predict that AI will accelerate the deterioration in the quality of medical care.

Expand full comment
MSB's avatar

You mention fostering a holistic sense of well-being, hampering healthcare workers’ roles or work in the interest of the patient by not prioritising personalized care, reducing healthcare workers to passive executors of recommendations. That already happened during covid in the US and UK via the employment of covid protocols. AI wasn’t even in the picture.

Expand full comment
Steve Cheung's avatar

I’m not sure this piece adds much to the generic concerns about AI in any domain, and not just in medicine.

Hallucination. Over-fitting. Explainability. Plus over-arching concerns about pt consent and legal liability. This is not news (or at least shouldn’t be).

Expand full comment
Carllo's avatar

There are no neutral algorithms. These AI bots would undoubtably prescribe mRNA jabs to the world without an ethical blink of a shudder’s “eye.”

Expand full comment
toolate's avatar

No mention of privacy concerns???

De-identification may soon be a thing of the past

Or already is

Expand full comment
Travis S Allen, MD's avatar

This is a really poorly written piece that is not up to the standards of sensible medicine. The author describes concerns about AI as though he has no experience with it whatsoever. Recommendations provided by AI can be tailored based on the guidelines and prompts you provide. Typically, they are more (not less) thorough. Try typing in “recommendations for orthostatic hypotension management inpatient “ and let me know if most hospitalist are doing all of those things.

And then there’s this little toss-in: “Additionally, AI-based models may promote inequalities by not adequately considering the needs of minorities.” Being a minority ought to have no bearing on the quality of care you receive, obviously. Know your audience. It’s called sensible medicine, not woke medicine.

Expand full comment
Brad's avatar

"This introduces the potential for asymmetry between AI and human expertise, running the risk of reducing healthcare workers to passive executors of AI recommendations."

Too late, EPIC EMR is already doing that...

Expand full comment
Paul Railsback's avatar

Why do physicians refer to themselves as "health care workers"?

Expand full comment
Eliza Holland, MD's avatar

We don’t, “health care provider” is the name given to us by the system. Often, there is not a “physician” box when asked your role. We are now lumped in with “licensed independent providers,” which includes nurse practitioners and physician assistants. It is absolutely a generic term meant to diminish the differences between people with different levels of experience and education.

Expand full comment
Ernest N. Curtis's avatar

Good question. My feeling about the loose terminology hasn't changed in the ten years since I wrote a short article for JunkScience.com titled "The Delusion of Health Care". I'll quote the opening paragraph: "I don't like the term "health care". Health doesn't require care---illness does. I don't like the terms "health care provider" and "health care facility". They lead people into unrealistic thought patterns about what doctors and hospitals are all about and, more importantly, what they can and cannot do. I really dislike terms like "health care maintenance organization" and "health care system". The first is an outright lie and the second at least a gross misrepresentation. We don't have a health care system; we have an illness care system or what I would call a medical care system. Doctors and hospitals can't provide health or even maintain it."

Expand full comment
Edward Brown's avatar

Do you prefer a different name such as “sick care workers”?

Expand full comment
Zade's avatar

Maybe "health care workers" sounds a little too nondescript and generic, almost Marxist in whitewashing legitimate differences between doctors and nurses. Kind of like "pregnant people".

Expand full comment
JohnS's avatar

I think concerns about AI are misplaced. AI is that latest form of automation. People have always been afraid of automation because they think it will cause mass unemployment. But since the invention of the wheel, automation has instead made us more productive. This is how the economy grows.

AI can be a great tool for patients and doctors. For patients it will be very convenient to get answers to common questions, thus avoiding unnecessary trips to the doctor. A recent study comparing AI to real doctors on reddit has shown that AI can compete favorably for answering common questions. For healthcare workers, it will make them more productive. They can offload grunt work to the AI then spend more time looking at deeper, more complex health related issues.

My concern about AI is that the establishment will make it woke. For AI to live up to its potential, it has to be trained only with high quality data. It also has to be trained to understand the difference between low quality data and high quality data. It must be allowed to tell the truth. The problem is that an honest AI will often draw conclusions that disagree with establishment and political interests. When this happens there will be pressure to retrain the AI with lower quality data to give the establishment the answer they want. I think we should be very leery of letting all the stakeholders participate.

Expand full comment
Zade's avatar

There is legitimate concern that AI, by feeding off of information on the Internet that's already been mined and presented by AI, will spiral down a rabbit hole. It would be concerning if AI were to mine data from papers published by the big pharma factory that keeps its data private, cooks the conclusions, and pays for "top experts" to be primary author. And overworked doctors would be expected to trust the guidelines developed that way because this AI black box cranked them out. I find this truly depressing.

Expand full comment
Marci Kessen's avatar

Yeah, like us healthcare workers had soooooo much input into the electronic record programs, the constant refain with those “did anyone that made this thing ACTUALLY work in a clinc, ever??!!??”

How in earth do we make that happen?

Are the programmers actually going to work in each medical environment to see what it really needs? I do not see that happening. They will have this idealized utopia in their minds on how it SHOULD be and design it that way. History has many examples of how utopia chasing works and it is never good. If they actually followed real docs around and would help them with the things that do drive us nuts (prior auths for example) then good. somehow I think the providers and patients will be not the primary focus, it will be “efficiency”, guidlines, cost savings and all those easy to measure metrics over individual ptient care. I hope I am wrong, but the EMR set a bad precedent.

Expand full comment
Jim Healthy's avatar

And why are we to believe that medical corporations, now that they have seized full control of modern healthcare, will suddenly change its spots and dedicate this new technology to anything other than further cutting costs, reducing labor, and maximizing its profits to feed its greed?

Pul-ease!

Expand full comment
Crixcyon's avatar

Well here we go...A/I is perfect master for the one-size-fits-all modern stone age medical mafia. Doctors, if they haven't already, will become slaves to A/I and lose even more of their thinking ability than they already have. With all this presumed "technical" advancement, why is the country sicker than ever? No thanks, I will continue to avoid allopathic medicine like the plague of death it has become over the last 70 years.

Expand full comment
Patrick Dziedzic's avatar

Balance is required in the relationship so that the healthcare provider is using AI as the tool to provide care vs the AI using the provider as the tool to solve data input driven algorithms that may have no value for the mentioned Stakeholders.

Another reason for healthcare providers to get engaged in the development of Medical care AI is to establish AI programs that prioritize the Stakeholders before the corporate sector of healthcare prioritizes the monetary or financial aspect.

Expand full comment
Carrie C's avatar

My reaction to this piece: Since when do healthcare workers ‘have a stake’ in anything except being a cog in the wheel that costs the business money?

“Healthcare workers and patients must champion a human-centered approach to AI innovation.” That is not going to happen- they are already too busy completing the last layer of nonsense added to their day thought of by well-meaning people.

Sorry to sound so burned out but this healthcare worker is tired of all the wonderful ideas that are endlessly layered over the top of proper patient care and never ever go away.

AI may be useful at some point, yes, but I’ll wait and see if it a practical help or just more work.

Expand full comment
Susanna's avatar

One possible good outcome is that AI could integrate info from different specialists regarding a shared patient - predicting potential side effects of meds prescribed by different docs and improve on that silo situation that so many patients experience.

Expand full comment