10 Comments
User's avatar
Jim Ryser's avatar

I look at AI in medicine the same way I look at AI in music. If you hear AI produced music, you can tell. And I suspect over the years it will get more focused. But it will never be the soul and spirit that a human can imbibe into a song. And AI music is as obvious to me as a rash on my face. I would imagine physicians would feel the same way. Now granted, I’m old. So I guess this is just a warning to those who think AI is going to be a panacea. I see it the same way we used to see “up to date” when it first came out. Used as a companion it can be beneficial. Used as primary not so much.

Expand full comment
Kathy's avatar

AI is dependent upon a thorough history and accurate, honest clinical studies. I am not a physician but an occupational therapist. In my experience, it often took multiple visits and establishing a connection with my patient to uncover the many factors contributing to the condition. People often do not remember (auto accident 10! years ago with cervical injury that could contribute to current carpal tunnel syndrome), lack of sophistication to report something that may be relevant (pt with multiple admissions for low sodium who turned to to be drinking bottles and bottles of water a day to be healthy), or do not want to reveal the truth. In addition, we are now learning that research has been falsified. In clinical practice, I had the ability to look at studies and question if those studies may have a conflict of interest. Or it may just not have made sense to me and thus I would hold it as questionable. I don’t know if AI will have the capacity to do that. In a perfect world, I think AI would be a fantastic tool. In our flawed world AI will be another tool that has the potential to be used for both good and evil.

Expand full comment
Rudy's avatar

Although computer interpretation of ECGs may not be the best example of AI, it should be one of the easiest to develop (recognition of a limited number of patterns). In my opinion, other than the completely normal ECG, computer interpretation of ECGs are often wrong. The computer often misses important diagnoses, and often assigns potentially serious diagnoses to either artifact or normal variance.

Expand full comment
Mary Braun Bates, MD's avatar

AI has a couple advantages over human physicians. AI doesn't care what its patient satisfaction scores are and will not suffer if it is sued for malpractice.

Expand full comment
Jim Ryser's avatar

🤣

Expand full comment
Randy's avatar

I’m guessing that AI is subject to the same “leash” as human doctors. That is, it is programmed to suggest the accepted “standard of care,” even if it detects a better treatment is available. For example, if it were around in late 2021, AI would not be “allowed” to recommend ivermectin to a patient with a COVID-positive PCR test, even though its safety and efficacy (both superior to Remdesivir) were well-established by then.

Expand full comment
Gene's avatar

AI can’t pray and it can’t give compassion. My love for others will never be replaced. ❤️+

Expand full comment
Jerome Kassirer's avatar

Some will find it odd that the accompanying editorial, which covers the same ground, is not cited. JPK

Expand full comment
The Layperson's Layperson's avatar

The adjudication process was not outcome based. We don't know how these patients actually fared just what (biased?) adjudicators thought should happen.

The adjudication process did not involve costs.

I like what it says about unjustified empirical treatment. If AI can stop empirical treatment it should win the Nobel Prize in Medicine. I would like to know what *justified* empirical treatment is.

Expand full comment
Saul's avatar

It will be interesting to see the pace of improvement of these “agents” especially regarding patients who present with more complex histories. How will AI resolve potentially contradictory data?

Expand full comment