AI may someday work medical miracles. For now, it helps do paperwork.
[ad_1]
Dr. Matthew Hitchcock, a family physician in Chattanooga, Tennessee, has an AI helper.
It records patient visits on his smartphone and summarizes them for treatment plans and billing. He does some light editing of what the AI produces, and is done with his daily patient visit documentation in 20 minutes or so.
Hitchcock used to spend up to two hours typing up these medical notes after his four children went to bed. “That’s a thing of the past,” he said. “It’s quite awesome.”
ChatGPT-style artificial intelligence is coming to health care, and the grand vision of what it could bring is inspiring. Every doctor, enthusiasts predict, will have a superintelligent sidekick, dispensing suggestions to improve care.
But first will come more mundane applications of artificial intelligence. A prime target will be to ease the crushing burden of digital paperwork that physicians must produce, typing lengthy notes into electronic medical records required for treatment, billing and administrative purposes.
For now, the new AI in health care is going to be less a genius partner than a tireless scribe.
From leaders at major medical centers to family physicians, there is optimism that health care will benefit from the latest advances in generative AI — technology that can produce everything from poetry to computer programs, often with human-level fluency.
But medicine, doctors emphasize, is not a wide-open terrain of experimentation. AI’s tendency to occasionally create fabrications, or so-called hallucinations, can be amusing, but not in the high-stakes realm of health care.
That makes generative AI, they say, very different from AI algorithms, already approved by the Food and Drug Administration, for specific applications, like scanning medical images for cell clusters or subtle patterns that suggest the presence of lung or breast cancer. Doctors are also using chatbots to communicate more effectively with some patients.
Physicians and medical researchers say regulatory uncertainty, and concerns about patient safety and litigation, will slow the acceptance of generative AI in health care, especially its use in diagnosis and treatment plans.
Those physicians who have tried out the new technology say its performance has improved markedly in the past year. And the medical note software is designed so that doctors can check the AI-generated summaries against the words spoken during a patient’s visit, making it verifiable and fostering trust.
“At this stage, we have to pick our use cases carefully,” said Dr. John Halamka, president of Mayo Clinic Platform, who oversees the health system’s adoption of artificial intelligence. “Reducing the documentation burden would be a huge win on its own.”
Recent studies show that doctors and nurses report high levels of burnout, prompting many to leave the profession. High on the list of complaints, especially for primary care physicians, is the time spent on documentation for electronic health records. That work often spills over into the evenings, after-office-hours toil that doctors refer to as “pajama time.”
Generative AI, experts say, looks like a promising weapon to combat the physician workload crisis.
“This technology is rapidly improving at a time health care needs help,” said Dr. Adam Landman, chief information officer of Mass General Brigham, which includes Massachusetts General Hospital and Brigham and Women’s Hospital in Boston.
For years, doctors have used various kinds of documentation assistance, including speech recognition software and human transcribers. But the latest AI is doing far more: summarizing, organizing and tagging the conversation between a doctor and a patient.
Companies developing this kind of technology include Abridge; Ambience Healthcare; Augmedix; Nuance, which is part of Microsoft; and Suki.
Ten physicians at the University of Kansas Medical Center have been using generative AI software for the past two months, said Dr. Gregory Ator, an ear, nose and throat specialist and the center’s chief medical informatics officer. The medical center plans to eventually make the software available to its 2,200 physicians.
But the Kansas health system is steering clear of using generative AI in diagnosis, concerned that its recommendations may be unreliable and that its reasoning is not transparent. “In medicine, we can’t tolerate hallucinations,” Ator said. “And we don’t like black boxes.”
The University of Pittsburgh Medical Center has been a test bed for Abridge, a startup led and co-founded by Dr. Shivdev Rao, a practicing cardiologist who was also an executive at the medical center’s venture arm.
Abridge was founded in 2018, when large language models, the technology engine for generative AI, emerged. The technology, Rao said, opened a door to an automated solution to the clerical overload in health care, which he saw around him, even for his own father.
“My dad retired early,” Rao said. “He just couldn’t type fast enough.”
Today, the Abridge software is used by more than 1,000 physicians in the University of Pittsburgh medical system.
Dr. Michelle Thompson, a family physician in Hermitage, Pennsylvania, who specializes in lifestyle and integrative care, said the software had freed up nearly two hours in her day. Now, she has time to do a yoga class, or to linger over a sit-down family dinner.
Another benefit has been to improve the experience of the patient visit, Thompson said. There is no longer typing, note-taking or other distractions. She simply asks patients for permission to record their conversation on her phone.
“AI has allowed me, as a physician, to be 100% present for my patients,” she said.
The AI tool, Thompson added, has also helped patients become more engaged in their own care. Immediately after a visit, the patient receives a summary, accessible through the University of Pittsburgh medical system’s online portal.
The software translates any medical terminology into plain English at about a fourth grade reading level. It also provides a recording of the visit with “medical moments” color-coded for medications, procedures and diagnoses. The patient can click on a colored tag and listen to a portion of the conversation.
Studies show that patients forget up to 80% of what physicians and nurses say during visits. The recorded and AI-generated summary of the visit, Thompson said, is a resource her patients can return to for reminders to take medications, exercise or schedule follow-up visits.
After the appointment, physicians receive a clinical note summary to review. There are links back to the transcript of the doctor-patient conversation, so the AI’s work can be checked and verified. “That has really helped me build trust in the AI,” Thompson said.
In Tennessee, Hitchcock, who also uses Abridge software, has read the reports of ChatGPT scoring high marks on standard medical tests and heard the predictions that digital doctors will improve care and solve staffing shortages.
Hitchcock has tried ChatGPT and is impressed. But he would never think of loading a patient record into the chatbot and asking for a diagnosis, for legal, regulatory and practical reasons. For now, he is grateful to have his evenings free, no longer mired in the tedious digital documentation required by the American health care industry.
And he sees no technology cure for the health care staffing shortfall. “AI isn’t going to fix that anytime soon,” said Hitchcock, who is looking to hire another doctor for his four-physician practice.
This article originally appeared in The New York Times.
Get more business news by signing up for our Economy Now newsletter.
[ad_2]
Source link