Artificial intelligence is rapidly becoming part of everyday life in American hospitals.
From patient scheduling to medical documentation, health systems across the country are racing to adopt new AI powered software that promises to save time and reduce burnout.
However, many doctors and patient advocates are warning that the technology is moving faster than the safeguards meant to protect patients.
For psychotherapist Paul Boyer of Kaiser Permanente in Oakland, California, the AI revolution has not exactly lived up to the hype.
Here's What They're Not Telling You About Your Retirement
Kaiser recently introduced a note taking system developed by healthcare AI company Abridge.
The software is designed to summarize patient visits quickly and reduce the paperwork burden many doctors face each day. Although the technology has been praised by some clinicians, Boyer says the system often creates more work instead of less.
The AI scribe is “not super useful,” Boyer explained, because he and his colleagues frequently have to correct the notes generated by the system.
Boyer said Abridge is “not good at picking up on clinical nuance, at picking up on the emotional tone” that can be especially important in mental health treatment. For patients experiencing mania, he explained, the tone and delivery of speech can matter more than the actual words spoken. That is where the software struggles.
This Could Be the Most Important Video Gun Owners Watch All Year
AI note taking systems are no longer experimental technology. Hospitals throughout the nation are already deploying them at a rapid pace.
Some research suggests the tools may provide real benefits. A study published in April in the Journal of the American Medical Association found that doctors who heavily used AI scribes saved more than thirty minutes of work each day after the systems were installed.
At the same time, questions remain about reliability and safety.
Researchers worry that doctors may eventually become too dependent on AI generated notes and fail to catch mistakes. If incorrect information enters a patient’s medical record, future doctors could end up relying on flawed documentation when making treatment decisions.
MORE NEWS: First Real-Time Brain-Controlled Hearing Device Allows Listeners to Focus on a Single Voice
Abridge says it closely monitors the quality of its software throughout deployment. The company’s director of applied science, Davis Liang, stated, “Following deployment of a model, we monitor clinician edits, star ratings, and free text feedback from clinician users about note quality.”
Still, critics argue that federal oversight has not kept pace with the explosive growth of AI tools in healthcare.
“There is currently no safeguard in place” to vet scribe software at the federal level, said Raj Ratwani, a researcher at MedStar Health who studies how humans interact with technology in healthcare settings.
Ratwani is particularly concerned about proposed federal rule changes that could weaken requirements for medical record transparency and usability. According to critics, those changes could make electronic health records more confusing for doctors and nurses, therefore increasing the risk of medical errors.
Beginning during the Obama administration, federal regulators encouraged software companies to conduct “user centered design” testing. In practice, that meant developers had to test products with doctors and nurses to ensure the systems were understandable and practical in real world situations.
However, proposed changes from the Department of Health and Human Services under Secretary Robert F. Kennedy Jr. would eliminate several of those requirements.
Supporters of the changes argue that excessive regulation has slowed innovation and limited competition in the healthcare technology market.
Abridge general counsel Tim Hwang said the company “broadly supports” the new proposals because they represent a “necessary modernization” that “accommodates the speed at which AI is evolving.”
Ryan Howells of Leavitt Partners also defended the changes, saying the old rules “put way too much burden” on electronic health record systems. He argued that federal regulations are “the single biggest inhibitor to true clinical innovation.”
There is no question that the healthcare records market has become highly concentrated. A 2022 study found that Epic and Oracle Health controlled more than seventy percent of the hospital market. Therefore, some policymakers believe reducing regulatory barriers could encourage new companies and products to enter the space.
Yet many doctors, hospitals, and patient safety advocates remain deeply uneasy.
Jennifer Holloman of the American Hospital Association warned that hospitals have long struggled with the “black box nature of certain AI tools and how the algorithms are developed.” She added that transparency becomes even more important as AI systems spread throughout healthcare.
The concern is not theoretical. Even seemingly simple electronic health record tasks can create dangerous confusion. Ratwani gave the example of ordering Tylenol through some hospital systems.
“The physician is trying to order Tylenol, and the medication list can be so confusing that there's 30 different versions of Tylenol all at a different dose and for different purposes,” he explained.
A recent Veterans Health Administration study added more fuel to the debate. Researchers evaluated eleven AI scribes and found that the software performed worse than humans across multiple test scenarios. The study concluded, “Although ambient AI scribes can generate complete notes, the overall quality remains broadly below that of human authored documentation.”
Researchers were especially concerned about missing information because omissions in patient records could directly affect follow up care and treatment decisions.
MORE NEWS: Passengers from Hantavirus-Linked Cruise Ship Could Depart Nebraska Early if Conditions Are Met
For Boyer, the biggest fear is not just inaccurate software. He worries hospital management could eventually use AI generated productivity gains as justification to schedule more patients, even while doctors spend additional time correcting computer generated errors.
Kaiser spokesperson Vincent Staupe noted that clinicians are not required to use AI systems.
Still, Boyer says the technology has not reduced his workload at all.
“When I am correcting that note, I feel like this is too much work,” Boyer said. “This is definitely making this worse, and this is taking up time that I need to not be spending on correcting an AI tool.”
Join the Discussion
COMMENTS POLICY: We have no tolerance for messages of violence, racism, vulgarity, obscenity or other such discourteous behavior. Thank you for contributing to a respectful and useful online dialogue.