Taking a page from the private insurance industryās playbook, the Trump administration will launch a program next year to find out how much money an artificial intelligence algorithm could save the federal government by denying care to Medicare patients.
The pilot program, designed to weed out wasteful, ālow-valueā services, amounts to a federal expansion of an unpopular process called prior authorization, which requires patients or someone on their medical team to seek insurance approval before proceeding with certain procedures, tests, and prescriptions. It will affect Medicare patients, and the doctors and hospitals who care for them, in Arizona, Ohio, Oklahoma, New Jersey, Texas, and Washington, starting Jan. 1 and running through 2031.
The move has raised eyebrows among politicians and policy experts. The traditional version of Medicare, which covers adults 65 and older and some people with disabilities, has mostly eschewed prior authorization. Still, it is widely used by private insurers, especially in the Medicare Advantage market.
And the timing was surprising: The pilot was , just days after the Trump administration unveiled a voluntary effort by private health insurers to revamp and reduce their own use of prior authorization, which causes care to be āsignificantly delayed,ā said Mehmet Oz, administrator of the Centers for Medicare & Medicaid Services.
āIt erodes public trust in the health care system,ā Oz told the media. āIt’s something that we can’t tolerate in this administration.ā
But some critics, like Vinay Rathi, an Ohio State University doctor and policy researcher, have accused the Trump administration of sending mixed messages.
On one hand, the federal government wants to borrow cost-cutting measures used by private insurance, he said. āOn the other, it slaps them on the wrist.ā
Administration officials are ātalking out of both sides of their mouth,ā said Rep. Suzan DelBene, a Washington Democrat. āItās hugely concerning.ā
Patients, doctors, and other lawmakers have also been critical of what they see as delay-or-deny tactics, which can slow down or block access to care, causing irreparable harm and even death.
“Insurance companies have put it in their mantra that they will take patientsā money and then do their damnedest to deny giving it to the people who deliver care,ā said Rep. Greg Murphy, a North Carolina Republican and a urologist. āThat goes on in every insurance company boardroom.ā
Insurers have long argued that prior authorization reduces fraud and wasteful spending, as well as prevents potential harm. Public displeasure with insurance denials dominated the news in December, when the shooting death of UnitedHealthcareās CEO led many to anoint his alleged killer as a folk hero.
And the public broadly dislikes the practice: Nearly three-quarters of respondents thought prior authorization was a āmajorā problem in , a health information nonprofit that includes Ńī¹óåś“«Ć½Ņīl Health News.
Indeed, Oz said during his June press conference that āviolence in the streetsā prompted the Trump administration to take on the issue of prior authorization reform in the private insurance industry.
Still, the administration is expanding the use of prior authorization in Medicare. CMS spokesperson Alexx Pons said both initiatives āserve the same goal of protecting patients and Medicare dollars.ā
Unanswered Questions
The , WISeR ā short for “Wasteful and Inappropriate Service Reduction” ā will test the use of an AI algorithm in making prior authorization decisions for some Medicare services, including skin and tissue substitutes, electrical nerve stimulator implants, and knee arthroscopy.
The federal government says such procedures are particularly vulnerable to āfraud, waste, and abuseā and could be held in check by prior authorization.
Other procedures may be added to the list. But services that are inpatient-only, emergency, or āwould pose a substantial risk to patients if significantly delayedā would not be subject to the AI modelās assessment, according to the federal announcement.
While the use of AI in health insurance isnāt new, Medicare has been slow to adopt the private-sector tools. Medicare has historically used prior authorization in a limited way, with contractors who arenāt incentivized to deny services. But experts who have studied the plan believe the federal pilot could change that.
Pons told Ńī¹óåś“«Ć½Ņīl Health News that no Medicare request will be denied before being reviewed by a āqualified human clinician,ā and that vendors āare prohibited from compensation arrangements tied to denial rates.ā While the government says vendors will be rewarded for savings, Pons said multiple safeguards will “remove any incentive to deny medically appropriate care.”
āShared savings arrangements mean that vendors financially benefit when less care is delivered,ā a structure that can create a powerful incentive for companies to deny medically necessary care, said Jennifer Brackeen, senior director of government affairs for the Washington State Hospital Association.
And doctors and policy experts say thatās only one concern.
Rathi said the plan “is not fully fleshed out” and relies on “messy and subjective” measures. The model, he said, ultimately depends on contractors to assess their own results, a choice that makes the results potentially suspect.
āIām not sure they know, even, how theyāre going to figure out whether this is helping or hurting patients,” he said.
Pons said the use of AI in the Medicare pilot will be āsubject to strict oversight to ensure transparency, accountability, and alignment with Medicare rules and patient protection.ā
āCMS remains committed to ensuring that automated tools support, not replace, clinically sound decision-making,ā he said.
Experts agree that AI is theoretically capable of expediting what has been a cumbersome process marked by delays and denials that can harm patientsā health. Health insurers have argued that AI eliminates human error and bias and will save the health care system money. These companies have also insisted that humans, not computers, are ultimately reviewing coverage decisions.
But some scholars are doubtful thatās routinely happening.
āI think that there’s also probably a little bit of ambiguity over what constitutes āmeaningful human review,ā” said Amy Killelea, an assistant research professor at the Center on Health Insurance Reforms at Georgetown University.
A 2023 found that, over a two-month period, doctors at Cigna who reviewed requests for payment spent an average of only 1.2 seconds on each case.
Cigna spokesperson Justine Sessions told Ńī¹óåś“«Ć½Ņīl Health News that the company does not use AI to deny care or claims. The ProPublica investigation referenced a āsimple software-driven process that helped accelerate payments to clinicians for common, relatively low-cost tests and treatments, and it is not powered by AI,ā Sessions said. āIt was not used for prior authorizations.ā
And yet class-action lawsuits filed against major health insurers have alleged that flawed AI models undermine doctor recommendations and fail to take patientsā unique needs into account, forcing some people to shoulder the financial burden of their care.
Meanwhile, a by the American Medical Association in February found that 61% think AI is āincreasing prior authorization denials, exacerbating avoidable patient harms and escalating unnecessary waste now and into the future.ā
Chris Bond, a spokesperson for the insurersā trade group AHIP, told Ńī¹óåś“«Ć½Ņīl Health News that the organization is āzeroed inā on implementing the commitments made to the government. Those include reducing the scope of prior authorization and making sure that communications with patients about denials and appeals are easy to understand.
āThis Is a Pilotā
The Medicare pilot program underscores ongoing concerns about prior authorization and raises new ones.
While private health insurers have been opaque about how they use AI and the extent to which they use prior authorization, policy researchers believe these algorithms are often programmed to automatically deny high-cost care.
“The more expensive it is, the more likely it is to be denied,ā said Jennifer Oliva, a professor at the Maurer School of Law at Indiana University-Bloomington, whose work focuses on AI regulation and health coverage.
Oliva explained in a recent that when a patient is expected to die within a few years, health insurers are āmotivated to rely on the algorithm.ā As time passes and the patient or their provider is forced to appeal a denial, the chance of the patient dying during that process increases. The longer an appeal, the less likely the health insurer is to pay the claim, Oliva said.
āThe No. 1 thing to do is make it very, very difficult for people to get high-cost services,ā she said.
As the use of AI by health insurers is poised to grow, insurance company algorithms amount to a āregulatory blind spotā and demand more scrutiny, said Carmel Shachar, a faculty director at Harvard Law Schoolās Center for Health Law and Policy Innovation.
The WISeR pilot is āan interesting stepā toward using AI to ensure that Medicare dollars are purchasing high-quality health care, she said. But the lack of details makes it difficult to determine whether it will work.
Politicians are grappling with some of the same questions.
āHow is this being tested in the first place? How are you going to make sure that it is working and not denying care or producing higher rates of care denial?ā asked DelBene, who to Oz with other Democrats demanding answers about the AI program. But Democrats arenāt the only ones worried.
Murphy, who co-chairs the House GOP Doctors Caucus, acknowledged that many physicians are concerned the WISeR pilot could overreach into their practice of medicine if the AI algorithm denies doctor-recommended care.
Meanwhile, House members of both parties recently supported a , a Florida Democrat, to block funding for the pilot in the fiscal 2026 budget of the Department of Health and Human Services.
AI in health care is here to stay, Murphy said, but it remains to be seen whether the WISeR pilot will save Medicare money or contribute to the problems already posed by prior authorization.
āThis is a pilot, and I’m open to see what’s going to happen with this,ā Murphy said, ābut I will always, always err on the side that doctors know what’s best for their patients.ā