Utah Allows AI Chatbot to Prescribe Psychiatric Drug Refills, Sparking Safety Debate

First-of-its-kind US pilot lets AI renew mental health prescriptions under strict safeguards.
An illustration of a therapist replaced with AI laptop and client sitting in front of him.
AN artificial intelligence chatbot to renew prescriptions for psychiatric medications without direct physician approval.AI image
Published on
Updated on

USA: Utah has launched a first-of-its-kind pilot program in the United States allowing an artificial intelligence chatbot to renew prescriptions for psychiatric medications without direct physician approval, a move that is reshaping how routine mental healthcare may be delivered.

The one-year pilot, announced in April 2026, authorizes a chatbot developed by San Francisco-based startup Legion Health to handle prescription refills for a limited set of mental health drugs under state-approved clinical protocols. The system is designed to function as a physician extender rather than a replacement, operating within strict eligibility and safety boundaries.

State officials say the initiative aims to address a growing mental health crisis, with an estimated 500,000 Utah residents lacking adequate access to care, while also reducing costs and easing pressure on clinicians.

How the AI prescription system works

The chatbot allows patients to request medication refills through a structured digital screening process. To qualify, patients must already have an existing prescription and a stable diagnosis.

Patients are required to:

  • Verify their identity and prescription details, such as uploading a pill bottle label

  • Answer questions about symptoms, medication effectiveness, and side effects

  • Report any warning signs such as suicidal thoughts or severe adverse reactions

The system only approves refills for patients who meet low-risk criteria and clinical stability, meaning no recent medication changes and no psychiatric hospitalization within the past year.

If the chatbot detects anything outside predefined safe parameters, it automatically escalates the case to a human clinician for review.

The AI is limited to renewing prescriptions for 15 low-risk maintenance medications, including:

  • Fluoxetine

  • Sertraline

  • Bupropion

  • Mirtazapine

  • Hydroxyzine

The system cannot issue new prescriptions and excludes higher-risk drugs such as controlled substances, benzodiazepines, antipsychotics, lithium, and medications that require close monitoring.

Patients must also check in with a healthcare provider after every 10 refills or at least once every six months to maintain eligibility.

White pills coming out of orange container.
State officials and company leaders argue that automating routine prescription renewals will improve access to care. Towfiqu barbhuiya/Pexels

Cost, rollout, and oversight

Legion Health is offering the service through a $19 per month subscription, with a waitlist open as the rollout begins.

The pilot includes multiple layers of oversight to ensure safety:

  • The first 1,250 cases are reviewed by human physicians

  • Between 5 percent and 10 percent of ongoing cases are audited

  • Monthly reports are submitted to regulators

  • Pharmacists and patients can request human review at any stage

Before the AI can operate with reduced supervision, it must meet a 98 percent approval accuracy benchmark, a safeguard designed to ensure reliability in early use.

Officials defend move as solution to access crisis

State officials and company leaders argue that automating routine prescription renewals will improve access to care and allow clinicians to focus on more complex cases.

"By safely automating the renewal process for maintenance medications, we are allowing patients to get the care they need much more quickly and affordably," officials said.

Legion Health CEO Yash Patel told The Verge, the initiative is a major step forward, calling it "the beginning of something much bigger than refills."

According to The Verge, president Arthur MacWaters said the system includes strong safeguards, noting that risks exist in all care models and that the platform uses "conservative eligibility gates" along with escalation pathways to clinicians when needed.

Psychiatrists raise concerns over safety and clinical judgment

Mental health experts have expressed concern about the risks of relying on AI for prescribing decisions, even in limited scenarios.

In an interview with The Verge Brent Kious, a psychiatrist at the University of Utah School of Medicine, said the benefits may be overstated and called for stronger validation.

"It would be better if there were greater transparency, more science, and more rigorous testing before people are asked to use this."

Kious also warned that increased automation could contribute to an "epidemic of over-treatment" in psychiatry.

John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center, questioned whether AI systems can handle the complexity of mental health care.

"They require more active management, changes, and careful consideration."

He added that prescribing involves more than symptom checklists and drug interactions, raising concerns about whether AI can replicate clinical nuance.

"I would personally avoid it for now."

(Rh/ARC)

An illustration of a therapist replaced with AI laptop and client sitting in front of him.
ChatGPT Health Missed Over Half of Medical Emergencies in AI Triage Test, Nature Medicine Study Warns

Related Stories

No stories found.
logo
Medbound Times
www.medboundtimes.com