Clinical AI Governance: What Clinicians Must Know in 2026 With: Dr. Amar Rewari and Dr. Lindsey Cotton

As artificial intelligence becomes embedded across clinical workflows, healthcare leaders face a defining challenge: how to govern AI safely, responsibly, and at scale. In this full interview, Dr. Amar Rewari and Dr. Lindsey Cotton unpack what clinicians must understand about AI governance as healthcare enters 2026.

The conversation explores how FDA oversight, CLIA regulation, and real-world evidence intersect with modern AI tools, from decision support systems to emerging autonomous workflows. Drawing on experience in oncology, diagnostics, and health policy, the guests explain why FDA approval alone is insufficient, why institutional accountability matters, and how AI model drift creates silent clinical risk. They also examine implementation science, clinician trust, workforce burnout, and why governance frameworks must evolve alongside reimbursement incentives.

For clinicians and health system leaders, this discussion offers a pragmatic roadmap for evaluating AI tools, balancing innovation with safety, and preparing organizations for the next phase of AI adoption in patient care.

About the Guest

Dr. Amar Rewari is a nationally recognized physician-executive, Chief of Radiation Oncology at Luminis Health, and adjunct assistant professor at the Johns Hopkins School of Medicine. He holds an MD and MBA from Yale University and previously worked in investment banking at Credit Suisse. https://www.linkedin.com/in/amar-rewari-md-mba/

Dr. Lindsey Cotton is a clinician, scientist, and healthcare leader with over 20 years of experience, specializing in early cancer detection through liquid biopsy biomarkers. She has held leadership roles at Grail, Guardant Health, and Delfi Diagnostics. https://www.linkedin.com/in/drlindseycotton/

Notable Quote

“FDA approval is only the starting point. The real responsibility begins with how AI is implemented and monitored in the clinic.”

Key Takeaways

  • FDA approval does not eliminate institutional responsibility for AI safety and monitoring.

  • AI model drift poses silent clinical risk and requires ongoing governance and validation.

  • Successful AI adoption depends more on implementation and incentives than technology al

Transcript Summary


How has the FDA historically regulated AI and machine learning in healthcare?

Lindsey Cotton: The FDA has evaluated machine learning for years, long before “AI” became a buzzword. Many tools fall under CLIA-regulated laboratory developed tests, which are not unregulated but governed differently to support innovation and access.

Does lack of FDA approval mean an AI tool is unsafe?

Amar Rewari: Not necessarily. When tools are not FDA-approved, responsibility shifts to institutions. Validation, monitoring, and clinical oversight become critical to safe implementation.

How should clinicians think about different levels of AI risk?

Amar Rewari: AI falls into tiers: assistive tools, decision-influencing tools, and autonomous systems. Each tier demands a higher regulatory bar and stronger post-implementation monitoring.

Why is AI model drift such a concern in clinical care?

Lindsey Cotton: The greatest danger is AI that works well initially, gains trust, and then quietly degrades. Small performance drops can lead to significant missed diagnoses at scale.

What role does real-world evidence play in AI governance?

All Guests: Real-world evidence lowers barriers to innovation but requires investment in monitoring, auditing, and data governance to prevent bias and unintended harm.

What will matter most for AI in healthcare by 2026?

Consensus: Governance frameworks, clinician trust, workforce impact, and continuous validation will matter more than novel algorithms alone.

About the Series

Leading oncology AI thought leaders Drs. Sanjay Juneja, Debra Patt, and Doug Flora bring you conversations at the intersection of medicine, data, and innovation. Each episode explores both the big picture and the breaking news in artificial intelligence and healthcare—examining how today’s technology is reshaping the practice and business of oncology.

From industry disruptors to clinical pioneers, guests share insights that bridge the gap between algorithms and the art of patient care.

To learn more about the mission and upcoming initiatives, visit tensorblack.ai and subscribe to stay ahead of the curve.

Get new episode alerts via email

By clicking the Submit button, you agree to xCures's Terms of Service and Privacy Policy.