Medical Education in 2025: AI’s Double-Edged Sword With: Dr. Ben Rosner
Dr. Ben Rosner, a practicing clinician and digital health thought leader, discusses the evolving role of generative AI in medical education. As the faculty lead for AI innovations at UCSF School of Medicine, he outlines real-world pilot programs using AI to automate administrative tasks, enhance diagnostic training, and even generate personalized tutor bots. However, he raises critical concerns about “de-skilling”—a phenomenon where overreliance on AI can erode core clinical competencies. Drawing parallels from aviation and colonoscopy procedures, Dr. Rosner explains why educators must tread carefully. This insightful clip explores challenges in medical education AI integration and raises pressing questions about how AI can enhance learning without undermining human expertise.
Episode Contents:
About the Guest
Dr. Ben Rosner is a practicing clinician and digital health researcher. He serves as the faculty lead for AI innovations in medical education at UCSF. Connect on LinkedIn: https://www.linkedin.com/in/benrosnermdphd/
Key Takeaways
- AI tools can automate low-stakes admin tasks in medical training
- De-skilling is a real risk as AI becomes embedded in clinical learning
- Institutions must define when and how trainees use AI tools
Transcript Summary
Can AI be trusted in medical education?
Dr. Rosner explains that while AI is already being adopted by students, educators must set boundaries to prevent overreliance. He discusses low-risk applications like case logging and high-risk concerns like clinical reasoning analysis.
What is de-skilling, and how does it relate to AI in medicine?
De-skilling occurs when clinicians lose core skills due to overuse of AI tools. He cites a study on colonoscopies where physicians became less accurate without AI, highlighting risks in training environments.
Can AI support adherence to clinical best practices?
Yes. Dr. Rosner discusses how AI could help clinicians keep up with the growing volume of medical guidelines—especially in smaller health systems lacking support staff.
Is there a cultural barrier to AI-based feedback in healthcare?
Yes. He explains that medicine often lacks the review-and-learn culture found in sports or aviation. Changing this mindset is key to improving diagnostic excellence.
More Topics
- AI in the Healthcare Industry
- AI and Medical Innovation
- Healthcare Ethics and Policy
- AI in Patient Care
Keep Exploring
About the Series
AI and Healthcare—with Mika Newton and Dr. Sanjay Juneja is an engaging interview series featuring world-renowned leaders shaping the intersection of artificial intelligence and medicine.
Dr. Sanjay Juneja, a hematologist and medical oncologist widely recognized as “TheOncDoc,” is a trailblazer in healthcare innovation and a rising authority on the transformative role of AI in medicine.
Mika Newton is an expert in healthcare data management, with a focus on data completeness and universality. Mika is on the editorial board of AI in Precision Oncology and is no stranger to bringing transformative technologies to market and fostering innovation.