Artificial intelligence (AI) is increasingly becoming part of healthcare, moving beyond simple support to a more active role in clinical work. The Lancet describes a new category of “agentic AI teammates” capable of reasoning, planning, and interacting with healthcare systems, rather than just performing automated tasks.[1] At the same time, a JAMA article warns that while AI can match human performance in specific diagnostic tasks, over-reliance might weaken essential medical skills.[2]
In India, AI is being used to assist with diagnoses, manage administrative tasks, and support remote patient monitoring — yet it is not replacing doctors.[3]
Historically, AI in medicine focused on narrow tasks like image analysis, triage, or report summarization. Today, agentic AI can carry out multi-step reasoning, retrieve data across platforms, and collaborate with healthcare professionals.[1]
However, AI’s performance still depends on well-labeled training data. For example, it can excel in detecting specific pathological features, such as glomeruli in kidney biopsies, but may reduce clinician engagement with core skills if used excessively.[2]
In India, AI-assisted breast cancer screening improved detection rates: radiologists using AI identified 6.7 cancers per 1,000 screenings, compared to 5.7 without AI.[3]
AI is currently applied in several areas:
Diagnostics: AI analyzes imaging, lab results, and patient histories to spot patterns that may go unnoticed by humans.
Treatment support: AI combines patient data with guidelines to suggest diagnostic and treatment options.
Administration: Scheduling, documentation, and billing can be automated, allowing doctors to spend more time with patients.
Access to care: In regions with physician shortages, AI-based tools, such as mobile apps and remote monitoring, extend healthcare reach
Despite progress, AI cannot replicate several essential aspects of healthcare:
Human interaction and empathy: AI cannot form the doctor–patient connection critical for care.
Complex decision-making: When data are incomplete or unusual, AI’s guidance may be limited.
Ethical reasoning and accountability: AI lacks moral judgment and awareness of social context.
System integration: Challenges such as algorithmic bias, data privacy, and interoperability remain.
For medical students and doctors, AI brings both opportunities and responsibilities:
Evolving roles: Clinicians may shift focus from repetitive tasks to interpreting AI outputs, ethical judgment, and patient communication.
Curriculum updates: Understanding AI, including how it works, its limitations, and its outputs, will become important in MBBS training.
Skill maintenance: JAMA emphasizes that excessive reliance on AI may reduce proficiency in basic clinical skills.[2]
Practical application: Even the most advanced AI tools require human oversight and alignment with real-world healthcare operations.
Threat: AI could reduce the need for humans in some data-intensive areas, like radiology or pathology. There is also concern that clinicians may become overly dependent on AI.
Tool: Evidence supports AI as a supportive tool that improves speed, accuracy, and access to care, particularly in low-resource settings.
Collaboration: The Lancet suggests that the future lies in partnership, where AI works alongside clinicians as “teammates” rather than replacements.[1]
Healthcare professionals on MedBound Hub have shared that AI can speed up diagnostics but cannot replace clinical judgment. Students and doctors alike are interested in learning how to use AI safely alongside traditional medical skills. These discussions emphasize that AI is a supportive tool, not a substitute for human care.
AI is indeed transforming the face of healthcare. From diagnosing diseases to personalizing treatment regimens to robotics-assisted surgeries, AI is revolutionizing healthcare. But we cannot blindly trust AI with our health, and it is never a replacement for doctors. However, it can definitely be used as an assistive tool for healthcare professionals to improve patient care and diagnosis.Fathima, B Pharm graduate
I think AI in healthcare is definitely powerful, but it cannot fully replace doctors. Human touch, empathy, and clinical judgment are things AI can never provide. Instead, AI works best as a supportive tool – speeding up diagnosis, reducing errors, and helping in areas with fewer doctors. In the end, it’s not ‘AI vs doctors’ but ‘AI with doctors’ that will shape the future of medicine.Likhitha Reddy, B Pharm
While AI is undoubtedly transforming healthcare, it can’t replace doctors. Medicine needs empathy, judgement and human connection, which cannot be replaced by any algorithm. It might tell which specific drug to be given in which disease, but it can never replicate the Psychology of a Clinician’s Touch. Therefore, AI should be seen as an assistant that enhances accuracy and efficiency, not a threat to medical careers.Dr. Anshul Thakur, MBBS
In the short term, AI will continue supporting diagnostics and workflow, always under human supervision. Over time, AI may handle preliminary diagnosis or triage more independently, but accountability will remain with clinicians. The human role will increasingly focus on empathy, ethical reasoning, and holistic patient care — skills that AI cannot replicate.
AI in medicine is transforming practice rather than eliminating the need for doctors. While it can process data faster and support decision-making, the art of medicine — listening, reasoning, and empathizing — remains human. MBBS students and clinicians who integrate AI knowledge with strong clinical skills will remain central to patient care.
Zou J, Topol EJ. The rise of agentic AI teammates in medicine. Lancet. 2025 Feb 8;405(10477):457. doi: 10.1016/S0140-6736(25)00202-8. PMID: 39922663.
Fogo AB, Kronbichler A, Bajema IM. AI’s Threat to the Medical Profession. JAMA. 2024;331(6):471–472.
IndiaMed Today. “Rise of AI Doctors: How Close Are We to Replacing Doctors?” IndiaMed Today, 2025. 1.
Edited by M Subha Maheswari