Blogs By Dr. Syed Nabeel

AI in Healthcare: From Hype to Helper – A Perspective for Doctors and Dentists

16/07/2025

As doctors and dentists, we’ve all heard the buzz around artificial intelligence (AI) in healthcare. From promises of revolutionizing diagnostics to claims of machines outsmarting human clinicians, the narrative around AI has often been exaggerated. However, recent research, like Apple’s “The Illusion of Thinking” study, is shifting our understanding of AI from an all-powerful entity to a practical tool that simplifies our work while reinforcing the unmatched value of human intelligence. Let’s unpack this shift in perspective and explore why our clinical expertise remains irreplaceable.

AI: Not a Mastermind, but a Helpful Assistant

The Apple study, published in June 2025, tested advanced AI models, known as Large Reasoning Models (LRMs), on puzzles like the Tower of Hanoi. These models, designed to “think” step-by-step, performed well on simpler tasks but struggled or gave up entirely when faced with complex problems. This suggests that AI’s “reasoning” is more about pattern-matching than true understanding. For healthcare professionals, this is a crucial reminder: AI is a tool, not a replacement for our critical thinking.

Think of AI as a highly efficient assistant. It can process vast amounts of data quickly, spot patterns, and handle repetitive tasks, but it lacks the nuanced judgement we bring to the clinic. For example, in dentistry, AI tools like those used for radiographic analysis can flag potential caries or periodontal issues by comparing images to thousands of others. But only a dentist can interpret the patient’s history, clinical presentation, and lifestyle factors to decide whether that shadow on an X-ray is a cavity or an artefact. Similarly, in medicine, AI might analyse ECGs to detect arrhythmias, but a cardiologist’s expertise is needed to weigh the patient’s symptoms, risk factors, and context to recommend treatment.

Why Human Intelligence Reigns Supreme

The Apple study highlights three key regimes of AI performance: low-complexity tasks where AI can sometimes outperform simpler models, medium-complexity tasks where it shows some advantage, and high-complexity tasks where it collapses entirely. In healthcare, we deal with high-complexity scenarios daily—situations where data is incomplete, patients are unique, and ethical considerations are paramount. This is where human intelligence shines.

  1. Contextual Understanding: AI can process data, but it doesn’t “get” the patient. For instance, a surgeon preparing for a complex procedure, like a cholecystectomy, relies on years of training to interpret subtle tactile feedback during surgery, something AI cannot replicate. A dentist adjusting a treatment plan for a patient with dental anxiety needs empathy and communication skills to build trust—qualities no algorithm can mimic.

  2. Ethical Decision-Making: The study showed AI struggles with tasks requiring precise, algorithmic execution, like solving complex puzzles. In healthcare, decisions often involve ethical dilemmas—like balancing a patient’s quality of life against aggressive treatment. A doctor might decide against a risky procedure for an elderly patient with comorbidities, a choice rooted in human values that AI cannot weigh.

  3. Adaptability to Novelty: The research found AI fails when problems deviate from its training data. In contrast, human clinicians excel at adapting to new situations. For example, during the COVID-19 pandemic, doctors and dentists rapidly adjusted protocols based on emerging evidence, something AI would struggle to do without extensive retraining.

Real-World Examples in Healthcare

Let’s look at how AI is proving to be a helpful tool rather than a genius overlord:

  • Dentistry: AI-powered imaging tools, like those from Pearl or Denti.AI, can analyse dental X-rays to highlight potential issues, saving time during screenings. However, a dentist’s expertise is critical to confirm diagnoses and plan treatments, especially for complex cases like impacted third molars or TMJ disorders. I recall a case where a patient’s unusual radiographic findings required a multidisciplinary discussion to avoid unnecessary intervention—something AI couldn’t facilitate.

  • Medicine: AI systems like IBM Watson have been used to suggest oncology treatment plans by analyzing patient data against vast medical literature. Yet, oncologists often override AI recommendations because they consider patient preferences, family dynamics, and practical constraints like access to care. A colleague in general surgery once shared how AI flagged a potential malignancy in a scan, but their clinical intuition, honed through years of practice, led to a correct diagnosis of a benign lesion after further investigation.

  • Surgical Planning: Tools like Surgical Theater use AI to create 3D models for preoperative planning, helping surgeons visualize complex anatomy. But the surgeon’s hands-on experience and intraoperative decision-making—say, during a delicate neurosurgery—determine the outcome, not the AI model.

Why This Matters to Us

We’ve been trained to blend scientific rigour with compassionate care. The Apple study reminds us that while AI can streamline workflows—think automated charting or predictive analytics for patient triage—it cannot replicate the holistic approach we bring to patient care. AI’s limitations in handling complexity mirror its role in healthcare: it’s a force multiplier, not a substitute.

For instance, in a busy GP practice, AI can flag patients at risk of diabetes based on EHR data, allowing doctors to prioritize follow-ups. In dentistry, AI-driven scheduling tools can optimize clinic flow, giving us more time to focus on patient interaction. These tools make our lives easier, but they don’t replace the diagnostic acumen or surgical precision we’ve developed through years of training and practice.

The Road Ahead

The hype around AI as a “thinking” entity is fading, thanks to studies like Apple’s. As healthcare professionals, we should embrace AI as a tool to enhance efficiency while staying grounded in our unique ability to reason, empathize, and adapt. The future of healthcare lies in hybrid intelligence—where AI handles data-heavy tasks, and we focus on the human elements of care.

So, dear colleagues, let’s leverage AI to reduce administrative burdens, improve diagnostic accuracy, and free up time for what we do best: caring for our patients. We know that true healing comes from the heart and mind, not just an algorithm.

 

Reference: https://machinelearning.apple.com/research/illusion-of-thinking

Author: Dr. Syed Nabeel, BDS, D.Orth, MFD RCS (Ireland), MFDS RCPS (Glasgow) is a clinician-scholar whose professional trajectory spans over a quarter century at the intersection of orthodontics, neuromuscular dentistry, and digitally driven diagnostics. As the Clinical Director of Smile Maker Clinics Pvt Ltd, he has articulated a refined philosophy of care that integrates anatomical exactitude with contemporary digital modalities, particularly in the nuanced management of temporomandibular disorders, esthetic smile reconstruction, and algorithm-guided orthodontic therapy. Grounded in the principles of occlusal neurophysiology, his approach is further distinguished by an enduring commitment to AI-enhanced clinical workflows and predictive modeling in complex craniofacial therapeutics. In 2004, Dr. Nabeel established DentistryUnited.com, a visionary digital platform designed to transcend clinical silos and foster transnational dialogue within the dental fraternity. This academic impetus culminated in the founding of Dental Follicle – The E-Journal of Dentistry (ISSN 2230-9489), a peer-reviewed initiative dedicated to the dissemination of original scholarship and interdisciplinary engagement. A lifelong learner, educator, and mentor, he remains deeply invested in cultivating critical thought among emerging clinicians, with particular emphasis on orthodontic biomechanics and the integrative neurofunctional paradigms that underpin both form and function.