
A woman from Perth recently turned to social media for opinions after an unusual experience at her doctor's appointment. She was taken aback upon discovering that her physician was relying on ChatGPT, the well-known AI chatbot created by OpenAI, during the consultation. Posting on Reddit, she described that she had gone to the clinic to pick up test results but was appalled to see the doctor copying and pasting data into the AI software.
"I never felt like I was sitting in front of an incompetent physician until now," she wrote in her post. "It feels like pure laziness, ignorance, and completely unreliable medical advice."
According to the woman, the doctor entered her blood test results and age into ChatGPT to receive recommendations on further steps. While she acknowledged that doctors occasionally refer to Google for medical information, she felt this situation had gone too far.
Her concerns sparked a broader debate—should AI play a role in medical consultations? The woman has since filed a formal complaint with the Australian Health Practitioner Regulation Agency (AHPRA) and intends to bring the matter to the Health and Disability Services Complaints Office as well.
The Future of Medicine with AI:
The experience generated concerns over the use of AI tools within a medical environment. Most commenters to her Reddit entry expressed her concern, doubting the validity of an AI-based diagnosis.
Australian general practitioner Dr. Sam Hay joined the conversation, commenting that AI does have a role in medicine—but only if properly used.
"Does AI belong in medicine? I believe it does—when used properly!" he told Kidspot.
"Doctors frequently use web-based search tools to quickly access medical data and confirm facts. I do this daily, often right in front of patients. However, the key is knowing how to evaluate and trust the information provided."
Dr. Hay stressed the value of trusting one's doctor. "For me, it's essential to rely on credible sources. That means weeding out untrustworthy, ad-laden websites. Not all doctors do this, which is why patients must research and develop relationships with healthcare providers they trust."
When it comes to tools such as ChatGPT, Dr. Hay feels that they can help doctors but must not replace clinical judgment. "These AI tools can help ensure we don't miss anything, but they should not be an independent diagnostic system. A doctor's skill is in interpreting the information given by AI in the context of the patient's condition."
He further stressed that medical professionals should verify that any AI-generated information is backed by up-to-date research, credible medical sources, and established industry guidelines. "Nothing can replace a thorough clinical assessment. The real skill of a doctor is in understanding and applying the information AI provides in a meaningful way for each individual patient."
References:
1. New York Post. 2025. "My Doctor Used ChatGPT in Front of Me." New York Post, April 1, 2025. https://nypost.com/2025/04/01/world-news/my-doctor-used-chatgpt-in-front-of-me/.
(Input from various sources)
(Rehash/Sai Sindhuja K/MSM)