Dr. Munawar Aziz MCPS*
The term” refers to incorrect, false, and misleading information generated by modern search engines, which are very popular among the general public, especially in Pakistan and generally throughout the world. A number of young adults come to me after “diagnosing” themselves with colon cancer, which often turns out to be a simple case of constipation. Hard stool causing minor bleeding alarms them, and the “one touch away” diagnosis from typing their complaints into a search engine frightens them even more.
Others purchase expensive online supplements, often containing steroids, to build muscle mass instead of taking natural proteins in the form of eggs, meat, beef, fish, lentils, etc.
A recent article in The Economic Times (epaper, dated August 18, 2025) reported the case of a 60 year old man who wanted to replace sodium chloride (common table salt) in his diet because of a health concern. Instead of consulting a healthcare professional, he turned to a search engine. The AI’s suggestion was to replace it with sodium bromide. After taking sodium bromide daily for six months, he began developing psychiatric symptoms of paranoid psychosis, accusing his neighbor of poisoning him, along with episodes of dizziness, hallucinations, suspicion, and depression.

Dr. Munawar Aziz
He eventually landed in the emergency department of a nearby hospital, where he remained undiagnosed for quite some time. Only after a detailed history was taken was he recognized as a case of bromide poisoning, a very rare condition today. Bromide salts were prescribed in the late 18th and early 19th centuries for headaches and anxiety but were later phased out in ingestible form because of poisoning cases that caused dermatitis and psychiatric disorders.
“AI hallucinations” occur when search results or AI generated answers sound very authentic and reliable but are actually false. This happens because large language models (LLMs) do not “know” facts in the way humans do; they generate text by predicting the most likely sequence of words based on patterns in their training data. If a question requires information that is unavailable or uncertain, the model may “fill in the gaps” with invented details for example, claiming that a scientific study exists when it does not, or producing references to books or articles that sound real but are fabricated.
There is no harm in experimenting with the latest technologies, but for health issues, it is always advisable and rewarding to consult a qualified healthcare professional. There is no substitute for health. Qualified doctors, who spend their lives among all sorts of patients, cannot be replaced by any AI model. In the end, human touch and interaction remain the most important part of the doctor patient relationship.
- Dr. Munawar Aziz
Abbottabad, Pakistan.
aziz.munawar@gmail.com
You tube has made everyone a specialist
All info on AI be verified .
The dangers of AI/Internet are always present there , along with limited information and knowledge generated by artificial mechanisms . Evidently these are unable at this point of technological development to really ‘know’ and analyse and interpret sophisticated information the way a trained medical doctor or other scientist can. AI is like the “Neem Hakeem, khatra i jaan” mentioned by Shaikh Saadi. Public should beware .
Old habits die hard. We in India and Pakistan are addicted to suggesting remedies for any type of disease on earth even if a qualified doctor is present in the gathering. The social media has complicated the matter to be worst as we have many quacks posing to be doctors advising different herbs and foods for ailments on Facebook and YouTube. I call them Facebook doctors. To make the matters worse now search engines have started diagnosing and suggesting medicine for all type of diseases ranging from stomach pain to injuries and even cancer.
Forewarned is forearmed. You have done your bit. Now let us see if this has any impact on our old habit of being specialists of health issue.
This article is an early warning to the public who have an approach to the net without the judicious use .They end up with bigger problems than the original which needed the attention of a qualified health provider.
Yes! These modern methods of communication and advice may sound quite convincing but certainly can be misleading and misguiding! On issues other than health the advice may not tally with one’s cultural & religious beliefs!
For medical advice nothing can equal a direct face to face communication and management
very apt … only wisdom experience and clinical skills of doctor can help people with symptoms with Allah’s help. Doctors who use AI search engines with this background are more helpful. otherwise people are more misguided …