How to Diagnose Yourself with ChatGPT and Why You Shouldn’t

Shortly after becoming attendings, my colleagues started to post memes and merch with the headline, “Please Don’t Confuse Your Google Search With My Medical Degree.” Google is an invaluable resource for information, but was also perceived by many medical providers as a threat. This was about 5 years ago, so you can only imagine how frustrated some medical professionals must be with ChatGPT. ChatGPT is an artificial intelligence (AI) chatbot developed by OpenAI that uses large language models to communicate with users. Essentially, if you ask ChatGPT a question it will give you an answer in a manner that mimics human interaction. You can now get direct answers to your questions without the fluff of search results.

The dawn of authentic communication with AI is upon us. This sounds exciting, but it’s also concerning to many because they fear being replaced. As this technology advances, is it possible for AI to be a replacement for someone’s medical degree? 

In this blog I’ll show how ChatGPT stacks up when it comes to making a diagnosis. Many people use whatever resource they find to know what’s going on with them.  ChatGPT is perfectly suited to help people self-diagnose themselves. However, I would argue that it’s still important to leave making a diagnosis to health professionals. Below, I will use depression as an example to how ChatGPT approaches giving a user a possible diagnosis. Along the way, I will also highlight the shortfalls of using AI in its current state to self-diagnose.

I first asked ChatGPT, ” Do I have depression?” It did not have any prior information. This is good to note because as you give the AI more information, it will be able to give you better and more precise answers. Nevertheless, at face value, ChatGPT initially defers making diagnoses to health professionals. This will be consistent throughout this experiment, but you’ll see how it is possible to get ChatGPT to give its opinion.

I asked ChatGPT, ” Do I have depression?” again. This time, it was more specific. It also mentioned that a diagnosis from a healthcare professional would likely be accurate and help with starting treatment.

Interestingly, ChatGPT gave me more information the day before. It encouraged me to see a professional, but also provided some symptoms that were characteristic of depression.

I wasn’t satisfied. I wanted ChatGPT to tell me what it thought was happening. ChatGPT is able to change its results based on prior information that the user provided. To exploit this, I told ChatGPT that I had every symptom that it mentioned was a symptom of depression. Every time, It gave me the above responses. Once it knew I had symptoms, it validated that I may have depression. It also doubled down that I needed to seek professional help by providing some benefits of seeing a health professional and giving me ways to find help.

At this point, ChatGPT knows that I want to know if I have depression. It also knows that I have all the symptoms. With this information, it finally was able to say that I may have depression.

Key Points/ Summary of the Experiment

Like normal interactions, ChatGPT changes its responses based on the information it receives. It is apprehensive about giving a diagnosis and defers to the professionals. As it gets more information, it’s more proactive about informing the user about the importance of getting help. It provides encouragement and resources. Although it doesn’t tell the user that it explicitly has a diagnosis, I think ChatGPT’s educated guess beats any web search when it comes to efficiency and , in some cases, accuracy. This experiment was done with depression, but this can be extrapolated to any health condition.

Why you shouldn’t rely on ChatGPT to make your diagnosis.

Above, I gave steps to how ChatGPT could give a diagnosis, but I didn’t say that it would be the correct one. Below, are reasons why you shouldn’t rely on AI to find your current diagnosis.

ChatGPT is limited by the information it receives from the user.

ChatGPT relies on self-report. Telling the symptoms you don’t have may be just as important as telling your current symptoms when it comes to finding a diagnosis. Reporting old symptoms may also be important. Lack of symptoms and past symptoms are less likely to be self-reported. We are also less likely to self-report symptoms that we don’t have insight into. Most people can’t identify that they are delusional. Most people also have trouble identifying that they have certain diagnoses due to cultural, social, or personal stigma and constraints. Providers are trained to dig deeper and get collateral (outside information) to get the missing information. Moreover, a provider can see the patient. They can visually see that the reported symptoms don’t match with what they are seeing.

ChatGPT doesn’t go into the differential diagnoses

Forming a differential diagnosis is the process of differentiating between two or more conditions which share similar signs or symptoms. Some illnesses have the exact same symptoms with one or two differences. Trained providers know this and will dig deeper to find the right diagnosis. They will hear your symptoms and think of all the diagnoses that have similar symptoms.

ChatGPT is not HIPAA compliant.

This is important to remember if you are a provider using the platform for consults. It’s not completely known where Open AI sources its information, but it’s likely that our interactions with the platform are used to feed the algorithm. You wouldn’t want your information or your patient’s information in that database. So if you are putting in symptoms, leave out identifying information like names, numbers, and birthdates.

ChatGPT does not give treatment

At the end of the day, we want a diagnosis to get treatment. ChatGPT is not able to give therapy or send prescriptions. Therefore, even if ChatGPT gave you the correct diagnosis, its ability to treat the diagnosis is limited. That’s not to say the platform won’t inform you of treatment options. I believe it’s much better to find a trained professional  who can give you the right diagnosis and treat whatever is wrong.


Although I don’t recommend using ChatGPT to find your diagnosis, I think having more information is invaluable. Therefore, ChatGPT might be a helpful first step for getting more information to share with your provider. If you don’t want to sit around asking ChatGPT a bunch of questions to find out more about your mental health, I have created a course on Udemy that could be invaluable.

The “All Things Mental Health” course is designed to help learners know when and how to get mental health help for themselves and others. One in five adults in the United States will experience some form of mental illness. Moreover, more that half of adults with a mental illness didn’t receive mental health services in the previous year. This course aims to increase awareness about mental health conditions and provide students with the tools to lessen the burden of such common illnesses. The first half of the course outlines when to get help. There will be overviews of the most common mental health conditions: depression, anxiety, bipolar disorder, schizophrenia, substance use disorders, eating disorders, and personality disorders.  The first half of the course also has information about suicide prevention and safety planning. A safety plan is an important tool to have handy in a time of crisis. The second half of the course outlines where to find help and what to expect in various treatment settings. These treatment settings include standard appointments, behavioral health areas of emergency departments, and psychiatric inpatient units. There will also be information about when to start therapy or medications and what to do after those treatments have started .  The ultimate goal of this course is for learners to be able to advocate for themselves  and others when it comes to finding and receiving mental health treatment.

Please leave a comment about whether or not you would use AI or the internet to self-diagnose yourself. Please also leave a comment with any thoughts you had after reading this blog. As always, please like and subscribe so you don’t miss any content.

Leave a Reply