Preloader

ChatGPT Health makes its debut: Is it wise to upload medical records and get a diagnosis from AI?

  • Jan 16, 2026 02:00

It has become commonplace for millions of people: consult a chatbot and then call the doctor. ChatGPT Health responds precisely to this habit: OpenAI has set up a separate environment entirely focused on health and wellness and is separate from the general chatbot. This new environment is designed to help users better understand medical test results and medical data.

For millions of people, it has now become normal to consult a chatbot before calling the doctor. ChatGPT Health stems directly from this everyday practice: OpenAI has created a separate environment dedicated to health and wellness that is separate from the general chatbot. This new environment is designed to help users read medical test results, understand medical data and prepare for a doctor's appointment. And that's no accident: every week, hundreds of millions of users ask questions about symptoms, blood tests, common aches and pains or a healthy lifestyle. This is why AI has increasingly become the first interlocutor for health questions.

What exactly is ChatGPT Health and what it can do

On January 7, 2026, ChatGPT Health was officially presented as a digital hub where users can upload medical documents and link electronic health records. Apps such as Apple Health or MyFitnessPal, smartwatches and fitness devices can also be linked to it. The stated goal is not to diagnose or prescribe treatments, but to provide context: explaining the progression of a particular value over time, helping to formulate targeted questions for a specialist, and suggesting exercise regimens or nutrition plans that fit the available data. According to OpenAI, 260 physicians from 60 countries worked on it for more than two years. The system has been tested with clinical benchmarks that assess not only accuracy, but also clarity, appropriateness and the ability to signal urgent situations.

What about privacy and regulations?

One of the most sensitive issues is the protection of medical data. OpenAI speaks of a shielded environment, with specific encryption and a promise that health information will not be used to train the underlying models. Users can disconnect linked resources at any time and delete chats and memory functions. Still, ChatGPT Health remains a consumer product and not a medical device: the data is not automatically subject to regulations such as the U.S. federal law for the protection of sensitive health information and could, in principle, be the subject of legal requests. Moreover, for now, the most advanced integrations are only available in the United States. In Europe and the United Kingdom, access to health data is more strictly regulated.

Don't confuse it with a real doctor

The arrival of ChatGPT Health is an important step, but requires caution. In the past, the same chatbot has regularly been inaccurate on medical topics, with answers sometimes proving ambiguous, incomplete or just plain wrong. Even in this further developed version, AI cannot replace professional medical assessment: there is no physical examination, no human and social framework, no clinical experience. The risk, therefore, is that the independent user will place too much trust in the system. Those who use ChatGPT Health wisely may find it a useful tool to orient themselves and go to an appointment better informed. Those who use it as a replacement for a doctor are stuck with a technological sham solution that may have real consequences for our health.

Source: OpenAI

Share: