blog posts

The Possibility Of Using GPT-4 In The Treatment System

The Possibility Of Using GPT-4 In The Treatment System

Microsoft And Epic Systems Are Planning To Use GPT-4 AI  To Help Draft Healthcare System Staff Responses To Patients As Well As Analyze Medical Records.

Epic Systems is one of the largest software companies in the field of medical systems in America. The company’s electronic health record (EHR) software is reported to be used in more than 29 percent of US hospitals, and more than 305 million patients have electronic records in the Epic program worldwide. However, in the past, Epic has been criticized several times for using incorrect prediction algorithms in treatment systems.

In Monday’s announcement, Microsoft cited Epic for using its Azure AI service, which provides API access to OpenAI’s large language models (LLM) such as GPT-3 and GPT -4. This means that companies can benefit from the company’s artificial intelligence services using Microsoft’s Azure cloud platform.

Have. The first use of GPT-4 allows physicians and healthcare systems staff to draft response messages to patients automatically. Incorporating artificial intelligence into some of our daily workflows will increase the productivity of many of our physicians and allow them to focus more on their clinical duties, which is what patients need, the company’s press release said.

However, the partnership between Microsoft and Epic has raised concerns among AI researchers, partly due to GPT-4’s ability to classify information not present in its dataset.

“Language models are not trained to produce facts, but to produce things that look like reality,” says an artificial intelligence researcher Dr. Margaret Mitchell. In my opinion, using artificial intelligence to rely on critical issues is wrong. If you want to use AI to write creative stories or learn a language, it can help.

Mitchell explains that using artificial intelligence in treatment systems has been studied for years. One of the most important results of these studies is that the information provided by artificial intelligence cannot be misleading, partly because It can be done using patterns and instructions.

In addition, some people are not sure that a linguistic model is an appropriate tool for use in therapeutic systems.

Some experts in artificial intelligence believe that artificial intelligence can shine in this field. They wonder why large companies are reluctant to use it and think it is not worth investing in.

Another concern for using artificial intelligence in therapeutic systems is the high biases of GPT-4, which may discriminate against a particular set of patients based on gender, race, age, or other factors. A series of critics say: GPT-4 has many limitations similar to previous models; this robot can produce biased and unreliable content, for example.

Even if Epic uses a fully tuned version of GPT-4 trained explicitly for use in the medical field, the AI ​​may still have even weak biases.

With these limitations in mind, OpenAI’s policies clearly state that its models cannot be used to provide guidelines on treating diseases. The company said in a statement: Artificial intelligence models are not set up to provide medical information. It would be best if you never used our models to provide diagnostic or therapeutic services for serious illnesses.

It seems that Epic and Microsoft use GPT-4 in such a way that they can take advantage of its capabilities directly (without the presence of a medical professional) or by diagnosing and analyzing data from the patient’s medical field, and this can be very effective…