Home e-Business What Doctors Need to Know About ChatGPT and Other AI Tools?
For many physicians, AI tools may feel unnerving and even threatening since a lot of patients begin to use these tools to get answers about their health

What Doctors Need to Know About ChatGPT and Other AI Tools?

For many physicians, the rise of artificial intelligence (AI) tools, such as ChatGPT, may feel unnerving and even threatening, particularly as patients begin to use these tools to get answers about their health.

Although every tool has its dangers and AI is, in many ways, in the early days of its capabilities, AI tools are here to stay, according to experts and physicians who educate themselves, and learn to use them, are likely to stay ahead of the curve and even enhance their practices.

The key thing to remember is that AI is a very powerful tool and like a hammer, can be used to hit people and to build a house.

The fact that people can run around smacking people upside the head with a hammer is not a reason to fear, but a reason to fear specific uses of the hammer.

The same is true of AI so, if a user is deeply educated on what they are using, it is not dangerous.

The more important question that physicians should be asking is, “‘Am I getting value out of this?’”

What is ChatGPT, and should physicians use it?

Currently the AI known as ChatGPT, a large language model developed by OpenAI, is all the buzz, since a free version of it recently became available for layperson use.

In simple terms, ChatGPT as “an extremely fancy autocomplete.” In essence, the AI is a predictive language model that mimics language patterns based on what it has seen.

In ChatGPT’s case, it has seen almost every piece of written information that exists on the internet, including books and therefore, can respond to user queries with answers that sound humanlike, as though it is thinking, but it is not actually utilizing any sort of logic, nor checking itself.

What physicians need to be careful with is that ChatGPT tends to make up information at this point.

There have been cases where it was asked for some information and provided a link to a paper title, but it’s not actually real.

ChatGPT is trained to predict associations between word and sentence patterns but has no way to ensure accuracy nor deliver by itself any quantitative results because it is not programmed to do math or make non-language-based associations.

But it could be paired with other kinds of algorithms, such as Mathematica, an AI tool used by scientists in academia and likely will be down the road for more complex uses.

However, ChatGPT can be useful to physicians in a couple of ways even now in its early days.

It is already useful broadly as a writing tool.

Physicians must spend a lot of time typing correspondence with patients or insurance companies so they could ask ChatGPT to aggregate basic info and then make a pass for accuracy and specificity, saving time.

Additionally, it can help to translate more technical medical jargon to patients.

We all complain that doctors give us explanations or write in our medical records in medical speak that’s very technical.

So, we could ask ChatGPT to translate this in a human, more understandable way so (patients) know what the doctor is saying.

Furthermore, it could act like a medical assistant or scribe by recording conversations between physician and patient and then summarizing it into a report and be used as a kind of virtual intern.

You know, if you need some inspiration, or additional research, or a brainstorm to think of potential solutions, cases, or diagnoses ChatGPT might be a good alternative for that.

What’s important to note is that patients will be using it to research health information and even self-diagnose, which is where physicians need to be vigilant, because ChatGPT gets so many things wrong.

Patients are already using it and they aren’t trained to distinguish between (correct and incorrect information, so that could create some tension between the patient and the doctor.

Other AI tools and their uses

AI tools are also useful for anything operations related (inventory management, patient scheduling nurse scheduling, vendor management, contract management, etc.).

Although there are some uses of AI already in diagnostics, such as in laboratories to detect cancer in patient tissue samples, this area is a work in progress, but one that is promising.

For obvious reasons, these AI diagnostic tools need to go through FDA approval and even clinical trials.

AI tools will also be able to help with insurance claims.

It’s going to be taking over that whole administrative claim recording, in making claims and adjudicating claims and be able to spot mistakes.

AI will not replace doctors

Although AI tools will keep getting more sophisticated and transform some health care practices, they will not replace doctors for many reasons.

The obvious reason is that “the warmth of the doctor-patient relationship can’t be replaced by AI.

Health care is so humane that AI is going to be an aid, but it’s not going to be a replacement of what doctors do.”

Furthermore, most AI needs a well-trained babysitter who’s an expert in something and can guarantee the quality of what’s coming out of the tool because there’s a high likelihood that it’s going to do something wrong, and that person needs to be trained to prevent it.

For physicians who are unsure about these tools, they can start to become familiar with them and get familiar with AI tools.

There is an AI revolution and AI is here to stay. Although many of these tools are not very mature or accurate, they are going to get better and better.

Caution, these tools need clinical validation and regulatory approval to build trust in both physicians and patients, because there is potential for legal concern.

Health care providers need to investigate the outputs of these models, particularly in the case of adverse incidents or complaints.

Experts recommend that physicians be discerning in the tools they end up using, and to seek solutions to their problems versus just the hot new AI tool. Shop for value, not for AI. 

Most of the value that is going to be generated by AI is going to be delivered via a services layer of experts who know how to use it.

However, physicians who engage with AI tools and people that start taking advantage of these tools will be better prepared for the new economy and the new ways things will be done.

Medical Manage

Εμείς και οι συνεργάτες μας αποθηκεύουμε ή/και έχουμε πρόσβαση σε πληροφορίες σε μια συσκευή, όπως cookies και επεξεργαζόμαστε προσωπικά δεδομένα, όπως μοναδικά αναγνωριστικά και τυπικές πληροφορίες, που αποστέλλονται από μια συσκευή για εξατομικευμένες διαφημίσεις και περιεχόμενο, μέτρηση διαφημίσεων και περιεχομένου, καθώς και απόψεις του κοινού για την ανάπτυξη και βελτίωση προϊόντων. Αποδοχή Cookies Όροι Προστασίας Προσωπικών Δεδομένων