Commentary

ChatGPT in Dermatology Clinical Practice: Potential Uses and Pitfalls

Author and Disclosure Information

Practice Points

  • ChatGPT potentially can play a beneficial role in dermatologic practice by quickly accessing and synthesizing information, drafting generic medical documents, interpreting visits, advancing medical education, and more.
  • Dermatologists using ChatGPT should be extremely cautious, as it can produce false or biased information, perpetuate harmful stereotypes, and present information that is not up-to-date.


 

References

Artificial intelligence (AI) technology has increasingly been incorporated in medicine. In dermatology, AI has been used to detect and diagnose skin lesions, including skin cancer.1 ChatGPT (OpenAI) is a novel, highly popular development in generative AI technology. A large language model released in 2022, ChatGPT is a chatbot designed to mimic human conversation and generate specific detailed information when prompted. Free and publicly available, it has been used by millions of people. ChatGPT’s application in the medical field currently is being evaluated across several specialties, including plastic surgery, radiology, and urology.2-4 ChatGPT has the potential to assist health care professionals, including dermatologists, though its use raises important ethical considerations. Herein, we focus on the potential benefits as well as the pitfalls of using ChatGPT in dermatology clinical practice.

Potential Uses of ChatGPT in Practice

A major benefit of ChatGPT is its ability to improve clinical efficiency. First, ChatGPT can provide quick access to general medical information, similar to a search engine but with more natural language processing and contextual understanding to synthesize information.5 This function is useful for rapid concise answers to specific and directed questions. ChatGPT also can interact with its user by asking follow-up questions to produce more precise and relevant responses; this feature may help dermatologists form more accurate differential diagnoses. Additionally, ChatGPT can increase efficiency in clinical practice by drafting generic medical documents,2 including templates for after-visit summaries, postprocedure instructions, referrals, prior authorization appeal letters, and educational handouts. Importantly, increased efficiency can reduce provider burnout and lead to improved patient care. Another useful feature of ChatGPT is its ability to output information modeling human conversation. Because of this feature, ChatGPT also could be employed in clinical practice to serve as an interpreter for patients during clinic visits. Currently, the use of virtual translators can be cumbersome and subject to technical constraints. ChatGPT can provide accurate and conversational translations for patients and dermatologists, improving the patient-provider relationship.

ChatGPT also can contribute to major advancements in the field of dermatology beyond the clinical setting. Because of its ability to draw from extensive data that have already been uploaded, there are some uses of ChatGPT in a research context: to assist in finding resources for research and reviews, formulating hypotheses, drafting study protocols, and collecting large amounts of data within seconds.6

ChatGPT also has potential in advancing medical education. It could be used by medical schools to model interactive patient encounters to help students practice taking a patient’s history and creating differential diagnoses.6 This application of ChatGPT may help medical students hone their clinical skills in a low-stress environment without the restrictions that can come with hiring and training standardized patients, especially when mimicking dermatologic clinical encounters.

Other possibilities for ChatGPT in dermatologic practice include survey administration, clinical trial recruitment, and even automatic high-risk medication monitoring. Despite the many potential applications of ChatGPT in clinical practice, the question raised in each scenario is the quality, accuracy, and safety of what it produces.

Potential Pitfalls of ChatGPT in Practice and Possible Mitigation Strategies

A main concern in using ChatGPT in clinical practice is its potential to produce inaccurate or biased information. When prompted to create a research abstract based on previously published research, ChatGPT drafted abstracts that were clear and digestible but supplemented with incorrect data.7 A group of medical researchers who reviewed these ChatGPT-generated abstracts mistook 32% of the abstracts as having been written by human researchers. The implications of this finding are worrisome. If inaccurate or false information is used by ChatGPT in documents sent to insurance companies or patients, the patient’s safety as well as the dermatologist’s license and credibility are at stake. Thus, dermatologists looking to use ChatGPT to draft generic medical documents should actively review the output to ensure that the information is accurate. Importantly, ChatGPT also is only currently programmed with information up to 2021, limiting its access to recently published research articles and updated International Classification of Diseases, Tenth Revision codes.5 The continued development of ChatGPT will lead to regular updates by OpenAI that resolve this shortcoming in the future. Further, AI models may encode and perpetuate harmful stereotypes and social biases that are present in training data.8

When considering its potential in clinical practice, ChatGPT itself states that it can aid in clinical decision-making by processing patient information, including history, current symptoms, and biopsy and test results. This is uncharted territory, and providers who use ChatGPT at this stage to aid in decision-making should regard it similarly to information retrieved from a search engine. The information produced by ChatGPT should be used to substantiate but not definitively confirm a diagnosis or dictate management. Clinical suspicion by the dermatologist should always trump ChatGPT output. If physicians recommend ChatGPT output over their own advice, it can become a liability, as the technology is not regulated to any degree. Patients also should be cautious when seeking and following medical advice from ChatGPT, as it could be misleading or incorrect and could undermine the patient-physician relationship.6

Pages

Next Article: