Publication:
Application of ChatGPT as a content generation tool in continuing medical education: acne as a test topic.

cris.virtual.author-orcid0000-0001-8161-6138
cris.virtualsource.author-orcidf7d9d367-6080-4fe7-9ee6-d72fcc445aed
cris.virtualsource.author-orcidf008391c-fcfc-43fe-9155-a480a9df287e
datacite.rightsopen.access
dc.contributor.authorNaldi, Luigi
dc.contributor.authorBettoli, Vincenzo
dc.contributor.authorSantoro, Eugenio
dc.contributor.authorValetto, Maria Rosa
dc.contributor.authorBolzon, Anna
dc.contributor.authorCassalia, Fortunato
dc.contributor.authorCazzaniga, Simone
dc.contributor.authorCima, Sergio
dc.contributor.authorDanese, Andrea
dc.contributor.authorEmendi, Silvia
dc.contributor.authorPonzano, Monica
dc.contributor.authorScarpa, Nicoletta
dc.contributor.authorDri, Pietro
dc.date.accessioned2025-03-06T11:46:52Z
dc.date.available2025-03-06T11:46:52Z
dc.date.issued2024-11-28
dc.description.abstractThe large language model (LLM) ChatGPT can answer open-ended and complex questions, but its accuracy in providing reliable medical information requires a careful assessment. As part of the AICHECK (Artificial Intelligence for CME Health E-learning Contents and Knowledge) Study, aimed at evaluating the potential of ChatGPT in continuous medical education (CME), we compared ChatGPT-generated educational contents to the recommendations of the National Institute for Health and Care Excellence (NICE) guidelines on acne vulgaris. ChatGPT version 4 was exposed to a 23-item questionnaire developed by an experienced dermatologist. A panel of five dermatologists rated the answers positively in terms of "quality" (87.8%), "readability" (94.8%), "accuracy" (75.7%), "thoroughness" (85.2%), and "consistency" with guidelines (76.8%). The references provided by ChatGPT obtained positive ratings for "pertinence" (94.6%), "relevance" (91.2%), and "update" (62.3%). The internal reproducibility was adequate both for answers (93.5%) and references (67.4%). Answers related to issues of uncertainty and/or controversy in the scientific community scored the lowest. This study underscores the need to develop rigorous evaluation criteria for AI-generated medical content and for expert oversight to ensure accuracy and guideline adherence.
dc.description.sponsorshipClinic of Dermatology
dc.identifier.doi10.48620/85797
dc.identifier.pmid39969058
dc.identifier.publisherDOI10.4081/dr.2024.10138
dc.identifier.urihttps://boris-portal.unibe.ch/handle/20.500.12422/205833
dc.language.isoen
dc.publisherPAGEpress
dc.relation.ispartofDermatology Reports
dc.relation.issn2036-7392
dc.subject.ddc600 - Technology::610 - Medicine & health
dc.titleApplication of ChatGPT as a content generation tool in continuing medical education: acne as a test topic.
dc.typearticle
dspace.entity.typePublication
dspace.file.typetext
oairecerif.author.affiliationClinic of Dermatology
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.contributor.roleauthor
unibe.description.ispublishedinpress
unibe.refereedtrue
unibe.subtype.articlejournal

Files

Original bundle
Now showing 1 - 1 of 1
Name:
DERMA+10138.pdf
Size:
2.46 MB
Format:
Adobe Portable Document Format
File Type:
text
License:
https://creativecommons.org/licenses/by-nc/4.0
Content:
published

Collections