Exploring the use of ChatGPT in Reporting a Case of HIV, Syphilis, and Lyme Co-infection Presenting with Multiple Cranial Neuropathies
Nimmi Wickramasuriya1, Eleftheria Vyras1, Tse Chiang Chen1, Jorie Singer1, Martha Robinson1
1Tulane University Medical Center
Objective:
This case report delves into the utilization of Microsoft Bing Chat Generative Pre-Trained Transformer (ChatGPT) for reporting a unique case involving co-infection with HIV, syphilis, and Lyme disease, presenting with multiple cranial neuropathies and gait abnormalities.
Background:
Microsoft Bing ChatGPT is one of the latest AI search engines powered by GPT language model developed by OpenAI and uses the testing version of the GPT-4 model. The chat option provides a broad range of features, including answering questions, writing articles and sifting through scientific literature to provide references and resources
Design/Methods:

ChatGPT-4 model was used to write the case report based on the input prompt provided and generate background content for the case report; this was then compared to the background content created by ChatGPT based on the information collected by our own research.

Results:
The case report itself revealed that both neurosyphilis and Lyme can lead to a similar presentation with multiple cranial neuropathies. When concomitant HIV is present, clinical findings are exacerbated. In the setting of co-infection, it is hard to determine which pathogen is the most offending. ChatGPT allowed us to generate texts faster whether it be purely by AI generated text or based on our own research. However, text generated based on our own research was more accurate and expressed points more clearly. Purely AI generated text did not provide enough citations and, if provided, was not always correct.
Conclusions:
Overall, ChatGPT is a great tool to use to decrease time and improve efficiency in producing text, and thereby promote rapid turnover of literature. It can be used to generate text based on your own research but should be used cautiously if relying on ChatGPT solely to generate the material. If this last method is employed, fact checking should be used. 
10.1212/WNL.0000000000205349