Handheld Non-mydriatic Fundus Camera for Bedside Inpatient Ophthalmology and Neurology Consultations
Kevin Yan1, Daniel Adamkiewicz2, Spencer Hutto3, Nancy Newman2, Valerie Biousse2
1Icahn School of Medicine at Mount Sinai, 2Emory University School of Medicine, 3Emory University: Neurology Residency Program
Objective:
We assessed the feasibility, usefulness, and generalizability of the Eyer2 and its integrated AI tool for ophthalmology/neuro-ophthalmology/neurology inpatient and emergency department (ED) consultations.
Background:
Bedside ocular funduscopic examination is challenging, both for trainees and staffing attendings who must confirm remote findings. Eyer2 is a handheld non-mydriatic, true-color 45/55-degree field ocular fundus camera, with electronic-health-record compatibility, automated upload of images to the Eyer-Cloud, and artificial intelligence (AI) capabilities. Its reliability when used by non-eye-care providers at bedside is unclear.
Design/Methods:
We deployed the Eyer2 by trained students/medicine/neurology/ICU/residents/fellows/attendings in varied clinical settings including outpatient clinics, EDs, and inpatient units. Images were acquired in both eyes without pupillary dilation, and automatically uploaded to the Eyer-Cloud, with remote interpretation by ophthalmologists. Image quality was graded as good/adequate/poor. Images were assessed for pathology and AI interpretation accuracy.
Results:
Over 20 weeks, 482 photographs were obtained on 242 patients in the ED (19%), hospital floors (13%), ICU (19%), clinic (43%), elsewhere (5%). 230 images were graded as good (48%), 181 adequate (38%), 71 poor (15%). The most common reason for poor image quality was miosis (68%). 183 images were abnormal (38%), including papilledema (38); diabetic/hypertensive retinopathy (35). The AI tool was activated for 66% of images (AI tool does not activate if poor image quality) with sensitivity 76% and specificity 90% for detecting ocular fundus abnormalities.
Conclusions:
Most Eyer2 fundus photographs provided useful data for remote interpretation, allowing accurate and rapid identification of optic nerve/macula pathology, immediate backup for trainees, and documentation of findings. While image quality varied across user-type and location, the camera generated useful images across multiple different operators and clinical settings. Challenges included a steep learning curve in its use and pupillary miosis, especially in ICUs. The AI interpretation was helpful mostly in good quality photographs by alerting non-ophthalmology-trained providers of potential optic nerve or retinal pathology.
Disclaimer: Abstracts were not reviewed by Neurology® and do not reflect the views of Neurology® editors or staff.