To assess inter-examiner agreement of virtual concussion telemedicine examinations and to determine agreement of virtual and in-person concussion examinations performed by the same physician.
We developed a virtual concussion telemedicine examination by adapting several in-office examination methods. The virtual exam was compared to the in-person concussion exam to examine reliability and to validate the use of telemedicine via audio-video conferencing technology.
We determined Cohen’s kappa to assess agreement on dichotomous examination ratings for each of 29 exam elements across 21 participants. Kappa values for inter-examiner agreement ranged from 0.31-1.0, with median kappa=0.76; 45% of exam elements had excellent inter-examiner agreement between the two telemedicine examiners (kappa>0.75), while 75% of exam elements had at least intermediate inter-examiner agreement for telemedicine examiners (kappa>0.40). Within the same examiner for telemedicine vs. in-person examinations, Cohen’s kappa values were even higher, with values ranging from 0.48-1.0 overall.
We found that the treating physician’s virtual and in-person concussion examination findings largely agreed. However, there was less agreement between virtual concussion examinations performed by two different physicians. This suggests that expertise and experience of examining physicians contribute more to variability than do modality or area of the examination. This study sets the stage for investigations of reliability between in-person and teleneurology examinations.