Radiologist-AI interaction is a novel area of research of potentially great impact. It has been observed in the literature that the radiologists’ performance deteriorates towards the shift ends and there is a visual change in their gaze patterns. However, the quantitative features in these patterns that would be predictive of fatigue have not yet been discovered. A radiologist was recruited to read chest X-rays, while his eye movements were recorded. His fatigue was measured using the target concentration test and Stroop test having the number of analyzed X-rays being the reference fatigue metric. A framework with two convolutional neural networks based on UNet and ResNeXt50 architectures was developed for the segmentation of lung fields. This segmentation was used to analyze radiologist’s gaze patterns. With a correlation coefficient of 0.82, the eye gaze features extracted lung segmentation exhibited the strongest fatigue predictive powers in contrast to alternative features.
Detection of lung diseases from chest X-rays has been of great interest from the research community during the last decade. Despite the existence of large annotated public databases, computer-aided diagnostic solutions still fail on challenging rare abnormality cases. In this study, we investigated the paradigm of combining the analysis of chest X-rays and physician gaze patterns during the analysis of these X-rays to improve the computerized diagnostic accuracy. Tobii Eye Tracker 4C has been mounted to a physician workstation and his eye movements were recorded during the analysis of 400 chest X-rays in two days of work. The X-rays have been sampled from CheXpert, RSNA, and SIIM-ACR public databases labeled with 14 different pathology types. The task was formulated as a binary classification problem. A ResNet34-based neural network has been trained to map the input chest X-ray with the output physician gaze map and binary pathology label. The proposed network improved the diagnostic accuracy to 0.714 of the area under receiving operator curve (AUC) from 0.681 AUC obtained for the same ResNet34 trained to generate binary pathology labels alone. The proposed study has demonstrated the potential benefits of using gaze information in computerized diagnostic solutions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.