Eye tracking in combination with artificial intelligence is a developing area of research with a wide range of applications, as evidenced by the increasing number of studies being conducted in this field. Such studies hold promising results in terms of prognosis and diagnosis, as they provide insight into how doctors interpret images and the factors that influence their decision-making processes. In this study, we investigated whether potential diagnostic errors made by physicians can be recognized through eye movements and artificial intelligence. To achieve this, we engaged four radiologists with varying levels of diagnostic experience to analyze 400 X-rays chest images with a wide range of anomalies, concurrently capturing their eye movements using an eye tracker. For each of the resulting 1546 readings, we computed numerical features extracted using radiologists’ gaze saccade data. Subsequently, we applied three machine learning algorithms such as random forest, support vector machines, k-nearest neighbor classifier, and also a neural network to map reading gaze features with radiological errors resulting in the error prediction accuracy of 0.7. Our experiments demonstrate the existence of a connection between diagnostic errors and gaze, indicating that eye-tracking data can serve as a valuable source of information for human error analysis.
|