AI gauges the mental health of cancer patients through eye movements

Jon Fingas
·Associate Editor
·1-min read

Good mental health is important, and early research suggests AI might help diagnose that health for people who are under a particularly heavy strain. Scientists have developed a combination of deep learning algorithms that use eye tracking to gauge the mental health of cancer patients after surgery. Ideally, this would help spot patients likely to be suffering from anxiety or depression when a human can’t perform an initial psychological assessment.

The system uses a mix of a convolutional neural network and long short-term memory algorithms to study the eye movements of patients wearing tracking glasses (in this case, Tobii Pro 2 glasses) while they contemplate artwork. The AI used the gaze and pupil position data from those glasses to determine how likely someone was to raise concerns on established hope, anxiety and mental wellbeing questionnaires they would fill out later.

The initial results were promising, with accuracy between 93.8 and 95 percent depending on the test. As Reddit users noted, though, the study only looked at 25 subjects — that’s not a huge sample to draw from. More work would be necessary to ensure the AI works reliably in flagging at-risk patients, and the team acknowledges there should be “further validation.” This also assumes patients would be comfortable knowing machines were studying their eye movements — those in the study were, but others might bristle at the thought.

If the accuracy translates well to larger studies, though, the AI could prove extremely helpful for the healthcare industry. Patients could recover at home while sharing their mental health, and psychotherapists would only have to focus on those patients with warning signs. In other words, the technology could improve the availability and quality of help for those that need it the most.