
[Image courtesy of the University of Houston]
A University of Houston researcher has developed an artificial intelligence system that can predict where a radiologist will look next on a chest X-ray before the radiologist even makes the move.
The tool, called MedGaze, is designed to improve radiology training, hospital efficiency and AI-powered diagnostic accuracy by mimicking expert gaze patterns.
“We’re not just trying to guess what a radiologist will do next; we’re helping teach machines and future radiologists how to think more like experts by seeing the world as they do,” said Hien Van Nguyen, associate professor of electrical and computer engineering at UH and lead author of a new study published in Nature Scientific Reports.
MedGaze functions as a “digital gaze twin,” trained on thousands of eye-tracking sessions that recorded how experienced radiologists examined chest X-rays. The software models where radiologists look, how long they fixate and in what sequence to produce a detailed picture of expert visual interpretation in real time.
Nguyen said MedGaze’s ability to anticipate long fixation sequences distinguishes it from previous AI efforts in computer vision.
“Unlike previous computer vision efforts that focus on predicting scan paths based on specific objects or categories, our approach addresses a broader context of modeling scan path sequences for searching multiple abnormalities in chest x-ray images,” Nguyen said. “Specifically, the key technical innovation of MedGaze is its capability to model fixation sequences that are an order of magnitude longer than those handled by the current state-of-the-art methods.”
According to UH, the system could help hospitals allocate reading time more effectively, understand diagnostic workload and improve training by offering new insights into how experts process complex cases. It may also enhance existing diagnostic AI systems by prioritizing the same regions of medical images that human experts would examine first.
While the tool is currently tailored to chest X-rays, Nguyen and his team plan to expand its capabilities to include other imaging modalities, such as MRI and CT scans.
“This opens the door to a unified, AI-driven approach for understanding and replicating clinical expertise across the full spectrum of medical imaging,” Nguyen said.
The research team also included UH graduate students Akash Awasthi and Mai-Anh Vu, as well as Carol Wu and Rishi Agrawal, professors at MD Anderson Cancer Center.