News

Radiologists at Center of AI Systems for Better Cancer Diagnoses

It’s no surprise that the recent interest in artificial intelligence (AI) has sparked several conversations about the future, especially regarding the potential growth in the medical field. However, despite the success of its development, several studies show that AI systems do not improve radiologists' diagnostic performance.

An NCI-funded project is hoping to change that. University of Houston’s Dr. Hien Van Nguyen is working on fundamentally changing the design of Al medical systems to work with human radiologists for more accurate medical diagnoses.

“Current AI systems focus on improving stand-alone performances while neglecting team interaction with radiologists. This is a critical limitation,” says Dr. Van Nguyen.

His project, “AI-Doctor Collaborative Medical Diagnosis,” began receiving NCI funds in 2022. It combines novel AI algorithms, gaze monitoring software, and design principles to help doctors minimize diagnostic errors due to cognitive and perceptual biases. If the project is successful, it could increase diagnostic accuracy, save lives, reduce missed cancer diagnoses, and improve public health.

“The goal is to establish a comprehensive computational framework that will enable AI to collaborate with human radiologists in the domain of medical diagnosis. We want to put radiologists at the center of AI systems, creating an environment in which human expertise and machine learning algorithms work hand-in-hand to achieve superior diagnostic accuracy and better patient outcomes,” he adds.

The project’s three aims focus on fundamental theories and evaluate the proposed approaches to targeted applications:

  1. Develop computational principles for optimal Al-radiologist interaction to guide the interaction between radiologists and Al. This will assist in achieving the best possible diagnostic performance while reducing the time.
  2. Design a user-friendly radiologist-Al interaction with a minimally interfering interface that allows human radiologists to interact with Al models efficiently. The proposed system combines “multimodal thinking with audio and gaze” methodology with user-centered design. The process will result in a novel radiologist-Al collaborative interface that maximizes time efficiency while minimizing distraction.
  3. Evaluate the proposed approaches (mentioned in Aims 1 and 2) by testing them via lung nodule detection and pulmonary embolism procedures. Studying how radiologists collaborate with Al to reduce diagnostic errors may lead to significant clinical impacts.

“Our approaches are creative and original because they represent a substantive departure from the existing algorithms. Instead of continuously providing AI predictions, our system uses a gaze-assisted reinforcement learning agent to determine the optimal time and type of information to present to radiologists.”

The project is set to wrap in 2026.

Vote below about this page’s helpfulness.