Features Partner Sites Information LinkXpress hp
Sign In
Advertise with Us
PURITAN MEDICAL

Download Mobile App




Researchers Use AI to Improve Mammogram Interpretation

By HospiMedica International staff writers
Posted on 04 Jul 2018
A team of researchers at the Department of Energy’s Oak Ridge National Laboratory (Oak Ridge, TN, USA) successfully used artificial intelligence to improve understanding of the cognitive processes involved in image interpretation. More...
Their work, which was published in the Journal of Medical Imaging, will help reduce errors in the analyses of diagnostic images by health professionals and has the potential to improve health outcomes for women affected by breast cancer.

Early detection of breast cancer is critical for effective treatment, which requires accurate interpretation of a patient’s mammogram. The ORNL-led team of researchers found that analyses of mammograms by radiologists were significantly influenced by context bias, or the radiologist’s previous diagnostic experiences. New radiology trainees were most susceptible to the phenomenon, although even more experienced radiologists fall victim to some degree, according to the researchers.

The researchers designed an experiment aimed at following the eye movements of radiologists at various skill levels to better understand the context bias involved in their individual interpretations of the images. The experiment followed the eye movements of three board certified radiologists and seven radiology residents as they analyzed 100 mammographic studies from the University of South Florida’s Digital Database for Screening Mammography. The 400 images, representing a mix of cancer, no cancer, and cases that mimicked cancer but were benign, were specifically selected to cover a range of cases similar to that found in a clinical setting.

The participants, who were grouped by levels of experience and had no prior knowledge of what was contained in the individual X-rays, were outfitted with a head-mounted eye-tracking device designed to record their “raw gaze data,” which characterized their overall visual behavior. The study also recorded the participants’ diagnostic decisions via the location of suspicious findings along with their characteristics according to the BI-RADS lexicon, the radiologists’ reporting scheme for mammograms. By computing a measure known as a fractal dimension on the individual participants’ scan path (map of eye movements) and performing a series of statistical calculations, the researchers were able to discern how the eye movements of the participants differed from mammogram to mammogram. They also calculated the deviation in the context of the different image categories, such as images that show cancer and those that may be easier or more difficult to decipher.

In order to effectively track the participants’ eye movements, the researchers had to employ real-time sensor data, which logs nearly every movement of the participants’ eyes. However, with 10 observers interpreting 100 cases, the data soon began adding up, making it impractical to manage such a data-intensive task manually. This made the researchers turn to artificial intelligence to help them efficiently and effectively make sense of the results. Using ORNL’s Titan supercomputer, the researchers were able to rapidly train the deep learning models required to make sense of the large datasets. While similar studies in the past have used aggregation methods to make sense of the enormous data sets, the team of researchers at ORNL processed the full data sequence, a critical task as over time this sequence revealed differentiations in the eye paths of the participants as they analyzed the various mammograms.

In a related paper published in the Journal of Human Performance in Extreme Environments, the researchers demonstrated how convolutional neural networks, a type of artificial intelligence commonly applied to the analysis of images, significantly outperformed other methods, such as deep neural networks and deep belief networks, in parsing the eye tracking data and, by extension, validating the experiment as a means to measure context bias. Furthermore, while the experiment focused on radiology, the resulting data drove home the need for “intelligent interfaces and decision support systems” to assist human performance across a range of complex tasks including air-traffic control and battlefield management.

While machines are unlikely to replace radiologists (or other humans involved in rapid, high-impact decision-making) any time soon, they do hold enormous potential to assist health professionals and other decision makers in reducing errors due to phenomena such as context bias, according to Gina Tourassi, team lead and director of ORNL’s Health Data Science Institute. “These findings will be critical in the future training of medical professionals to reduce errors in the interpretations of diagnostic imaging. These studies will inform human/computer interactions, going forward as we use artificial intelligence to augment and improve human performance,” said Tourassi.

Related Links:
Oak Ridge National Laboratory


Gold Member
12-Channel ECG
CM1200B
Antipsychotic TDM Assays
Saladax Antipsychotic Assays
Radiation Safety Barrier
RayShield Intensi-Barrier
Surgical Headlight
IsoTorch
Read the full article by registering today, it's FREE! It's Free!
Register now for FREE to HospiMedica.com and get access to news and events that shape the world of Hospital Medicine.
  • Free digital version edition of HospiMedica International sent by email on regular basis
  • Free print version of HospiMedica International magazine (available only outside USA and Canada).
  • Free and unlimited access to back issues of HospiMedica International in digital format
  • Free HospiMedica International Newsletter sent every week containing the latest news
  • Free breaking news sent via email
  • Free access to Events Calendar
  • Free access to LinkXpress new product services
  • REGISTRATION IS FREE AND EASY!
Click here to Register








Channels

Patient Care

view channel
Image: The revolutionary automatic IV-Line flushing device set for launch in the EU and US in 2026 (Photo courtesy of Droplet IV)

Revolutionary Automatic IV-Line Flushing Device to Enhance Infusion Care

More than 80% of in-hospital patients receive intravenous (IV) therapy. Every dose of IV medicine delivered in a small volume (<250 mL) infusion bag should be followed by subsequent flushing to ensure... Read more

Business

view channel
Image: The collaboration will integrate Masimo’s innovations into Philips’ multi-parameter monitoring platforms (Photo courtesy of Royal Philips)

Philips and Masimo Partner to Advance Patient Monitoring Measurement Technologies

Royal Philips (Amsterdam, Netherlands) and Masimo (Irvine, California, USA) have renewed their multi-year strategic collaboration, combining Philips’ expertise in patient monitoring with Masimo’s noninvasive... Read more
Copyright © 2000-2025 Globetech Media. All rights reserved.