Breast imaging is awash in data – Artificial Intelligence can help

Breast imaging plays a vital role in women’s health, involving a range of imaging procedures seeking to improve breast health through early detection, diagnosis, and treatment. Breast radiologists harness a range of technologies including mammography, tomosynthesis, MRI, and ultrasound for screening and early detection, as well as guiding biopsies and surgical planning.  With the importance of breast imaging across the care continuum, it’s no surprise that breast imaging is in high demand – demand that places a heavy burden on breast radiologists. A recent survey of members of the Society of Breast Imaging (SBI) demonstrated a high prevalence of burnout (77%) affecting practicing breast radiologists in the U.S.¹ Physician burnout, a state of chronic physical and emotional exhaustion as a result of prolonged stress and heavy workloads, has been increasingly recognized over the past decade as an epidemic within the United States.² Demanding Exam and Data Volume = Cognitive Overload Studies have shown that specialist radiologists – such as breast radiologists – detect more cancers and more early-stage cancers and have lower recall rates than general radiologists. However, breast imaging has unique stressors compared to other areas of radiology that can contribute to radiologist fatigue and error: the job requires prolonged concentration, comprehensive image review, high volumes of image and data loads, demands to read quickly, and reporting using 3D visualization of complex anatomy. Together, these factors contribute to physical and cognitive fatigue.³ The two most common stressors contributing to burnout, as highlighted in the SBI survey, were having to practice faster than ideal because of a high volume of work (71.2%) and trying to balance the demands of work with the time needed for personal and family life (66.9%). Excessive workloads including high work intensity have been associated with physician burnout. Within radiology, studies have demonstrated that reading high volumes of studies may increase radiologist fatigue and also increase interpretive errors. The American Medical Association reports that more than 87% of individual medical errors can be attributed to cognitive overload.⁵ Breast radiologists now process more information per study and per hour  The cognitive load theory suggests that completing a task relies on the interplay between sensory input, working memory, and long-term memory. Cognitive overload occurs when the working memory’s limited capacity is exceeded due to excessive information processing. In breast imaging, radiologists face increasing cognitive load as the volume of images, cases and data continues to rise, leading to potential burnout. According to MQSA data, more than 43 million mammograms were performed in the United States in 2024, increasing from 40 million in 2022.⁶ While nearly two-thirds of radiologists read some mammograms, only 10% or roughly 2,800, are breast radiologists.⁷ As breast imaging grows, the increased volume is absorbed in the most concentrated manner by specialists. This is further exacerbated by the increasing volume of radiological data generated via 3D imaging modalities, such as digital breast tomosynthesis, and the increasing availability to and reimbursement for supplemental imaging such as MRI and ultrasound. The issue of anatomic overlap inherent to 2D mammograms led to the development of tomosynthesis. While studies have shown that tomosynthesis is more accurate than mammography, the 3D images dramatically increase the cognitive load associated with reviewing 3D studies. With tomosynthesis exams, the radiologist must review up to hundreds of slices, rather than four 2D images, in order to detect any abnormalities, while correlating those images into a 3D representation of the breast.  With the tremendous volume of slices that need to be reviewed visually, the volume of data coursing through imaging networks and radiologist neurons has also increased. Based on studies read per hour, for example, a radiologist processes 2 GB of data per hour with breast MRI, compared to more than 28 GB of data per hour for tomosynthesis.   Medical imaging has become a critical component of most diagnostic and therapeutic procedures. However, medical images are complex and require expert interpretation to detect, diagnose, and stage cancer and other diseases. Interpretation involves two fundamental processes: visually searching the image (perception) and rendering an interpretation (cognition).⁸ A typical digital mammography (2D) image contains 20-40 million pixels, which is approximately the same size as a Breast CT or Breast MR image. A tomosynthesis (3D) image contains 2-5 billion pixels. Compare that to the size of a small lesion at approximately 9 pixels and you can see the difficulty in finding and evaluating potential breast lesions exams under time and volume constraints.   Adding to both the perception and cognitive challenges, only a small fraction of mammograms typically have abnormal findings.⁹ Reading a mammogram is like looking for the proverbial needle in a haystack, only you are looking through hundreds of haystacks without knowing which one has a needle – or whether there is a needle to find at all. The current method of human interpretation of mammograms does not provide an adequate method of managing large amounts of clinical data in order to control cognitive burden, such as reducing redundant activities, reducing the volume of data reviewed and prioritizing areas of focus.  However, Breast AI can help.   Using Breast AI to Address the Data Tsunami  Artificial Intelligence (AI) is ubiquitous – it is all around us. AI is used to help track your food delivery, recommend movies to stream, and manage performance in electric cars. AI is well suited to manage repetitive processes and to identify patterns in large amounts of data.  The application of AI to image interpretation tasks such as cancer detection and risk assessment can standardize the tools and approaches to perform breast imaging tasks, particularly reading mammograms. This can decrease the time needed to read exams, improve accuracy and decrease radiologists’ workload.  The history of AI in breast imaging dates back to the 1950 and 60s and the first paper on computer aided detection was published in a 1967 paper, looking at the feasibility of automated reading. There is a growing body of research demonstrating the ability of AI to meet or exceed the performance of human experts in several key aspects of medical-image analysis. Introduced in the 1990s, computer-aided detection (CAD) software for mammography showed early promise. However, that first generation of software was built on a foundation that could not achieve high sensitivity without giving up specificity, meaning early CAD highlighted too many findings and failed to improve the performance of readers in real-world settings. But now, modern machine learning (including deep learning) capabilities allow for greater precision in algorithmic performance.  Transpara is the leading Breast AI, with a singular purpose of providing accurate and consistent breast cancer screening to all women. Transpara serves as an extra pair of eyes for radiologists to help detect cancers earlier and reduce recall rates.  Developed, tested and trained on millions of images across diverse patient populations, Transpara Detection is designed to identify distinct features in mammograms that are likely to represent signs of cancer. It has been proven to improve the reading performance of all radiologists¹⁰, regardless of their experience level, delivering accurate and consistent results for all women. Improving Accuracy and Reducing Risk of Burnout  The most clinically validated breast AI on the market, Transpara Detection’s performance has been evaluated in nearly 40 peer-reviewed studies and dozens of conference presentations. No other AI has so much published and readily available performance data, including new and notable results:
  • Researchers from UCLA found that Transpara Detection flagged 76% of mammograms originally read as normal but later linked to an interval breast cancer and flagged 90% of missed reading error cases where the cancer had been visible on the mammogram but missed or misinterpreted by the radiologist. Journal of the National Cancer Institute
 
  • In the first and only randomized controlled trial of Breast AI (the MASAI trial), Transpara helped radiologists detect more cancers while safely reducing reading workload by 44% with no change in recall rate. The Lancet Digital Health
  Transpara Detection analyzes and assesses the risk of breast cancer being present in either 3D or 2D studies, directly within radiologists’ workflow. Exams are bucketed into one of three categories of risk, plus a lesion score and marking for specific regions of concerns within the study. The higher the score, the higher the risk of cancer being present in the breast at the time of the mammogram. In screening settings, Transpara typically classifies 70% or more of exams as Low Risk, meaning they are not flagged and can be reviewed more efficiently. In a peer-reviewed U.S. study, the Low Risk group showed a 99.97% negative predictive value.¹¹ In essence, Transpara uses AI to pinpoint the haystack and show where the needle may be. In fact, in both single reading and double reading environments, Transpara has delivered time savings primarily through unmarked Low Risk cases. In single reading environments, it provides added confidence by acting as a second opinion for Low Risk cases. In double reading environments, it has been used to replace the second reader in constrained settings – while also serving as a third reader for the most suspicious cases, offering an extra pair of eyes where it matters most. In your practice, stay one step ahead with Transpara. Book a demo with our team today.  
  1. Jay R Parikh, Jia Sun, Martha B Mainiero, Prevalence of Burnout in Breast Imaging Radiologists, Journal of Breast Imaging, Volume 2, Issue 2, March/April 2020, Pages 112–118, https://doi.org/10.1093/jbi/wbz091
  2.  Shanafelt, Tait D., et al. “Changes in burnout and satisfaction with work-life integration in physicians and the general US working population between 2011 and 2017.” Mayo Clinic Proceedings. Vol. 94. No. 9. Elsevier, 2019.
  3.  Waite S, Kolla S, Jeudy J, et al. Tired in the reading room: the influence of fatigue in radiology. J Am Coll Radiol 2017; 14:191–197
  4. Parikh JR, Sun J, Mainiero MB. What Causes the Most Stress in Breast Radiology Practice? A Survey of Members of the Society of Breast Imaging. J Breast Imaging. 2021 Apr 19;3(3):332-342. doi: 10.1093/jbi/wbab012. PMID: 34056593; PMCID: PMC8139609.
  5. How bad “cognitive ergonomics” can drain doctors’ brainpower, American Medical Association, https://www.ama-assn.org/practice-management/physician-health/how-bad-cognitive-ergonomics-can-drain-doctors-brainpower
  6.  MQSA National Statistics as of March 1, 2025, MQSA National Statistics
  7.  Lewis, R.S. ∙ Sunshine, J.H. ∙ Bhargavan, M., A portrait of breast imaging specialists and of the interpretation of mammography in the United States, AJR Am J Roentgenol. 2006; 187:W456-W468
  8. National Institutes of Health, National Cancer Institute. Breast Cancer Screening (PDQ®)–Health Professional Version: https://www.cancer.gov/types/breast/hp/breast-screening-pdq#_13_toc
  9. Krupinski EA. Current perspectives in medical image perception. Atten Percept Psychophys. 2010 Jul;72(5):1205-17. doi: 10.3758/APP.72.5.1205. PMID: 20601701; PMCID: PMC3881280.
  10. A. Rodriguez-Ruiz et al., Journal of the National Cancer Institute, 2019. https://doi.org/10.1093/jnci/djy222
  11. Raya-Povedano et al., 2021, Radiology. https://pubs.rsna.org/doi/full/10.1148/radiol.2021203555
Follow Us

Lake Medical Imaging Case Study

Download the study to see how Transpara reduces radiologist workload and improves diagnostic confidence.

We may use the information you provide to send you updates and marketing communications based on our legitimate interest. You can opt out at any time. For more information about how we handle your personal data, please review our Privacy Policy