A new deep learning model could reduce the need for surgery when diagnosing whether cancer cells are spreading, including to nearby lymph nodes—also known as metastasis. Developed by researchers from the University of Texas Southwestern Medical Center, the AI tool analyzes time-series MRIs and clinical data to identify metastasis, providing crucial, noninvasive support for doctors in treatment planning. The advancement could lead to more timely and accurate cancer assessments, helping many patients avoid unnecessary surgery and improve outcomes.
Metastatic breast cancer is responsible for the majority of breast cancer-related deaths. About one in three women in the US diagnosed with early-stage breast cancer develops metastatic cancer. However, early detection and treatment can slow disease progression, help doctors and patients manage symptoms, and maximize the effectiveness of treatments.
Doctors often rely on sentinel lymph node biopsies (SLNB) when checking whether cancer has spread to the lymph nodes. The procedure involves injecting dye and a radioactive solution near the cancer site to identify the sentinel nodes, which drain into the tumor area first. These nodes are then surgically removed and biopsied. If cancer cells are found in the sentinel nodes, it shows that the cancer is spreading to the lymphatic system and could spread further. This information helps doctors determine the most appropriate treatment for the patient.
While SLNB is a proven method, it’s invasive and comes with risks related to anesthesia, radiation exposure, swelling, pain, and limited movement near the incision.
To create a noninvasive and reliable alternative to SLNB, the researchers developed a custom four-dimensional convolutional neural network (4D CNN). They trained the model using dynamic contrast-enhanced MRI (DCE-MRI) along with clinical datasets from 350 women recently diagnosed with breast cancer that spread to lymph nodes.
The researchers used the Nucleus Compute Cluster, part of the University of Texas Southwestern Medical Center’s high-performance computing infrastructure, to build and train the complex 4D deep learning model employing NVIDIA A100 Tensor Core and NVIDIA V100 Tensor Core GPUs
“The deep learning model we built was a complex 4D model and GPUs were essential for us to achieve high training throughput as well as for our data preprocessing pipeline for image enhancement and noise reduction,” said NVIDIA Senior HPC Engineer Paniz Karbasi, a study coauthor and former Computational Scientist at the University of Texas Southwestern Medical Center.
This AI model processes data in four dimensions, examining data from 3D MRI scans while accounting for changes over time. The model learns features of tumors and nearby lymph nodes by analyzing multiple images over time and integrating clinical data such as age, tumor grade, and breast cancer markers. By doing so, it can accurately identify patterns associated with cancer-free or cancer-affected lymph nodes.
“The most important aspect of our study is that for imaging data we solely focus on data related to the primary tumor, without any additional axillary imaging,” said study lead author Dogan Polat, an Interventional Radiology Resident at Mount Sinai Health Systems. Dr. Polat led the study while at the University of Texas Southwestern Medical Center. “We aim to decrease the need for additional imaging and reduce the number of invasive procedures for patients,” said Dr. Polat.
It identifies lymph node metastasis with 89% accuracy, outperforming radiologists and other imaging-based models. It also has the potential to prevent breast cancer patients from undergoing unnecessary sentinel node biopsies, and axillary lymph node dissection (ALND), reducing the risks, complications, and resources associated with the procedure.
According to Polat, the next steps for the researchers include deploying the model to gather real-world data, which will help validate its effectiveness and identify areas for further refinement and broader application.