Cancer Data Science Pulse
Meteorology, Aerospace Engineering, and Cancer Research—The Future of Predictive Modeling
What do winter storms, airplanes, and cancer research have in common? More than you might think. This was the underlying theme of a recent virtual workshop, “Accelerating Precision Radiation Oncology Through Advanced Computing and Artificial Intelligence,” which was hosted by NCI’s Frederick National Laboratory for Cancer Research (FNLCR), and the U.S. Department of Energy (DOE). The workshop is the first of several to explore how radiation treatment might be improved using mechanism-based, computationally enabled modeling and advanced computing as a way to advance dynamic, multiscale, data-informed, clinically actionable predictions and decision making. The workshops are an outgrowth of the NCI-DOE Collaboration and the Computational Cancer Community.
The kickoff event brought together experts on meteorology, aerospace engineering, and radiation oncology to explore what we can learn from these very different fields to further advance how we target and apply radiation to treat cancerous tumors.
The workshop was moderated by Dr. Jeff Buchsbaum, medical officer and program director, NCI Radiation Research Program; and Dr. David Jaffray, professor and chief technology and digital officer, The University of Texas MD Anderson Cancer Center.
Three speakers were showcased, including Dr. Kelvin Droegemeier, former director of the White House Office of Science and Technology Policy and director emeritus of the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma; Dr. Karen Willcox, professor and director, Oden Institute for Computational Engineering and Sciences at The University of Texas at Austin; and Dr. Caroline Chung, associate professor and director of Imaging Technology and Innovation for the Division of Radiation Oncology at The University of Texas MD Anderson Cancer Center.
As noted by Dr. Jaffray, the goal was to imagine how computing can impact radiation oncology and to look for ways to leverage the remarkable resources available today to move the field forward. In short, said Dr. Jaffray, we want to start a conversation to see “how computing can change the world of oncology.”
The Farmer’s Almanac, Doppler Radar, and Predictive Modeling
To illustrate the power of computing and just how far the meteorology field has advanced, Dr. Droegemeier described the genesis of weather prediction. He noted that the Farmer’s Almanac has now been joined by numerical models and high-performance computers to predict weather events from a global perspective down to a specific neighborhood.
It all starts with data. Dr. Droegemeier described how data from balloons, radar, commercial aircraft sensors, surface sensors, and satellites all converge, feeding into models that show what’s likely to happen. Those predictions give us insight into the events that will take place within a few days or even hours before a storm occurs. Said Dr. Droegemeier, “We then identified and applied laws, such as Newton’s laws of motion and the theory of thermodynamics, to develop mathematical formulas and models.” In tracking how this technology evolved, he said the most significant advance in weather prediction occurred with the advent of 3D modeling to examine individual thunderstorms.
Could the behavior of tumor cells be tracked in the same way as storm cells? Dr. Droegemeier said yes, but noted that, with weather predictions, four major advancements needed to occur to make this happen, including 3D modeling, which gives insight into the thunderstorm’s interior structure (as shown by Doppler radar); the ability to differentiate between what is “noise” and what is an important storm event; an increase in computing power; and the development of fine-scale models using observational data.
Developing such fine-scale models was no easy feat, primarily because there is never “enough” data. Instead, said Droegemeier, “We had to infer what we could not observe using data assimilation.” He called this the “Adjoint Method.” With this method, an optimal model is used as the initial condition and adjustments are made based on observations collected over time. This essentially forces the model to adjust, minimizing the differences between what is real and what was predicted. With weather, this resulting model allowed meteorologists to accurately predict the future direction and intensity of storms down to a specific street in a specific neighborhood.
According to Dr. Droegemeier, this modeling approach directly relates to cancer research, as the fundamental mathematical underpinnings are translatable. He stressed the need for high-quality, comparative data. He added that we need ways of archiving that data and ensuring privacy, security, and governance. Much of this work is already underway, including CBIIT’s efforts in developing NCI’s Cancer Research Data Commons, a cloud-based infrastructure that provides access to analytical tools and resources for managing a broad range of cancer research data. The next steps will be to develop models that test the limits of predictability and provide insight on focus areas which are most likely to produce results.
Apollo 13 and Digital Twins
Dr. Willcox next related her experience with predictive modeling in aerospace systems. Aerospace often is regarded as a gold standard for computational simulations, but, according to Dr. Willcox, the field still faces many challenges as not every situation can be predicted at the “level of fidelity needed to drive good decisions.” Using rocket propulsion as an example, Dr. Willcox said you simply can’t simulate this event on a scale that allows us to truly predict what’s going on, and it can take months per simulation to arrive at a result.
Cancer research faces similar challenges. Like aerospace, cancer research involves complex physical phenomena: the models must blend cyber-physical interactions (i.e., software, hardware, sensors, and automation) and the data are highly diverse and can change over time.
But Dr. Willcox noted, computational science is finding new ways to meet these challenges. Advances in sensory technology are yielding abundant data, and algorithms are allowing us to process that data at a scale never seen before. When coupled with today’s computing power, we have the tools to advance research. She explained that airplanes now conduct computations without disruption to systems or operations in real-time, which was never possible before. This isn’t just true for aerospace. She added, “This is a revolution that’s touching every aspect of science, engineering, and society.”
The concept of a digital twin is revolutionizing how we approach predictive modeling. A digital twin is targeted to a specific individual and/or unique physical asset. In aerospace, a digital twin is typically linked to a specific airplane. In the future, a digital twin could reflect a particular cancer patient. Most importantly, it’s a “living model” that evolves as data from the actual airplane or patient changes.
Dr. Willcox related the Apollo 13 mission as an example. Simulators on the ground were crucial for bringing the crippled spacecraft back to Earth. Mission controllers were able to feed real-time data into the simulator, which was adapted to reflect what was happening in space. They then could play out (or project) the “what if’s” needed to develop a plan to bring about a successful landing.
Remarkably, this Apollo digital twin was from 1970, a time when we didn’t have particularly powerful computers or computational tools. As Dr. Willcox noted, “Our models and data don’t have to be perfect, and we don’t have to resolve everything at every scale to have models that can help in prediction.”
She expanded on this, saying there never will be enough data. The solution then is to use the data we have to create computational models, even if they are not perfect, to boost our knowledge of the uncertainties and to better inform predictability.
Lessons Learned in Radiation Oncology
Drawing on her experience in radiation oncology, Dr. Chung offered her perspective on modeling and simulations. She began by describing how imaging has been used for many years to understand the patient’s condition and design radiation treatment. Using images to guide and account for uncertainty helps physicians better target diseased tissue while sparing the healthy tissue surrounding it. However, noted Dr. Chung, we still need to “shrink the uncertainty of each measurement to make the higher spatial and temporal resolution meaningful.” This, in turn, will give clinicians greater confidence in the definition of the target for radiation.
As for the data, like the other speakers, Dr. Chung said it’s not just a question of having enough data. Instead, she believes it is critical to improve the quality and consistency of today’s data by harmonizing and aggregating data, checking and improving quality, and applying similar ontologies.
Using clear and consistent definitions to describe data can aid in communication and collaboration. As Dr. Chung noted, variations occur across cells, tumors, patients, practitioners, and institutions. Consistent ontology and more precise measurements will allow us to use this variability to our advantage.
One key area for improvement is the definition of outcomes and association with treatments. For example, radiation dose often is related to toxicity. However, Dr. Chung cautioned that the dose received at the specific structures and organs over the course of treatment isn’t necessarily what was intended and could be much higher (or lower) than planned. Better data on toxicity measurements at these sites would help determine more accurate dosing.
Deciding how to quantify and measure treatment response is another critical area that needs improvement. For tumor response, instead of asking whether the tumor is shrinking, it is likely more useful to ask if it has stopped growing or if it’s growing at a slower rate. By further defining outcomes and tumor measurements with greater granularity, coupled with artificial intelligence and machine learning, we likely can reduce the “noise” and obtain more precise and less-subjective tumor response measurements.
Harmonization and standardization are clearly important. However, to be most useful, clinicians need to adopt these standards. Showing evidence that those standards lead to improved cancer outcomes would motivate adoption of standards into clinical practice. Using high-quality and consistently defined data to inform computational models and updating those models as new data are available could ultimately make treatment decisions more predictive and less reactive.
In summarizing, Dr. Chung reminded the group that, like the weather, predictions in medicine don’t have to be perfect to be useful. “Clinicians deal with uncertainty all the time, and any improvements in prediction can help advance personalized treatment.” She added, “Putting our wealth of data and computing power to work for our patients is the first step toward improving these predictions over time.”
Next Steps
In looking at the future of radiation oncology, all three speakers agreed we have sufficient amounts of data to get started, and we have computers that are powerful enough to take on the challenges that lie ahead. Some general themes emerged, including the need for the following:
- collaboration and convergence among and across disciplines, such as cancer researchers, mathematicians, computational scientists, computer scientists, data scientists, and specialists in related fields;
- better ways of harmonizing existing data, using clear definitions and data elements and full transparency;
- recognition of the bias that exists in current data and models that take these uncertainties into account;
- translation and quick transition of research findings into clinical use; and
- new infrastructure to store, archive, and work with these data.
For more information
NCI and DOE have been collaborating since 2016. The mission of this strategic interagency collaboration is to simultaneously accelerate advances in predictive oncology and computing. Advancing radiation oncology is one emerging focus area.
Categories
- Data Sharing (64)
- Informatics Tools (40)
- Training (39)
- Data Standards (35)
- Genomics (35)
- Precision Medicine (32)
- Data Commons (32)
- Data Sets (26)
- Machine Learning (24)
- Seminar Series (22)
- Artificial Intelligence (21)
- Leadership Updates (13)
- Imaging (12)
- Policy (9)
- High-Performance Computing (HPC) (9)
- Jobs & Fellowships (7)
- Semantics (6)
- Funding (6)
- Proteomics (5)
- Awards & Recognition (2)
- Publications (2)
- Request for Information (2)
- Information Technology (2)
- Childhood Cancer Data Initiative (1)
Leave a Reply