News
NCI-Funded Researchers Develop Transformer Model to Decipher Whole Slide Images
Could technology be the answer to better cancer biopsies? See how researchers are honing deep learning models to help you not only efficiently diagnose cancer, but also predict how it might progress.
This latest model, called “SEQUOIA,” (or “Slide-based Expression Quantification using Linearized Attention”), borrows from a common deep learning technology—called a transformer model. Transformer models are neural networks that specialize in identifying sequences, like interpreting text or answering questions. Using a linear version of this technology (which helps map outputs to inputs more accurately and with fewer configurations), the researchers created digital gene expression profiles from whole slide images.
With SEQUOIA, the researchers were able to identify genes involved in key cancer processes (such as inflammation, cell cycles, and metabolism) and decipher both local and regional gene expression levels.
Dr. Olivier Gevaert, senior author on the study, said, “We focused on extracting information from whole slide images and moved away from “tile” aggregation (which offers only a narrow snapshot). This gave us more image data, both within the tumor and in the surrounding areas. By capturing greater heterogeneity we could identify key gene pathways relevant to cancer progression.”
He added, “Our model identified clinically relevant features that you can use to predict a patient’s risk for cancer, opening new avenues for more personalized care.”
The researchers built SEQUOIA from 7,584 tumor samples, using matched whole slide images and bulk RNA-Seq data from 16 cancer types available in The Cancer Genome Atlas.
NCI’s Division of Cancer Treatment and Diagnosis funded this work.