In precision medicine, the diagnosis and prognosis of colorectal cancer (CRC) has always been a clinical focus, and subjective evaluation of histological slides by highly trained pathologists remains the gold standard in this field. With the innovation of deep learning in image analysis, convolutional neural networks (CNNs) can extract quantitative information from HE pathological images, and We found a certain correlation between this information and the progression of colorectal cancer. In this study, we first used a CNN model to classify HE pathological images. The CNN model was trained and validated using patches of 86 (from the NCT biobank and the UMM pathology archive) and 25 (from the DACHS study in the NCT biobank) colorectal HE pathological images, respectively. With this tool, we performed automated tissue decomposition of representative multitissue HE images from the The Cancer Genome Atlas (TCGA) cohort. Based on the output neuron activations in the CNN, we calculated the tumor-stroma-ratio (TSR). This score was an independent prognostic factor for overall survival (OS) in a multivariable Cox proportional hazard model. Finally, we validated these findings in an independent CRC dataset. Again, the score was an independent prognostic factor for OS. This study, deep learning methods can decompose complex tissue pathological images and extract prognostic factors from HE pathological images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.