Supporting data for "Image segmentation of treated and untreated tumor spheroids by Fully Convolutional Networks" Datensatz uri icon

Open Access

  • true

Peer Reviewed

  • false

Abstract

  • Multicellular tumor spheroids (MCTS) are advanced cell culture systems for assessing the impact of combinatorial radio(chemo) therapy as they exhibit therapeutically relevant in-vivo-like characteristics from 3D cell-cell and cell-matrix interactions to radial pathophysiological gradients. State-of-the-art assays quantify long-term curative endpoints based on collected brightfield image time series from large treated spheroid populations per irradiation dose and treatment arm. This analyses require laborious spheroid segmentation of up to 100.000 images per treatment arm to extract relevant structural information from the images, e.g., diameter, area, volume and circularity. While several image analysis algorithms are available for spheroid segmentation, they all focus on compact MCTS with clearly distinguishable outer rim throughout growth. However, they often fail for the common case of treated MCTS, which may partly be detached and destroyed and are usually obscured by dead cell debris. To address these issues, we successfully train two Fully Convolutional Networks, UNet and HRNet, and optimize their hyperparameters to develop an automatic segmentation for both untreated and treated MCTS. We extensively test the automatic segmentation on larger, independent datasets and observe high accuracy for most images with Jaccard indices around 90%. For cases with lower accuracy, we demonstrate that the deviation is comparable to the inter-observer variability. We also test against previously published datasets and spheroid segmentations. The developed automatic segmentation cannot only be used directly but also integrated into existing spheroid analysis pipelines and tools. This facilitates the analysis of 3D spheroid assay experiments and contributes to the reproducibility and standardization of this preclinical in vitro model.

Veröffentlichungszeitpunkt

  • Januar 1, 2025