Rofo 2023; 195(02): 105-114
DOI: 10.1055/a-1909-7013
Review

Artificial Intelligence in Oncological Hybrid Imaging

Künstliche Intelligenz in der onkologischen Hybridbildgebung
Benedikt Feuerecker*
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
2   German Cancer Research Center (DKFZ), Partner site Munich, DKTK German Cancer Consortium, Munich, Germany
,
Maurice M. Heimer*
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
,
Thomas Geyer
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
,
Matthias P Fabritius
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
,
Sijing Gu
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
,
Balthasar Schachtner
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
,
Leonie Beyer
3   Department of Nuclear Medicine, University Hospital, LMU Munich, Munich, Germany
,
Jens Ricke
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
,
Sergios Gatidis
4   Department of Radiology, University Hospital Tübingen, Tübingen, Germany
5   MPI, Max Planck Institute for Intelligent Systems, Tübingen, Germany
,
Michael Ingrisch
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
,
Clemens C Cyran
1   Department of Radiology, University Hospital, LMU Munich, Munich, Germany
› Author Affiliations
 

Abstract

Background Artificial intelligence (AI) applications have become increasingly relevant across a broad spectrum of settings in medical imaging. Due to the large amount of imaging data that is generated in oncological hybrid imaging, AI applications are desirable for lesion detection and characterization in primary staging, therapy monitoring, and recurrence detection. Given the rapid developments in machine learning (ML) and deep learning (DL) methods, the role of AI will have significant impact on the imaging workflow and will eventually improve clinical decision making and outcomes.

Methods and Results The first part of this narrative review discusses current research with an introduction to artificial intelligence in oncological hybrid imaging and key concepts in data science. The second part reviews relevant examples with a focus on applications in oncology as well as discussion of challenges and current limitations.

Conclusion AI applications have the potential to leverage the diagnostic data stream with high efficiency and depth to facilitate automated lesion detection, characterization, and therapy monitoring to ultimately improve quality and efficiency throughout the medical imaging workflow. The goal is to generate reproducible, structured, quantitative diagnostic data for evidence-based therapy guidance in oncology. However, significant challenges remain regarding application development, benchmarking, and clinical implementation.

Key Points:

  • Hybrid imaging generates a large amount of multimodality medical imaging data with high complexity and depth.

  • Advanced tools are required to enable fast and cost-efficient processing along the whole radiology value chain.

  • AI applications promise to facilitate the assessment of oncological disease in hybrid imaging with high quality and efficiency for lesion detection, characterization, and response assessment. The goal is to generate reproducible, structured, quantitative diagnostic data for evidence-based oncological therapy guidance.

  • Selected applications in three oncological entities (lung, prostate, and neuroendocrine tumors) demonstrate how AI algorithms may impact imaging-based tasks in hybrid imaging and potentially guide clinical decision making.

Citation Format

  • Feuerecker B, Heimer M, Geyer T et al. Artificial Intelligence in Oncological Hybrid Imaging. Fortschr Röntgenstr 2023; 195: 105 – 114


#

Zusammenfassung

Hintergrund Der Stellenwert künstlicher Intelligenz (KI) hat in der medizinischen Bildgebung in den letzten Jahren deutlich zugenommen. Aufgrund der enormen Datenmengen und strukturierbaren Aufgaben im diagnostischen Workflow hat die KI in der onkologischen Hybridbildgebung besonders vielversprechende Anwendungsgebiete für die Läsionsdetektion, die Läsionscharakterisierung und die Therapiebeurteilung. Vor dem Hintergrund rasanter Entwicklungen im Bereich des Machine Learning (ML) und des Deep Learning (DL) ist von einer zunehmenden Bedeutung in der onkologischen Hybridbildgebung auszugehen mit Potenzial, die klinische Therapiesteuerung und patientenrelevante Ergebnisse zu verbessern.

Methode und Ergebnisse Diese narrative Übersichtsarbeit fasst die Evidenz in verschiedenen aufgabenbezogenen Anwendungen der Bildanalyse von KI im Bereich der onkologischen Hybridbildgebung zusammen. Nach Einführung in das Thema der KI werden ausgewählte Beispiele exploriert, vor dem Hintergrund aktueller Herausforderungen und im Hinblick auf die klinische Relevanz in der Therapiesteuerung diskutiert.

Schlussfolgerung Der Einsatz von KI bietet vielversprechende Anwendungen der Detektion, der Charakterisierung und der longitudinalen Therapiebeurteilung im Bereich der onkologischen Hybridbildgebung. Schlüsselherausforderungen liegen in den Bereichen der Entwicklung von Algorithmen, der Validierung und der klinischen Implementierung.

Kernaussagen:

  • Mit der onkologischen Hybridbildgebung werden große Datenvolumen aus 2 bildgebenden Modalitäten erzeugt, deren strukturierte Analyse komplex ist.

  • Für die Datenanalyse werden neue Methoden benötigt, um eine schnelle und kosteneffiziente Beurteilung in allen Aspekten der diagnostischen Wertschöpfungskette zu ermöglichen.

  • KI verspricht, die diagnostische Auswertung der onkologischen Hybridbildgebung zu vereinfachen und wesentliche Verbesserungen in Qualität und Effizienz bei der Erkennung, Charakterisierung und dem longitudinalen Monitoring onkologischer Erkrankungen zu ermöglichen. Ziel ist reproduzierbare, strukturierte, quantitative diagnostische Daten für die evidenzbasierte onkologische Therapiesteuerung zu generieren.

  • Selektierte Anwendungsbeispiele in 3 ausgewählten Tumorentitäten (Lungenkarzinom, Prostatakarzinom, Neuroendokrine Tumore) zeigen wie KI-gestützte Applikationen einen wesentlichen Beitrag in der automatisierten Bildanalyse leisten und eine weitere Individualisierung von Therapien ermöglichen könnten.


#

Introduction

Artificial intelligence (AI) applications are believed to provide promising tools for the analysis of evolving multi-omics data in diagnostic medicine [1]. With significant methodological advances in AI, applications continue to improve and may support experts in task-specific applications [2]. However, to date, AI applications cannot replace physicians in complex tasks that require human-guided decisions and interactions. While moving through the hype cycle with respect to expectations towards AI, the initial euphoria is currently being dampened by studies with a clear focus on limitations and weaknesses [3]. Due to the level of digitalization and natural accruement of big data, AI applications might add value to patient care by assisting physicians in simple tasks [4] [5]. With disproportionate growth of imaging data in a single examination, measures to increase productivity and to leverage data are desired to assist physicians and technicians with pre-screened data, optimizable raw images/post-processing tools, and quantitative features along the whole imaging workflow [6]. Hybrid imaging is the combination of morphological (CT/MRI) and functional imaging using a variety of radiotracers such as 18F-FDG, 68Ga/18F-PSMA, or 18F-SIFA-TATE, and provides complementary information regarding, e. g., tumor characteristics and metabolism. This method generates huge datasets with only portions of data such as standardized uptake values (SUV) or tumor size currently being used. Deep learning might benefit from the application of artificial intelligence in automated raw image pre-processing to refine image quality with promising applications in ultra-low radiation imaging, attenuation correction, and de-noising [7] [8]. Further areas of implementation include image reconstruction, image processing, and automated image analysis by machine learning approaches [9]. The main clinical applications of radiomics might apply in a diagnostic context to image-based applications including particularly time-consuming tasks such as lesion detection, characterization, and monitoring [9] ([Fig. 1]), which can be considered challenging with increasing numbers of tumor manifestations, examination time points, and heterogeneity of tumor burden [9]. The impact of whole tumor burden heterogeneity assessment in imaging studies will likely evolve as an interesting field of study since biopsies are prone to sampling error that is hardly addressed in current practice [10]. Beyond diagnostics, accurate delineation and segmentation tasks play a crucial role in radiotherapy treatment planning [11].

Zoom Image
Fig. 1 Applications of AI in oncological imaging along the radiology workflow: a Detection of lesions in schematic drawings of patients with lung cancer, prostate cancer, and neuroendocrine tumor, b Characterization of solitary lesions in axial PET/CT reconstructions of lung cancer and prostate cancer, additional circles drawn to highlight the areas of interest. c Longitudinal monitoring of single lesions with regard to aforementioned characteristics allowing response assessment (axial PET-reconstruction) in lung cancer with changing characteristics over time.

Hybrid imaging including PET/CT and PET/MRI provides complementary imaging data allowing a multifaceted anatomical, functional and molecular characterization of tumor manifestations. With different radiotracers, hybrid imaging is applied in most malignant tumors including lung cancer, prostate cancer, and neuroendocrine tumors with significant effects on patient management compared to conventional imaging algorithms as reflected by German and international guidelines [12] [13] [14]. The limitations of single-modality imaging are overcome by the strengths of the complementary modality, e. g. superior lesion detection in PET and superior anatomical resolution in CT or MRI, with consecutively significantly increased diagnostic accuracy [15].

While the benefits of automated tumor delineation, tumor characterization, or tools for longitudinal tumor volume monitoring are intuitive, AI-enhanced analysis methods facilitate the extraction of more subtle information from imaging data that are mostly undetectable for human readers with deep learning approaches. As of now, the majority of AI studies in hybrid imaging are retrospective, with a lack of clinical translation due to unresolved limitations in algorithm development, validation, and clinical implementation [16]. Additionally, prospective trials on the applications of artificial intelligence are scarce and therefore do not allow for general translation of results on larger cohorts and further indications. Multicenter, randomized controlled, prospective trials will be necessary to demonstrate the usefulness of radiomics and to scrutinize retrospective results of radiomics and AI imaging studies [17].

Prospectively, oncological applications of AI beyond imaging-based tasks will focus on a holistic integration of multi-source diagnostic data including radiomics, genomics, and metabolomics to personalize diagnostics at the molecular, cellular, and organism level [4] [18] [19]. In this narrative review, we provide an overview of AI applications in oncological hybrid imaging. The first part provides an introduction into the principles of AI for the analysis of medical imaging data with review of the most recent literature related to applications in oncological hybrid imaging. The second part discusses AI applications in hybrid imaging in lung cancer, prostate cancer, and neuroendocrine tumors.


#

Technical realization, basics of data acquisition, and analysis

Radiomics

The combination of automated quantitative image analysis with supervised machine learning (ML) is often referred to as radiomics ([Fig. 2]) [20] [21]. Quantitative analysis describes image-based features with regard to tumor shape, distribution of intralesional signal intensities (often referred to as histogram statistics), and texture, i. e., spatial relationships of voxels and their respective grayscale patterns, resulting in a large, high-dimensional feature space. Since many of these image features may show strong correlations, a feature selection step may be helpful to reduce the number of intercorrelated image features.

Zoom Image
Fig. 2 Applications of AI in radiomic classification and deep learning algorithms. The first row shows the radiomics approach, comprising tumor segmentation, extraction of handcrafted features, and training of an ML model. The bottom row illustrates an automated approach of a deep learning algorithm with convolutional neural networks.

This image-derived feature space can then be linked to clinical outcomes, such as diagnosis, prognosis, or treatment response, by fitting or training statistical or machine learning models to the data. Ultimately, trained models may then be used to predict clinical outcomes from imaging features. Popular models for these applications are generalized linear models, support vector machines, and random forests [9].


#

Deep Learning Algorithms

Deep learning (DL) methods have gained considerable interest within medical imaging research. In DL the algorithm learns a composition of features that reflect a hierarchy of structures in the data [22]. These systems allow leveraging of the compositional nature of images by an end-to-end approach integrating image-based features [22]. In contrast to traditional ML approaches, deep learning (DL) models based on convolutional neural networks (CNN) do not require a predefined definition of image features but are able to learn relevant features directly from imaging data [6] [9] [23]. Beyond the prediction of patient outcome, DL is particularly useful for object detection, e. g. localization of lung nodules, or image segmentation for the assessment of tumors or organs. The quality and quantity of imaging data for training and validation play a pivotal role in the clinical application of DL models. Due to the high number of free parameters that need to be determined in model training, DL models are particularly data-hungry and require large amounts of curated and possibly expert-annotated imaging data. Since DL models can easily be overfitted to training data, the reliability of trained models and the quality of predictions need to be assessed and validated carefully on independent data sets which are not used during training. A high level of evidence is reached by validation on entirely independent data [16].


#

Hybrid imaging and machine learning

From a technical perspective, the fusion and integration of imaging modalities such as PET and CT/MRI from hybrid imaging is straightforward, when images from both modalities are sufficiently aligned – either through simultaneous acquisition or image registration. In radiomics approaches, quantitative image features can then be calculated from each image contrast separately and extend the feature space, i. e., the number of features derived from each tumor or metastasis. Likewise, in a CNN approach, image contrasts can be combined in channels – much like in the case of red, green, and blue channels for standard photographic images. In both approaches, the increased information content of the enlarged feature space needs to be accounted for in the statistical modelling approach. An enlarged feature space requires models that can store more information resulting in greater susceptibility to overfitting or memorizing of the training data. Therefore, rigorous model validation plays a crucial role in hybrid imaging.

To fully exploit the potential of complementary hybrid imaging information, several tumor co-segmentation methods were proposed in hybrid imaging [15]. Potenzial clinical applications will include identification of dedifferentiation patterns as observed in neuroendocrine tumors or malignant transformation of lymphoma that are associated with a worse prognosis [24] [25]. Another highly relevant application will be lesion characterization, for instance of lymph nodes with central necrosis, that can be associated with a worse prognosis in a variety of malignancies including sarcoma [26].


#

Applicability of AI for imaging data optimization

AI applications were recently successfully evaluated for attenuation correction, pre- and postprocessing, co-registration of data, and PET or MRI/CT-based motion correction. A detailed review of potential applications can be found in [6].


#

Limitations, challenges, and perspectives

To meet the high standards in medical imaging, a variety of challenges will have to be addressed to realize and accelerate the clinical translation of relevant AI applications in medical imaging. Unambiguous nomenclature and clear definition of intended use of models are prerequisites for broad implementation to differentiate mere data mining approaches from (semi-) automated task-based applications and to fully integrate tools into the medical imaging value chain [17].

From a radiomics perspective, processing of both small cohorts and unstructured big data will not yield sufficiently robust algorithms to overcome unresolved obstacles in standardization such as a lack of protocol harmonization and data heterogeneity [27]. While small data sets will eventually lead to overfitting of algorithms, unstructured big data sets will be insufficient for training purposes, thus generating inaccurate algorithms. Therefore, the population for training and validation purposes must be sufficiently powered, well-balanced, and organized with regard to complexity and application-relevant features.

With significant preanalytical heterogeneities, harmonization of PET imaging remains challenging. Divergences may for instance originate in a broad spectrum of applied dose, distinctive physiological uptake patterns, different attenuation correction methods, and scanner-related differences in image acquisition and reconstruction algorithms [16]. From our experience, this compares to morphological imaging protocols including both CT and MRI, with a relevant spectrum of applied contrast agents, time between application of contrast agents, and modality parameters. Technical factors and reconstruction algorithms have a substantial impact on the quality of the extracted radiomics features and need to be considered [28]. Systematic methodical flaws need to be identified using independent external validation providing meaningful performance metrics [17] [29]. For this purpose checklists for the development and evaluation of artificial intelligence tools in medical imaging have been designed to improve the quality of studies [30] [31].

At this point, integrated AI applications for the structured analysis of hybrid imaging data remain scarce, particularly due to data privacy and the lack of publicly available, expert-annotated hybrid imaging data sets [6]. Structured data repositories, such as The Cancer Imaging Archive (TCIA), and large-scale, privacy-preserving initiatives, such as the Radiology Cooperative Network (RACOON), promise to increase sample sizes for the development of AI models, possibly with federated learning approaches [27].

From an ethical and medicolegal perspective, AI applications in medical imaging will require detailed explanation regarding development and codebase with proof of validation and safety studies for approval before broad clinical implementation. In Europe, medical devices including AI-based algorithms are not approved by a centralized agency and will be regulated depending on their risk potential. While high-risk devices (IIa, IIb, and III) are handled and certified by accredited, notified bodies in Europe, low-risk devices will be released at the sole responsibility of the manufacturer and ultimately also the user [32]. In contrast, in the US medical devices including AI-based algorithms are cleared in three pathways: the premarket approval pathway (for risk-associated devices), the de-novo premarket review (for low and moderate-risk devices), and the 510(k) pathway [32]. With regard to the spectrum of approved AI applications, Luchini et al. reported that a total of 71 oncology-related AI applications had been approved by the FDA as of the May 31, 2021, with 39 (55 %) applications in cancer radiology [33]. Of these the majority are intended as an integrative application, potentially representing the decisive step in the diagnostic workflow of cancer patients, with only one application for de-noising of PET-images in hybrid imaging [33]. Finally, from a clinical workflow perspective, AI products will require seamless integration in the diagnostic workflow with transparent and explainable results to support decision making.


#
#

Applications of AI in oncological hybrid imaging

The following chapter reviews relevant clinical applications of AI in more detail for lung cancer, prostate cancer, and neuroendocrine tumors, where hybrid imaging has a significant impact on therapy guidance and clinical decision making in a tertiary medical center. Further successful fields of AI application of PET/CT and PET/MRI include lymphoma [34] [35], breast [36] and brain cancer [37], cervical cancer [38] for which a multitude of studies report relevant clinical findings, e. g., correlations of SUVmax of the primary breast tumor and significantly more frequently local recurrence in surveillance [36].

Lung cancer

According to the German S3-guideline, hybrid imaging with 18F-fluorodeoxyglucose (18F-FDG) PET/CT is an established standard in the diagnostic algorithm of both small-cell lung cancer (SCLC) and non-small cell lung cancer (NSCLC) [39]. Compared to conventional CT, 18F-FDG PET/CT provides significantly improved delineation of the primary tumor and accurate assessment of metastases compared to conventional CT, detecting unexpected lesions with significant effects on therapy management in 20–25 % of cases [40] [41]. Currently, well-established single-modality AI applications exist in CT for pulmonary nodule detection, with impact of lesion size and quantity on T- and eventually M-stage classification and therapy planning in lung cancer [42]. Pulmonary nodules have a variety of CT-attenuation patterns with well-defined margins between solid components and healthy lung parenchyma, unclear margins in subsolid and ground-glass components, and even more heterogeneous borders in cases of associated local infiltration, atelectasis, and pneumonic consolidation. Using an automated image analysis approach, intrapulmonary lesions can be assessed with regard to different characteristics [43] [44]. The Dutch-Belgian lung cancer screening trial (NELSON), for example, was the first screening trial to apply semi-automated computer aided-volumetry (CAV) instead of handcrafted measurements, thereby achieving high negative predictive values and presumably fewer false-positive results compared to other lung cancer screening trials [45] [46]. Also PET-based single modality approaches have been studied and shown promising results in segmentation of both pulmonary nodules and thoracic lymph nodes to predict outcome [47].

However, there are also few well-documented examples of true multi-modality AI applications in lung cancer. Wallis et al., for example, developed a deep learning method to detect pathological mediastinal lymph nodes from whole-body 18F-FDG PET/CT. Model performance was comparable to that of an expert reader on data from the same type of scanner, and transfer learning allowed translation to other scanners [48]. Zhao et al. proposed a fully convolutional neural network on a cohort of 84 patients with lung cancer who underwent 18F-FDG PET/CT showing that co-segmentation networks can combine the advantages of two modalities effectively outperforming single-modality applications [15]. From a clinical perspective, this approach appears valuable in assessing the primary pulmonary malignancy to guide T-stage classification. Beyond segmentation, multi-modality applications have been shown to impact lesion characterization and prognostication. In a retrospective multi-institutional study, Mu et al. showed that a radio-genomic deep learning approach can be used to predict EGFR status with weak but significant inverse correlation to PD-L1 status for noninvasive decision support in NSCLC [49]. The algorithm yielded an area under the receiver operating characteristics curve of 0.81 with an accuracy of 78.5 % in an external test cohort of 65 patients with higher performance when integrating anatomical and metabolic information compared to single-modality approaches [49]. Yet, due to the limited ROC, physicians may not fully omit biopsy as a tool to guide treatment on a patient level, notably when deciding for or against a certain treatment. Further studies are required to improve the performance of these algorithms to safely guide treatment selection.


#

Prostate cancer

Hybrid imaging with prostate specific membrane antigen (PSMA) ligands has gained broad application in prostate cancer including biochemical recurrence, primary staging in high-risk disease (Gleason score > 7, PSA > 20 ng/mL, clinical stage T2c-3a), and response assessment with significant impact on clinical decision making, particularly in the detection of metastatic lymph nodes and bone metastases [50] [51] [52] [53]. In this context, most AI applications focus on single-modality approaches for lesion detection. Kostyszyn et al. developed a CNN approach based on 68Ga-PSMA PET to assess intraprostatic gross tumor volume in a multi-center study of 152 patients with retrospective histopathologic correlation [54]. Results demonstrated fast and robust auto-segmentation of the intraprostatic tumor volume not only in 68Ga- but also in 18F-PSMA PET/CT compared to manual segmentation and semi-automatic thresholding, which encouragingly shows translatability between differently labelled PSMA ligands [54]. In another study, an ML algorithm was trained on 72 prostate cancer patients for lesion detection, analyzing 77 radiomic features in 68Ga-PSMA PET/ low dose CT to differentiate physiological from pathological radiotracer uptake, resulting in high sensitivity (97 %) with lower specificity (82 %) due to frequent misinterpretation of physiologic PSMA uptake in glands [55]. To assess whole-body tumor burden, a semi-automatic software package (qPSMA) for 68Ga-PSMA PET/CT was introduced and validated with high correlation between total lesion metabolic volume and PSA levels [56]. Using this tool, patients with very high tumor load showed a significantly lower uptake of 68Ga-PSMA-11 in normal organs confirming a tumor sink effect [57]. This has clinical implications, as similar effects might occur with PSMA-targeted radioligand therapy, making this tool interesting for pre-therapeutic stratification. Without exceeding the radiation dose limits for organs at risk, these patients might potentially benefit from increased therapeutic activity [57]. In another single-center cohort of 83 patients, Moazemi et al. investigated deep learning applications in pre-therapeutic 68Ga-PSMA PET/CT for lesion detection and 177Lu-PSMA therapy guidance in metastatic prostate cancer, showing high diagnostic accuracy [58]. Radiomic features (SUVmin, SUV correlation, CT min, CT busyness and CT coarseness) in 68Ga-PSMA PET/CT and clinical parameters such as Alp1 and Gleason score yielded strong correlations with changes in prostate-specific antigen (PSA) to predict outcome [58] [59]. This finding also points in the direction of integrated diagnostics where the integration of multi-source diagnostic data from medical imaging, pathology, liquid biopsy, and clinical findings is analyzed to achieve optimized diagnostic accuracy in evidence-based clinical decision guidance.

Only a few true hybrid deep learning applications have been evaluated in prostate cancer imaging, including prediction and response assessment. Papp et al. developed an ML approach to predict low vs. high lesion risk, biochemical recurrence, and overall patient risk using 68Ga-PSMA PET/MRI with excellent cross-validation performance based on a cohort of 52 patients selected from a prospective randomized trial in primary prostate cancer [60]. The algorithm yielded 89 % and 91 % accuracy in biochemical recurrence and overall patient risk, respectively. In this study, feature ranking demonstrated that molecular 68Ga-PSMA PET was the dominant in vivo feature source for lesion risk prediction compared to MRI which yielded ADC but not T2w parameters as high-ranking features. The authors hypothesized that integration of PSMA and ADC features in a model scheme could deliver a superior predictive value [60]. However, notably the latter study may be difficult to interpret due to methodical limitations such as lack of inter- and intrareader variability analysis and omission of a final model with validation on an independent test set.


#

Neuroendocrine tumors

Neuroendocrine neoplasms (NEN) are a heterogeneous group of malignancies with a variety of histological subtypes, primary location, and functional status. NEN are classified as differentiated neuroendocrine tumors (NET) with preserved somatostatin receptor (SSTR) status and poorly differentiated neuroendocrine carcinomas (NEC). Since NETs usually progress slowly and several treatment options are available, the prevalence of NETs is increasing along with the number of imaging examinations, which often show significant metastatic tumor burden, impacting the radiologic workload [61]. To address these difficulties and to improve standard of care, the European Neuroendocrine Tumor Society (ENETS) promotes structured reporting in radiology and molecular imaging studies [62] [63].

Hybrid PET/CT imaging with somatostatin-receptor agonists, such as 68Ga-DOTA-TATE, 68Ga-DOTA-TOC and most recently 18F-SiFAlin-TATE, allows longitudinal multimodal assessment of morphology and SSTR expression in therapy guidance in NETs [64]. While tracer biodistribution of the established SSTR agonists is similar, minor, yet existing differences in physiological distribution profiles may pose a challenge for an automated image-segmentation approach. Single-modality AI applications have shown their potential in the diagnostic workup to help distinguish pathology, aid lesion detection, and facilitate response assessment in NETs. Criteria-based reporting systems including RECIST 1.1 and the Krenning Score allow patient stratification in a single-modality approach, with SSTR-RADS serving as an example for multimodal assessment criteria, which could help structure the outcome of classifications-based algorithms [65].

Promising results have been reported with respect to AI applications for grading in both CT and MRI in preoperative morphological imaging studies [66] [67]. Liberini et al. and Atkinson et al. provide convincing data to suggest that statistical and histogram-based parameters of SSTR-ligand PET may have added value for prediction and therapy response [68] [69]. Recently, SSTR expressing tumor volume and total lesion SSTR expression were proposed as first-order molecular prediction biomarkers assessed by AI [69] [70] [71]. Skewness and kurtosis of tumor lesions on pretreatment 68Ga-DOTA-TATE PET/CT were shown to predict responsiveness to radionuclide peptide treatment [71]. However, these first-order features do not necessarily reflect true radiomic features to use the hidden potential of imaging data. Wehrend et al. developed a DL algorithm to automatically detect tumor lesions in 68Ga-DOTA-TATE PET in a study of 125 patients [72]. Despite promising results, high physiological liver uptake and comparably low spatial resolution hamper the diagnostic accuracy of PET with SSTR analogs in the detection of liver lesions, making hybrid imaging with MRI desirable. Fehrenbach et al. developed a DL algorithm in gadoxetic-acid (Gd-EOB)-enhanced MRI for the assessment of hepatic tumor burden in NEN based on an initial training cohort of 222 imaging studies. Their application shows high accuracy in the detection and quantification of liver metastasis, facilitating clinical decision making in multidisciplinary cancer conferences [61]. Taking both into account, it is likely that integration of complementary data streams could refine AI algorithms. Yet most studies focus on automated assessment of hepatic tumor burden in NEN with very limited available literature evaluating the performance and added value of fused hybrid imaging features for therapy guidance in NETs.


#
#

Discussion and conclusions

The application of AI in hybrid medical imaging offers potential for the automated delineation, noninvasive characterization, and longitudinal monitoring of oncological diseases. Yet, many hurdles remain to be addressed before AI can be implemented in daily routines. Validation of AI tools will require methodological approaches and significant evidence with prospective trials to demonstrate the impact of AI tools on patient outcomes [16] [17]. In the near feature AI applications will more likely represent additional tools rather than standalone diagnostic algorithms [28].

Undoubtedly, hybrid imaging has significant advantages regarding diagnostic accuracy compared to its complementary standalone modalities but new challenges with respect to data volume and structured analysis need to be overcome to fully exploit its potential in the context of precision medicine [20]. Task-based applications in lung cancer, prostate cancer, and neuroendocrine tumors indicate that technical implementation is feasible with significant impact on the medical imaging work stream and may in the future provide down-stream clinical decision support in precision oncology [73] [74]. With innovative molecular and cellular oncological treatments, multi-modality applications may impact therapy guidance by assessing complex therapy response patterns and metastatic heterogeneity. AI applications present as transformative technology to supersede single-modality algorithms for automated detection, noninvasive characterization, and longitudinal monitoring of oncological disease in hybrid imaging [12]. However, true complementary multi-modality algorithms remain scarce with a majority of applications being based on single-modality approaches. In addition to the aforementioned, applications of artificial intelligence have also been evaluated in a variety of other malignancies. In renal cell cancer, PET/MRI radiomic signatures analyzed in three separate feature sets showed that the combined functional and structural information of PET/MRI had a higher correlation with tumor microvascular density [75]. Additionally, promising results from using fusion models that integrate data from CT/MRI/PET have been reported and showed better results than separate image analysis [76].

To close the translational gap of AI applications in medical hybrid imaging, challenges need to be addressed to improve safety, quality, and ultimately public trust [9] [27]. While expectations regarding AI tools in medical imaging have become more critical, by focusing on the limitations and weaknesses of the technology, we expect future research and development to yield valuable task-based tools for medical imaging in radiology and nuclear medicine.


#
#

Interessenkonflikte

Cyran: Speaker’s bureau AAA, Brainlab, Mint. The other authors declare that they have no conflict of interest.

* Benedikt Feuerecker and Maurice M. Heimer contributed equally as first author.


  • References

  • 1 Moore JH. et al. Preparing next-generation scientists for biomedical big data: artificial intelligence approaches. Per Med 2019; 16 (03) 247-257
  • 2 Rueckel J. et al. Artificial Intelligence Algorithm Detecting Lung Infection in Supine Chest Radiographs of Critically Ill Patients With a Diagnostic Accuracy Similar to Board-Certified Radiologists. Crit Care Med 2020; 48 (07) e574-e583
  • 3 Oosterhoff JHF, Doornberg JN, Doornberg C. Machine Learning, Artificial intelligence in orthopaedics: false hope or not? A narrative review along the line of Gartner’s hype cycle. EFORT Open Rev 2020; 5 (10) 593-603
  • 4 Sollini M. et al. Artificial intelligence and hybrid imaging: the best match for personalized medicine in oncology. Eur J Hybrid Imaging 2020; 4 (01) 24
  • 5 Capobianco E. High-dimensional role of AI and machine learning in cancer research. Br J Cancer 2022; 126 (04) 523-532
  • 6 Shiyam Sundar LK. et al. Potenzials and caveats of AI in hybrid imaging. Methods 2021; 188: 4-19
  • 7 Zaharchuk G. Next generation research applications for hybrid PET/MR and PET/CT imaging using deep learning. Eur J Nucl Med Mol Imaging 2019; 46 (13) 2700-2707
  • 8 Aide N. et al. New PET technologies – embracing progress and pushing the limits. Eur J Nucl Med Mol Imaging 2021; 48 (09) 2711-2726
  • 9 Hosny A. et al. Artificial intelligence in radiology. Nat Rev Cancer 2018; 18 (08) 500-510
  • 10 Pe’er D. et al. Tumor heterogeneity. Cancer Cell 2021; 39 (08) 1015-1017
  • 11 Ju W. et al. Random Walk and Graph Cut for Co-Segmentation of Lung Tumor on PET-CT Images. IEEE Trans Image Process 2015; 24 (12) 5854-5867
  • 12 Pfannenberg C. et al. Practice-based evidence for the clinical benefit of PET/CT-results of the first oncologic PET/CT registry in Germany. Eur J Nucl Med Mol Imaging 2019; 46 (01) 54-64
  • 13 Mottet N. et al. EAU-EANM-ESTRO-ESUR-SIOG Guidelines on Prostate Cancer-2020 Update. Part 1: Screening, Diagnosis, and Local Treatment with Curative Intent. Eur Urol 2021; 79 (02) 243-262
  • 14 AWMF. Leitlinienprogramm Onkologie (Deutsche Krebsgesellschaft, Deutsche Krebshilfe, AWMF): S3-Leitlinie Prostatakarzinom, Langversion 6.0, 2021, AWMF Registernummer: 043/022OL. (abgerufen am: 02.08.2021) http://www.leitlinienprogramm-onkologie.de/leitlinien/prostatakarzinom/
  • 15 Zhao X. et al. Tumor co-segmentation in PET/CT using multi-modality fully convolutional neural network. Phys Med Biol 2018; 64 (01) 015011
  • 16 Bradshaw TJ. et al. Nuclear Medicine and Artificial Intelligence: Best Practices for Algorithm Development. J Nucl Med 2021; DOI: 10.2967/jnumed.121.262567.
  • 17 Pinto Dos Santos D, Dietzel M, Baessler B. A decade of radiomics research: are images really data or just patterns in the noise?. Eur Radiol 2021; 31 (01) 1-4
  • 18 Gatta R. et al. Integrating radiomics into holomics for personalised oncology: from algorithms to bedside. Eur Radiol Exp 2020; 4 (01) 11
  • 19 Holzinger A. et al. Causability and explainability of artificial intelligence in medicine. Wiley Interdiscip Rev Data Min Knowl Discov 2019; 9 (04) e1312
  • 20 Gillies RJ, Kinahan PE, Hricak H. Radiomics: Images Are More than Pictures, They Are Data. Radiology 2016; 278 (02) 563-577
  • 21 Sepehri S. et al. Comparison and Fusion of Machine Learning Algorithms for Prospective Validation of PET/CT Radiomic Features Prognostic Value in Stage II-III Non-Small Cell Lung Cancer. Diagnostics (Basel) 2021; 11 (04) DOI: 10.3390/diagnostics11040675.
  • 22 Chartrand G. et al. Deep Learning: A Primer for Radiologists. Radiographics 2017; 37 (07) 2113-2131
  • 23 Litjens G. et al. A survey on deep learning in medical image analysis. Med Image Anal 2017; 42: 60-88
  • 24 Sanli Y. et al. Neuroendocrine Tumor Diagnosis and Management: (68)Ga-DOTATATE PET/CT. Am J Roentgenol 2018; 211 (02) 267-277
  • 25 El-Galaly TC. et al. FDG-PET/CT in the management of lymphomas: current status and future directions. J Intern Med 2018; 284 (04) 358-376
  • 26 Rakheja R. et al. Necrosis on FDG PET/CT correlates with prognosis and mortality in sarcomas. Am J Roentgenol 2013; 201 (01) 170-177
  • 27 Bettinelli A. et al. A Novel Benchmarking Approach to Assess the Agreement among Radiomic Tools. Radiology 2022; 211604 DOI: 10.1148/radiol.211604.
  • 28 van Timmeren JE. et al. Radiomics in medical imaging-“how-to” guide and critical reflection. Insights Imaging 2020; 11 (01) 91
  • 29 Li K, Zhang R, Cai W. Deep learning convolutional neural network (DLCNN): unleashing the potential of (18)F-FDG PET/CT in lymphoma. Am J Nucl Med Mol Imaging 2021; 11 (04) 327-331
  • 30 Mongan J, Moy L, Kahn Jr CE. Checklist for Artificial Intelligence in Medical Imaging (CLAIM): A Guide for Authors and Reviewers. Radiol Artif Intell 2020; 2 (02) e200029
  • 31 Lambin P. et al. Radiomics: the bridge between medical imaging and personalized medicine. Nat Rev Clin Oncol 2017; 14 (12) 749-762
  • 32 Muehlematter UJ, Daniore P, Vokinger KN. Approval of artificial intelligence and machine learning-based medical devices in the USA and Europe (2015-20): a comparative analysis. Lancet Digit Health 2021; 3 (03) e195-e203
  • 33 Luchini C, Pea A, Scarpa A. Artificial intelligence in oncology: current applications and future perspectives. Br J Cancer 2022; 126 (01) 4-9
  • 34 Sibille L. et al. (18)F-FDG PET/CT Uptake Classification in Lymphoma and Lung Cancer by Using Deep Convolutional Neural Networks. Radiology 2020; 294 (02) 445-452
  • 35 Blanc-Durand P. et al. Fully automatic segmentation of diffuse large B cell lymphoma lesions on 3D FDG-PET/CT for total metabolic tumour volume prediction using a convolutional neural network. Eur J Nucl Med Mol Imaging 2021; 48 (05) 1362-1370
  • 36 Qu YH. et al. The correlation of (18)F-FDG PET/CT metabolic parameters, clinicopathological factors, and prognosis in breast cancer. Clin Transl Oncol 2021; 23 (03) 620-627
  • 37 Lohmann P. et al. PET/MRI Radiomics in Patients With Brain Metastases. Front Neurol 2020; 11: 1
  • 38 Nakajo M. et al. Machine learning based evaluation of clinical and pretreatment (18)F-FDG-PET/CT radiomic features to predict prognosis of cervical cancer patients. Abdom Radiol (NY) 2022; 47 (02) 838-847
  • 39 AWMF. Leitlinienprogramm Onkologie (Deutsche Krebsgesellschaft, Deutsche Krebshilfe, AWMF): Prävention, Diagnostik, Therapie und Nachsorge des Lungenkarzinoms, Langversion 1.0, 2018, AWMF-Registernummer: 020/007OL. 2018 (Zugriff am: 09.04.2022) http://leitlinienprogramm-onkologie.de/Lungenkarzinom.98.0.html
  • 40 Kandathil A. et al. Role of FDG PET/CT in the Eighth Edition of TNM Staging of Non-Small Cell Lung Cancer. Radiographics 2018; 38 (07) 2134-2149
  • 41 UyBico SJ. et al. Lung cancer staging essentials: the new TNM staging system and potential imaging pitfalls. Radiographics 2010; 30 (05) 1163-1181
  • 42 Thakur SK, Singh DP, Choudhary J. Lung cancer identification: a review on detection and classification. Cancer Metastasis Rev 2020; 39 (03) 989-998
  • 43 Venkadesh KV. et al. Deep Learning for Malignancy Risk Estimation of Pulmonary Nodules Detected at Low-Dose Screening CT. Radiology 2021; 300 (02) 438-447
  • 44 Binczyk F. et al. Radiomics and artificial intelligence in lung cancer screening. Transl Lung Cancer Res 2021; 10 (02) 1186-1199
  • 45 van Klaveren RJ. et al. Management of lung nodules detected by volume CT scanning. N Engl J Med 2009; 361 (23) 2221-2229
  • 46 Werner S. et al. Accuracy and Reproducibility of a Software Prototype for Semi-Automated Computer-Aided Volumetry of the solid and subsolid Components of part-solid Pulmonary Nodules. Fortschr Röntgenstr 2022; 194 (03) 296-305
  • 47 Borrelli P. et al. Freely available convolutional neural network-based quantification of PET/CT lesions is associated with survival in patients with lung cancer. EJNMMI Phys 2022; 9 (01) 6
  • 48 Wallis D. et al. An [18F]FDG-PET/CT deep learning method for fully automated detection of pathological mediastinal lymph nodes in lung cancer patients. Eur J Nucl Med Mol Imaging 2022; 49 (03) 881-888
  • 49 Mu W. et al. Non-invasive decision support for NSCLC treatment using PET/CT radiomics. Nat Commun 2020; 11 (01) 5228
  • 50 Fendler WP. et al. (68)Ga-PSMA PET/CT: Joint EANM and SNMMI procedure guideline for prostate cancer imaging: version 1.0. Eur J Nucl Med Mol Imaging 2017; 44 (06) 1014-1024
  • 51 Hofman MS. et al Prostate-specific membrane antigen PET-CT in patients with high-risk prostate cancer before curative-intent surgery or radiotherapy (proPSMA): a prospective, randomised, multicentre study. Lancet 2020; 395 10231 1208-1216
  • 52 Calais J. et al. Potenzial Impact of (68)Ga-PSMA-11 PET/CT on the Planning of Definitive Radiation Therapy for Prostate Cancer. J Nucl Med 2018; 59 (11) 1714-1721
  • 53 Grubmuller B. et al. PSMA Ligand PET/MRI for Primary Prostate Cancer: Staging Performance and Clinical Impact. Clin Cancer Res 2018; 24 (24) 6300-6307
  • 54 Kostyszyn D. et al. Intraprostatic Tumor Segmentation on PSMA PET Images in Patients with Primary Prostate Cancer with a Convolutional Neural Network. J Nucl Med 2021; 62 (06) 823-828
  • 55 Erle A. et al. Evaluating a Machine Learning Tool for the Classification of Pathological Uptake in Whole-Body PSMA-PET-CT Scans. Tomography 2021; 7 (03) 301-312
  • 56 Gafita A. et al. qPSMA: Semiautomatic Software for Whole-Body Tumor Burden Assessment in Prostate Cancer Using (68)Ga-PSMA11 PET/CT. J Nucl Med 2019; 60 (09) 1277-1283
  • 57 Gafita A. et al. Tumor sink effect in (68)Ga-PSMA-11 PET: Myth or Reality?. J Nucl Med 2021; DOI: 10.2967/jnumed.121.261906.
  • 58 Moazemi S. et al. Decision-support for treatment with (177)Lu-PSMA: machine learning predicts response with high accuracy based on PSMA-PET/CT and clinical parameters. Ann Transl Med 2021; 9 (09) 818
  • 59 Moazemi S. et al. Estimating the Potenzial of Radiomics Features and Radiomics Signature from Pretherapeutic PSMA-PET-CT Scans and Clinical Data for Prediction of Overall Survival When Treated with (177)Lu-PSMA. Diagnostics (Basel) 2021; 11 (02) DOI: 10.3390/diagnostics11020186.
  • 60 Papp L. et al. Supervised machine learning enables non-invasive lesion characterization in primary prostate cancer with [(68)Ga]Ga-PSMA-11 PET/MRI. Eur J Nucl Med Mol Imaging 2021; 48 (06) 1795-1805
  • 61 Fehrenbach U. et al. Automatized Hepatic Tumor Volume Analysis of Neuroendocrine Liver Metastases by Gd-EOB MRI-A Deep-Learning Model to Support Multidisciplinary Cancer Conference Decision-Making. Cancers (Basel) 2021; 13 (11) DOI: 10.3390/cancers13112726.
  • 62 Hicks RJ. et al. ENETS standardized (synoptic) reporting for molecular imaging studies in neuroendocrine tumours. J Neuroendocrinol 2021; e13040 DOI: 10.1111/jne.13040.
  • 63 Dromain C. et al. ENETS standardized (synoptic) reporting for radiological imaging in neuroendocrine tumours. J Neuroendocrinol 2021; e13044 DOI: 10.1111/jne.13040.
  • 64 Ambrosini V. et al. Consensus on molecular imaging and theranostics in neuroendocrine neoplasms. Eur J Cancer 2021; 146: 56-73
  • 65 Werner RA. et al. SSTR-RADS Version 1.0 as a Reporting System for SSTR PET Imaging and Selection of Potenzial PRRT Candidates: A Proposed Standardization Framework. J Nucl Med 2018; 59 (07) 1085-1091
  • 66 Gao X, Wang X. Deep learning for World Health Organization grades of pancreatic neuroendocrine tumors on contrast-enhanced magnetic resonance images: a preliminary study. Int J Comput Assist Radiol Surg 2019; 14 (11) 1981-1991
  • 67 Luo Y. et al. Preoperative Prediction of Pancreatic Neuroendocrine Neoplasms Grading Based on Enhanced Computed Tomography Imaging: Validation of Deep Learning with a Convolutional Neural Network. Neuroendocrinology 2020; 110 (05) 338-350
  • 68 Atkinson C. et al. Radiomics-Based Texture Analysis of (68)Ga-DOTATATE Positron Emission Tomography and Computed Tomography Images as a Prognostic Biomarker in Adults With Neuroendocrine Cancers Treated With (177)Lu-DOTATATE. Front Oncol 2021; 11: 686235
  • 69 Liberini V. et al. Impact of segmentation and discretization on radiomic features in (68)Ga-DOTA-TOC PET/CT images of neuroendocrine tumor. EJNMMI Phys 2021; 8 (01) 21
  • 70 Liberini V. et al. The Challenge of Evaluating Response to Peptide Receptor Radionuclide Therapy in Gastroenteropancreatic Neuroendocrine Tumors: The Present and the Future. Diagnostics (Basel) 2020; 10 (12) DOI: 10.3390/diagnostics10121083.
  • 71 Onner H, Abdulrezzak U, Tutus A. Could the skewness and kurtosis texture parameters of lesions obtained from pretreatment Ga-68 DOTA-TATE PET/CT images predict receptor radionuclide therapy response in patients with gastroenteropancreatic neuroendocrine tumors?. Nucl Med Commun 2020; 41 (10) 1034-1039
  • 72 Wehrend J. et al. Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network. EJNMMI Res 2021; 11 (01) 98
  • 73 Sonni I. et al. Impact of (68)Ga-PSMA-11 PET/CT on Staging and Management of Prostate Cancer Patients in Various Clinical Settings: A Prospective Single-Center Study. J Nucl Med 2020; 61 (08) 1153-1160
  • 74 Barrio M. et al. The Impact of Somatostatin Receptor-Directed PET/CT on the Management of Patients with Neuroendocrine Tumor: A Systematic Review and Meta-Analysis. J Nucl Med 2017; 58 (05) 756-761
  • 75 Yin Q. et al. Associations between Tumor Vascularity, Vascular Endothelial Growth Factor Expression and PET/MRI Radiomic Signatures in Primary Clear-Cell-Renal-Cell-Carcinoma: Proof-of-Concept Study. Sci Rep 2017; 7: 43356
  • 76 Srimathi S, Yamuna G, Nanmaran R. An Efficient Cancer Classification Model for CT/MRI/PET Fused Images. Curr Med Imaging 2021; 17 (03) 319-330

Correspondence

Herr Prof. Dr. Clemens C Cyran
Klinik und Poliklinik für Radiologie, Klinikum der Ludwig-Maximilians-Universität München
Marchioninistraße 15
81377 München
Germany   
Phone: +49/8 94 40 07 36 20   

Publication History

Received: 13 March 2022

Accepted: 11 July 2022

Article published online:
28 September 2022

© 2022. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Moore JH. et al. Preparing next-generation scientists for biomedical big data: artificial intelligence approaches. Per Med 2019; 16 (03) 247-257
  • 2 Rueckel J. et al. Artificial Intelligence Algorithm Detecting Lung Infection in Supine Chest Radiographs of Critically Ill Patients With a Diagnostic Accuracy Similar to Board-Certified Radiologists. Crit Care Med 2020; 48 (07) e574-e583
  • 3 Oosterhoff JHF, Doornberg JN, Doornberg C. Machine Learning, Artificial intelligence in orthopaedics: false hope or not? A narrative review along the line of Gartner’s hype cycle. EFORT Open Rev 2020; 5 (10) 593-603
  • 4 Sollini M. et al. Artificial intelligence and hybrid imaging: the best match for personalized medicine in oncology. Eur J Hybrid Imaging 2020; 4 (01) 24
  • 5 Capobianco E. High-dimensional role of AI and machine learning in cancer research. Br J Cancer 2022; 126 (04) 523-532
  • 6 Shiyam Sundar LK. et al. Potenzials and caveats of AI in hybrid imaging. Methods 2021; 188: 4-19
  • 7 Zaharchuk G. Next generation research applications for hybrid PET/MR and PET/CT imaging using deep learning. Eur J Nucl Med Mol Imaging 2019; 46 (13) 2700-2707
  • 8 Aide N. et al. New PET technologies – embracing progress and pushing the limits. Eur J Nucl Med Mol Imaging 2021; 48 (09) 2711-2726
  • 9 Hosny A. et al. Artificial intelligence in radiology. Nat Rev Cancer 2018; 18 (08) 500-510
  • 10 Pe’er D. et al. Tumor heterogeneity. Cancer Cell 2021; 39 (08) 1015-1017
  • 11 Ju W. et al. Random Walk and Graph Cut for Co-Segmentation of Lung Tumor on PET-CT Images. IEEE Trans Image Process 2015; 24 (12) 5854-5867
  • 12 Pfannenberg C. et al. Practice-based evidence for the clinical benefit of PET/CT-results of the first oncologic PET/CT registry in Germany. Eur J Nucl Med Mol Imaging 2019; 46 (01) 54-64
  • 13 Mottet N. et al. EAU-EANM-ESTRO-ESUR-SIOG Guidelines on Prostate Cancer-2020 Update. Part 1: Screening, Diagnosis, and Local Treatment with Curative Intent. Eur Urol 2021; 79 (02) 243-262
  • 14 AWMF. Leitlinienprogramm Onkologie (Deutsche Krebsgesellschaft, Deutsche Krebshilfe, AWMF): S3-Leitlinie Prostatakarzinom, Langversion 6.0, 2021, AWMF Registernummer: 043/022OL. (abgerufen am: 02.08.2021) http://www.leitlinienprogramm-onkologie.de/leitlinien/prostatakarzinom/
  • 15 Zhao X. et al. Tumor co-segmentation in PET/CT using multi-modality fully convolutional neural network. Phys Med Biol 2018; 64 (01) 015011
  • 16 Bradshaw TJ. et al. Nuclear Medicine and Artificial Intelligence: Best Practices for Algorithm Development. J Nucl Med 2021; DOI: 10.2967/jnumed.121.262567.
  • 17 Pinto Dos Santos D, Dietzel M, Baessler B. A decade of radiomics research: are images really data or just patterns in the noise?. Eur Radiol 2021; 31 (01) 1-4
  • 18 Gatta R. et al. Integrating radiomics into holomics for personalised oncology: from algorithms to bedside. Eur Radiol Exp 2020; 4 (01) 11
  • 19 Holzinger A. et al. Causability and explainability of artificial intelligence in medicine. Wiley Interdiscip Rev Data Min Knowl Discov 2019; 9 (04) e1312
  • 20 Gillies RJ, Kinahan PE, Hricak H. Radiomics: Images Are More than Pictures, They Are Data. Radiology 2016; 278 (02) 563-577
  • 21 Sepehri S. et al. Comparison and Fusion of Machine Learning Algorithms for Prospective Validation of PET/CT Radiomic Features Prognostic Value in Stage II-III Non-Small Cell Lung Cancer. Diagnostics (Basel) 2021; 11 (04) DOI: 10.3390/diagnostics11040675.
  • 22 Chartrand G. et al. Deep Learning: A Primer for Radiologists. Radiographics 2017; 37 (07) 2113-2131
  • 23 Litjens G. et al. A survey on deep learning in medical image analysis. Med Image Anal 2017; 42: 60-88
  • 24 Sanli Y. et al. Neuroendocrine Tumor Diagnosis and Management: (68)Ga-DOTATATE PET/CT. Am J Roentgenol 2018; 211 (02) 267-277
  • 25 El-Galaly TC. et al. FDG-PET/CT in the management of lymphomas: current status and future directions. J Intern Med 2018; 284 (04) 358-376
  • 26 Rakheja R. et al. Necrosis on FDG PET/CT correlates with prognosis and mortality in sarcomas. Am J Roentgenol 2013; 201 (01) 170-177
  • 27 Bettinelli A. et al. A Novel Benchmarking Approach to Assess the Agreement among Radiomic Tools. Radiology 2022; 211604 DOI: 10.1148/radiol.211604.
  • 28 van Timmeren JE. et al. Radiomics in medical imaging-“how-to” guide and critical reflection. Insights Imaging 2020; 11 (01) 91
  • 29 Li K, Zhang R, Cai W. Deep learning convolutional neural network (DLCNN): unleashing the potential of (18)F-FDG PET/CT in lymphoma. Am J Nucl Med Mol Imaging 2021; 11 (04) 327-331
  • 30 Mongan J, Moy L, Kahn Jr CE. Checklist for Artificial Intelligence in Medical Imaging (CLAIM): A Guide for Authors and Reviewers. Radiol Artif Intell 2020; 2 (02) e200029
  • 31 Lambin P. et al. Radiomics: the bridge between medical imaging and personalized medicine. Nat Rev Clin Oncol 2017; 14 (12) 749-762
  • 32 Muehlematter UJ, Daniore P, Vokinger KN. Approval of artificial intelligence and machine learning-based medical devices in the USA and Europe (2015-20): a comparative analysis. Lancet Digit Health 2021; 3 (03) e195-e203
  • 33 Luchini C, Pea A, Scarpa A. Artificial intelligence in oncology: current applications and future perspectives. Br J Cancer 2022; 126 (01) 4-9
  • 34 Sibille L. et al. (18)F-FDG PET/CT Uptake Classification in Lymphoma and Lung Cancer by Using Deep Convolutional Neural Networks. Radiology 2020; 294 (02) 445-452
  • 35 Blanc-Durand P. et al. Fully automatic segmentation of diffuse large B cell lymphoma lesions on 3D FDG-PET/CT for total metabolic tumour volume prediction using a convolutional neural network. Eur J Nucl Med Mol Imaging 2021; 48 (05) 1362-1370
  • 36 Qu YH. et al. The correlation of (18)F-FDG PET/CT metabolic parameters, clinicopathological factors, and prognosis in breast cancer. Clin Transl Oncol 2021; 23 (03) 620-627
  • 37 Lohmann P. et al. PET/MRI Radiomics in Patients With Brain Metastases. Front Neurol 2020; 11: 1
  • 38 Nakajo M. et al. Machine learning based evaluation of clinical and pretreatment (18)F-FDG-PET/CT radiomic features to predict prognosis of cervical cancer patients. Abdom Radiol (NY) 2022; 47 (02) 838-847
  • 39 AWMF. Leitlinienprogramm Onkologie (Deutsche Krebsgesellschaft, Deutsche Krebshilfe, AWMF): Prävention, Diagnostik, Therapie und Nachsorge des Lungenkarzinoms, Langversion 1.0, 2018, AWMF-Registernummer: 020/007OL. 2018 (Zugriff am: 09.04.2022) http://leitlinienprogramm-onkologie.de/Lungenkarzinom.98.0.html
  • 40 Kandathil A. et al. Role of FDG PET/CT in the Eighth Edition of TNM Staging of Non-Small Cell Lung Cancer. Radiographics 2018; 38 (07) 2134-2149
  • 41 UyBico SJ. et al. Lung cancer staging essentials: the new TNM staging system and potential imaging pitfalls. Radiographics 2010; 30 (05) 1163-1181
  • 42 Thakur SK, Singh DP, Choudhary J. Lung cancer identification: a review on detection and classification. Cancer Metastasis Rev 2020; 39 (03) 989-998
  • 43 Venkadesh KV. et al. Deep Learning for Malignancy Risk Estimation of Pulmonary Nodules Detected at Low-Dose Screening CT. Radiology 2021; 300 (02) 438-447
  • 44 Binczyk F. et al. Radiomics and artificial intelligence in lung cancer screening. Transl Lung Cancer Res 2021; 10 (02) 1186-1199
  • 45 van Klaveren RJ. et al. Management of lung nodules detected by volume CT scanning. N Engl J Med 2009; 361 (23) 2221-2229
  • 46 Werner S. et al. Accuracy and Reproducibility of a Software Prototype for Semi-Automated Computer-Aided Volumetry of the solid and subsolid Components of part-solid Pulmonary Nodules. Fortschr Röntgenstr 2022; 194 (03) 296-305
  • 47 Borrelli P. et al. Freely available convolutional neural network-based quantification of PET/CT lesions is associated with survival in patients with lung cancer. EJNMMI Phys 2022; 9 (01) 6
  • 48 Wallis D. et al. An [18F]FDG-PET/CT deep learning method for fully automated detection of pathological mediastinal lymph nodes in lung cancer patients. Eur J Nucl Med Mol Imaging 2022; 49 (03) 881-888
  • 49 Mu W. et al. Non-invasive decision support for NSCLC treatment using PET/CT radiomics. Nat Commun 2020; 11 (01) 5228
  • 50 Fendler WP. et al. (68)Ga-PSMA PET/CT: Joint EANM and SNMMI procedure guideline for prostate cancer imaging: version 1.0. Eur J Nucl Med Mol Imaging 2017; 44 (06) 1014-1024
  • 51 Hofman MS. et al Prostate-specific membrane antigen PET-CT in patients with high-risk prostate cancer before curative-intent surgery or radiotherapy (proPSMA): a prospective, randomised, multicentre study. Lancet 2020; 395 10231 1208-1216
  • 52 Calais J. et al. Potenzial Impact of (68)Ga-PSMA-11 PET/CT on the Planning of Definitive Radiation Therapy for Prostate Cancer. J Nucl Med 2018; 59 (11) 1714-1721
  • 53 Grubmuller B. et al. PSMA Ligand PET/MRI for Primary Prostate Cancer: Staging Performance and Clinical Impact. Clin Cancer Res 2018; 24 (24) 6300-6307
  • 54 Kostyszyn D. et al. Intraprostatic Tumor Segmentation on PSMA PET Images in Patients with Primary Prostate Cancer with a Convolutional Neural Network. J Nucl Med 2021; 62 (06) 823-828
  • 55 Erle A. et al. Evaluating a Machine Learning Tool for the Classification of Pathological Uptake in Whole-Body PSMA-PET-CT Scans. Tomography 2021; 7 (03) 301-312
  • 56 Gafita A. et al. qPSMA: Semiautomatic Software for Whole-Body Tumor Burden Assessment in Prostate Cancer Using (68)Ga-PSMA11 PET/CT. J Nucl Med 2019; 60 (09) 1277-1283
  • 57 Gafita A. et al. Tumor sink effect in (68)Ga-PSMA-11 PET: Myth or Reality?. J Nucl Med 2021; DOI: 10.2967/jnumed.121.261906.
  • 58 Moazemi S. et al. Decision-support for treatment with (177)Lu-PSMA: machine learning predicts response with high accuracy based on PSMA-PET/CT and clinical parameters. Ann Transl Med 2021; 9 (09) 818
  • 59 Moazemi S. et al. Estimating the Potenzial of Radiomics Features and Radiomics Signature from Pretherapeutic PSMA-PET-CT Scans and Clinical Data for Prediction of Overall Survival When Treated with (177)Lu-PSMA. Diagnostics (Basel) 2021; 11 (02) DOI: 10.3390/diagnostics11020186.
  • 60 Papp L. et al. Supervised machine learning enables non-invasive lesion characterization in primary prostate cancer with [(68)Ga]Ga-PSMA-11 PET/MRI. Eur J Nucl Med Mol Imaging 2021; 48 (06) 1795-1805
  • 61 Fehrenbach U. et al. Automatized Hepatic Tumor Volume Analysis of Neuroendocrine Liver Metastases by Gd-EOB MRI-A Deep-Learning Model to Support Multidisciplinary Cancer Conference Decision-Making. Cancers (Basel) 2021; 13 (11) DOI: 10.3390/cancers13112726.
  • 62 Hicks RJ. et al. ENETS standardized (synoptic) reporting for molecular imaging studies in neuroendocrine tumours. J Neuroendocrinol 2021; e13040 DOI: 10.1111/jne.13040.
  • 63 Dromain C. et al. ENETS standardized (synoptic) reporting for radiological imaging in neuroendocrine tumours. J Neuroendocrinol 2021; e13044 DOI: 10.1111/jne.13040.
  • 64 Ambrosini V. et al. Consensus on molecular imaging and theranostics in neuroendocrine neoplasms. Eur J Cancer 2021; 146: 56-73
  • 65 Werner RA. et al. SSTR-RADS Version 1.0 as a Reporting System for SSTR PET Imaging and Selection of Potenzial PRRT Candidates: A Proposed Standardization Framework. J Nucl Med 2018; 59 (07) 1085-1091
  • 66 Gao X, Wang X. Deep learning for World Health Organization grades of pancreatic neuroendocrine tumors on contrast-enhanced magnetic resonance images: a preliminary study. Int J Comput Assist Radiol Surg 2019; 14 (11) 1981-1991
  • 67 Luo Y. et al. Preoperative Prediction of Pancreatic Neuroendocrine Neoplasms Grading Based on Enhanced Computed Tomography Imaging: Validation of Deep Learning with a Convolutional Neural Network. Neuroendocrinology 2020; 110 (05) 338-350
  • 68 Atkinson C. et al. Radiomics-Based Texture Analysis of (68)Ga-DOTATATE Positron Emission Tomography and Computed Tomography Images as a Prognostic Biomarker in Adults With Neuroendocrine Cancers Treated With (177)Lu-DOTATATE. Front Oncol 2021; 11: 686235
  • 69 Liberini V. et al. Impact of segmentation and discretization on radiomic features in (68)Ga-DOTA-TOC PET/CT images of neuroendocrine tumor. EJNMMI Phys 2021; 8 (01) 21
  • 70 Liberini V. et al. The Challenge of Evaluating Response to Peptide Receptor Radionuclide Therapy in Gastroenteropancreatic Neuroendocrine Tumors: The Present and the Future. Diagnostics (Basel) 2020; 10 (12) DOI: 10.3390/diagnostics10121083.
  • 71 Onner H, Abdulrezzak U, Tutus A. Could the skewness and kurtosis texture parameters of lesions obtained from pretreatment Ga-68 DOTA-TATE PET/CT images predict receptor radionuclide therapy response in patients with gastroenteropancreatic neuroendocrine tumors?. Nucl Med Commun 2020; 41 (10) 1034-1039
  • 72 Wehrend J. et al. Automated liver lesion detection in (68)Ga DOTATATE PET/CT using a deep fully convolutional neural network. EJNMMI Res 2021; 11 (01) 98
  • 73 Sonni I. et al. Impact of (68)Ga-PSMA-11 PET/CT on Staging and Management of Prostate Cancer Patients in Various Clinical Settings: A Prospective Single-Center Study. J Nucl Med 2020; 61 (08) 1153-1160
  • 74 Barrio M. et al. The Impact of Somatostatin Receptor-Directed PET/CT on the Management of Patients with Neuroendocrine Tumor: A Systematic Review and Meta-Analysis. J Nucl Med 2017; 58 (05) 756-761
  • 75 Yin Q. et al. Associations between Tumor Vascularity, Vascular Endothelial Growth Factor Expression and PET/MRI Radiomic Signatures in Primary Clear-Cell-Renal-Cell-Carcinoma: Proof-of-Concept Study. Sci Rep 2017; 7: 43356
  • 76 Srimathi S, Yamuna G, Nanmaran R. An Efficient Cancer Classification Model for CT/MRI/PET Fused Images. Curr Med Imaging 2021; 17 (03) 319-330

Zoom Image
Fig. 1 Applications of AI in oncological imaging along the radiology workflow: a Detection of lesions in schematic drawings of patients with lung cancer, prostate cancer, and neuroendocrine tumor, b Characterization of solitary lesions in axial PET/CT reconstructions of lung cancer and prostate cancer, additional circles drawn to highlight the areas of interest. c Longitudinal monitoring of single lesions with regard to aforementioned characteristics allowing response assessment (axial PET-reconstruction) in lung cancer with changing characteristics over time.
Zoom Image
Fig. 2 Applications of AI in radiomic classification and deep learning algorithms. The first row shows the radiomics approach, comprising tumor segmentation, extraction of handcrafted features, and training of an ML model. The bottom row illustrates an automated approach of a deep learning algorithm with convolutional neural networks.