TY - JOUR
T1 - AI-powered prostate cancer detection: a multi-centre, multi-scanner validation study
AU - Giganti, Francesco
AU - da Silva, Nadia Moreira
AU - Yeung, Michael
AU - Davies, Lucy
AU - Frary, Amy
AU - Rodriguez, Mirjana Ferrer
AU - Sushentsev, Nikita
AU - Ashley, Nicholas
AU - Andreou, Adrian
AU - Bradley, Alison
AU - Wilson, Chris
AU - Maskell, Giles
AU - Brembilla, Giorgio
AU - Caglic, Iztok
AU - Suchánek, Jakub
AU - Budd, Jobie
AU - Arya, Zobair
AU - Aning, Jonathan
AU - Hayes, John
AU - De Bono, Mark
AU - Vasdev, Nikhil
AU - Sanmugalingam, Nimalan
AU - Burn, Paul
AU - Persad, Raj
AU - Woitek, Ramona
AU - Hindley, Richard
AU - Liyanage, Sidath
AU - Squire, Sophie
AU - Barrett, Tristan
AU - Barwick, Steffi
AU - Hinton, Mark
AU - Padhani, Anwar R.
AU - Rix, Antony
AU - Shah, Aarti
AU - Sala, Evis
N1 - © 2025 The Author(s). This is an open access article distributed under the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/
PY - 2025/2/28
Y1 - 2025/2/28
N2 - Objectives: Multi-centre, multi-vendor validation of artificial intelligence (AI) software to detect clinically significant prostate cancer (PCa) using multiparametric magnetic resonance imaging (MRI) is lacking. We compared a new AI solution, validated on a separate dataset from different UK hospitals, to the original multidisciplinary team (MDT)-supported radiologist’s interpretations. Materials and methods: A Conformité Européenne (CE)-marked deep-learning (DL) computer-aided detection (CAD) medical device (Pi) was trained to detect Gleason Grade Group (GG) ≥ 2 cancer using retrospective data from the PROSTATEx dataset and five UK hospitals (793 patients). Our separate validation dataset was on six machines from two manufacturers across six sites (252 patients). Data included in the study were from MRI scans performed between August 2018 to October 2022. Patients with a negative MRI who did not undergo biopsy were assumed to be negative (90.4% had prostate-specific antigen density < 0.15 ng/mL
2). ROC analysis was used to compare radiologists who used a 5-category suspicion score. Results: GG ≥ 2 prevalence in the validation set was 31%. Evaluated per patient, Pi was non-inferior to radiologists (considering a 10% performance difference as acceptable), with an area under the curve (AUC) of 0.91 vs. 0.95. At the predetermined risk threshold of 3.5, the AI software’s sensitivity was 95% and specificity 67%, while radiologists at Prostate Imaging-Reporting and Data Systems/Likert ≥ 3 identified GG ≥ 2 with a sensitivity of 99% and specificity of 73%. AI performed well per-site (AUC ≥ 0.83) at the patient-level independent of scanner age and field strength. Conclusion: Real-world data testing suggests that Pi matches the performance of MDT-supported radiologists in GG ≥ 2 PCa detection and generalises to multiple sites, scanner vendors, and models. Key Points: Question The performance of artificial intelligence-based medical tools for prostate MRI has yet to be evaluated on multi-centre, multi-vendor data to assess generalisability. Findings A dedicated AI medical tool matches the performance of multidisciplinary team-supported radiologists in prostate cancer detection and generalises to multiple sites and scanners. Clinical relevance This software has the potential to support the MRI process for biopsy decision-making and target identification, but future prospective studies, where lesions identified by artificial intelligence are biopsied separately, are needed.
AB - Objectives: Multi-centre, multi-vendor validation of artificial intelligence (AI) software to detect clinically significant prostate cancer (PCa) using multiparametric magnetic resonance imaging (MRI) is lacking. We compared a new AI solution, validated on a separate dataset from different UK hospitals, to the original multidisciplinary team (MDT)-supported radiologist’s interpretations. Materials and methods: A Conformité Européenne (CE)-marked deep-learning (DL) computer-aided detection (CAD) medical device (Pi) was trained to detect Gleason Grade Group (GG) ≥ 2 cancer using retrospective data from the PROSTATEx dataset and five UK hospitals (793 patients). Our separate validation dataset was on six machines from two manufacturers across six sites (252 patients). Data included in the study were from MRI scans performed between August 2018 to October 2022. Patients with a negative MRI who did not undergo biopsy were assumed to be negative (90.4% had prostate-specific antigen density < 0.15 ng/mL
2). ROC analysis was used to compare radiologists who used a 5-category suspicion score. Results: GG ≥ 2 prevalence in the validation set was 31%. Evaluated per patient, Pi was non-inferior to radiologists (considering a 10% performance difference as acceptable), with an area under the curve (AUC) of 0.91 vs. 0.95. At the predetermined risk threshold of 3.5, the AI software’s sensitivity was 95% and specificity 67%, while radiologists at Prostate Imaging-Reporting and Data Systems/Likert ≥ 3 identified GG ≥ 2 with a sensitivity of 99% and specificity of 73%. AI performed well per-site (AUC ≥ 0.83) at the patient-level independent of scanner age and field strength. Conclusion: Real-world data testing suggests that Pi matches the performance of MDT-supported radiologists in GG ≥ 2 PCa detection and generalises to multiple sites, scanner vendors, and models. Key Points: Question The performance of artificial intelligence-based medical tools for prostate MRI has yet to be evaluated on multi-centre, multi-vendor data to assess generalisability. Findings A dedicated AI medical tool matches the performance of multidisciplinary team-supported radiologists in prostate cancer detection and generalises to multiple sites and scanners. Clinical relevance This software has the potential to support the MRI process for biopsy decision-making and target identification, but future prospective studies, where lesions identified by artificial intelligence are biopsied separately, are needed.
KW - Artificial intelligence
KW - Magnetic resonance imaging
KW - Prostatic neoplasms
UR - https://www.scopus.com/pages/publications/86000058242
U2 - 10.1007/s00330-024-11323-0
DO - 10.1007/s00330-024-11323-0
M3 - Article
SN - 0938-7994
SP - 1
EP - 10
JO - European Radiology
JF - European Radiology
M1 - e232635
ER -