Publication:
Fully Convolutional Neural Network to Assess Skeleton Tumor Burden in Prostate Cancer Using 68Ga-PSMA-11 PET/CT: Preliminary Results

cris.virtual.author-orcid0000-0002-1954-736X
cris.virtualsource.author-orcidbd991341-4294-4586-b80f-40b024a93af3
cris.virtualsource.author-orcid04b125a2-21e5-4707-826d-29316769e724
datacite.rightsopen.access
dc.contributor.authorTetteh, G.
dc.contributor.authorGafita, A.
dc.contributor.authorXu, L.
dc.contributor.authorZhao, Y.
dc.contributor.authorDong, C.
dc.contributor.authorRominger, Axel Oliver
dc.contributor.authorShi, Kuangyu
dc.contributor.authorZimmer, C.
dc.contributor.authorMenze, B.H.
dc.contributor.authorEiber, M.
dc.date.accessioned2024-10-08T15:20:31Z
dc.date.available2024-10-08T15:20:31Z
dc.date.issued2018
dc.description.abstractAim: Treatment of bone metastases plays an important role for patients with metastatic prostate cancer (mPC). PSMA-based PET imaging is increasingly used to delineate bone tumor burden before therapy. Bone scan Index (BSI) and Bone- PET-Index (BPI) are promising in assessing treatment outcome in patients with mPC. However, the semiautomatic segmentation using conventional methods in skeleton tumor burden assessment can be time-consuming and lengthy. The emerging of deep learning methods have provided great potential to extend the limit of conventional methods. We aimed to assess the feasibility of deep learning network in bone lesion delineation and tumor burden quantification. Methods and Materials: A fully convoconvolutional neural network (FCNN) concept was proposed to automatically detect bone lesions and characterize osseous tumor burden from 68Ga-PSMA-11 PET/CT imaging. A pipeline of two FCNNs was employed in a cascaded form. The first part of the cascaded network generates bone mask from CT images as anatomical regions of interest (ROI), while the second part detects and segments bone lesions based on PET imaging restricted to the anatomical regions within the generated bone mask. For proof-of-concept test, 50 68Ga-PSMA-11 PET/CT from patients with mPC were included. SUV of PET images were calculated and the bone lesions were semi-automatically annotated using an in-house developed software. Forty 68Ga-PSMA-11 PET/ CT scans were used as training data set for the FCNN and the remaining 10 scans were used as test dataset for performance assessment. The performance of the developed method was evaluated by considering the overall segmentation result in the form of a slice-wise lesion detection accuracy and Dice score, including the Recall and Precision scores. Results: The developed deep learning method has achieved a slice-wise detection accuracy of 91% with a positive predictive value (PPV) of 78%. The average segmentation Dice score was 76%, with a Recall and Precision scores of 86% and 66%, respectively. Conclusion: Our results highlight that even in a small size training dataset, deep learning can successfully detect bone lesions in a 68Ga-PSMA-11 PET/CT setting. A higher accuracy for lesion segmentation should be obtained by increasing the number of training dataset and providing physiological lesion contouring to guide the training process. Accurate bone lesions detection and segmentation could be further implemented in the treatment setting.
dc.description.numberOfPages1
dc.description.sponsorshipUniversitätsklinik für Nuklearmedizin
dc.identifier.doi10.48350/126189
dc.identifier.urihttps://boris-portal.unibe.ch/handle/20.500.12422/64023
dc.language.isoen
dc.publisherSpringer-Verlag
dc.relation.conferenceAnnual Congress of the European Association of Nuclear Medicine
dc.relation.ispartofEuropean journal of nuclear medicine and molecular imaging
dc.relation.issn1619-7070
dc.relation.organizationDCD5A442BAD5E17DE0405C82790C4DE2
dc.subject.ddc600 - Technology::610 - Medicine & health
dc.titleFully Convolutional Neural Network to Assess Skeleton Tumor Burden in Prostate Cancer Using 68Ga-PSMA-11 PET/CT: Preliminary Results
dc.typeconference_item
dspace.entity.typePublication
dspace.file.typetext
oaire.citation.conferenceDate13. - 17.10.2018
oaire.citation.conferencePlaceDüsseldorf
oaire.citation.endPageS41
oaire.citation.issueS1
oaire.citation.startPageS41
oaire.citation.volume45
oairecerif.author.affiliationUniversitätsklinik für Nuklearmedizin
oairecerif.author.affiliationUniversitätsklinik für Nuklearmedizin
oairecerif.identifier.urlhttps://doi.org/10.1007/s00259-018-4148-3
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.contributor.rolecreator
unibe.date.licenseChanged2022-09-13 11:07:39
unibe.description.ispublishedpub
unibe.eprints.legacyId126189
unibe.refereedtrue
unibe.subtype.conferenceabstract

Files

Original bundle
Now showing 1 - 1 of 1
Name:
Tetteh_Fully_Convolutional_Neural_Network_to_Assess.pdf
Size:
147.86 KB
Format:
Adobe Portable Document Format
File Type:
text
License:
publisher
Content:
published

Collections