One-shot domain adaptation in multiple sclerosis lesion segmentation using convolutional neural networks
dc.contributor.author
dc.date.accessioned
2019-07-24T10:48:00Z
dc.date.available
2019-07-24T10:48:00Z
dc.date.issued
2019-01-01
dc.identifier.issn
2213-1582
dc.identifier.uri
dc.description.abstract
In recent years, several convolutional neural network (CNN) methods have been proposed for the automated white matter lesion segmentation of multiple sclerosis (MS) patient images, due to their superior performance compared with those of other state-of-the-art methods. However, the accuracies of CNN methods tend to decrease significantly when evaluated on different image domains compared with those used for training, which demonstrates the lack of adaptability of CNNs to unseen imaging data. In this study, we analyzed the effect of intensity domain adaptation on our recently proposed CNN-based MS lesion segmentation method. Given a source model trained on two public MS datasets, we investigated the transferability of the CNN model when applied to other MRI scanners and protocols, evaluating the minimum number of annotated images needed from the new domain and the minimum number of layers needed to re-train to obtain comparable accuracy. Our analysis comprised MS patient data from both a clinical center and the public ISBI2015 challenge database, which permitted us to compare the domain adaptation capability of our model to that of other state-of-the-art methods. In both datasets, our results showed the effectiveness of the proposed model in adapting previously acquired knowledge to new image domains, even when a reduced number of training samples was available in the target dataset. For the ISBI2015 challenge, our one-shot domain adaptation model trained using only a single case showed a performance similar to that of other CNN methods that were fully trained using the entire available training set, yielding a comparable human expert rater performance. We believe that our experiments will encourage the MS community to incorporate its use in different clinical settings with reduced amounts of annotated data. This approach could be meaningful not only in terms of the accuracy in delineating MS lesions but also in the related reductions in time and economic costs derived from manual lesion labeling
dc.description.sponsorship
Mariano Cabezas holds a Juan de la Cierva - Incorporación grant
from the Spanish Government with reference number IJCI-2016-29240.
This work has been partially supported by La Fundació la Marató de
TV3, Spain; by Retos de Investigación TIN2014-55710-R, TIN2015-
73563-JIN and DPI2017-86696-R from the Ministerio de Ciencia y
Tecnología, Spain. The authors gratefully acknowledge the support of
the NVIDIA Corporation with their donation of the TITAN-X PASCAL
GPU used in this research
dc.format.mimetype
application/pdf
dc.language.iso
eng
dc.publisher
Elsevier
dc.relation
info:eu-repo/grantAgreement/MINECO//TIN2014-55710-R/ES/HERRAMIENTAS DE NEUROIMAGEN PARA MEJORAR EL DIAGNOSIS Y EL SEGUIMIENTO CLINICO DE LOS PACIENTES CON ESCLEROSIS MULTIPLE/
info:eu-repo/grantAgreement/MINECO//TIN2015-73563-JIN/ES/SEGMENTACION AUTOMATICA DE LAS ESTRUCTURAS CEREBRALES PARA SU USO COMO BIOMARCADORES DE IMAGEN/
info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2013-2016/DPI2017-86696-R/ES/MODELOS PREDICTIVOS PARA LA ESCLEROSIS MULTIPE USANDO BIOMARCADORES DE RESONANCIA MAGNETICA DEL CEREBRO/
dc.relation.isformatof
Reproducció digital del document publicat a: https://doi.org/10.1016/j.nicl.2018.101638
dc.relation.ispartof
NeuroImage: Clinical, 2019, vol. 21, p. 101638
dc.relation.ispartofseries
Articles publicats (D-ATC)
dc.rights
Attribution-NonCommercial-NoDerivatives 4.0 International
dc.rights.uri
dc.subject
dc.title
One-shot domain adaptation in multiple sclerosis lesion segmentation using convolutional neural networks
dc.type
info:eu-repo/semantics/article
dc.rights.accessRights
info:eu-repo/semantics/openAccess
dc.type.version
info:eu-repo/semantics/publishedVersion
dc.identifier.doi
dc.identifier.idgrec
029961
dc.contributor.funder
dc.type.peerreviewed
peer-reviewed
dc.relation.FundingProgramme
dc.relation.ProjectAcronym