Autor según el artículo: Payette, K; Li, HB; de Dumest, P; Licandro, R; Ji, H; Siddiquee, MMR; Xu, DG; Myronenko, A; Liu, H; Pei, YC; Wang, LS; Peng, Y; Xie, JY; Zhang, HQ; Dong, GM; Fu, H; Wang, GT; Rieu, Z; Kim, D; Kim, HG; Karimi, D; Gholipour, A; Torres, HR; Oliveira, B; Vilaca, JL; Lin, Y; Avisdris, N; Ben-Zvi, W; Ben Bashat, D; Fidon, L; Aertsen, M; Vercauteren, T; Sobotka, D; Langs, G; Alenya, M; Villanueva, MI; Camara, O; Fadida, BS; Joskowicz, L; Weibin, L; Yi, L; Xuesong, L; Mazher, M; Qayyum, A; Puig, D; Kebiri, H; Zhang, ZL; Xu, XY; Wu, D; Liao, KL; Wu, YX; Chen, JT; Xu, YZ; Zhao, L; Vasung, L; Menze, B; Cuadra, MB; Jakab, A
Departamento: Enginyeria Informàtica i Matemàtiques
Autor/es de la URV: Mazher, Moona / Puig Valls, Domènec Savi
Palabras clave: Super-resolution reconstructions Multi-class image segmentation Mri Fetal brain mri Congenital disorders super-resolution reconstructions myelomeningocele fetuses fetal brain mri congenital disorders atlas
Resumen: In-utero fetal MRI is emerging as an important tool in the diagnosis and analysis of the developing human brain. Automatic segmentation of the developing fetal brain is a vital step in the quantitative analysis of prenatal neurodevelopment both in the research and clinical context. However, manual segmentation of cerebral structures is time-consuming and prone to error and inter-observer variability. Therefore, we organized the Fetal Tissue Annotation (FeTA) Challenge in 2021 in order to encourage the development of automatic segmentation algorithms on an international level. The challenge utilized FeTA Dataset, an open dataset of fetal brain MRI reconstructions segmented into seven different tissues (external cerebrospinal fluid, gray matter, white matter, ventricles, cerebellum, brainstem, deep gray matter). 20 international teams participated in this challenge, submitting a total of 21 algorithms for evaluation. In this paper, we provide a detailed analysis of the results from both a technical and clinical perspective. All participants relied on deep learning methods, mainly U-Nets, with some variability present in the network architecture, optimization, and image pre- and post-processing. The majority of teams used existing medical imaging deep learning frameworks. The main differences between the submissions were the fine tuning done during training, and the specific pre- and post-processing steps performed. The challenge results showed that almost all submissions performed similarly. Four of the top five teams used ensemble learning methods. However, one team's algorithm performed significantly superior to the other submissions, and consisted of an asymmetrical U-Net network architecture. This paper provides a first of its kind benchmark for future automatic multi-tissue segmentation algorithms for the developing human brain in utero.
Áreas temáticas: Radiology, nuclear medicine and imaging Radiology, nuclear medicine & medical imaging Radiological and ultrasound technology Materiais Health informatics Engineering, biomedical Engenharias iv Computer vision and pattern recognition Computer science, interdisciplinary applications Computer science, artificial intelligence Computer graphics and computer-aided design Ciência da computação
Acceso a la licencia de uso: https://creativecommons.org/licenses/by/3.0/es/
Direcció de correo del autor: moona.mazher@estudiants.urv.cat domenec.puig@urv.cat
Identificador del autor: 0000-0003-4444-5776 0000-0002-0562-4205
Fecha de alta del registro: 2024-08-03
Versión del articulo depositado: info:eu-repo/semantics/publishedVersion
URL Documento de licencia: https://repositori.urv.cat/ca/proteccio-de-dades/
Referencia al articulo segun fuente origial: Medical Image Analysis. 88
Referencia de l'ítem segons les normes APA: Payette, K; Li, HB; de Dumest, P; Licandro, R; Ji, H; Siddiquee, MMR; Xu, DG; Myronenko, A; Liu, H; Pei, YC; Wang, LS; Peng, Y; Xie, JY; Zhang, HQ; Do (2023). Fetal brain tissue annotation and segmentation challenge results. Medical Image Analysis, 88(), -. DOI: 10.1016/j.media.2023.102833
Entidad: Universitat Rovira i Virgili
Año de publicación de la revista: 2023
Tipo de publicación: Journal Publications