Autor según el artículo: Schwarz Schuler, Joao Paulo; Romani Also, Santiago; Puig, Domenec; Rashwan, Hatem; Abdel-Nasser, Mohamed
Departamento: Enginyeria Informàtica i Matemàtiques
Autor/es de la URV: Abdellatif Fatahallah Ibrahim Mahmoud, Hatem / Abdelnasser Mohamed Mahmoud, Mohamed / Puig Valls, Domènec Savi / Romaní Also, Santiago / Schwarz Schuler, Joao Paulo
Palabras clave: Pointwise convolution Parameter reduction Parallel branches Neural-network Network optimization Image classification Grouped convolution Efficientnet Deep learning Dcnn Data analysis Convolutional neural network Computer vision Channel interleaving
Resumen: In image classification with Deep Convolutional Neural Networks (DCNNs), the number of parameters in pointwise convolutions rapidly grows due to the multiplication of the number of filters by the number of input channels that come from the previous layer. Existing studies demonstrated that a subnetwork can replace pointwise convolutional layers with significantly fewer parameters and fewer floating-point computations, while maintaining the learning capacity. In this paper, we propose an improved scheme for reducing the complexity of pointwise convolutions in DCNNs for image classification based on interleaved grouped filters without divisibility constraints. The proposed scheme utilizes grouped pointwise convolutions, in which each group processes a fraction of the input channels. It requires a number of channels per group as a hyperparameter Ch. The subnetwork of the proposed scheme contains two consecutive convolutional layers K and L, connected by an interleaving layer in the middle, and summed at the end. The number of groups of filters and filters per group for layers K and L is determined by exact divisions of the original number of input channels and filters by Ch. If the divisions were not exact, the original layer could not be substituted. In this paper, we refine the previous algorithm so that input channels are replicated and groups can have different numbers of filters to cope with non exact divisibility situations. Thus, the proposed scheme further reduces the number of floating-point computations (11%) and trainable parameters (10%) achieved by the previous method. We tested our optimization on an EfficientNet-B0 as a baseline architecture and made classification tests on the CIFAR-10, Colorectal Cancer Histology, and Malaria datasets. For each dataset, our optimization achieves a saving of 76%, 89%, and 91% of the number of trainable parameters of EfficientNet-B0, while keeping its test classification accuracy.
Áreas temáticas: Saúde coletiva Physics, multidisciplinary Physics and astronomy (miscellaneous) Physics and astronomy (all) Medicina ii Medicina i Mathematical physics Matemática / probabilidade e estatística Interdisciplinar Information systems Geociências General physics and astronomy Filosofía Engenharias iv Engenharias iii Electrical and electronic engineering Educação física Ciências biológicas i Ciência da computação Astronomia / física
Acceso a la licencia de uso: https://creativecommons.org/licenses/by/3.0/es/
Direcció de correo del autor: mohamed.abdelnasser@urv.cat hatem.abdellatif@urv.cat joaopaulo.schwarz@estudiants.urv.cat santiago.romani@urv.cat domenec.puig@urv.cat
Identificador del autor: 0000-0002-1074-2441 0000-0001-5421-1637 0000-0002-7582-0711 0000-0001-6673-9615 0000-0002-0562-4205
Fecha de alta del registro: 2024-09-21
Versión del articulo depositado: info:eu-repo/semantics/publishedVersion
URL Documento de licencia: https://repositori.urv.cat/ca/proteccio-de-dades/
Referencia al articulo segun fuente origial: Entropy. 24 (9): 1264-
Referencia de l'ítem segons les normes APA: Schwarz Schuler, Joao Paulo; Romani Also, Santiago; Puig, Domenec; Rashwan, Hatem; Abdel-Nasser, Mohamed (2022). An Enhanced Scheme for Reducing the Complexity of Pointwise Convolutions in CNNs for Image Classification Based on Interleaved Grouped Filters without Divisibility Constraints. Entropy, 24(9), 1264-. DOI: 10.3390/e24091264
Entidad: Universitat Rovira i Virgili
Año de publicación de la revista: 2022
Tipo de publicación: Journal Publications