Scopus Indexed Publications

Paper Details


Title
Self-gated rectified linear unit for performance improvement of deep neural networks
Author
Israt Jahan,
Email
Abstract

This technical paper proposes an activation function, self-gated rectified linear unit (SGReLU), to achieve high classification accuracy, low loss, and low computational time. Vanishing gradient problem, dying ReLU, noise vulnerability are also resolved in our proposed SGReLU function. SGReLU’s performance is evaluated on MNIST, Fashion-MNIST, and Imagenet datasets and compared with seven highly effective activation functions. We obtained that the proposed SGReLU outperformed other activation functions in most cases in VGG16, Inception v3, and ResNet50. In VGG16 and Inception v3, it achieved an accuracy of 90.87% and 95.01%, respectively, exceeding other functions with the second-fastest computing time in these networks.

Keywords
Image classificationActivation functionDeep neural networkAccuracyTime complexity
Journal or Conference Name
ICT Express
Publication Year
2023
Indexing
scopus