A Novel Deep Learning Technique for Medical Image Analysis Using Improved Optimizer
Keywords:
Gradient Centralization, Medical image Analysis, Real Esrgan, GFPGANAbstract
Application of Convolutional neural network in spectrum of Medical image analysis are providing benchmark outputs which converges the interest of many researchers to explore it in depth. Availability of clean dataset is one of the significant factor which drives the efficiency of CNN (Convolutional neural network) model. Latest preprocessing technique Real Esrgan and GFPGAN (Generative facial prior GAN) are proving their mettle in providing high resolution dataset. Optimizer also plays a vital role in upgrading the functioning of CNN model. A new optimization technique Gradient Centralization is providing the unparalleled result in terms of generalization and execution time. Our research delves deep into aforementioned factors. Result has already been achieved by taking in account the first factor which is preprocessing technique. In continuation of our effort, our paper explores the next factor which is the employment of new optimization technique, Gradient centralization to our integrated framework (Model with advanced preprocessing technique). Preprocessing integrated Densenet 201, and NasNet are giving encouraging result in terms of training loss and execution time when GC is used as optimizer. Experiments are carried out for three different types of datasets: Skin, Retina and Lung and comparison of loss curves for both preprocessing integrated models is carried out.
Downloads
References
[1] N. Nuryani, A. B. Mutiara, I. M. Wiryana, D. Purnamasari, and S. N. W. Putra, “Artificial intelligence model for detecting tax evasion involving complex network schemes,” Aptisi Transactions on Technopreneurship (ATT), vol. 6, no. 3, pp. 339–356, 2024.
[2] X. Wang, L. Xie, C. Dong, and Y. Shan, “Real-esrgan: Training real-world blind super-resolution with pure synthetic data,” arXiv preprint arXiv:2107.10833, 2021.
[3] X. Wang, Y. Li, H. Zhang, and Y. Shan, “Towards real-world blind face restoration with generative facial prior,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp. 9168–9178.
[4] E. Bisong, “Optimization for machine learning: Gradient descent,” in Building Machine Learning and Deep Learning Models on Google Cloud Platform. Apress, 2019, pp. 203–207.
[5] I. Gitman, H. Lang, P. Zhang, and L. Xiao, “Understanding the role of momentum in stochastic gradient methods,” Advances in Neural Information Processing Systems, vol. 32, 2019.
[6] Z. Zhang, “Improved adam optimizer for deep neural networks,” in 2018 IEEE/ACM 26th International Symposium on Quality of Service (IWQoS). IEEE, 2018, pp. 1–2.
[7] D. V. Babu, C. Karthikeyan, and A. Kumar, “Performance analysis of cost and accuracy for whale swarm and rmsprop optimizer,” IOP Conference Series: Materials Science and Engineering, vol. 993, no. 1, p. 012080, 2020.
[8] H. Yong, J. Huang, X. Hua, and L. Zhang, “Gradient centralization: A new optimization technique for deep neural networks,” in European Conference on Computer Vision. Springer, 2020, pp. 635–652.
[9] E. T. Rusmiati, L. Febrina, Y. Sari, and E. M. S. Sakti, “Adoption of ai driven ecological preaching systems using sem pls analysis,” Aptisi Transactions on Technopreneurship (ATT), vol. 8, no. 1, pp. 284–295, 2026.
[10] V. Agarwal, M. C. Lohani, A. S. Bist, and D. Julianingsih, “Application of voting based approach on deep learning algorithm for lung disease classification,” in 2022 International Conference on Science and Technology (ICOSTECH). IEEE, 2022, pp. 1–7.
[11] K. K. Pal and K. S. Sudeep, “Preprocessing for image classification by convolutional neural networks,” in 2016 IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT). IEEE, 2016, pp. 1778–1781.
[12] U. M. Khaire and R. Dhanalakshmi, “High-dimensional microarray dataset classification using an improved adam optimizer (iadam),” Journal of Ambient Intelligence and Humanized Computing, vol. 11, no. 11, pp. 5187–5204, 2020.
[13] M. N. Halgamuge, E. Daminda, and A. Nirmalathas, “Best optimizer selection for predicting bushfire occurrences using deep learning,” Natural Hazards, vol. 103, no. 1, pp. 845–860, 2020.
[14] I. Kandel, M. Castelli, and A. Popoviˇc, “Comparative study of first order optimizers for image classification using convolutional neural networks on histopathology images,” Journal of Imaging, vol. 6, no. 9, p. 92, 2020.
[15] M. Yaqub, J. Feng, M. S. Zia, K. Arshid, K. Jia, Z. U. Rehman, and A. Mehmood, “State-of-the-art cnn optimizer for brain tumor segmentation in magnetic resonance images,” Brain Sciences, vol. 10, no. 7, p. 427, 2020.
[16] Y. S. Chowdhury, R. Dasgupta, and S. Nanda, “Analysis of various optimizer on cnn model in the application of pneumonia detection,” in 2021 3rd International Conference on Signal Processing and Communication (ICPSC). IEEE, 2021, pp. 417–421.
[17] S. Bera and V. K. Shrivastava, “Analysis of various optimizers on deep convolutional neural network model in the application of hyperspectral remote sensing image classification,” International Journal of Remote Sensing, vol. 41, no. 7, pp. 2664–2683, 2020.
[18] I. P. Gustiah, N. Lutfiani, M. R. Kusuma, E. P. Lestari, and T. Green, “Grant submission monitoring system based on orange technology using laravel 12 and vue.js,” Journal of Orange Technology, vol. 1, no. 1, pp. 75–90, 2025.
[19] G. Perin and S. Picek, “On the influence of optimizers in deep learning-based side-channel analysis,” in International Conference on Selected Areas in Cryptography. Springer, 2020, pp. 615–636.
[20] R. Poojary and A. Pai, “Comparative study of model optimization techniques in fine-tuned cnn models,” in 2019 International Conference on Electrical and Computing Technologies and Applications (ICECTA). IEEE, 2019, pp. 1–4.
[21] S. Vani and T. M. Rao, “An experimental approach towards the performance assessment of various optimizers on convolutional neural network,” in 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI). IEEE, 2019, pp. 331–336.
[22] M. Yaqub, J. Feng, M. S. Zia, K. Arshid, K. Jia, Z. U. Rehman, and A. Mehmood, “State-of-the-art cnn optimizer for brain tumor segmentation in magnetic resonance images,” Brain Sciences, vol. 10, no. 7, p. 427, 2020.
[23] S. R. Dubey, S. Chakraborty, S. K. Roy, S. Mukherjee, S. K. Singh, and B. B. Chaudhuri, “diffgrad: An optimization method for convolutional neural networks,” IEEE Transactions on Neural Networks and Learning Systems, vol. 31, no. 11, pp. 4500–4511, 2019.
[24] A. M. Taqi, A. Awad, F. Al-Azzo, and M. Milanova, “The impact of multi-optimizers and data augmentation on tensorflow convolutional neural network performance,” in 2018 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR). IEEE, 2018, pp. 140–145.
[25] K. Thavasimani and N. K. Srinath, “Deep learning techniques: A case study on comparative analysis of various optimizers to detect bots from cresci-2017 dataset,” International Journal of Advanced Science and Technology, vol. 29, no. 4, pp. 10 040–10 053, 2020.
[26] P. Elangovan and M. K. Nath, “Performance analysis of optimizers for glaucoma diagnosis from fundus images using transfer learning,” in Machine Learning, Deep Learning and Computational Intelligence for Wireless Communication. Springer, 2021, pp. 507–518.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Vertika Agarwal, M.C Lohani, Qurotul Aini, Nesti Anggraini Santoso

This work is licensed under a Creative Commons Attribution 4.0 International License.




