Volume 14 Issue 3  ·  ISSN: 2319-4863  ·  Monthly Publication editor@ijdacr.com
Home Archives Vol.14 No.3 (October 2025) Article

Research Article

Evolutionary Hyperparameter Tuning for Deep Learning Models

Ashutosh Kumar Singh

IJDACR Vol.14 No.3 (October 2025) ISSN 2319-4863 Open Access Peer Reviewed

Journal

International Journal of Digital Applications and Contemporary Research (IJDACR)

ISSN

2319-4863

Volume / Issue

Vol.14 · Issue 3

Published

October 2025

Access

Open Access

Licence

CC BY-NC-SA 4.0

Authors

Ashutosh Kumar Singh

Abstract

The performance of deep learning (DL) models is highly dependent on hyperparameter selection, including learning rate, network depth, dropout, and optimizer choice. Traditional methods such as manual tuning and grid search are computationally expensive and ineffective in high-dimensional, non-convex search spaces. This paper reviews Evolutionary Strategies (ES) as a population-based meta-heuristic approach for automated hyperparameter optimization (HPO) in DL. Inspired by biological evolution, ES iteratively refines candidate solutions through selection, crossover, and mutation to explore optimal regions of the search space. The study synthesizes ES methodologies alongside related optimization approaches such as Bayesian optimization, and presents key variants including Genetic Algorithms and Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Comparative insights highlight the advantages of ES over traditional HPO techniques like random search and gradient-based methods, particularly in complex and multimodal landscapes. Empirical evidence across applications—such as CNN-based image analysis, LSTM-based time-series prediction, and radiomics—demonstrates strong performance and robust exploration capabilities. However, challenges remain, including high computational cost, the need for parallelization, and efficient fitness design. Future directions include neuroevolution, multi-objective optimization, and applications in emerging domains such as finance, IoT, and multimodal AI.

Keywords

Evolutionary Strategies Hyperparameter Optimization Deep Learning Genetic Algorithms Meta-Heuristics Neural Architecture Search Automated Machine Learning (AutoML) Model Tuning.

How to Cite

Ashutosh Kumar Singh (2025). Evolutionary Hyperparameter Tuning for Deep Learning Models. International Journal of Digital Applications and Contemporary Research (IJDACR), Vol.14, Issue 3. ISSN: 2319-4863.

References

Full references are available in the PDF version of this paper.

Download Full Paper (PDF) →

Downloads

Full Text Access

Article Info

Journal IJDACR
Volume Vol. 14
Issue No. 3
Month October
Year 2025
ISSN 2319-4863
Access Open Access

Share This Paper

← Back to Vol.14 No.3 (October 2025) Submit Your Manuscript

Call for Submissions

Volume 14 Issue 4 — Manuscripts Currently Being Accepted

IJDACR accepts submissions on a rolling basis. Authors are advised to consult the preparation guidelines and scope documentation prior to submission.

Submissions are subject to editorial screening and peer review. Submission does not guarantee acceptance.

Submit Your Manuscript Call for Papers