UGC Approved Journal no 63975(19)
New UGC Peer-Reviewed Rules

ISSN: 2349-5162 | ESTD Year : 2014
Volume 13 | Issue 3 | March 2026

JETIREXPLORE- Search Thousands of research papers



WhatsApp Contact
Click Here

Published in:

Volume 6 Issue 5
May-2019
eISSN: 2349-5162

UGC and ISSN approved 7.95 impact factor UGC Approved Journal no 63975

7.95 impact factor calculated by Google scholar

Unique Identifier

Published Paper ID:
JETIR1905Z33


Registration ID:
557301

Page Number

295-302

Share This Article


Jetir RMS

Title

An Automated Hyperparameter Tuning Approach for Optimizing Machine Learning Model Performance

Abstract

A crucial step in the creation of machine learning models, hyperparameter tweaking has a big influence on computing efficiency, model generalization, and prediction performance. Large-scale machine learning systems cannot benefit from the time-consuming and computationally costly nature of traditional manual and grid search-based approaches for hyperparameter tuning. In order to systematically improve model performance, this study proposes an automated hyperparameter tuning method that makes use of cutting-edge optimization approaches including Bayesian Optimization, Genetic Algorithms, and Reinforcement Learning. In order to dynamically improve hyperparameter selection based on real-time input and assessment metrics, the suggested technique integrates adaptive learning methodologies. The study compares the efficacy, scalability, and efficiency of several cutting-edge hyperparameter tuning frameworks, such as Optuna, Hyperopt, and AutoML, in a variety of machine learning models, including support vector machines, gradient boosting machines, and deep neural networks. Additionally, we provide a comparative analysis of the effects of hyperparameter tweaking on various datasets and machine learning tasks, demonstrating the gains made in model resilience, accuracy, and training time. Empirical findings show that automated hyperparameter tuning maximizes resource consumption by eliminating pointless calculations while also outperforming conventional methods in terms of accuracy. Along with possible solutions, the discussion covers issues related to automated tuning, such as algorithm-specific restrictions, computational complexity, and overfitting concerns. The report's conclusion offers several suggestions for future research, such as including meta-learning strategies, tuning based on reinforcement learning, and the use of quantum computing to hyperparameter optimization.

Key Words

Bayesian optimization, genetic algorithms, reinforcement learning, model performance, autoML, neural networks, gradient boosting, hyperparameter search, computational efficiency, meta-learning, quantum computing, overfitting mitigation, scalable artificial intelligence, and automated machine learning

Cite This Article

"An Automated Hyperparameter Tuning Approach for Optimizing Machine Learning Model Performance", International Journal of Emerging Technologies and Innovative Research (www.jetir.org), ISSN:2349-5162, Vol.6, Issue 5, page no.295-302, May-2019, Available :http://www.jetir.org/papers/JETIR1905Z33.pdf

ISSN


2349-5162 | Impact Factor 7.95 Calculate by Google Scholar

An International Scholarly Open Access Journal, Peer-Reviewed, Refereed Journal Impact Factor 7.95 Calculate by Google Scholar and Semantic Scholar | AI-Powered Research Tool, Multidisciplinary, Monthly, Multilanguage Journal Indexing in All Major Database & Metadata, Citation Generator

Cite This Article

"An Automated Hyperparameter Tuning Approach for Optimizing Machine Learning Model Performance", International Journal of Emerging Technologies and Innovative Research (www.jetir.org | UGC and issn Approved), ISSN:2349-5162, Vol.6, Issue 5, page no. pp295-302, May-2019, Available at : http://www.jetir.org/papers/JETIR1905Z33.pdf

Publication Details

Published Paper ID: JETIR1905Z33
Registration ID: 557301
Published In: Volume 6 | Issue 5 | Year May-2019
DOI (Digital Object Identifier):
Page No: 295-302
Country: -, -, India .
Area: Engineering
ISSN Number: 2349-5162
Publisher: IJ Publication


Preview This Article


Downlaod

Click here for Article Preview

Download PDF

Downloads

00092

Print This Page

Current Call For Paper

Jetir RMS