UGC Approved Journal no 63975(19)
New UGC Peer-Reviewed Rules

ISSN: 2349-5162 | ESTD Year : 2014
Volume 12 | Issue 9 | September 2025

JETIREXPLORE- Search Thousands of research papers



WhatsApp Contact
Click Here

Published in:

Volume 11 Issue 11
November-2024
eISSN: 2349-5162

UGC and ISSN approved 7.95 impact factor UGC Approved Journal no 63975

7.95 impact factor calculated by Google scholar

Unique Identifier

Published Paper ID:
JETIR2411396


Registration ID:
551134

Page Number

d823-d828

Share This Article


Jetir RMS

Title

Mathematical approaches to gradient descent and its variants in Machine Learning

Abstract

Gradient Descent (GD) is a foundational optimization technique widely used in machine learning for minimizing objective functions, particularly in training neural networks and other models. This paper provides a mathematical exploration of the gradient descent algorithm, covering its basic principles and essential variants, such as Stochastic Gradient Descent (SGD), Mini-Batch Gradient Descent, Momentum-based methods, AdaGrad, RMSprop, and Adam. By examining each variant’s mathematical formulation, convergence properties, and practical impact on optimization speed and accuracy, we highlight the trade-offs involved in their application. The comparative analysis reveals how these variants address limitations of traditional gradient descent, such as slow convergence and sensitivity to local minima, thereby enhancing model performance and efficiency in large-scale and complex learning tasks. This paper aims to equip machine learning practitioners with insights into choosing appropriate optimization strategies based on theoretical understanding, problem requirements, and empirical considerations.

Key Words

Gradient Descent, Machine Learning, Optimization, SGD, Mini-Batch, Convergence, Momentum, AdaGrad, RMSprop, Adam, Neural Networks, Loss Minimization, Learning Rate, Large-scale, Non-convex.

Cite This Article

"Mathematical approaches to gradient descent and its variants in Machine Learning", International Journal of Emerging Technologies and Innovative Research (www.jetir.org), ISSN:2349-5162, Vol.11, Issue 11, page no.d823-d828, November-2024, Available :http://www.jetir.org/papers/JETIR2411396.pdf

ISSN


2349-5162 | Impact Factor 7.95 Calculate by Google Scholar

An International Scholarly Open Access Journal, Peer-Reviewed, Refereed Journal Impact Factor 7.95 Calculate by Google Scholar and Semantic Scholar | AI-Powered Research Tool, Multidisciplinary, Monthly, Multilanguage Journal Indexing in All Major Database & Metadata, Citation Generator

Cite This Article

"Mathematical approaches to gradient descent and its variants in Machine Learning", International Journal of Emerging Technologies and Innovative Research (www.jetir.org | UGC and issn Approved), ISSN:2349-5162, Vol.11, Issue 11, page no. ppd823-d828, November-2024, Available at : http://www.jetir.org/papers/JETIR2411396.pdf

Publication Details

Published Paper ID: JETIR2411396
Registration ID: 551134
Published In: Volume 11 | Issue 11 | Year November-2024
DOI (Digital Object Identifier):
Page No: d823-d828
Country: Pune, maharashtra, India .
Area: Science & Technology
ISSN Number: 2349-5162
Publisher: IJ Publication


Preview This Article


Downlaod

Click here for Article Preview

Download PDF

Downloads

000258

Print This Page

Current Call For Paper

Jetir RMS