CFP last date
20 May 2024
Reseach Article

A More Accurate Approach for Prediction using Gradient Descent

by Shweta Agrawal, Ravishek Kumar Singh
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 184 - Number 10
Year of Publication: 2022
Authors: Shweta Agrawal, Ravishek Kumar Singh
10.5120/ijca2022922036

Shweta Agrawal, Ravishek Kumar Singh . A More Accurate Approach for Prediction using Gradient Descent. International Journal of Computer Applications. 184, 10 ( Apr 2022), 18-22. DOI=10.5120/ijca2022922036

@article{ 10.5120/ijca2022922036,
author = { Shweta Agrawal, Ravishek Kumar Singh },
title = { A More Accurate Approach for Prediction using Gradient Descent },
journal = { International Journal of Computer Applications },
issue_date = { Apr 2022 },
volume = { 184 },
number = { 10 },
month = { Apr },
year = { 2022 },
issn = { 0975-8887 },
pages = { 18-22 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume184/number10/32362-2022922036/ },
doi = { 10.5120/ijca2022922036 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:21:06.497486+05:30
%A Shweta Agrawal
%A Ravishek Kumar Singh
%T A More Accurate Approach for Prediction using Gradient Descent
%J International Journal of Computer Applications
%@ 0975-8887
%V 184
%N 10
%P 18-22
%D 2022
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Accuracy is one of the most important concerned when we are dealing the problem related to machine learning. Artificial intelligence of also one of the most popular and emerging field which used optimization to improve accuracy. Gradient descent is a new and most power techniques which integrate machine learning and AI to achieve optimization. There are so many techniques are available for optimization and Gradient descent  is one of then it is a kind of an iterative algorithm and most used for discovering the minimum cost function. This technique help to take efficiently and successfully decisions, by the use of derivatives. Derivative is useful when we want to calculate slope of the graph from a particular point. The value of the slope is defined by representation a tangent line to the graph at the point. In this technique we have to calculate this tangent line, with help of this we are able to calculate and decide the direction to reach the minima. It is the best techniques for optimization in machine learning. It is based on first-order optimization. This technique used a objective function, in which we have to update parameter in the reverse direction for every iteration. In the paper apply this optimization techniques to find better linear regression line and try to fit as best regression line for given data set. Our objective is to reduce SSE error and improve cost function value.

References
  1. Simon S. Du , Jason D. Lee and Yuandong Tian Barnab´as P´oczos 1 Aarti Singh Gradient Descent Learns One-hidden-layer CNN: Don’t be Afraid of Spurious Local Minima Proceedings of the 35 th International Conference on Machine Learning, Stockholm, Sweden, PMLR 80, 2018. Copyright 2018.
  2. Shadi Diab, Al-Quds Optimizing Stochastic Gradient Descent in Text Classification Based on Fine-Tuning Hyper-Parameters Approach International Journal of Computer Science and Information Security (IJCSIS), Vol. 16, No. 12, December 2018.
  3. Lala Septem Riza, Dendi Handian, Rani Megasari, Ade Gafar Abdullah, Asep Bayu Dani Nandiyanto, Shah Nazir4 Development Of R Package And Experimental Analysis On Prediction Of The Co2 Compressibility Factor Using Gradient Descent Journal of Engineering Science and Technology Vol. 13, No. 8 (2018) 2342 - 2351 © School of Engineering, Taylor’s University.
  4. Nan Cui1, Applying Gradient Descent in Convolutional Neural Networks 1Electrical and Computer Engineering, University of Massachusetts Lowell, Lowell,01854, United States IOP Conf. Series: Journal of Physics: Conf. Series 1004 (2018) 012027 doi :10.1088/1742-6596/1004/1/012027.
  5. Jiawei Zhang Gradient Descent based Optimization Algorithms for Deep Learning Models Training Founder and Director Information Fusion and Mining Laboratory (First Version: February 2019; Revision: March 2019.) arXiv:1903.03614v1 [cs.LG] 11 Mar 2019.
  6. Dr. Yvonne W. Karanja Review On Gradient Descent Algorithms In Machine Learning Electronic copy available at: https://ssrn.com/abstract=3810947 .
  7. Christian L. Thunberg Niklas Mannerskog Stochastic Gradient Descent in Machine Learning Degree Project in Technology, First Cycle, 15 Credits Stockholm, Sweden 2019 ,
  8. Dmitrii Marin Meng Tang Ismail Ben Ayed Beyond Gradient Descent for Regularized Segmentation Losses 1https://github.com/dmitrii-marin/adm-seg 2In this paper, “shallow” refers to methods unrelated to deep learning 2019.
  9. Yanli Liu_ Yuan Gaoy Wotao Yin An Improved Analysis of Stochastic Gradient Descent with Momentum 34th Conference on Neural Information Processing Systems (NeurIPS 2020), Vancouver, Canada.
  10. Karen E. Walker, Walker Consulting LLC Chandler Arizona Gradient Descent Using SAS™ for Gradient Boosting PharmaSUG 2020 - Paper XX-25.
  11. Babacar Gaye, Dezheng Zhang and Aziguli Wulamu Sentiment classification for employees reviews using regression vectorstochasticgradient descent classifier (RVSGDC) Babacar Gaye, Dezheng Zhang and Aziguli Wulamu 2021. Sentiment classification for employees reviews using regression vector-stochastic gradient descent classifier (RV-SGDC). PeerJ Comput. Sci. 7:e712.
  12. Chuanlei Zhang , Minda Yao , Wei Chen , Shanwen Zhang , Dufeng Chen,and Yuliang Wu6 “Gradient Descent Optimization in Deep Learning Model Training Based on Multistage and Method Combination Strategy” Accepted 2 July 2021; Published 23 July 2021 Academic Editor: Chi-Hua Chen Copyright © 2021 Chuanlei Zhang.
  13. Peshawa Jamal Muhammad Ali and Haval Abdulkarim Ahmed Gradient Descent Algorithm: Case Study Machine Learning Technical Reports is a periodical technical report published by the Machine Learning Laboratory Koya 44023, Erbil, F.R. of Iraq
Index Terms

Computer Science
Information Sciences

Keywords

Keywords — Machine Learning Linear Regression Optimization Accuracy Efficiently