Applied and Computational Engineering

- The Open Access Proceedings Series for Conferences


Proceedings of the 4th International Conference on Computing and Data Science (CONF-CDS 2022)

Series Vol. 2 , 22 March 2023


Open Access | Article

Research on SGD Algorithm Using Momentum Strategy

Liqi Xue * 1
1 College of Science, Shanghai University, Shanghai, China, 200444

* Author to whom correspondence should be addressed.

Applied and Computational Engineering, Vol. 2, 141-150
Published 22 March 2023. © 2023 The Author(s). Published by EWA Publishing
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Citation Liqi Xue. Research on SGD Algorithm Using Momentum Strategy. ACE (2023) Vol. 2: 141-150. DOI: 10.54254/2755-2721/2/20220622.

Abstract

With the continuous development of stochastic gradient descent algorithms, many efficient momentum algorithms have appeared. Stochastic gradient descent(SGD) is one of the classic algorithms in optimization. Its accelerated version, the SGD algorithm with momentum strategy, has been a hot research topic in recent years. Therefore, this paper will analyze and summarize these series of algorithms, starting with the classical momentum algorithm, and introduce some improved versions of the momentum algorithm. Numerical experiments on real problems will also be done to evaluate the performance of these algorithms. It is proved that the addition of momentum and adaptive learning rate effectively improve the performance of these algorithms. In future research, some cutting-edge momentum algorithms and other basic network should be analyzed.

Keywords

Machine Learning., Momentum Algorithms, Stochastic Gradient Descent, SGD

References

1. Robbins, H. , & Monro, S. . A stochastic approximation method. Annals of Mathematical Statistics, 22(3), 400-407. (1951).

2. Goodfellow, I., Bengio, Y., & Courville, A.Deep learning. Cambridge, MA: MIT press. (2016).

3. Jiarong, S., Dan, W., Fanhua, S. & Heyu, Z. Research progress of stochastic gradient descent algorithm. Acta Automatica Sinica (09), 2103-2119. (2021).

4. Polyak, B. T. Some methods of speeding up the convergence of iteration methods. Ussr computational mathematics and mathematical physics, 4(5), 1-17. (1964).

5. Nesterov, Y. . A method of solving a convex programming problem with convergence rate $O(1/k^2)$. Soviet Mathematics Doklady. (1983).

6. Jianzhi, H., Chengcheng, D., Wei, T. & Qing, T. Optimal individual convergence rate of Adam type algorithm in non-smooth convex case. CAAI Transactions on Intelligent Systems (06), 1140-1146. (2020).

7. Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. doi: https://arxiv.org/abs/1412.6980.

8. Reddi, S. J., Kale, S., & Kumar, S. (2019). On the convergence of adam and beyond. doi: https://arxiv.org/abs/1904.09237.

9. UCI Machine Learning Repository. https://archive-beta.ics.uci.edu/. 2022.

10. Tran, P. T.On the convergence proof of amsgrad and a new version. IEEE Access, 7, 61706-61716. (2019).

11. Loshchilov, I., & Hutter, F. Fixing weight decay regularization in adam. doi: https://openreview.net/forum?id=rk6qdGgCZ (2018)

12. Dozat, T. Incorporating nesterov momentum into adam. doi: https://openreview.net/forum?id=OM0jvwB8jIp57ZJjtNEZ (2016).

Data Availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. Authors who publish this series agree to the following terms:

1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.

2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.

3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open Access Instruction).

Volume Title
Proceedings of the 4th International Conference on Computing and Data Science (CONF-CDS 2022)
ISBN (Print)
978-1-915371-19-5
ISBN (Online)
978-1-915371-20-1
Published Date
22 March 2023
Series
Applied and Computational Engineering
ISSN (Print)
2755-2721
ISSN (Online)
2755-273X
DOI
10.54254/2755-2721/2/20220622
Copyright
22 March 2023
Open Access
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited

Copyright © 2023 EWA Publishing. Unless Otherwise Stated