An Asymptotic Analysis of Random Partition Based Minibatch Momentum Methods for Linear Regression Models
Professor Hansheng Wang
Head of Department of Business Statistics and Econometrics
Guanghua School of Management
Peking university
Momentum methods have been shown to accelerate the convergence of the standard gradient descent algorithm in practice and theory. In particular, the random partition based minibatch gradient descent methods with momentum (MGDM) are widely used to solve large-scale optimization problems with massive datasets. Despite the great popularity of the MGDM methods in practice, their theoretical properties are still underexplored. To this end, we investigate the theoretical properties of MGDM methods based on the linear regression models. We first study the numerical convergence properties of the MGDM algorithm and derive the conditions for faster numerical convergence rate. In addition, we explore the relationship between the statistical properties of the resulting MGDM estimator and the tuning parameters. Based on these theoretical findings, we give the conditions for the resulting estimator to achieve the optimal statistical efficiency. Finally, extensive numerical experiments are conducted to verify our theoretical results.
(Joint work with Yuan Gao, Xuening Zhu, Haobo Qi, Guodong Li and Riquan Zhang)