Accelerated Convergence of Stochastic Gradient Descent under Interpolation
The authors prove new convergence rates for a generalized version of stochastic Nesterov acceleration under interpolation conditions. Their approach accelerates any stochastic gradient method that makes sufficient progress in expectation, and the proof applies to both convex and strongly convex functions.