Obtained a theoretically-optimal batch size per iteration for finite-sum mini-batch SGD for convex objectives. Analysed optimal per iteration v/s same optimal batch size for all iterations, establishing a theoretically optimal batch size framework for SGD with batch size increasing with error.Established batch size and step size equivalence for
convergence-diagnostic based adaptive step size and batch size control algorithms for SGD. Designed a new gradient norm based test for detecting SGD convergence. Tested batch size control and GradNorm test to obtain comparable performances.