Stochastic Gradient Descent using Zero-Order Estimators with Reduced Estimation Bias

Organiser:
Raghuvansh Saxena
Date:
Tuesday, 2 Apr 2024, 16:00 to 17:00
Venue:
via Zoom in A201
Category:
Abstract

We present a new family of generalized simultaneous perturbation stochastic gradient estimators that estimate the gradient of an objective function using noisy function measurements, but where the number of function measurements and the form of the gradient estimator is guided by the desired estimator bias. Estimators with more function measurements are seen to result in lower bias. We sketch an asymptotic convergence argument under a constant gradient estimation parameter to the attractor of a limiting differential inclusion as well as provide a finite time bound on the mean-squared error under certain conditions. 

Short Bio:

Shalabh Bhatnagar received a B.Sc(Hons) in Physics (Delhi University, 1988), followed by a Master's and PhD in Electrical Engineering from IISc (1992/97). He was a Research Associate at the University of Maryland, College Park (1997-2000) and the Vrije Universiteit, Amsterdam (2000-01). He is with the CSA Department of IISc since December 2001 and is currently a Senior Professor. His interests are in stochastic approximation algorithms, reinforcement learning and stochastic optimization. He is a Fellow of all major science and engineering academies in India and is a J.C.Bose National Fellow.