BEGIN:VCALENDAR
PRODID:-//eluceo/ical//2.0/EN
VERSION:2.0
CALSCALE:GREGORIAN
BEGIN:VEVENT
UID:www.tcs.tifr.res.in/event/1301
DTSTAMP:20230914T125958Z
SUMMARY:Stochastic Approximations of Sampling Algorithms
DESCRIPTION:Speaker: Dheeraj Nagaraj (Google AI\, Bangalore)\n\nAbstract: \
nWe consider stochastic approximations of sampling algorithms like Langevi
n Monte Carlo (pathwise approximation via random batches) and Stein Variat
ional Gradient Descent (approximation in the space of distributions). Thes
e algorithms are heavily deployed in Bayesian inference\, and the physical
sciences.\nWe first consider pathwise approximation in Stochastic Gradien
t Langevin Dynamics (SGLD)\, we show that the noise induced by the random
batches is approximately Gaussian (due to the Central Limit Theorem) while
the Brownian motion driving the algorithm is exactly Gaussian. We utili
ze this structure to provide improved guarantees for sampling algorithms u
nder significantly weaker assumptions. We then propose covariance correcti
on\, which rescales the brownian motion to approximately remove the random
batch error. We show that covariance corrected algorithms enjoy even bett
er convergence.\nWe then consider stochastic approximation in the space of
probability distributions to obtain a new particle discretization of Stei
n Variational Gradient Descent (SVGD)\, an interacting particle based samp
ling algorithm. We introduce and analyze Virtual Particle SVGD (VP-SVGD)\,
which enjoys provably rapid convergence to the target. Our rates provide
a double exponential improvement over the prior state of the art convergen
ce results for SVGD under mild conditions\, giving us the first provably f
ast variant of SVGD.\nBased on joint work with Aniket Das (Google) and Ana
nt Raj (INRIA and UIUC)\n
URL:https://www.tcs.tifr.res.in/web/events/1301
DTSTART;TZID=Asia/Kolkata:20230608T110000
DTEND;TZID=Asia/Kolkata:20230608T120000
LOCATION:A201
END:VEVENT
END:VCALENDAR