Abstract:
In this work, we upper bound the generalization error of batch learning algorithms trained on samples drawn from a mixing stochastic process (i.e., a dependent data source) both in expectation and with high probability. Unlike previous results by Mohri et al. (JMLR 2010) and Fu et al. (ICLR 2023), our work does not require any additional stability assumptions on the batch learner itself. This is made possible due to our use of the Online-to-Batch ( OTB ) conversion framework, which allows us to shift the burden of stability from the batch learner to an artificially constructed online learner. We prove that our bounds are equal to the bounds in the i.i.d. setting (i.e., optimal) up to a term that depends on the decay rate of the underlying mixing stochastic process. Central to our analysis is a new notion of algorithmic stability for online learning algorithms based on Wasserstein distances of order one. Furthermore, we prove that the EWA algorithm, a textbook family of online learning algorithms, satisfies our new notion of stability.
Short Bio:
Sagnik Chatterjee recently obtained his Ph.D. from IIIT Delhi, where he was advised by Dr. Debajyoti Bera. Prior to his Ph.D., Sagnik was a software engineer at Oracle in Bangalore. Sagnik's main research interest is in Quantum Computing with a particular focus on Learning Theory and Shallow Circuit Complexity.