Scalable and Practical Discrete Optimization for Big Data

Speaker:
Rishabh Iyer
Organiser:
Himanshu Asnani
Date:
Tuesday, 24 Sep 2019, 14:30 to 15:30
Venue:
A-201 (STCS Seminar Room)
Category:
Abstract
Abstract: Data has been growing at an unprecedented rate in the last few years. While this massive data is a blessing to data science by helping improve predictive accuracy, it is also a curse since humans are unable to consume this large amount of data. Moreover, majority of this data is plagued with redundancy. In this talk, I will present a powerful modeling abstraction which formulates these problems as a special class of combinatorial optimization called submodular optimization. I shall consider two applications of this paradigm, one in summarizing massive data for human consumption, and another in making machine learning models and training processes more efficient. I shall also present a unified discrete gradient based optimization framework for solving a large class of these optimization problems, which not only obtains good theoretical guarantees but is also easy to implement and scales well with large datasets. In addition to describing the underlying algorithmic advances, I will describe its impact on several several concrete applications including visual data summarization, data subset selection, data partitioning and diversified active learning. Finally, I will describe a comprehensive, highly optimized
discrete optimization package I developed (with my colleagues at UW) which implements most state-of-the-art submodular optimization algorithms, and includes several implementation techniques to scale to large datasets.
Bio: Rishabh Iyer is currently a Research Scientist at Microsoft, where he works on several problems around Online Learning, Contextual Bandits, Reinforcement Learning and Discrete Optimization with applications to Computational Advertisement and Computer Vision. During his time at Microsoft, several of his algorithms and innovations have been shipped in Bing Ads platform substantially improving the efficiency of the system and Revenue. He finished his PostDoc and PhD from the University of Washington, Seattle. His work has received best paper awards at ICML and NIPS. He also won the Microsoft PhD fellowship and Facebook PhD Fellowship (declined in the favor of Microsoft) in 2014, along with the Yang Outstanding Doctoral Student Award from University of Washington. He has been a visitor at Microsoft Research, Redmond and Simon Fraser University.
He has worked on several aspects of Machine Learning including Discrete and Convex optimization, deep learning, video/image summarization, diversified active learning and data subset selection, online learning etc.