ABSTRACT: In many different applications settings, it is of interest to numerically optimize a system over a set of decision variables. When the system contains uncertainty, it often is the case that either the objective function to be minimized and/or the constraints on the set of feasible decision variables will involve expectations of random variables. One possible approach to numerical optimization is to generate a random sample from which one can effectively create a global approximation to the objective function and constraints, and to numerically optimize the approximating surface using conventional optimization methods. This approach is known as "sample-average approximation" (SAA). Another alternative is to randomly sample points in the space of feasible decision variables, to conduct simulations at these points, and to approximate the optimal value on the basis of the best sample mean observed at these points. This "random search" approach has properties quite different from those associated with SAA. In this talk, we will discuss and contrast the performance of these two different families of algorithms, in the setting in which the sample size and/or the number of points sampled is large. Our discussion will point out the key role that smoothness of the objective function in the decision variable plays in these algorithms.
ABOUT THE SPEAKER : Peter W. Glynn also holds a courtesy appointment in the Department of Electrical Engineering. He was Director of Stanford's Institute for Computational and Mathematical Engineering from 2006 until 2010. He is a Fellow of INFORMS and a Fellow of the Institute of Mathematical Statistics, has been co-winner of the John von Neumann Theory Prize from INFORMS in 2010. In 2012, he was elected to the National Academy of Engineering. His research interests lie in simulation, computational probability, queueing theory, statistical inference for stochastic processes, and stochastic modeling.