Incorporating Views in Mathematical Models: An Approach Based on Entropy

Speaker:
Santanu Dey School of Technology and Computer Science Tata Institute of Fundamental Research Homi Bhabha Road
Date:
Tuesday, 9 Feb 2010 (all day)
Venue:
A-212 (STCS Seminar Room)
Category:
Abstract
A mathematical model based on historical data or general past experience may at times be an unsatisfactory model for the future. One way to come up with a more accurate model is to explicitly incorporate in it views that are believed to better reflect the future. We address this issue by letting $\\mu$ denote the original probability measure of a mathematical model. We then search for a probability measure $\\nu$ that minimizes a distance measure with respect to $\\mu$ and satisfies certain user specified views or constraints. We consider Kullbach-Leibler distance as well as other $f$-divergences as measures of distance between probability measures. We show that under the KL distance, our optimization problem may lack a closed form solution when views involve fat tailed distributions. This drawback may be corrected if a ``polynomial-divergence is used. We also discuss the optimal solution structure under these divergences when the views include those on marginal probabilities associated with the original probability measure. We apply these results to the area of portfolio optimization where we note that a reasonable view that a particular portfolio of assets has heavy tailed losses leads to a more realistic model through our approach.