University of California at Berkeley
Electrical Engineering and Computer Sciences
271 Coray Hall
Berkeley, CA 94720
United States of America
Abstract: Shannon's entropy power inequality characterizes the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies.
Since the pioneering work of Shannon, there has been a steady stream of results over the years, trying to develop parallels to Shannon's entropy power inequality in other scenarios, such as for discrete random variables, and point processes.
We will survey this landscape in this talk. We will present old results, new results, and share some speculation about how to prove new kinds of entropy power inequalities.