Abstract: Regularity is a notion of “pseudorandomness” that allows one to decompose a given object into a collection of simpler objects which appear random according to certain statistics. The famous regularity lemma of Szemeredi [Sze75, Sze78] says that any dense graph can be partitioned into a collection of bounded number of “pseudorandom” bipartite graphs. The Szemeredi regularity lemma has numerous applications in combinatorics and property testing.
In a sequence of developments stemming from Gowers' proof of Szemeredi's theorem, Green and Tao introduced a notion of regularity for a collection of polynomials. Variants of these ideas were famously used by Green and Tao to prove that the primes contain arbitrarily long arithmetic progressions. Over finite fields, the theory extends previously used concepts in theoretical computer science, such as low-biased random variables and Fourier analysis over finite fields.
In this talk, we will survey recent developments in the area, loosely termed "higher-order Fourier analysis". In joint work with Fischer, H. Hatami, P. Hatami, and Lovett, we developed certain bounds for exponential sums of polynomials [BFL13, BFH+13] that are useful in the area of property testing. These results used a non-constructive form of the regularity lemma. Subsequently, in joint work with P. Hatami and Tulsiani [BHT13], we devised an algorithmic regularity lemma for polynomials. This can be used [B14] to give new polynomial time algorithms for problems such as factoring low-degree multivariate polynomials over prime order fields and deciding, for fixed $p, d$ and $r$, whether a given d-dimensional tensor over $F_p$ has rank at most $r$.