Communication Complexity and Quantum Machine Learning via Query Complexity

Speaker: 

Time: 

Monday, 24 August 2020, 10:30 to 11:30

Venue: 

  • Zoom link for the meeting. https://zoom.us/j/96554737206?pwd=RzBsVE56TnV2dTZ0dlNmaFp5MFRzZz09

Query algorithms are everywhere. Gradient Descent is a well-known algorithm that queries the gradient of a hidden function and moves towards the minimum of the function. Shor's factoring algorithm is based on a quantum query algorithm for period finding. Many interesting computational tasks are indeed query tasks. However, the study of query complexity of total functions also leads us to interesting mathematical insights, such as the degree of a function being polynomially related to its query complexity.

In this thesis we show a few related results.

1. In the context of randomized communication complexity and randomized parity decision tree (RPDT) complexity, we prove that the analogues of the mathematical insight stated above do not hold. This disproves multiple long-standing conjectures in communication complexity.

2. Our results above originated in the RPDT world and were then translated to the communication world. However, we also have stronger results in the RPDT world that we could not translate yet. We pose a fundamental conjecture that would imply that the stronger results do hold in the communication world as well.

3. We analyze the query complexity of minimizing a convex function in a bounded region when given access to function value and gradient oracles. Projected Gradient Descent is known to be optimal for deterministic and randomized query algorithms (although the randomized lower bound is for a limited range of parameters). We provide a randomized lower bound that works for the optimal range of parameters. We also prove that projected gradient descent is optimal among quantum query algorithms.