Speaker: |
Suhail Sherif |

Organiser: |
Arkadev Chattopadhyay |

Date: |
Monday, 17 May 2021, 17:00 to 18:00 |

Venue: |

(Scan to add to calendar)

1. In the context of randomized communication complexity and randomized parity decision tree (RPDT) complexity, we prove that the relevant computational measures and algebraic measures are not closely related. This disproves multiple long-standing conjectures in communication complexity.

2. We try to quantitatively strengthen the above result. We do manage to do so in the context of RPDTs, but there are challenges in translating this strengthening to the context of randomized communication complexity. We pose a fundamental conjecture that would imply that the strengthening holds in the communication world as well.

In the second part of the thesis, we look at convex optimization. The goal is to optimize a convex function in a bounded region when you can only learn about the function through function value and gradient queries. We want to use as few queries as possible. Projected gradient descent (PGD) is a well-known algorithm that optimally solves this task when the dimension of the function's domain is large. In the second part of the thesis we add to the optimality results of PGD.

3. (a) In the case of quantum algorithms it was open whether there is a quadratic speedup over PGD when the dimension is large. We show that this is not so, that PGD is optimal up to constant factors.

(b) In the case of randomized algorithms we show a simple argument bettering the range of dimensions where PGD is known to be optimal up to constant factors.

We end with a few open problems.

Join Zoom Meeting

https://zoom.us/j/93580606744?pwd=L1pCek1pSy91a0JOOXpXdjI0Z2EyQT09

Meeting ID: 935 8060 6744

Passcode: 335095