Abstract: In a coordination problem, users in a network observing correlated inputs collaborate to evaluate possibly randomized functions of the inputs. Problems of this kind have been widely studied in the information theory literature. It finds applications in several diverse areas such as in parallel processing, cooperative game theory, distributed control, function computation in networks. The focus of many of the works in the literature has been on the amount of communication needed to achieve coordination. In practice, several other aspects are also of interest such as the amount and form of shared/correlated randomness available, topology of the network used, and security.
In this talk, we present a systematic study of various such aspects, namely, shared randomness, security, and interaction in addition to the amount of communication needed. To this end, the first problem we study is a distributed sampling problem where a set of processors want to output correlated sequences of random variables with the help of a coordinator which has access to several independent sources of randomness and each processor has access to a subset of these sources. We characterize optimal communication and/or shared randomness rates in various cases of this setting. In the second problem, we study interactive secure function computation. The privacy requirement is that the communication should not reveal to either user any extra information about the other user’s input and output other than what can be inferred from the user’s own input and output. We give single-letter expressions for the asymptotic rate regions. Further, we analyse the role of common randomness and interaction. In secure function computation, in practical scenarios, it might be reasonable to give away certain amount of information about the inputs but not about some specific functions of the inputs which the users want to keep private. We study such settings also.