Abstract - The goal of information theory is to understand the limits of data compression and communication in the presence of noise. Traditionally in information theory literature it is common to study the underlying problems in the asymptotic setting, often assuming that the channel characteristics do not change over multiple use. The proofs appeal to typicality of sequences or subspaces: the empirical distribution of symbols in a long sequence of trials will with high probability be close to the true distribution. However, information theoretic arguments based on typicality assume that both the source and channel are stationary and/or ergodic (memoryless), assumptions that are not always valid.
In this thesis we study some information theoretic protocol in the one-shot setting where we are allowed to use the channel only once and are allowed only a small probability of error. Such results are more general, for one can always view a channel used repeatedly as a single channel with a larger alphabet, and recover the asymptotic bounds as a special case. The general one-shot results, while sometimes technically harder to show (for arguments based on typicality is no longer available), are often so strong that nothing is lost in deriving the asymptotic results from them. We discuss the following contributions in the talk:
1) One-shot bounds on the coding rates for source coding of correlated sources.
2) One-shot bounds on the reliability of information transmission over the quantum channel.
3) One-shot Marton inner bound for the classical-quantum broadcast channel.
4) One-shot bounds on the private capacity of the quantum channel.