Great Ideas in Coding Theory: From Theory to Practice

Lalitha Vadlamani
Umang Bhaskar
Thursday, 24 Jun 2021, 18:00 to 19:00
In 1948, Claude Shannon wrote his landmark paper on "A mathematical theory of communication", which paved way to the field of information theory. He introduced the term entropy to measure information and also determined the fundamental limits of source compression and communication over a noisy channel (known as channel capacity). In his work, he proved the existence of codes which achieve the channel capacity. In 1950, Richard Hamming discovered the first-ever error correcting code which is single-error correcting. Coding theorists have contributed to designing codes which achieve the channel capacity in the decades to come. Low density parity check (LDPC) codes invented by Gallager in 1960s and rediscovered in 1990s proved to be very effective codes due to their low decoding complexity requirements at large block lengths. In 2009, Erdal Arikan proposed polar codes which are the first class of codes provably capacity achieving for a class of binary input discrete memoryless channels. There are several applications of coding theory, apart from channel coding, including storage devices, distributed storage systems etc. Reed-Solomon and BCH codes have been conventionally used in storage devices. More recently in 2012, locally repairable codes (LRC) were introduced by researchers in Microsoft to solve the repair problem in distributed storage systems.
Youtube link: