Nexus of Information and Computation Theories IHP Spring 2016 Thematic Program
Scientific Background
From its genesis in Claude Shannon’s seminal paper [10], information theory has provided the foundation for a powerful theory of communication. Most of the early efforts in information theory focused on developing theoretical bounds for reliable communication of a single message or efficient compression of a single data source. Typical questions the theory tried to answer were ‘How much information can we store on a given support?’ or ‘How much information can we transmit across a given channel?’. Over time, research efforts have turned towards a broader class of problems, many of which involve communication within a network of users. In turn, these problems lead to an intrinsic coupling of the issues of computation and communication. For instance, a widely studied question is ‘How much infomation can be supported by a given network?’. A network is typically composed of nodes that can simply pass on their received information but also process this information, i.e. , compute a function of the incoming data, before transmission. Thus, in addition to the classical error correction problems where a certain message needs to be transmitted from point A to point B reliably, optimal network communication strategies involve also finding optimal data processing (computation) at the relay nodes.
In parallel, research on the theory of computation has aimed at establishing bounds on a variety of computational problems and tasks. A variety of mathematical tools have been developed to classify computational problems into complexity classes such as \(\mathbf{P}\) and \(\mathbf{NP}\), and to understand the relationships between these problems [2]. In addition, various restricted computation models, such as the streaming model or the cell-probe model, which aim to capture the diverse aspects of practical computation have been developed. While most conjectures about the computational hardness of problems remain unanswered today, hinging on big open problems such as \(\mathbf{P} \stackrel{?}{=} \mathbf{NP}\), results in these restricted models of computation can often be proved unconditionally. One of the key technical tools in these proofs is that of communication complexity [6], which is closely related to problems studied in information theory.
As it appears, we are in the midst of a partial convergence between the theories of information and computation, both in the sense of problems and analytical techniques.There is now a unique opportunity to bridge these two communities, encourage the sharing of ideas, and equip the next generation of researchers with tools to attack problems in both fields. Below, we discuss recent progress and outstanding challenges in some core topics, around which our thematic program is structured.
Computation appears in quite a few interesting contexts within the broad problem of communication over networks. One natural question is how many bits must be exchanged by two (or more) users to compute a given function of their data. This problem is known as communication complexity in the computer science literature, distributed functional source coding in the information theory literature [8], and remains open in general. Its study involves quantities such as graph entropy (which also plays an important role in zero-error communication). It should be noted that the various formulations of this problem ignore complexity issues at both the transmitters and the receivers. Including a complexity component into these models would help understand tradeoffs between performance and complexity of communication protocols.
An intriguing recent discovery is the implicit role of computation in multi-user communication, i.e. , network coding. As first shown by Ahlswede et al. [1], intermediate nodes in a network must send functions of their received messages in order to approach the capacity limits set out by the max-flow min-cut theorem. This insight continues to hold in the wireless context, especially when the interference between simultaneous transmissions is a major bottleneck. Very recently, it has been shown that interference can itself be harnessed for communication and computation, through the use of certain algebraic structures, such as nested lattices [12] and common invariant subspaces [5], as well as ideas from Diophantine approximation [7].
A great deal of effort has gone into developing efficient algorithms that can attain the performance promised by information-theoretic capacity bounds. The state-of-the-art in this endeavor is the class of ‘codes on graphs’ (e.g., low-density parity-check codes) coupled with message-passing algorithms (e.g., belief propagation) [3, 9]. Part of the analysis of these algorithms uses ideas from statistical physics (e.g., Ising models) and distributed computation (e.g., gossip algorithms).
Within the theory of computation, much success has been attained in proving unconditional tight bounds on the complexity of problems within restricted models of computation. One such example is the streaming model, which models scenarios where the amount of data greatly exceeds the amount of memory available, such as in Internet routers. In the streaming model, the computer only gets a single pass over the data, before computing the output, and it does not have sufficient memory to store the entire input. Other examples include the cell-probe model for the complexity of implementing abstract data structures, the layout model for VLSI circuit design, and bounded-depth circuits which model extremely parallelized computation. Several technical tools have been developed for proving unconditional lower bounds. One of the most notable tools is that of communication complexity [6], which measures the amount of communication needed to compute a function whose inputs are distributed over a number of players. Lower bounds in communication complexity imply lower bounds in the aforementioned models of computation. Information-theoretic tools are instrumental in understanding communication complexity and proving upper and lower bounds. Information theory has also found several other important applications in theoretical computer science, most notably perhaps in combinatorics.
Privacy and secrecy has garnered signicant recent interest from both the information theory and computer science communities. While the first formal results can be found in Shannon’s paper on secrecy [11], much of the later groundwork was laid by cryptographers within the computer science community using a different toolkit [4]. More recently, information-theoretic tools have re-emerged as a key method to prove secrecy bounds. Concurrently, the basic results of information-theoretic secrecy have been generalized to several interesting new problems, including secure multi-party computation and secret key generation.
Overall, the time is ripe to bring these two communities together to share mathematical tools, develop a shared set of open problems and conceptual challenges, and break down the notational and language barriers that separate research topics.
References
[1] R. Ahlswede, N. Cai, R. Li, and R.W. Yeung, Network information flow, IEEE Transactions on Information Theory 46 (2000), no. 4, pp. 1204–1216.
[2] S. Arora and B. Barak, Computation complexity: A modern approach, Cambridge University Press, 2009.
[3] R. G. Gallager, Low density parity check codes, MIT Press, Cambridge, MA, 1963.
[4] Oded Goldreich, Foundations of cryptography, vol. 1 and 2, Cambridge University Press, Cambridge, UK, 2004.
[5] Syed A. Jafar, Interference alignment – a new look at signal dimensions in a communication network, Foundations and Trends in Communications and Information Theory, vol. 7, NOW Publishers, 2011, pp. 1–134.
[6] E. Kushilevitz and N. Nisan, Communication complexity, Cambridge University Press, 1997.
[7] A. S. Motahari, S. O. Gharan, Mohammad-Ali Maddah-Ali, and Amir Keyvan Khandani, Real interference alignment: Exploiting the potential of single antenna systems, IEEE Transactions on Information Theory 60 (2014), no. 8, pp. 4799-4810.
[8] Alon Orlitsky and James R. Roche, Coding for computing, IEEE Transactions on Information Theory 47 (2001), no. 3, 903–917.
[9] T. Richardson and R. Urbanke, Modern coding techniques, Cambridge University Press, 2008.
[10] C. E. Shannon, A mathematical theory of communication, Bell Sys. Tech. J. 27 (1948), 379–423.
[11] Claude E. Shannon, Communication theory of secrecy systems, Bell System Tech- nical Journal 28 (1949), no. 4, 656–715.
[12] Ram Zamir, Lattices are everywhere, Proceedings of the 4th Annual Workshop on Information Theory and its Applications (ITA 2009) (La Jolla, CA), February 2009
|