Note: This outline is a work in progress. I will be adjusting topics between now and the time of the workshop, and would appreciate feedback.
As for prerequisities: I plan to develop material ``from scratch'' but of course it will be helpful to have some prior exposure. For information theory, I suggest reviewing the basic definitions and properties of entropy, conditional entropy, mutual information, Kullback-Liebler divergence, and Shannon's noiseless coding theorem. There are many sources for this material; one good one is chapter 2 of the book "Elements of Information Theory" by Cover and Thomas. It would also be good to have some familiarity (which I expect most participants do), with Communcation complexity such as the first three chapters of the Kushilevitz-Nisan book "Communication Complexity".
For those ambitious enough to read some of the papers below in advance of the workshop, I have marked with a (**) those papers I think would be most useful to read in advance.
I. Communication complexity
II. Data structure lower bounds
- Lower bounds for communication complexity and streaming:
- (**) Z. Bar-Yossef, T.S. Jayram, R. Kumar, D. Sivakumar, An information statistics approach to data stream and communication complexity, JCSS 68 (2004) 702-732.
- Z. Bar-Yossef, T.S. Jayram, R. Kumar, and D. Sivakumar, Information theory methods in communication complexity.
- T.S. Jayram, Hellinger strikes back: A note on the multi-party information complexity of AND.
- T.S. Jayram and D.P. Woodruff, The data stream space complexity of cascaded norms
- A. Andoni, T. S. Jayram, M. Patrascu. Lower Bounds for Edit Distance and Product Metrics via Poincaré-Type Inequalities. SODA'2010. pp.184-192.
- N. Leonardos, M. Saks, Lower Bounds on the Randomized Communication Complexity of Read-Once Functions. Computational Complexity 19(2010), 153-181.
- Compressing interactive computation
- (**) B. Barak, M. Braverman, X. Chen and A. Rao, How to compress interactive communication.
- M. Braverman and A. Rao, Information equals amortized computation.
- Direct product theorems for communication complexity
- B. Barak, M. Braverman, X. Chen and A. Rao, How to compress interactive communication.
- R. Jain, Strong direct product conjecture holds for all relations in public coin randomized one-way communication complexity.
- Noisy broadcast
N.Goyal, G. Kindler, M. Saks, Lower Bounds for the Noisy Broadcast Problem. SIAM J. Comput. 37(2008), 1806-1841.
III.Sample complexity of approximation algorithms
- A. Chakrabarti, O. Regev, An optimal randomised cell probe lower bound for approximate nearest neighbor searching.
- (**) M. Patrascu, Unifying the landscape of cell-probe lower bounds.
IV. Graph Entropy
- (**) Z. Bar-Yossef, R. Kumar, D. Sivakumar, Sampling algorithms: Lower bounds and algorithms.
- Z. Bar-Yossef, Sampling lower bounds via information theory.
V. Comparison searching and sorting
- Basic notions and alternative formulations; connection to perfect graphs
- (**) G. Simonyi, Graph Entropy: A survey, DIMACS Series in Discrete Mathematics and Theoretical Computer Science.
- I. Csisz\'ar, J. K\"orner, L. Lov\'asz, K. Marton, and G. Simonyi, Entropy Splitting for Antiblocking Corners and Perfect Graphs, Combinatorica 10 (1) (1990) 27-40.
- J. K\"orner, Coding of an information source having ambiguous alphabet and the entropy of graphs, in Transactions of the 6th Prague Conference on Informati on Theory, etc. Academia, Prague, 1973, 411-425.
- N. Alon and A. Orlitsky, Source coding and graph entropies,
- Perfect hashing families, formula size and circuit size lower bounds
- G. Simonyi, Graph Entropy: A survey, DIMACS Series in Discrete Mathematics
- J. K\"orner, Fredman-Koml\'os bounds and information theory, SIAM J. on Alg ebraic and Discrete Methods 7 (1986) 560-570.
- I. Newman, P. Ragde, and A. Wigderson, Perfect hasing Graph Entropy, and Circuit Complexity. Structure in Complexity Theory Conference 1990: 91-99
- Jaikumar Radhakrishnan: Better Lower Bounds for Monotone Threshold Formulas. J. Comput. Syst. Sci. 54(2): 221-226 (1997)/ol>
VI. Other applications in (discrete) mathematics
- J. Kahn and J.H. Kim, Entropy and sorting, J. Comput. Syst. Sci. 51 (1995) 390-399.
- J. Cardinal, S. Fiorini, G. Joret, R.M. Jungers and I. Munro, An efficient algorithm for partial order production, SIAM J. Computing 10 (2010) 2927-2940.
- J. Cardinal, S. Fiorini, G. Joret, R. M. Jungers, and J. I. Munro. Sorting under Partial Information (without the Ellipsoid Algorithm). In Proc. ACM Symposium on Theory of Computing (STOC), pages 359-368, 2010. ACM.
N. Linial and M. Saks, Every poset has a central element, JCT A 40 (1985) 195-210. (Note: This paper has no explicit information theory, I plan to Present an unpublished information theoretic version of the proof Suggested by James Shearer)
- Inequalities via information theory: Cauchy-Schwartz, Holder, Bregman, Shearer, hypercontractive inequalities, concentration bounds
- Ehud Friedgut, Hypergraphs, Entropy and Inequalities. The American Mathematical Monthly, 111 (2004), no. 9, 749--760..
- E. Friedgut and V. Rödl, Proof of a Hypercontractive Estimate via Entropy Israel J. Math. 125 (2001), 369--380 .
- J. Radhakrishnan, Entropy and Counting
- R. Impagliazzo, V. Kabanets, Constructive proofs of concentration bounds
- Approximate counting
- J. Kahn, Entropy, independent sets and antichains: a new approach to Dedekind's problem, Proc. of the AMS 130 (2) (2002), 371-378.