Information Theory and Statistics Reading Group
This group met from Fall 2018 to Winter 2019.
Winter Quarter
- [03/05/2019] Tong Yitang presents on Lecture 71. Topics include maximum entropy density estimation, information projection, and the duality of maximum entropy with maximum likelihood estimation.
- [02/26/2019] David Weber presents on Lecture 61. Topics include graphical models.
- [02/19/2019] David Weber and Dmitry Shemetov present on Lecture 61. Topics include mutual information estimators and applications of mutual information to machine learning.
- [02/12/2019] Dmitry Shemetov presents on Lecture 4 and 51. Topics include properties of submodular functions, maximizing submodular funcions, and an information theoretic clustering algorithm2.
- [01/29/2019] Dmitry Shemetov presents on Lecture 31. Topics include the data processing inequality, submodular functions, submodularity of mutual information, maximizing submodular functions, and an application of sensor placement3.
- [01/22/2019] David Weber presents on the challenges of estimating mutual information4 5.
- [01/15/2019] Cong Xu presents on “Chapter 5: Data Compression”6.
Fall Quarter
- [12/13/2018] Dmitry Shemetov presents an introduction to some work by Katalin Marton7: the blowing up lemma8, single-letter characterization, Ornstein’s copying lemma, and Talagrand’s theorem.
- [12/07/2018] David Weber lays the foundations with “Chapter 8: Differential Entropy”6 and then presents on “Estimating Mutual Information”4.
- [11/29/2018] Yiqun Shao presents9.
- [11/09/2018] Stephen Sheng presents10.
- [11/02/2018] Group read of “Chapter 11: Information Theory and Statistics”6.
- [10/26/2018] Group read of “Chapter 6: Gambling and Data Compression”6.
- [10/19/2018] Dmitry Shemetov presents on “Chapter 2: Lower bounds on minimax risk”11.
- [10/12/2018] A problem solving session dedicated to Chapter 26.
- [10/05/2018] Dmitry Shemetov presents the foundational material from “Chapter 2: Entropy, Relative Entropy, Mutual Information”6.
- [9/28/2018] Organizing meeting.
References
A. Singh, “Data Processing and Information Theory Course Notes” ↩︎ ↩︎ ↩︎ ↩︎ ↩︎
L. Faivishevsky, J. Goldberger, “A Nonparametric Information Theoretic Clustering Algorithm” ↩︎
A. Krause, A. Singh, C. Guestrin, “Near-Optimal Sensor Placements in Gaussian Processes: Theory, Efficient Algorithms and Empirical Studies” ↩︎
A. Kraskov, H. Stoegbauer, P. Grassberger, “Estimating Mutual Information” ↩︎ ↩︎
J. Walters-Williams, Y. Li, “Estimating Mutual Information: A Survey” ↩︎
T. Cover, J. Thomas, Elements of Information Theory ↩ ↩︎ ↩︎ ↩︎ ↩︎ ↩︎ ↩︎
K. Marton, “Distance Divergence Inequalities” ↩︎
K. Marton, “A simple proof of the blowing-up lemma” ↩︎
A. Gibbs, F. Su, “On choosing and bounding probability metrics” ↩︎
T. Naftali, N. Zagaslavsky, “Deep Learning and the Information Bottleneck Principle” ↩︎
A. Tsybakov, Introduction to Nonparametric Estimation ↩︎