The Chair of Mathematical Data Science (MDS) focuses on mathematical principles and algorithms for data science and AI. As branches of mathematics, it mainly involves probability, statistics, discrete mathematics, and as more specific fields, machine learning and information theory.

Emmanuel Abbe received his Ph.D. degree from the EECS Department at the Massachusetts Institute of Technology (MIT) in 2008, and his M.S. degree from the Department of Mathematics at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in 2003. He was at Princeton University as an assistant professor from 2012-2016 and a tenured associate professor from 2016, jointly in the Program for Applied and Computational Mathematics and the Department of Electrical Engineering, as well an associate faculty in the Department of Mathematics at Princeton University since 2016. He joined EPFL in 2018 as a Full Professor, jointly in the Mathematics Institute and the School of Computer and Communication Sciences, where he holds the Chair of Mathematical Data Science. He is the recipient of the Foundation Latsis International Prize, the Bell Labs Prize, the NSF CAREER Award, the Google Faculty Research Award, the Walter Curtis Johnson Prize from Princeton University, the von Neumann Fellowship from the Institute for Advanced Study, the IEEE Information Theory Society Paper Award, and a co-recipient of the Simons-NSF Mathematics of Deep Learning Collaborative Research Award.

Prof. E. Abbe is also a Global Expert at the Geneva Science and Diplomacy Anticipator (GESDA), a member of the Steering Committee of the Center for Intelligent Systems (CIS) at EPFL, a member of the Deepfoundations collaboration on the theoretical foundations of deep learning, a consultant at Apple Artificial Intelligence and Machine Learning Reseach, and the director of the Bernoulli Center for Fundamental Studies at EPFL.

EPFL SB MATH

MA C2 543 (Bâtiment MA)

Station 8

CH-1015 Lausanne

Switzerland

Office: MA C2 543 (2

- E. Abbe, S. Bengio, E. Cornacchia, J. Kleinberg, A. Lotfi, M. Raghu, C. Zhang,
*Learning to Reason with Neural Networks: Generalization, Unseen Data and Boolean Measures,***NeurIPS’22** - E. Abbe, E. Boix-Adsera,
*On the non-universality of deep learning: quantifying the cost of symmetry,***NeurIPS’22** - E. Abbe, E. Boix-Adsera, T. Misiakiewicz,
*The merged-staircase property: a necessary and nearly sufficient condition for SGD learning of sparse functions on two-layer neural networks,***COLT’22** - E. Abbe, E.Cornacchia, J. Hazla, C. Marquis,
*An initial alignment between neural network and target is needed for gradient descent to learn,***ICML’22** - E. Abbe, S. Li, A. Sly,
*Binary perceptron: efficient algorithms can find solutions in a rare well-connected cluster,***STOC’21** - E. Abbe, S. Li, A. Sly,
*Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron,***FOCS’21** - E. Abbe, P. Kamath, E. Malach, C. Sandon, N. Srebro, On the power of differentiable learning versus PAC and SQ learning,
**NeurIPS’21 Spotlight** - E. Abbe, E. Boix-Adsera, M. Brenner, G. Bresler, D. Nagarj, The staircase property: how hierarchical structure can guide deep learning,
**NeurIPS’21** - E. Abbe, E. Cornacchia, Y. Gu, Y. Polyanskiy,
*Stochastic block model entropy and broadcasting on trees with survey***COLT’21 Best Student Paper Award** - E. Malach, P. Kamath, E. Abbe, N. Srebro,
*Quantifying the Benefit of Using Differentiable Learning over Tangent Kernels*,**ICML’21** - A. Asadi, E. Abbe,
*Maximum Multiscale Entropy and Neural Network Regularization* - E. Abbe, J. Fan, K. Wang,
*An l_p theory of PCA and spectral clustering*,**Annals of Statistics** - E. Abbe, S. Li, A. Sly,
*Learning sparse graphons and the generalized Kesten-Stigum Threshold* - E. Abbe, C. Sandon,
*Polytime universality and limitations of deep learning*,**CPAM** - E. Abbe, A. Shpilka, M. Ye,
*Reed-Muller codes: theory and algorithms*,**Information Theory Trans.** - E. Abbe, C. Sandon,
*On the universality of deep learning*,**NeurIPS 20** - E. Abbe, M. Ye,
*Reed-Muller codes polarize.***FOCS 19** - Min Ye, Emmanuel Abbe,
*Recursive projection-aggregation decoding of Reed-Muller codes.***ISIT 19** - Amir R. Asadi, Emmanuel Abbe,
*Chaining Meets Chain Rule: Multilevel Entropic Regularization and Training of Neural Nets.***JMLR** - Emmanuel Abbe, Enric Boix-Adserà,
*Subadditivity Beyond Trees and the Chi-Squared Mutual Information.***ISIT 19** - E. Abbe, E. Boix, P. Ralli, C. Sandon,
*Graph powering and spectral robustness.***SIAM Journal on Mathematics of Data Science** - E. Abbe, E. Boix,
*An Information-Percolation Bound for Spin Synchronization on General Graphs*.**Annals of Applied Probability** - A. Asadi, E. Abbe, S. Verdu
*Chaining mutual information and tightening generalization bounds*.**NIPS 18** - E. Abbe, L. Massoulié, A. Montanari, A. Sly, N. Srivastava,
*Group syncrhonization on grids*.**Mathematical Statistics and Learning (MSL) 18** - M. Ye, E. Abbe,
*Communication-computation efficient gradient coding*.**ICML 18** - A. Sankararaman, E. Abbe, F. Baccelli, Community Detection on Euclidean Random Graphs,
**Information and Inference: A Journal of the IMA** - E. Abbe, T. Bandory, W. Leeb, J. Pereira, N. Sharon, A. Singer
*Multireference alignment is easier with an aperiodic translation distribution*.**Information Theory Trans.** - E. Abbe, J. Fan, K. Wang, Y. Zhong,
*Entrywise eigenvector analysis of random matrices of low expected rank*.**Annals of Statistics** - E. Abbe, S. Kulkarni, E. Lee,
*Generalized nonbacktracking bounds on the influence*.**JMLR** - E. Abbe,
*Community detection and stochastic block models: recent development*.**JMLR** - E. Abbe, S. Kulkarni, E. Lee,
*Nonbacktracking bounds on the influence in cascade models*.**NIPS 17** - E. Abbe, J. Pereira, A. Singer,
*Sample complexity of the Boolean multireference alignment problem*,**ISIT 16** - E. Abbe, C. Sandon,
*Proof of the achievability conjectures in the general stochastic block model*.**CPAM** - E. Abbe, C. Sandon,
*Detection in the stochastic block model with multiple clusters: proof of the achievability conjectures, acyclic BP, and the information-computation gap*.**NIPS 16 oral** - E. Abbe, C. Sandon,
*Recovering communities in the general stochastic block model without knowing the parameters*.**NIPS 15** - E. Abbe, C. Sandon,
*Community detection in the general stochastic block model: fundamental limits and efficient recovery algorithms update here*.**FOCS 15** - E. Abbe, A. Shpilka, A. Wigderson,
*Reed-Muller codes for random erasures and erros*.**STOC 15** - Y. Desphande, E. Abbe, A. Montanari,
*Asymptotic mutual information for the balanced binary SBM.***Journal Information and Inference: A Journal of the IMA** - E. Abbe, Y. Wigderson,
*High-girth matrices and polarization*.**ISIT 15** - E. Abbe, A. Bandeira, G. Hall,
*Exact recovery in the stochastic block model http://arxiv.org/abs/1405.3267.Update here*.**Information Theory Trans.** - E. Abbe, N. Alon, A. Bandeira,
*Linear Boolean classification, coding and “the critical problem”.***ISIT 14** - E. Abbe, A. Bandeira, A. Bracher, A. Singer,
*Decoding graph labels from censored correlations: phase transition and efficient recovery.***Transactions on Network Science and Eng.** - E. Abbe, A. Montanari,
*Conditional random fields, planted constraint satisfaction and entropy concentration*.**Journal Theory of Computing** - E. Abbe, A. Montanari,
*On the concentration of the number of solutions of random satisfiability formulas.***Random Structures and Algorithms** - N. Goela, E. Abbe, M. Gastpar,
*Polar codes for broadcast channels http://arxiv.org/abs/1301.6150*.**Information Theory Trans.** - E. Abbe, A. Khandani, A. W. Lo,
*Privacy-preserving methods in**systemic risk*.**American Economical Review (AER)**. New York Times article: http://bits.blogs.nytimes.com/2013/09/09/a-data-weapon-to-avoid-the-next-financial-crisis/

- E. M. Chayti, Capacity of Binary perceptrons with binary inputs, MDS, Semester Project, 2020
- E. Boix, Average-case statistical query algorithms, MDS, Summer internship report, 2019. Last version
- E. Bamas, Learning Monomials, MDS, Semester projects report, 2018

Revision of basic set theory and combinatorics.

Elementary probability: random experiment; probability space; conditional probability; independence.

Random variables: basic notions; density and mass functions; examples including Bern…

The class will cover statistical models and statistical learning problems involving discrete structures. It starts with an overview of basic random graphs and discrete probability results. It then covers topics such as reconstruction on trees, stochastic …

- Position
- Scientist
- Office
- MA C2 553

- Position
- Scientist
- Office
- MA C2 553

- Position
- Scientist
- Office
- MA C2 543

- Position
- Doctoral Assistant
- Office
- MA B2 534

- Position
- Doctoral Assistant
- Office
- MA C2 543

- Enric Boix
- Elena Grigorescu
- Min Ye