The Chair of Mathematical Data Science (MDS) focuses on mathematical principles and algorithms for data science and AI. As branches of mathematics, it mainly involves probability, statistics, discrete mathematics, and as more specific fields, machine learning and information theory.
Emmanuel Abbe received his Ph.D. degree from the EECS Department at the Massachusetts Institute of Technology (MIT) in 2008, and his M.S. degree from the Department of Mathematics at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in 2003. He was at Princeton University as an assistant professor from 2012-2016 and a tenured associate professor from 2016, jointly in the Program for Applied and Computational Mathematics and the Department of Electrical Engineering, as well an associate faculty in the Department of Mathematics at Princeton University since 2016. He joined EPFL in 2018 as a Full Professor, jointly in the Mathematics Institute and the School of Computer and Communication Sciences, where he holds the Chair of Mathematical Data Science.
He is the recipient/co-recipient of the Foundation Latsis International Prize; the Bell Labs Prize; the NSF CAREER Award; the Google Faculty Research Award; the Walter Curtis Johnson Prize from Princeton University; the von Neumann Fellowship from the Institute for Advanced Study; the IEEE Information Theory Society Paper Award; the Simons-NSF Mathematics of Deep Learning Collaborative Research Award; the ICML Outstanding Paper Award.
He is also a Global Expert at the Geneva Science and Diplomacy Anticipator (GESDA), a member of the Steering Committee of the Center for Intelligent Systems (CIS) at EPFL, a member of the Deepfoundations collaboration on the theoretical foundations of deep learning, and the co-director of the Bernoulli Center for Fundamental Studies at EPFL. Emmanuel Abbe is also a senior research scientist at Apple MLR.
Revision of basic set theory and combinatorics.
Elementary probability: random experiment; probability space; conditional probability; independence.
Random variables: basic notions; density and mass functions; examples including Bern…
The class covers statistical models and statistical learning problems involving discrete structures. It starts with an overview of basic random graphs and discrete probability results. It then covers topics such as reconstruction on trees, stochastic …
Selected topics: statistical inference and algorithms on graphs, neural networks. …