Robert Bamler: Scalable Bayesian Inferece: New Tools for New Challenges
Robert Bamler is a Professor for Data Science and Machine Learning at the University of Tübingen in Germany. This talk was part of the colloquium of the Cluster of Excellence “Machine Learning: New Perspectives for Science”.
Abstract: Scalable Bayesian inference methods and deep probabilistic models combine the principles of statistical machine learning with the expressivity of deep learning. While a lot of recent work focuses on introducing Bayesian inference into existing applications of deep learning, in this talk Robert Bamler takes a different perspective and demonstrates that scalable Bayesian inference allows us to tackle new problems that would be difficult to address with other methods. He starts by presenting a novel inference algorithm called Perturbative Variational Inference, which draws on ideas from theoretical physics. He then shows that such foundational research opens up new frontiers in applied machine learning research, discussing examples from data and model compression, and an outlook on a new approach to the economics of machine learning that is more resilient to an institutional centralization of power.