Department of Computer Science
University of Oxford
Oxford, OX1 3QD
Originally from Romania, I grew up in Southern Germany. After studying Computer Science and Mathematics at the Technical University in Munich, I spent a couple of years in Zurich as a software engineer at Google (YouTube). I have worked as a performance research engineer at DeepMind for a year before starting my DPhil in September 2018.
I am interested in Bayesian Deep Learning, Information Theory (and its application in Information Bottlenecks and Active Learning), and Uncertainty Quantification. I also like to think about AI Ethics and AI Safety.
|Feb 23, 2021||
Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty has been uploaded to arXiv as pre-print. Joint work with Jishnu Mukhoti, and together with Joost van Amersfoort, Philip H.S. Torr, Yarin Gal. We show that a single softmax neural net with minimal changes can beat the uncertainty predictions of Deep Ensembles and other more complex single-forward-pass uncertainty approaches.
|Jul 17, 2020||
Two workshop papers have been accepted to Uncertainty & Robustness in Deep Learning Workshop at ICML 2020:
both together with Clare Lyle and Yarin Gal. They are based on Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning for the former, and an application of the UIB framework for the latter: we can use it to train models that perform well on CIFAR-10 without using a cross-entropy loss at all.
|Mar 27, 2020||
Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning, together with Clare Lyle and Yarin Gal, has been uploaded as pre-print to arXiv. It examines and unifies different Information Bottleneck objectives and shows that we can introduce simple yet effective surrogate objectives without complex derivations.
|Sep 4, 2019||
BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning got accepted into NeurIPS 2019. See you all in Vancouver!
|Jun 24, 2019||
Uploaded BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning to arXiv, and also published an extensive blog post about it on OATML with code available on GitHub.
- PreprintDeterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric UncertaintyarXiv Preprint 2021
- ICML UDL 2020Learning CIFAR-10 with a Simple Entropy Estimator Using Information Bottleneck ObjectivesIn Workshop Uncertainty & Robustness in Deep Learning at Int. Conf. on Machine Learning (ICML), online 2020
- PreprintUnpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep LearningarXiv Preprint 2020
- NeurIPS 2019BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active LearningNeurIPS 2019