Andreas Kirsch

Bio & research interests

Department of Computer Science

University of Oxford

Wolfson Building

Oxford, OX1 3QD

I am a DPhil student with Prof Yarin Gal in the OATML group at the University of Oxford and a student in the AIMS program.

Originally from Romania, I grew up in Southern Germany. After studying Computer Science and Mathematics at the Technical University in Munich, I spent a couple of years in Zurich as a software engineer at Google (YouTube). I have worked as a performance research engineer at DeepMind for a year before starting my DPhil in September 2018.

I am interested in Bayesian Deep Learning, Information Theory (and its application within Information Bottlenecks and Active Learning), and Uncertainty Quantification. I also like to think about AI Ethics and AI Safety.

selected publications

  1. Preprint
    Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty
    arXiv Preprint 2021
  2. UDL 2020
    Learning CIFAR-10 with a Simple Entropy Estimator Using Information Bottleneck Objectives
    Kirsch, Andreas, Lyle, Clare, and Gal, Yarin
    In Uncertainty & Robustness in Deep Learning at Int. Conf. on Machine Learning (ICML Workshop) 2020
  3. Preprint
    Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning
    Kirsch, Andreas, Lyle, Clare, and Gal, Yarin
    arXiv Preprint 2020
  4. NeurIPS 2019
    BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning
    Kirsch*, Andreas, van Amersfoort*, Joost, and Gal, Yarin
    NeurIPS 2019

news

Jul 24, 2021

Seven workshop papers at ICML 2021 (out of which five are first author submissions):

Uncertainty & Robustness in Deep Learning

Two papers and posters at the Uncertainty & Robustness in Deep Learning workshop:

SubSetML: Subset Selection in Machine Learning: From Theory to Practice

Four papers (posters, one spotlight) at the SubSetML: Subset Selection in Machine Learning: From Theory to Practice workshop:

Neglected Assumptions In Causal Inference

One paper (poster) at the Neglected Assumptions In Causal Inference workshop:

Feb 23, 2021

Lecture on “Bayesian Deep Learning, Information Theory and Active Learning” for Oxford Global Exchanges. You can download the slides here.

Feb 21, 2021

Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty has been uploaded to arXiv as pre-print. Joint work with Jishnu Mukhoti, and together with Joost van Amersfoort, Philip H.S. Torr, Yarin Gal. We show that a single softmax neural net with minimal changes can beat the uncertainty predictions of Deep Ensembles and other more complex single-forward-pass uncertainty approaches.

Dec 10, 2020

Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning was also presented as a poster at the “NeurIPS Europe meetup on Bayesian Deep Learning”.

You can find the poster below (click to open):

Image version

or as PDF version to download.

Jul 17, 2020

Two workshop papers have been accepted to Uncertainty & Robustness in Deep Learning Workshop at ICML 2020:

  1. Scalable Training with Information Bottleneck Objectives, and
  2. Learning CIFAR-10 with a Simple Entropy Estimator Using Information Bottleneck Objectives

both together with Clare Lyle and Yarin Gal. They are based on Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning for the former, and an application of the UIB framework for the latter: we can use it to train models that perform well on CIFAR-10 without using a cross-entropy loss at all.

Mar 27, 2020

Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning, together with Clare Lyle and Yarin Gal, has been uploaded as pre-print to arXiv. It examines and unifies different Information Bottleneck objectives and shows that we can introduce simple yet effective surrogate objectives without complex derivations.

Sep 4, 2019

BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning got accepted into NeurIPS 2019. See you all in Vancouver!

Jun 24, 2019

BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning has been uploaded to arXiv, and we have also published an extensive blog post about it on OATML with the code available on GitHub.

Follow me on Twitter @blackhc