Input Uncertainty Propagation in Gaussian Process Regression Models

Example GP and GP with Augmented Predictive Variance

Abstract

Often times we work with data that is considered noisy. But what happens when we want to propagate noise through our machine learning model? Gaussian processes are notorious for having a kernel Bayesian framework that allows us to get predictive mean and variances. I explore a few methods that would allow us to handle the uncertainties including the easy linearized version (the Taylor expansion) and a variational method. While this talk is very short and doesn’t include too many examples, I hope to outline the most important aspects so that people can start getting involved.

Date
Feb 18, 2020 12:00 AM
Event
KERMES 2020
Location
Santander, Spain

Related