In this talk, we want to bridge the gap between practical methods for Deep Learning and Bayesian Inference in a practical setting.

As motivation, we will start by introducing common problems in machine learning systems that arise from the lack of understanding of how uncertain models are when given specific inputs. This causes limitations on applications that need robust solutions and can impact people lives, such as healthcare, financial trading, and autonomous vehicles.

We will present Bayesian Neural Networks and cover the fundamentals of Bayesian Inference. Dropout layers and other stochastic regularization techniques, when viewed in the lens of BNNs, offer us out-of-the-box tools to measure uncertainty that we can implement with little or no cost to existing architectures.

To close, we will go over real-life applications of these techniques down to some code snippets. Better estimation of what the model doesn’t know enables faster explore-exploit tradeoffs in reinforcement learning problems and more efficient use of annotations through active sampling.

#DeepLearning #MachineLearning #AI #DataScience

Know what you don't know: Tools to understand uncertainty in DL
1.60 GEEK