The training aims to give an introduction into Bayesian statistics.
We explain how Bayesian Statistics compare to a frequentist approach, and dive into the calculations needed to create Baysian models. We will also have a first look at the domain of probabilistic programming. In the theoretical part, we introduce the Bayesian formula and give an introduction into different types of statistical distributions (Bernoulli, Poisson, Uniform, Normal, Exponential).
With this theoretical basis, we will walk through the calculation of Bayesian updates, Bayesian parameter estimation and the working of a Naïve Bayes classifier.
After this theoretical part, we dive into the application of the theory in the lab. The lab has three sections: 1. Coin Tossing. We take a biased coin, and explore the idea of Bayesian updates to describe the evolution of our prior beliefs about the coin. 2. Diabetes dataset, single variable. We will implement a naïve Bayesian classifier for a single variable. 3. Multivariable Predictions. We will extend the single variable prediction to a multivariable model.
There is also an additional lab that illustrates how probabilistic programming works, and that introduces the use of the pymc3 library.
After the training, students will have gained knowledge about:
- The definitions of prior, likelihood, posterior and evidence
- The definition and application of Bayes Rule
- How to iteratively update prior distributions from new observations with Bayesian Inference
- How a Gaussian Naïve Bayes algorithm can be implemented, both for single and multiple variables.
- How probabilistic modelling looks, and how pymc3 can be used to make probabilistic models for parameter estimation.