2023

January 2023
Diffusion models II
Diffusion models such as NCSNs and DDPMs are special cases of score-based generative models using SDEs in discrete time. Here we show how the two models fit into this framework and how it can be used for generative AI (link).

2022

November 2022
Probabilistic reconciliation
Here we implement and tests two recent methods on probabilistic reconciliation of hierarchical time series forecasts (link).
June 2022
Diffusion models I
In this case study we cover the basics of diffusion probabilistic models. The notebook uses Distrax and Haiku for probabilistic inference (link).
January 2022
Normalizing flows for variational inference
This case study implements an inverse autoregressive flow for variational inference of parameters. The notebook uses Distrax and Haiku for probabilistic inference (link).

2021

September 2021
Causal inference using tensor-product smoothing splines
The case study reproduces and improves upon a probabilistic model for causal inference with structured latent confounders. It uses BlackJAX, NumPyro and Stan for probabilistic inference (link).
August 2021
Stick-breaking constructions and variational inference
Here we implement common stick-breaking constructions for mean-field variational inference in nonparametric mixture and factor models. The notebook uses NumPyro and TFP for probabilistic inference (link).
July 2021
Hilbert-space approximate copula processes
In this notebook we explore the application of Hilbert-space methods for the approximation of Gaussian copula processes to model stochastic volatility. The notebook uses Stan for probabilistic inference (link).
June 2021
Variational, multivariate LSTMs
LSTMs provide an intriguing approach to timeseries forecasting since they naturally model their temporal dependencies. In this notebook we test implementing a variational, multivariate LSTM using Haiku and Numpyro to predict US election outcomes (link).
May 2021
Hierarchical, coregionalized GPs
Hierarchical and coregionalized GPs are two approaches to modelling marginally correlated data. In this notebook we implement two GP models using Numpyro and compare their predictive performance as well as MCMC diagnostics on an US election data set (link).
March 2021
The basics of Bayesian optimization
Bayesian optimization provides a unified framework to optimization of costly-to-evaluate objective functions. In this notebooks we demonstrate the basics of BO using Stan (link).

2020

October 2020
Normalizing flows for density estimation
Using TensorFlow Probability's Bijector API one can easily implement custom normalizing flows. This notebook shows how it can be done using MAFs as an example (link).
June 2020
Causal structure learning with VAEs
Learning cause-effect mechanisms among a set of random variables is not only of great epistemological interest but also a fascinating statistical problem. In this notebook we implement a graph variational autoencoder to learn the DAG of a structural equations model and compare it to greedy equivalent search (link).
May 2020
On sequential regression models
Sequential models are a special type of ordinal regression models, but additionally assume that categories can only be reached sequentially. This case study shows the difference to conventional ordinal models (link).

2019

October 2019
Mixed model reference implementations
Concise reference implementations to fit (generalized) linear mixed models in Python (link).
September 2019
Structure learning for Bayesian networks
A Python notebook on structure MCMC to learn the structure of a Bayesian network using PyMC3 (link).
June 2019
Simulation-based calibration
An example notebook for simulation-based calibration for validation of Bayesian inferences (link).
January 2019
Dirichlet process mixture models
Stan models on infinite Bayesian mixtures using the Dirichlet process (link).

2018

October 2018
Philosophy of Science
Some references on philosophy of science that are in my opinion worth reading (link).
October 2018
Gaussian Processes
Bayesian non-parametrics such as Gaussian Processes are a wonderful approach to machine learning. If you are also an enthusiast make sure to check out my Python notebooks on regression and classification (regression, classification).