etudes

A collection of études on probabilistic models.

GPL-3.0 License

Stars
9

tudes

🎶 A collection of tudes on probabilistic models.

About

The repository hosts some notebooks on probabilistic models, such as Gaussian processes, graphical models, normalizing flows, and so on. The notebooks are mostly on topics I am interested in or papers I happen to come across.

I do not take warranty for the correctness of these.

  • Gaussian process regression introduces non-parametric Bayesian regression.
  • Gaussian process classification extends Gaussian process regression to classification scenarios.
  • Dirichlet process mixture models extends the Bayesian mixture to the infinite case, i.e. when we don't know the number of clusters beforehand. We use the Chinese restaurant process and the stick-breaking construction for inference..
  • SBC shows a method to validate Bayesian posterior inferences.
  • Structure MCMC shows how PyMC3 can be used to learn the structure of a Bayesian network.
  • Mixed models shows concise reference implementations for optimization of the objective of (generalized) linear mixed models.
  • Sequential regression models introduces a special class of ordinal regression models which assume a sequential response mechanism.
  • Causal structure learning using VAEs implements a novel graph variational autoencoder and compares it to greedy equivalence search, one of the state-of-the-art methods for causal discovery.
  • Normalizing flows shows how TensorFlow Probability can be used to implement a custom normalizing flow.
  • Bayesian optimization introduces the basics of optimization of costly to evaluate functions with probabilistic surrogate models.
  • Hierarchical, coregionalized GPs implements two GP models and compares their predictive performance as well as MCMC diagnostics on an US election data set.
  • Variational LSTMs implements a variational multivariate LSTM for timeseries prediction of an US election data set.
  • Hilbert-space approximate copula processes explains how a copula process in conjunction with Hilbert-space approximations can be used to model stochastic volatility.
  • VI for stick-breaking constructions implements mean-field variational approximations for nonparametric mixture and factor models using stick-breaking constructions.
  • Tensor-product spline smoothers implements a probabilistic model for causal inference with structured latent confounders.
  • Normalizing flows for variational inference implements an inverse autoregressive flow for variational inference of parameters in a simple bivariate Gaussian example in Jax, Distrax, Optax and Haiku.
  • Diffusion models I introduces a novel class of generative models that are inspired by non-equilibrium thermodynamics.
  • Probabilistic reconciliation implements and tests two recent methods on reconciliation of hierarchical time series forecasts.
  • Diffusion models II introduces a new class of generative models using stochastic differential equations and denoising score matching.

Build

Compile the qmd files via

make file=FILE.qmd

Then move the created html and folder of images to docs.

Author

Simon Dirmeier sfyrbnd @ pm me

Badges
Extracted from project README
Project Status Binder