Currently implemented in the 'bayes-linear' package are:
- Binary probit model using
- Mean field variational inference (VI)
- Markov Chain Monte Carlo (MCMC) sampling (Gibbs sampling). Here also the posterior density function of the residuals is computed, which can be used for model diagnosis, outlier detection etc., see Albert and Chib (1995) for details.
- Student-t linear regression model for robust inference in case of heavy-tailed response variables using Gibbs sampling.
- For active learning in binary classification problems the Bayesian Active Learning by Disagreement (BALD) algorithm, see also Batch BALD, has been implemented here for the Bayesian probit model.
All implementations are purely based on numpy, scipy and pandas.