Skip to content

Bayesian inference for linear models with continous and discrete responses using MCMC and VI

Notifications You must be signed in to change notification settings

AVoss84/bayes-linear

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bayesian inference for linear models with continous and discrete responses using MCMC and VI

Currently implemented in the 'bayes-linear' package are:

  1. Binary probit model using
    1. Mean field variational inference (VI)
    2. Markov Chain Monte Carlo (MCMC) sampling (Gibbs sampling). Here also the posterior density function of the residuals is computed, which can be used for model diagnosis, outlier detection etc., see Albert and Chib (1995) for details.
  2. Student-t linear regression model for robust inference in case of heavy-tailed response variables using Gibbs sampling.
  3. For active learning in binary classification problems the Bayesian Active Learning by Disagreement (BALD) algorithm, see also Batch BALD, has been implemented here for the Bayesian probit model.

All implementations are purely based on numpy, scipy and pandas.

About

Bayesian inference for linear models with continous and discrete responses using MCMC and VI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published