The conceptual and practical limitations of classical multiple linear regression models can be resolved naturally in a Bayesian framework. Unless based on an overly simplistic parameterization, however, exact inference in Bayesian regression models is analytically intractable. This problem can be overcome using methods for approximate inference.
This MATLAB toolbox implements variational inference for a fully Bayesian multiple linear regression model, including Bayesian model selection and prediction of unseen data points on the basis of the posterior predictive density.
. . . → Read More: Variational Bayesian linear regression
Kandel, Schwartz, Jessell. Principles of Neural Science. McGraw Hill. Parts I – IV. [cell and molecular biology of the neuron, synaptic transmission, neural basis of cognition] Bear, Connors, Paradiso. Neuroscience. Exploring the Brain. LWW.
Bühlmann, P. Computational Statistics. ETH lectures. Website / Script [introduction to classical multiple linear regression, hypothesis tests, nonparametric regression, classification, shrinkage]
Bayesian statistics and machine learning
Bishop, C. M. (2007). Pattern Recognition and Machine Learning. Springer. Chapters 1-4. [Bayesian linear regression and classification] Barber, D. (2012). Bayesian Reasoning and Machine Learning. Cambridge University Press.
Bayesian models of neural information processing
Kenji Doya (Editor), . . . → Read More: Reading list