Bayesian Theory
-
(Bayesian) Foundations of data science
A little advertisement for a new free online course about the foundations of data science, machine learning, and – just a little – artificial intelligence. It's been designed for students in computer science and data science, who could be uncomfortable with a head-on probability-theory or statistics approach, and who might have a lighter background in maths. The main point of view of the course is how to build an artificial-intelligence agent who must draw inferences and make decisions. As a course, it's still a sort of experiment.
https://pglpm.github.io/ADA511/
In more technical terms, the course is actually about so-called "Bayesian nonparametric density inference" and Bayesian decision theory.
- win-vector.com What You Should Know About Linear Markov Chains
I want to collect some “great things to know about linear Markov chains.” For this note we are working with a Markov chain on states that are the integers 0 through k (k > 0). A Mark…
-
GPU-accelerated Gibbs sampling
doi.org GPU-accelerated Gibbs sampling: a case study of the Horseshoe Probit model - Statistics and ComputingGibbs sampling is a widely used Markov chain Monte Carlo (MCMC) method for numerically approximating integrals of interest in Bayesian statistics and other mathematical sciences. Many implementations of MCMC methods do not extend easily to parallel computing environments, as their inherently sequent...
Bayesian nonparametric methods include machine-learning and deep-learning methods as special cases, obtained by making quite coarse approximations to make the computation feasible. Using the fully fledged Bayesian method would give much more, but would take orders of magnitude more computational resources and time.
Papers like this, however, open up many more possibilities to use the full theory rather than approximations. I'm quite surprised to see that this paper is already four years old. I wonder if there's been any progress since then.
-
-
Three non-probability books with good introductions to Bayesian probability theory
Today there's an abundance of textbooks and webbooks on Bayesian probability theory, decision theory, and statistics, at very diverse technical levels. I wanted to point out three books whose main topic is not probability theory, but which give very good introductions (even superior to those of some specialized textbooks, in my opinion) to Bayesian probability theory:
-
Artificial Intelligence: A Modern Approach by S. J. Russell, P. Norvig. Part IV is an amazing introduction to Bayesian theory – including decision theory – with many connections with Artificial Intelligence and Logic.
-
Medical Decision Making by H. C. Sox, M. C. Higgins, D. K. Owens. This is essentially a very clear and insightful textbook on Bayesian probability theory and decision theory, but targeted to clinical decision-making.
-
Sentential Probability Logic: Origins, Development, Current Status, and Technical Applications by T. Hailperin. This is a book on Bayesian probability theory, presented as a generalization of propositional logic. This point of view is the most powerful I know of. The books also has important results on methods to find probability bounds, and on combining evidence.
-