Bayesian Inference
Bayesian Inference is a method for estimating a true probability distribution from samples of the distribution.
When I read the article in the wikipedia, I didn’t get the point.
But I found a better article that gives me more intuitive understanding.
My understanding of Bayesian Inference consists of the following steps:
- Come up with a hypothesis of a target distribution, which is called a prior probability distribution. Use uniform distribution as a prior probability distribution if you have no hypotheses.
- Update the hypothesis based on samples (which are called evidence) from the target distribution. The updated hypothesis is called a posterior probability.
When updating the hypothesis, we use Bayes' theorem.
: the prior probability
: the posterior probability
We can get the posterior probability by multiplying the prior probability by the value
The difficult part of Bayesian Inference is calculating .
It can be calculated as follows:
when the target distribution is discrete.
when the target distribution is continuous.
MCMC (Markov Chain Monte Carlo) is sometimes used to calculate it.