# Bayesian Probability

Bayesian probability is basically something that is belonging to the category of probabilities which are of evidential type. In order to evaluate the probability of some given hypothesis, such probability is specified primarily by the Bayesian probabilist, which is updated in the presence of relevant new data then. When such type of interpretation is made it is providing a set of formulae and procedures in order to perform such calculation.

One can categorize the Bayesian methods by specifying the following procedures and the concepts:

1) The random variables that are used in order to model every source of uncertainty in the statistical models. We also include here the uncertainty that is produced as a result of lack of information.

2) The requirement of evaluating the distribution of prior probability with consideration of the information available.

3) The step by step use of Baye’s formula: in case when we have availability of more data then the posterior distribution is calculated making use of Baye’s formula and then this posterior distribution is the next prior.

These rules usually fix the degree scale on which we measure the degree of beliefs. One can measure the degrees of beliefs on any scale, but there is a possibility of transformation of the degree of beliefs on the probability canonical scale in such a way that the rules given for conjunction and negation are taking the form of simply the product and the sum rule.
Some very common examples of applications of Bayesian theory of probability are:

1) In estimation of parameters like speed radars, etc.

2) In hypothetical testing or even pattern matching like in detection of diseases or in speech recognition.

3) In building of model by making use of reverse engineering etc.

4) In making inferences like in weather forecasting or in reasoning in decision of courts.

Here are given some precised answers to some questions below that forms the essence:

a) The details of information

b) The way in which the information is modelled mathematically.

c) The source from where we are getting the information, the changes made in it and the way it is processed.

d) The way the conclusions are drawn on the information and thus we make the decisions on the grounds of the information that is available.

## Rules

The Bayesian probability can be easily summarized as given elementary rules:

1) Sum rule

P (M | N) + P (M’ | N) = 1

2) Product rule

P (M N| S) = P (M | S) P (N | MS)

Here, P (M | N) denotes the probability of occurrence of event M when event N has already taken place.

A different way to write these rules is below:

Sum rule:  P (m + n | H) = P (m | H) + P (n | H)

Product rule: P (m n | H) = P (m | n H) P (n | H)

Baye’s rule:
P (Model | Data, G) = $\frac{[P (Data | Model, G)}{P (Data | G)] . P (Model | G)}$

## Examples

Example 1: Given are two boxes with marbles. The first is containing five marbles each of green and purple. The second one is containing three green and seven purple marbles. If we choose a box out of the two at random and choose a marble from it, then what is the probability that if we draw a purple ball, then, it is coming from the second box?

Solution:

P (green | box 1) = $\frac{5}{10}$

P (purple | box 1) = $\frac{5}{10}$

P (green | box 2) = $\frac{3}{10}$

P (purple | box 2) = $\frac{7}{10}$

P (box 1) = P (box 2) = $\frac{1}{2}$

So we get, P (purple) = P (purple | box 1) P (box 1) + P (purple | box 2) P (box 2)

= $\frac{5}{10}$ . $\frac{1}{2}$ + $\frac{7}{10}$ . $\frac{1}{2}$

= $\frac{12}{20}$

= $\frac{3}{5}$

Thus by Bayesian probability we have,

P (box 2 | purple) = $\frac{[P (purple | box 2) P (box 2)]}{P (purple)}$

= $\frac{\frac{7}{10} \times \frac{1}{2}}{\frac{3}{5}}$

= $\frac{7}{20}$. $\frac{5}{3}$

= $\frac{7}{12}$