Chebyshev inequality says that in any kind of probability distribution almost all the values are near to the mean. The Chebyshev inequality is also spelled as Tchebysheff's inequality. According to the Chebyshev's inequality, "the number of values in a distribution are not more than $\frac{1}{n^2}$ which are more away from the mean than 'n' standard deviations."
In short and simple words when a distribution is not normal, then we make use of Chebyshev's inequality in order to find out the data that is clustered around the mean in percentages. This inequality is referring to the set distribution of numbers. 


Let Y be any random variable, whose expected value is finite, $\mu$, and is having non zero variance ^\sigma^2$. Then for any n >0 belong to real numbers:

P ($|Y – \mu| >= n\sigma$) <= $\frac{1}{n^2}$

Only when we have n > 1, then the information is useful. In case when n < 1 then the right hand side is always greater than 1, which makes the inequality vacuous, because we know that no event can have a probability greater than 1. When n = 1, then the probability is less than or equal to the number 1, which is true at all times.


Chebyshev formula easily proves the chances of outliers that exist in a given interval. In other words, the absolute value of $\mu$ subtracted from Y when is greater than n times $\sigma$ is always less than or equal to inverse of squared n. Like in limits, Chebyshev also gives us one sided inequalities accordingly.

P ($I_|Y – \mu|$ >= $n\sigma$)

= E ($I_|Y – \mu|$ >= $n\sigma$)

Where $I_B$ is the indicator function for any event B. => $I_B$ = 1 when B occurs else 0 if it does not.

= E ($\frac{I_(|Y – \mu|)}{n\sigma})^2$ >= 1)

<= E (($\frac{(Y – \mu)}{n\sigma})^2$)

= $\frac{1}{n^2}$ . E ($(\frac{(Y - \mu)^2)}{\sigma^2}$)

= $\frac{1}{n^2}$

This is the direct proof. We can also prove it theoretically.


Example on Chebyshev  inequality is given below:

Example: Let Y be the outcome when a single and fair die is rolled. If E (Y) = 3.5 and var (Y) = $\frac{35}{12}$. Evaluate P (Y >= 6).

Solution: When we want to see exact thing we can directly say that P (Y >= 6) = P (Y = 6) = $\frac{1}{6}$ = 0.67 approximately.
When we use Markov inequality we have,

P (Y >= 6) <= $\frac{(\frac{21}{6})}{6}$ = 0.583 approximately.

When we make use of two sided Chebyshev’s Inequality we have,

P (Y >= 6)

<= P (Y >= 6 or Y <= 1)

= P (|Y – 3.5| >= 2.5)

<= $\frac{(\frac{35}{12})}{(2.5)^2}$

= $\frac{7}{15}$

= 0.467 (approximately)

When we make use of one sided Chebyshev’s inequality we are getting a stronger bound on P than before.

P (Y >=6)

= P (Y >= 3.5 + 2.5)

<= $\frac{(\frac{35}{12})}{((\frac{35}{12}) + (2.5)^2)}$

<= $\frac{7}{22}$

= 0.318 approximately.