Chebyshev's Theorem

or Tchebysheff's Theorem

Let X be a random variable with finite mean μ and variance σ20, then k>0

P(|Xμ|<kσ)11k2,orP(|Xμ|kσ)1k2

Probability that the sample points sit within k standard deviations of the mean get squeezed to 1 as k increases

At k=2, probability for any random variable under any distribution is at least 75% (for the normal, we have 95%)

Applies to a new random variable Y=g(X), as long as μy and σy2 exists

If the exact value is not needed when finding P(X>c), we can use the theorem to quickly calculate a bound instead of doing a laborious integration

Just plugging in k=kσ:

P(|Xμ|k)σ2k2

Law of large numbers

For normal random variable X and Sample Proportion Y=Xn=p^

P(|Yp|c)=P(|Yp|kσ)11k211(c/σ)2as n1p(1p)c2nsqueezed at 1 by axioms10P(|Yp|c)=1

for any arbitrary positive integer c

With the equivalent symmetrical inequality and Var(x¯)=σ2n:

P(|x¯μ|k)Var(x¯)k2=σ2nk2P(|x¯μ|k)0 as n