Mathematical Expectation

Discrete case

If X is a random variable and f(x) is the probability distribution of X at x, the mean/expected value of X is

E(x)=xxP(X=x)=xxf(x)

Joint Probability Mass function

E[g(X,Y)]=xyg(x,y)fX,Y(x,y)

Continuous case

If X is a random variable and f(x) is the probability density of X at x, the mean or expected value of X is

E(X)=xf(x)dx

if the integral is not finite, E(X) does not exist

Joint Probability Density Function

E[g(X,Y)]=g(x,y)fX,Y(x,y)dx

Properties

Expected value of a function of X

If the series is absolutely convergent / the integral is finite

E[g(X)]=g(x)f(x)dxE[g(X)]=xg(x)f(x)

Otherwise, the expectation doesn't exist

If g(X) is something simple, E[g(X)] could be calculated using E[X], but when g(X) has different behaviours depending on X (piecewise, etc), the integral with both functions is necessary, possibly also splitting up the bounds

Simplifying the base definition:

E[aX+b]=aE[X]+b

Linear Combination

E[i=1ncigi(X)]=i=1nciE[gi(x)]

Expansion with the binomial theorem

E[(aX+b)n]=i=0n(ni)anibiE(Xn1)

Darth Vader Rule

E(X)=0[1F(X)]dx