Discrete case
If X is a random variable and f ( x ) is the probability distribution of X at x, the mean/expected value of X is
E ( x ) = ∑ x x ⋅ P ( X = x ) = ∑ x x ⋅ f ( x ) Joint Probability Mass function
E [ g ( X , Y ) ] = ∑ x ∑ y g ( x , y ) ⋅ f X , Y ( x , y ) Continuous case
If X is a random variable and f ( x ) is the probability density of X at x, the mean or expected value of X is
E ( X ) = ∫ ∞ ∞ x ⋅ f ( x ) d x if the integral is not finite, E ( X ) does not exist
Joint Probability Density Function
E [ g ( X , Y ) ] = ∫ − ∞ ∞ ∫ − ∞ ∞ g ( x , y ) ⋅ f X , Y ( x , y ) d x Properties
Expected value of a function of X
If the series is absolutely convergent / the integral is finite
E [ g ( X ) ] = ∫ − ∞ ∞ g ( x ) f ( x ) d x E [ g ( X ) ] = ∑ x g ( x ) f ( x ) Otherwise, the expectation doesn't exist
If g ( X ) is something simple, E [ g ( X ) ] could be calculated using E [ X ] , but when g ( X ) has different behaviours depending on X (piecewise, etc), the integral with both functions is necessary, possibly also splitting up the bounds
Simplifying the base definition:
E [ a X + b ] = a E [ X ] + b Linear Combination
E [ ∑ i = 1 n c i g i ( X ) ] = ∑ i = 1 n c i E [ g i ( x ) ] Expansion with the binomial theorem
E [ ( a X + b ) n ] = ∑ i = 0 n ( n i ) a n − i b i E ( X n − 1 ) Darth Vader Rule
E ( X ) = ∫ 0 ∞ [ 1 − F ( X ) ] d x