3100 Cheat Sheet

Discrete

F(x,y)=P(Xx,Yy)=sxtyf(s,t)     for x,yR

Marginals

fX(x1,x2,xn)=fX1(x1)fX2(x2)fXn(xn)(x1,x2,,xn)

if and only if the n random variables are independent
uniform

f(x)=1k,μ=n+12,σ2=n2112

Bernoulli

f(x)=px(1p)1x,μ=p,σ2=p(1p)

binomial

f(x)=(nx)px(1p)nx,μ=np,σ2=np(1p),MX(t)=[1+p(et1)]n

negative binomial (geometric when k =1)

f(x;k,p)=(x1k1)pk(1p)xk,μ=kp,σ2=kp(1p1)

hypergeometric (sampling without replacement)

f(x;n,N,M)=(Mx)(NMnx)(Nn),μ=nMN,σ2=nM(NM)(Nn)N2(N1)

poisson

f(x;\lambda)=\frac{\lambda { #xe} ^{-\lambda}}{x!}, \mu =\sigma^2=\lambda, \quad M_{X}(t)=e^{\lambda(e^t-1)}

Continuous
Joint Cumulative Density

F(X,Y)=P(Xx,Yy)=yxf(s,t)dsdt   x,y(,)f(x,y)=2xyF(x,y)marginalsfX(x)=f(x,y)dyandfY(y)=f(x,y)dx

For n continuous random variables X1,X2,,Xn with a joint probability densities f(x1,x2,,xn) and marginals densities fXi(xi):

fX(x1,x2,xn)=fX1(x1)fX2(x2)fXn(xn)(x1,x2,,xn)

if and only if the n random variables are independent
uniform

f(x;α,β)=1βα,μ=α+β2,σ2=112(αβ)2

gamma function (continuous factorial)

Γ(α)=0yα1eydy,Γ(12)=π

Gamma Distribution

f(x;α,β)=1βαΓ(α)xα1ex/β,μ=βα,σ2=β2α

Exponential Distribution is a gamma distribution when α=1
The Chi-Squared Distribution is a gamma distribution with α=v2 and β=2, v = df
beta distribution

f(x;α,β)=Γ(α+β)Γ(α)Γ(β)xα1(1x)β1,μ=αα+β,σ2=αβ(α+β)2(α+β+1)

normal distribution

f(x;μ,σ2)=12πσ2e(xμ)2/2σ2,MX(t)=eμt+12σ2t2

If XBin(n,p) then the moment-generating function of

Z=Xnpnp(1p)=XμσP(A|B)=P(B|A)P(A)P(B)=P(AB)P(B)

Conditional

fX|Y(x|y)=P(A|B)=P(AB)P(B)=f(x,y)fY(y)E[g(X)|Y=y]=xg(x)fX|Y(x|y)E[g(X)|Y=y]=g(x)fX|Y(x|y)Var(X|Y=y)=E(X2|Y=y)[E(X|Y=y)]2

Expectation

E[g(X,Y)]=g(x,y)fX,Y(x,y)dxE[g(X,Y)]=xyg(x,y)fX,Y(x,y)

properties

E[i=1ncigi(X)]=i=1nciE[gi(x)]E[(aX+b)n]=i=0n(ni)anibiE(Xn1)E[aX+b]=aE[X]+bvar(aX+b)=a2[E(X2)[E(X)]2]=a2σ2=a2var(X)

Moments

μr=E(Xr)=xxrf(x)(discrete)μr=E(Xr)=xrf(x)dx(continuous)

Central moments

μr=E[(Xμ)r]=x(xμ)rf(x)(discrete)μr=E[(Xμ)r]=(xμ)rf(x)dx(continuous)var(X)=σ2=μ2=E[(Xμ)2]=E(X2)[E(X)]2

Moment generating functions are a bijection between functions

MX(t)=E(etX)=xetxf(x)discreteMX(t)=E(etX)=etxf(x)dxcontinuousμr=E(Xr)=drMX(t)dtr|t=01.MX+a(t)=E[e(X+a)t]=eatMX(t)2.MbX(t)=E[ebXt]=MX(bt)3.MX+ab(t)=E[e(x+ab)t]=E[eabt]MX(tb)

Product of Moments

μr,s=xyxrysfX,Y(x,y)=E[XrYs]μr,s=xrysfX,Y(x,y)dxdyP(|Xμ|<kσ)11k2,σ0

The covariance of X and Y is the Product of Moments about the mean, expectations are inversely related when covariance is negative, and directly related when it is positive

μ1,1=σXY=cov(X,Y)=E((XμX)1(YμY)1)=E(XY)E(X)E(Y)

If X and Y are independent:

If A1,A2,A3,An are in a sample space such that P(A1)0,P(A1A2)0,P(A1A2An1)0 then P(A1A2An)=P(A1)P(A2|A1)P(A3|A1A2)P(An|A1A2An1)
If A and B are independent (A and B) and (A and B) are also independent
If the sample space S can be partitioned into events B1,B2,Bk and P(Bk)0 i=1,2k
Then for any event A in S P(A)=i=1kP(A|Bi)P(Bi)
Independence
If X and Y are independent:

f(x,y)=P(X=x,Y=y)=P(X=x)P(Y=y)f(x,y)=fX(x)fY(y) (x,y) F(x,y)=P(Xx,Yy)=P(Xx)P(Yy)=FX(x)FY(y) (x,y)

The cdfs can be used instead to determine the dependency of the variables

f(x,y)=fX(x)fY(y)F(x,y)=FX(x)FY(y)