chaos world

Useful random variable distribution

0-1 Distribution

If random variable $X$ has distribution below:

$X$ $0$ $1$
$mathrm{P}(X)$ $1-p$ $p$

$0<p<1$, then $X$ obey 0-1 distribution. The expectation of $X$ is $mathrm{E}(X)=p$. The square deviation of $X$ is $mathrm{D}(X)=p(1-p)$.

Binomial Distribution

If random variable $X$ obeys distribution law:

$0<p<1$, $mathrm{C}_n^k$ refers to combination number $frac{n!}{k!(n-k)!}$, then $X$ obeys binomial distribution, wrote as $X sim mathrm{B}(n,p)$, especially when $n=1$, $X$ obey 0-1 distribution: $X sim mathrm{B}(1,p)$.

The expectation of $X$ is $mathrm{E}(X)=np$. The square deviation of $X$ is $mathrm{D}(X)=np(1-p)$. You can regard it as the sum of $n$ independent random variables which obey 0-1 distribution.

Geometric Distribution

If random variable $X$ obeys distribution law:

where $0<p<1,q=1-p$, then $X$ follows the Geometric Distribution. The expectation of $X$ is

and the deviation of $X$ is

Hypergeometric Distribution

If random variable $X$ obeys distribution law:

where $l_1=mathrm{max}(0, n-N+M), l_2=mathrm{min}(M,n)$, then $X$ follows Hypergeometric Distribution.

Poisson Distribution

If random variable $X$ obeys distribution law:

$lambda>0$ is a constant, then the $X$ obeys Poisson Distribution with parameter $lambda$, wrote as $X sim mathrm{P}(lambda)$. The expectation of $X$ is

The deviation of $X$ is

. Especially, if $X sim P(lambda),Y sim P(lambda)$, and $X,Y$ are independent of each other, then $X+Y sim P(2lambda)$.

Uniform Disrtibution

If continuous random variable $X$ has possibility density function like below:

, then we claim that $X$ obeys Uniform Distribution in range $[a, b]$, wrote as $X sim mathrm{U}[a,b]$.

If continuous random variable $X$ has possibility density function like below:

, then we claim that $X$ obeys Uniform Distribution in range $(a, b)$, wrote as $X sim mathrm{U}(a,b)$.

No matter $X sim mathrm{U}(a,b)$ or $X sim mathrm{U}[a,b]$, their distribution functions both are

. So the expectation of $X$ is

, and the deviation of $X$ is

Exponential Distribution

If continuous random variable $X$ has possibility density like below:

$lambda > 0$, then $X$ follows Exponential Distribution with parameter $lambda$, wrote as $X sim E(lambda)$. So the distribution function is

. Easily we can get $mathrm{P} { X>t }=mathrm{e}^{-lambda t}$, so the conditional possibility

which is regarded as the “lack in memory” character of exponential distribution. The expectation of $X$ is

. And the deviation of $X$ is

Normal Distribution

If continuous random variable $X$ has possibility density function like below:


$,-infty<x<+infty$, then we say that the $X$ obeys Normal Distribution, wrote as $X sim mathrm{N}(mu,sigma^2)$, where its distribution function is

. Especially when $mu=0,sigma=1$, as $X sim mathrm{N}(0,1)$, $X$ obeys standard normal distribution, we often use $varphi(x)=frac1{sqrt{2pi}}mathrm{e}^{-frac{x^2}2}$ to represent its possibility density, and $Phi(x)=frac1{sqrt{2pi}}int_{-infty}^x mathrm{e}^{-frac{t^2}2}mathrm{d}t$ to denote its distribution. It can be demonstrated that if $Y sim mathrm{N}(mu,sigma^2)$, then $Y$’s distribution function is $Phi(frac{x-mu}{sigma})$. Also we can find that the possibility density function is even, the left and right parts of $x=mu$ are symmetric. So the conclusion below is explicit:

. Finally the expectation and deviation of $X sim mathrm{N}(mu,sigma^2)$ are:

2-Dimensional Normal Distribution

If 2-dimensional continuous random variables $(X,Y)$ has possibility density function like:

where $mu_1,mu_2,sigma_1>0,sigma_2>0,-1<rho<1$ are all constants, then $(X,Y)$ follows 2-Dimensional Normal Distribution with parameters $mu_1,mu_2,sigma_1,sigma_2,rho$, wrote as $(X,Y) sim mathrm{N}(mu_1,mu_2;sigma_1^2,sigma_2^2;rho)$. The sufficient and necessary condition of $rho=0$ is that $X$ and $Y$ are independent of each other. There is a significant property of 2-Dimensional Normal Distribution as below:

, where $(X,Y) sim mathrm{N}(mu_1,mu_2;sigma_1^2,sigma_2^2;rho)$.

$chi^2$ Distribution

Assume $X_1, X_2, cdots,X_n$ are independent and obeys standard normal distribution, define

, then $chi^2$ obeys $chi^2$ distribution whose DOF is $n$, wrote as $chi^2 sim chi^2(n)$. The expectation of $chi^2(n)$ is

and the deviation of $chi^2(n)$ is

. Assume $chi^2 sim chi^2(n)$, for given $0<alpha<1$, which satisfies the condition:

, then the $chi_alpha^2(n)$ is the quantile of $alpha$ on $chi^2(n)$ distribution. If $chi_1^2 sim chi^2(n_1)$ and $chi_2^2 sim chi^2(n_2)$, in addition that $chi_1^2$ and $chi_2^2$ are independent, then $chi_1^2+chi_2^2 sim chi^2(n_1+n_2)$.

$t$ Distribution

If random variable $X$ and $Y$ are independent of each other, assume $X sim mathrm{N}(0,1), Y sim chi^2(n)$, define random variable

, then $T$ obeys $t$ distribution with DOF $n$, denoted as $T sim t(n)$. Because the possibility density function of $t$ distribution is even, when $n to infty$, $t(n)$ is similar to $N(0,1)$. Assume $T sim t(n)$, for given $0<alpha<1$, which satisfies the condition:

, then the $t_alpha(n)$ is the quantile of $alpha$ on $t(n)$ distribution. Obviously $t_alpha(n)=-t_{1-alpha(n)}$

$F$ Distribution

If random variable $X$ and $Y$ are independent of each other, assume $X sim chi^2(n_1), Y sim chi^2(n_2)$, define random variable

, then $F$ obeys $F$ distribution with DOF $(n_1,n_2)$, denoted as $F sim F(n_1,n_2)$, where $n_1, n_2$ are the first and second DOF. Assume $F sim F(n_1,n_2)$, for given $0<alpha<1$, which satisfies the condition:

, then the $F_alpha(n_1,n_2)$ is the quantile of $alpha$ on $F(n_1,n_2)$ distribution. Obviously $frac1F sim F_alpha(n_2,n_1)$ and

Sampling Distribution of Normal Variable

If $X sim mathrm{N}(mu,sigma^2)$, $X_1, X_2, cdots, X_n$ are samples from $X$, the average of samples is $bar X=frac1nsum_{i=1}^nX_i$, the variance of samples is $S^2=frac1{n-1}sum_{i=1}^n(X_i-bar X)^2$, then we have the following conclusion:




, moreover the $bar X$ and $S^2$ are independent of each other.