Continuous Probability Distributions

Robert Seldon Duncanson (1821-1872) was an American painter.

We have sections

\begin{equation}{\label{a}}\tag{A}\mbox{}\end{equation}

Moment-Generating Functions.

Random variables whose range are not composed of a countable number of points but are intervals or a union of intervals are said to be of the continuous type. We say that the probability density function of a continuous random variable \(X\) with range \(R\) that is an interval or union of intervals, is an integrable function \(f(x)\) satisfying the following conditions.

  • We have \(f(x)>0\) for \(x\in R\).
  • We have \[\int_{R} f(x)dx=1.\]
  • The probability of the event \({X\in A}\) is given by \[P(X\in A)=\int_{A} f(x)dx.\]

Example. Let the random variable \(X\) be the distance in feet between bad records on a used computer tape. Suppose that a reasonable probability model for \(X\) is given by the p.d.f.

\[f(x)=\frac{1}{40}e^{-x/40}\]

for \(0\leq x<\infty\). We first note that the range \(R=\{x:0\leq x<\infty\}\) and \(f(x)>0\) for \(x\in R\). We also have

\begin{align*} \int_{R} f(x)dx & =\int_{0}^{\infty}\frac{1}{40}e^{-x/40}dx\\ & =\lim_{b\rightarrow\infty}\left [-e^{-x/40}\right ]{0}^{b}\\ & =1-\lim_{b\rightarrow\infty} e^{-b/40}=1.\end{align*}

The probability that the distance between bad records is greater than \(40\) feet is given by

\begin{align*} P(X>40) & =\int_{40}^{\infty} \frac{1}{40}e^{-x/40}dx\\ & =e^{-1}.\end{align*}

The distribution function of a continuous random variable \(X\), defined in terms of the p.d.f. of \(X\), is given by

\begin{align*} F(x) & =P(X\leq x)\\ & =\int_{-\infty}^{x} f(t)dt.\end{align*}

The distribution function \(F(x)\) cumulates all of the probability less than or equal to \(x\). Using the fundamental theorem of calculus, we have \(F'(x)=f(x)\).

Since there are no jumps in a distribution function \(F(x)\) of the continuous type, it must be true that \(P(X=b)=0\) for all real values of \(b\). This agrees with the fact that the integral \(\int_{b}^{b} f(x)dx=0\) in calculus. Therefore, we have

\begin{align*} P(a\leq X\leq b) & =P(a<X<b)\\ & =P(a<X\leq b)\\ & =P(a\leq X<b)\\ & =F(b)-F(a)\end{align*}

provided that \(X\) is a continuous random variable.

Example. Let \(X\) be a continuous random variable with p.d.f. \(f(x)=2x\) for \(0<x<1\). The distribution function of \(X\) is given by

\[F(x)=\left\{\begin{array}{ll}
0, & x<0\\
{\displaystyle \int_{0}^{x} 2tdt=x^{2}}, & 0\leq x<1\\
1, & 1\leq y.
\end{array}\right .\]

We also have

\begin{align*} & P\left (\frac{1}{2}<X\leq\frac{3}{4}\right )\\ & \quad =F(3/4)-F(1/2)\\ & \quad =(3/4)^{2}-(1/2)^{2}\\ & \quad =\frac{5}{16}\end{align*}

and

\begin{align*} P\left (\frac{1}{4}\leq X<2\right ) & =F(2)-F(1/4)\\ & =1-(1/4)^{2}\\ & =\frac{15}{16}.\end{align*}

The expected value or mean of continuous random variable \(X\) is defined by

\[\mu =\mathbb{E}(X)=\int_{-\infty}^{\infty} xf(x)dx,\]

and the variance of \(X\) is defined by

\begin{align*} \sigma^{2} & =Var(X)\\ & =\mathbb{E}[(X-\mu )^{2}]\\ & =\int_{-\infty}^{\infty} (x-\mu )^{2}f(x)dx.\end{align*}

The standard deviation of \(X\) is defined by \(\sigma =\sqrt{Var(X)}\).

The moment-generating function, if it exists, is defined by

\begin{align*} M(t) & =E(e^{tX})\\ & =\int_{-\infty}^{\infty} e^{tx}f(x)dx\end{align*}

for \(-h<t<h\). Moreover, we also have \(\sigma^{2}=\mathbb{E}(X^{2})-\mu^{2}\), \(\mu =M'(0)\) and \(\sigma^{2}=M”(0)-[M'(0)]^{2}\).

Example. Let \(X\) have the p.d.f.

\[f(x)=\left\{\begin{array}{ll}
xe^{-x}, & 0\leq x<\infty,\\
0, & \mbox{elsewhere}.
\end{array}\right .\]

Then, we have

\begin{align*} M(t) & =\int_{0}^{\infty} e^{tx}xe^{-x}dx=\lim_{b\rightarrow\infty}\int_{0}^{b} xe^{-(1-t)x}dx\\
& =\lim_{b\rightarrow\infty}\left [-\frac{xe^{-(1-t)x}}{1-t}-\frac{e^{-(1-t)x}}{(1-t)^{2}}\right ]_{0}^{b}\\ & = \lim_{b\rightarrow\infty} \left [-\frac{be^{-(1-t)b}}{1-t}-\frac{e^{-(1-t)b}}{(1-t)^{2}}\right ]+\frac{1}{(1-t)^{2}}\\ & =\frac{1}{(1-t)^{2}}\end{align*}

provided that \(t<1\). Now, we have

\[M'(t)=\frac{2}{(1-t)^{3}}\]

and

\[M”(t)=\frac{6}{(1-t)^{4}}.\]

Therefore, we obtain

\[\mu =M'(0)=2\]

and

\[\sigma^{2}=M”(0)-[M'(0)]^{2}=6-2^{2}=2.\]

\begin{equation}{\label{b}}\tag{B}\mbox{}\end{equation}

Continuous Random Variables.

The Uniform Distribution.

Let the random variable \(X\) denote the outcome when a point is selected at random from an interval \([a,b]\), \(-\infty <a<b<\infty\). If the experiment is performed in a fair manner, it is reasonable to assume that the probability that the point is selected from the interval \([a,x]\) for \(a\leq x<b\) is \((x-a)/(b-a)\). In other words, the probability is proportional to the length of the interval so that the distribution function of \(X\) is given by

\[F(x)=\left\{\begin{array}{ll}
0, & x<a,\\
{\displaystyle \frac{x-a}{b-a},} & a\leq x<b,\\
1, & b\leq x.
\end{array}\right .\]

Since \(X\) is a continuous random variable, the derivative \(F'(x)\) is equal to the p.d.f. of \(X\) whenever \(F'(x)\) exists. Therefore, for \(a<x<b\), we have

\[f(x)=F'(x)=\frac{1}{b-a}.\]

The random variable \(X\) has a uniform distribution when its p.d.f. is equal to a constant on its support. In particular, if the support is the interval \([a,b]\), then

\[f(x)=\frac{1}{b-a}\]

for \(a\leq x\leq b\), which is denoted as \(U(a,b)\). The mean and variance are given by

\[\mu =\frac{a+b}{2}\mbox{ and }\sigma^{2}=\frac{(b-a)^{2}}{12}.\]

The moment-generating function is given by

\[M(t)=\left\{\begin{array}{ll}
{\displaystyle \frac{e^{tb}-e^{ta}}{t(b-a)},} & t\neq 0,\\
1, & t=0.
\end{array}\right .\]

Exponential Distribution.

We say that the random variable \(X\) has an exponential distribution when its p.d.f. is defined by

\[f(x)=\frac{1}{\theta}e^{-x/\theta}\]

for \(0\leq x<\infty\), where the parameter \(\theta >0\). The moment-generating function is given by

\begin{align*} M(t) & =\int_{0}^{\infty} e^{tx}\cdot\frac{1}{\theta}e^{-x/\theta}dx\\ & =\lim_{b\rightarrow\infty}\int_{0}^{b}\frac{1}{\theta}e^{(1-\theta t)x/\theta}dx\\ & =\lim_{b\rightarrow\infty}\left [-\frac{e^{-(1-\theta t)x/\theta}}{1-\theta t}\right ]_{0}^{b}\\ & =\frac{1}{1-\theta t}\end{align*}

for \(t<1/\theta\). Therefore, we obtain

\[M'(t)=\frac{\theta}{(1-\theta t)^{2}}\]

and

\[M”(t)=\frac{2\theta^{2}}{(1-\theta t)^{3}}.\]

We also have

\[\mu =M'(0)=\theta\]

and

\[\sigma^{2}=M”(0)-[M'(0)]^{2}=\theta^{2}.\]

Let \(X\) have an exponential distribution with mean \(\mu =\theta\). Then, the distribution function of \(X\) is given by

\[F(x)=\left\{\begin{array}{ll}
0, & -\infty <x<0,\\
1-e^{-x/\theta}, & 0\leq x<\infty .
\end{array}\right .\]

Example. Suppose that the life of a certain type pf electronic component has an exponential distribution with a mean life of 500 hours. Let \(X\) denote the life of this component (or the time to failure of this component). Then, we have

\begin{align*} P(X>x) & =\int_{x}^{\infty} \frac{1}{500}e^{-t/500}dt\\ & =e^{-x/500}.\end{align*}

Suppose that the component has been in operation for 300 hours. The conditional probability that it will last for another 600 hours is given by

\begin{align*} P(X>900|X>300) & =\frac{P(X>900)}{P(X>300)}\\ & =\frac{e^{-900/500}}{e^{-300/500}}\\ & =e^{-6/5}.\end{align*}

Gamma Distribution.

The gamma function is defined by

\[\Gamma (t)=\int_{0}^{\infty} y^{t-1}e^{-y}dy\]

for \(0<t\). This integral is positive for \(0<t\), since the integrand is positive. For \(t>1\), we have

\begin{align*} \Gamma (t) & =\left [-y^{t-1}e^{-y}\right ]_{0}^{\infty}+\int_{0}^{\infty}(t-1)y^{t-2}e^{-y}dy\\ & =(t-1)\int_{0}^{\infty}y^{t-2}e^{-y}dy\\ & =(t-1)\Gamma (t-1).\end{align*}

Whenever \(t=n\) is taken to be a positive integer, we have

\begin{align*} \Gamma (n) & =(n-1)\Gamma (n-1)\\ & =\cdots =(n-1)!\Gamma (1).\end{align*}

Since

\[\Gamma (1)=\int_{0}^{\infty} e^{-y}dy=1,\]

it follows \(\Gamma (n)=(n-1)!\). The random variable \(X\) has a gamma distribution when its p.d.f. is defined by

\[f(x)=\frac{1}{\Gamma (\alpha )\theta^{\alpha}}x^{\alpha -1}e^{-x/\theta}\]

for \(0\leq x<\infty\). It can be shown that the moment-generating function of \(X\) is given by

\[M(t)=\frac{1}{(1-\theta t)^{\alpha}}\]

for \(t<1/\theta\). The mean and variance are given by

\[\mu =\alpha\theta\mbox{ and }\sigma^{2}=\alpha\theta^{2}.\]

Chi-Square Distribution.

We now consider a special case of the gamma distribution that plays an important role in statistics. Let \(X\) have a gamma distribution with \(\theta =2\) and \(\alpha =r/2\), where \(r\) is a positive integer. The p.d.f. of \(X\) is

\[f(x)=\frac{1}{\Gamma (r/2)2^{r/2}}x^{r/2-1}e^{-x/2}\]

for \(0\leq x<\infty\). We say that \(X\) has a chi-square distribution with \(r\) degrees of freedom, which we abbreviate by saying \(X\) is \(\chi^{2}(r)\). The moment-generating function is given by

\[M(t)=(1-2t)^{-r/2}\]

for \(t<1/2\). The mean and variance are given by

\[\mu =\alpha\theta=r\mbox{ and }\sigma^{2}=\alpha\theta^{2}2r.\]

Normal Distribution.

The random variable \(X\) has a normal distribution when its p.d.f. is defined by

\[f(x)=\frac{1}{\sigma\sqrt{2\pi}}\exp\left [-\frac{(x-\mu )^{2}}{2\sigma^{2}}\right ]\]

for \(-\infty <x<\infty\), where \(\mu\) and \(\sigma\) are parameters satisfying \(-\infty <\mu <\infty\), \(0<\sigma <\infty\). We also say that \(X\) is \(N(\mu ,\sigma^{2})\). It can be shown that the moment-generating function is given by

\[M(t)=\exp\left (\mu t+\frac{\sigma^{2}t^{2}}{2}\right ).\]

Then, we have

\[M'(t)=(\mu +\sigma^{2}t)\exp\left (\mu t+\frac{\sigma^{2}t^{2}}{2}\right )\]

and

\[M”(t)=[(\mu +\sigma^{2}t)^{2}+\sigma^{2}]\exp\left (\mu t+\frac{\sigma^{2}t^{2}}{2}\right ).\]

Therefore, we obtain

\[E(X)=M'(0)=\mu\]

and

\[Var(X)=M”(0)-[M'(0)]^{2}=\sigma^{2}.\]

The parameters \(\mu\) and \(\sigma^{2}\) in the p.d.f. of \(X\) are the mean and the variance of \(X\). When \(Z\) is \(N(0,1)\), we shall say that \(Z\)
has a standard normal distribution. Moreover, the distribution function of \(Z\) is given by

\begin{align*} \Phi (z) & =P(Z\leq z)\\ & =\int_{-\infty}^{z}\frac{1}{\sqrt{2\pi}}e^{-w^{2}/2}dw.\end{align*}

Theorem. Let \(X\) be \(N(\mu ,\sigma^{2})\). Then \(Z=(X-\mu )/\sigma\) is \(N(0,1)\).

Proof. The distribution function of \(Z\) is given by

\begin{align*} P(Z\leq z) & =P\left (\frac{X-\mu}{\sigma}\leq z\right )\\ & =P(X\leq z\sigma+\mu )\\ & =\int_{-\infty}^{z\sigma +\mu}\frac{1}{\sigma\sqrt{2\pi}}\exp\left [-\frac{(x-\mu )^{2}}{2\sigma^{2}}\right ]dx.\end{align*}

Using the change of variable of integration given by \(w=(x-\mu )/\sigma\) (i.e. \(x=w\sigma +\mu\)) to obtain

\[P(Z\leq z)=\int_{-\infty}^{z}\frac{1}{\sqrt{2\pi}}e^{-w^{2}/2}dw,\]

which is the expression for \(\Phi (z)\), i.e., the distribution function of a standardized normal random variable. This completes the proof. \(\blakcsquare\)

This theorem can be used to find probabilities about \(X=N(\mu ,\sigma^{2})\) as follows.

\begin{align*} P(a\leq X\leq b) & =P\left (\frac{a-\mu}{\sigma}\leq\frac{X-\mu}{\sigma}\leq\frac{b-\mu}{\sigma}\right )\\ & =\Phi\left (\frac{b-\mu}{\sigma}\right )-\Phi\left (\frac{a-\mu}{\sigma}\right ),\end{align*}

since \((X-\mu )/\sigma\) is \(N(0,1)\).

Theorem. Let the random variable \(X\) be \(N(\mu ,\sigma^{2})\). Then, the random variable \(V=[(X-\mu )/\sigma ]^{2}=Z^{2}\) is \(\chi^{2}(1)\).

Proof. Since \(V=Z^{2}\), where \(Z=(X-\mu )/\sigma\) is \(N(0,1)\), the distribution function \(F(v)\) of \(V\) is given by

\begin{align*} F(v) & =P(Z^{2}\leq v)\\ & =P(-\sqrt{v}\leq Z\leq\sqrt{v}).\end{align*}

For \(v\geq 0\), we have

\begin{align*} F(v) & =\int_{-\sqrt{v}}^{\sqrt{v}}\frac{1}{\sqrt{2\pi}}e^{-z^{2}/2}dz\\ & =2\int_{0}^{\sqrt{v}}\frac{1}{\sqrt{2\pi}}e^{-z^{2}/2}dz.\end{align*}

Using the change the variable of integration by writing \(z=\sqrt{y}\), we obtain

\[F(v)=\int_{0}^{v}\frac{1}{\sqrt{2\pi y}}e^{-y/2}dy\]

for \(0\leq v\). We also have \(F(v)=0\) for \(v<0\). Therefore, using the fundamental theorem of calculus, the p.d.f. \(f(v)=F'(v)\) of the continuous random variable \(V\) is given by

\[f(v)=\frac{1}{\sqrt{\pi}\sqrt{2}}v^{1/2-1}e^{-v/2}\]

for \(0<v<\infty\). Since \(f(v)\) is a p.d.f., it must be true to have

\[\int_{0}^{\infty}\frac{1}{\sqrt{\pi}\sqrt{2}}v^{1/2-1}e^{-v/2}=1.\]

The change of variables \(x=v/2\) yields

\begin{align*} 1 & =\frac{1}{\sqrt{\pi}}\int_{0}^{\infty}x^{1/2-1}e^{-x}dx\\ & =\frac{1}{\sqrt{\pi}}\Gamma (1/2).\end{align*}

Since \(\Gamma (1/2)=\sqrt{\pi}\), it shows that \(V\) is \(\chi^{2}(1)\). This completes the proof. \(\blacksquare\)

 

Hsien-Chung Wu
Hsien-Chung Wu
文章: 183

發佈留言

發佈留言必須填寫的電子郵件地址不會公開。 必填欄位標示為 *