Characteristic Functions and Generating Functions

Philip de Laszlo (1869-1937) was a Hungarian painter.

We have sections

Recall that \(e^{ix}=\cos x+i\sin x\) for \(x\in \mathbb{R}\).

\begin{equation}{\label{a}}\tag{A}\mbox{}\end{equation}

For Random Variables.

Let \(X\) be a random variable with p.d.f. \(f\). Then the characteristic function of \(X\), denoted by \(\phi_{X}\), is a function defined on \(\mathbb{R}\), taking complex values, and defined by

\begin{align*} \phi_{X}(t) & =\mathbb{E}[e^{itX}]\\ & =\left\{\begin{array}{l}
{\displaystyle \sum_{x}\cos (tx)\cdot f(x)+i\sum_{x}\sin (tx)\cdot f(x)}\\
{\displaystyle \int_{\mathbb{R}}\cos (tx)\cdot f(x)dx+i\int_{\mathbb{R}}\sin (tx)\cdot f(x)dx}.\end{array}\right .\end{align*}

We see that \(\phi_{X}(t)\) exists for all \(t\in \mathbb{R}\). The characteristic function \(\phi_{X}(t)\) is also called the Fourier transform of \(f\).

Theorem. Some properties of the characteristic functions are given below.

(i) we have \(\phi_{X}(0)=1\) and \(|\phi_{X}(t)|=1\).

(ii) \(\phi_{X}\) is uniformly continuous.

(iii) We have \(\phi_{cX+d}(t)=e^{itd}\cdot\phi_{X}(ct)\), where \(c\) and \(d\) are constants.

(iv) We have

\[\left .\frac{d^{n}}{dt^{n}}\phi_{X}(t)\right |_{t=0}=i^{n}\mathbb{E}[X^{n}]\]

when \(\mathbb{E}[|X^{n}|]<\infty\). \(\sharp\)

Theorem. Let \(X\) be a random variable with p.d.f. \(f\) and characteristic function \(\phi\).

(i) Let \(X\) be of the discrete type. Then, we have

\[\mathbb{P}(X=x)=\lim_{T\rightarrow\infty}\frac{1}{2T}\int_{-T}^{T}e^{itx}\phi (t)dt.\]

(ii) Let \(X\) be of the continuous type. Then, we have

\[f(x)=\lim_{h\rightarrow 0}\lim_{T\rightarrow\infty}\frac{1}{2\pi}\int_{-T}^{T}\frac{1-e^{-ith}}{ith}\cdot e^{-itx}\cdot\phi (t)dt.\]

In particular, when \(\int_{\mathbb{R}}|\phi (t)|dt<\infty\), we have

\[f(x)=\frac{1}{2\pi}\int_{-\infty}^{\infty}e^{-itx}\phi (t)dt.\]

Theorem. Whenever \(a<b\in \mathbb{R}\) are continuity points of distribution function \(F_{X}\) of \(X\), we have

\[\mathbb{P}(A<X<b)=\lim_{T\rightarrow\infty}\frac{1}{2\pi}\int_{-T}^{T}\frac{e^{-ita}-e^{-itb}}{it}\cdot\phi_{X}(t)dt.\]

Theorem. (Uniqueness Theorem). There is a one-to-one correspondence between the characteristic function and the p.d.f. of a random variable.

The moment-generating function \(M_{X}\) of a random variable, which is also called the Laplace transform of \(f\), is defined by

\[M_{X}(t)=\mathbb{E}[e^{tX}]\]

for \(t\in \mathbb{R}\) when this expectation exists. For \(t=0\), we have \(M_{X}(0)=1\). However, for \(t\neq 0\), it may fail to exist. When \(M_{X}(t)\) exists, we have

\[\phi_{X}(t)=M_{X}(it).\]

The generating function \(\eta_{X}\) of a random variable is defined by

\[\eta_{X}(t)=\mathbb{E}[t^{X}],\]

which is sometimes known as the Mellin transform of \(f\). Clearly, we have

\[\eta_{X}(t)=M_{X}(\ln t)\]

for \(t>0\).

\[\phi_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})=\mathbb{E}\left [e^{it_{1}X_{1}+it_{2}X_{2}+\cdots it_{n}X_{n}}\right ].\]

\begin{equation}{\label{b}}\tag{B}\mbox{}\end{equation}

For Random Vectors.

Let \({\bf X}=(X_{1},\cdots ,X_{n})\) be a random vector. Then the characteristic function of the random vector \({\bf X}\), or the joint characteristic function of the random variables \(X_{1},\cdots ,X_{n}\), denoted by \(\phi_{\bf X}\) or \(\phi_{X_{1},\cdots ,X_{n}}\), is defined by

The characteristic function \(\phi_{X_{1},\cdots ,X_{n}}\) always exists.

Theorem. Some properties of the characteristic functions are given below.

(i) We have \(\phi_{X_{1},\cdots ,X_{n}}(0,\cdots ,0)=1\) and \(|\phi_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})|=1\).

(ii) \(\phi_{X_{1},\cdots ,X_{n}}\) is uniformly continuous.

(iii) We have

\[\phi_{c_{1}X_{1}+d_{1},\cdots ,c_{n}X_{n}=d_{n}}(t)=e^{it_{1}d_{1}+\cdots it_{n}d_{n}}\cdot\phi_{X_{1},\cdots ,X_{n}}(c_{1}t_{1},\cdots ,c_{n}t_{n}),\]

where \(c_{i}\) and \(d_{i}\) are constants for \(i=1,\cdots ,n\).

(iv) Suppose that the absolute \((k_{1},\cdots ,k_{n})\)-joint moment, as well as all lower order joint moments of \(X_{1},\cdots ,X_{n}\) are finite. Then, we have

\[\left .\frac{\partial^{k_{1}+\cdots +k_{n}}}{\partial t_{1}^{k_{1}}\cdots\partial t_{n}^{k_{n}}}\phi_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})\right |_{t{1}=\cdots t_{n}=0}=i^{\sum_{j=1}^{n}k_{j}}\mathbb{E}[X_{1}^{k_{1}}\cdots X_{n}^{k_{n}}].\]

In particular, we also have

\[\left .\frac{\partial^{k}}{\partial t_{j}^{k}}\phi_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})\right |_{t{1}=\cdots =t_{n}=0}=i^{k}\mathbb{E}[X_{j}^{k}]\]

for \(j=1,\cdots ,n\).

(v) In \(\phi_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})\), we set \(t_{j_{1}}=\cdots =t_{j_{k}}=0\). Then, the resulting expression is the joint characteristic function of the random variables \(X_{i_{1}},\cdots ,X_{i_{m}}\), where the indices \(i\)’s and \(j\)’s are different and \(k+m=n\). \(\sharp\)

Theorem. (Uniqueness Theorem). There is a one-to-one correspondence between the characteristic function and the p.d.f. of a random vector.

The moment-generating function of the random vector \({\bf X}=(X_{1},\cdots , X_{n})\) or the joint moment-generating function of the random variables \(X_{1},\cdots ,X_{n}\), denoted by \(M_{\bf X}\) or \(M_{X_{1},\cdots ,X_{n}}\), is defined by

\[M_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})=\mathbb{E}\left [e^{t_{1}X_{1}+\cdots +t_{n}X_{n}}\right ]\]

when the expectation exists. Suppose that If \(M_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})\) exists. Then, we have

\[\phi_{X_{1},\cdots ,X_{n}}(t_{1},\cdots ,t_{n})=M_{X_{1},\cdots ,X_{n}}(it_{1},\cdots ,it_{n}).\]

Finally, the characteristic function of a measurable function \(g({\bf X})\) of the random vector \({\bf X}\) is defined by

\begin{align*} \phi_{g({\bf X})}(t) & =\mathbb{E}[e^{itg({\bf X})}]\\ & =\left\{\begin{array}{l}
{\displaystyle \sum_{\bf x}e^{itg({\bf x})}\cdot f({\bf x})}\\
{\displaystyle \int_{\mathbb{R}}\cdots\int_{\mathbb{R}}e^{itg(x_{1},\cdots ,x_{n})}
\cdot f(x_{1},\cdots ,x_{n})dx_{1}\cdots dx_{n}}
\end{array}\right .\end{align*}

and its moment-generating function (when it exists) is given by

\begin{align*} M_{g({\bf X})}(t) & =\mathbb{E}[e^{tg({\bf X})}]\\ & =\left\{\begin{array}{l}
{\displaystyle \sum_{\bf x}e^{tg({\bf x})}\cdot f({\bf x})}\\
{\displaystyle \int_{\mathbb{R}}\cdots\int_{\mathbb{R}}e^{tg(x_{1},\cdots ,x_{n})}
\cdot f(x_{1},\cdots ,x_{n})dx_{1}\cdots dx_{n}}.
\end{array}\right .\end{align*}

\begin{equation}{\label{c}}\tag{C}\mbox{}\end{equation}

For Discrete Random Variables.

Let \(X\) be a discrete random variable with p.d.f. \(f(x)\). If

\[\eta (t)=E(t^{X})=\sum_{x}t^{x}f(x)\]

exists and is finite for \(t\) values in some interval including \(t=0\) and \(t=1\).

Example. Consider a random variable \(X\) that has the geometric p.d.f. \(f(x)=p(1-p)^{x-1}\) for \(x=1,2,3,\cdots ,\) where \(0<p<1\). Then, we have

\begin{align*} \eta (t) & =\sum_{x=1}^{\infty} t^{x}p(1-p)^{x-1}\\ & =pt\sum_{x=1}^{\infty}[(1-p)t]^{x-1}\\ & =\frac{pt}{1-(1-p)t}\end{align*}

provided that \(-1<(1-p)t<1\). In other words, for \(-1/(1-p)<t<1/(1-p)\), \(\eta (t)\) exists, where the interval of \(t\) includes \(t=0\) and \(t=1\) since \(0<p<1\). \(\sharp\)

$\eta (t)$ is also called the probability-generating function. To see why \(\eta (t)\) is called the probability-generating function, let us suppose there are only a finite number of points, which is true for the binomial distribution for illustration. We see that

\begin{align*} \eta (t) & =\sum_{x=0}^{n} t^{x}f(x)\\ & =f(0)+f(1)t+f(2)t^{2}+\cdots +f(n)t^{n}\end{align*}

is simply a polynomial of \(n\)th degree. In particular, we have

\[\eta (0)=f(0)=\mathbb{P}(X=0).\]

Now, we take the first derivative

\[\eta^{\prime}(t)=f(1)+2f(2)t+3f(3)t^{2}+\cdots +nf(n)t^{n-1}\]

and set \(t=0\). Then, we obtain

\[\eta^{\prime}(0)=f(1)=\mathbb{P}(X=1).\]

Taking the second derivative

\[\eta^{\prime\prime}(t)=2f(2)+3\cdot 2f(3)t+\cdots +n\cdot (n-1)f(n)t^{n-2}\]

and set \(t-0\), we also obtain

\[\mathbb{P}(X=2)=\frac{\eta^{\prime\prime}(0)}{2!}.\]

Continuing this process, we have

\[\mathbb{P}(X=3)=\frac{\eta^{\prime\prime\prime}(0)}{3!}.\]

By induction, we finally obtain

\[\mathbb{P}(X=r)=\frac{\eta^{(r)}(0)}{r!}\]

for \(r=0,1,2,\cdots ,n\). Therefore, we can generate the probabilities \(\mathbb{P}(X=x)\) for \(x\in R\) from \(\eta (t)\) and its derivatives.

Now, we consider

\[\eta (t)=\sum_{x} t^{x}f(x).\]

If we interchange the order of differentiation and summation, which we can do provided that the resulting summations exist, we obtain, for each positive integer \(r\),

\[\eta^{(r)}(t)=\sum_{x}x(x-1)\cdots (x-r+1)t^{x-r}f(x).\]

By taking \(t=1\), we obtain

\begin{align*} \eta^{(r)}(1) & =\sum_{x}x(x-1)\cdots (x-r+1)f(x)\\ & =\mathbb{E}[X(X-1)\cdots (X-r+1)].\end{align*}

In particular, we also have

\[\eta^{\prime}(1)=\mathbb{E}(X)=\mu\]

and

\begin{align*} & \eta^{\prime\prime}(1)+\eta^{\prime}(1)-[\eta^{\prime}(1)]^{2}\\ & \quad =\mathbb{E}[X(X-1)]+\mathbb{E}(X)-[\mathbb{E}(X)]^{2}\\ & \quad =\mathbb{E}(X^{2})-[\mathbb{E}(X)]^{2}\\ & \quad =\sigma^{2}.\end{align*}

Example. For the geometric distribution, the generating function is given by

\[\eta (t)=\frac{pt}{[1-(1-p)t]}\]

for \(-\frac{1}{1-p}<t<\frac{1}{1-p}\). Now, we have

\[\eta^{\prime}(t)=\frac{p}{[1-(1-p)t]^{2}}\]

and

\[\eta^{\prime\prime}(t)=\frac{2p(1-p)}{[1-(1-p)]^{3}}.\]

The mean of the geometric distribution is given by

\begin{align*} \mu & =E(X)\\ & =\eta^{\prime}(1)\\ & =\frac{p}{[1-(1-p)]^{2}}\\ & =\frac{1}{p}.\end{align*}

The variance of the geometric distribution is given by

\begin{align*} \sigma^{2} & =\eta^{\prime\prime}(1)+\eta^{\prime}(1)-[\eta^{\prime}(1)]^{2}\\ & =\frac{2p(1-p)}{[1-(1-p)]^{3}}+\frac{1}{p}-\frac{1}{p^{2}}\\ & =\frac{1-p}{p^{2}}.\end{align*}

Example. Consider the random variable \(X\) that has the geometric p.d.f. \(f(x)=p(1-p)^{x-1}\) for \(x=1,2,3,\cdots\). We have

\begin{align*} M(t) & =\sum_{x=1}^{\infty} e^{tx}p(1-p)^{x-1}\\ & =pe^{t}\sum_{x=1}^{\infty}[(1-p)e^{t}]^{x-1}.\end{align*}

The summation is the sum of a geometric series which exists provided that \((1-p)e^{t}<1\) or, equivalently, \(t<-\ln (1-p)\), i.e.,

\[M(t)=\frac{pe^{t}}{1-(1-p)e^{t}}\]

for \(t<-\ln (1-p)\). \(\sharp\)

From the theory of mathematical analysis, it can be shown that the existence of \(M(t)\), for \(-h<t<h\), implies that the derivatives of \(M(t)\) of all orders exists at \(t=0\). Moreover, it is permissible to interchange the differentiation and summation. Therefore, we have

\[M^{(r)}(t)=\sum_{x}x^{r}e^{tx}f(x).\]

By setting \(t=0\), we also have

\[M^{(r)}(0)=\sum_{x} x^{r}f(x)=E(X^{r}).\]

In particular, when the moment-generating function exists, we have \(\mu=M^{\prime}(0)\) and

\[\sigma^{2} =M^{\prime\prime}(0)-[M^{\prime}(0)]^{2}.\]

Example. Let \(X\) have a binomial distribution \(B(n,p)\) with p.d.f. \(f(x)=C^{n}_{x}p^{x}(1-p)^{n-x}\) for \(x=0,1,2,\cdots ,n\). The moment-generating function of \(X\) is given by

\begin{align*} M(t) & =E(e^{tX})\\ & =\sum_{x=0}^{n}e^{tx}C^{n}_{x}p^{x}(1-p)^{n-x}\\ & = \sum_{x=0}^{n}C^{n}_{x}(pe^{t})^{x}(1-p)^{n-x}.\end{align*}

Using the formula for the binomial expansion with \(a=1-p\) and \(b=pe^{t}\), we have

\[M(t)=[(1-p)+pe^{t}]^{n}\]

for \(-\infty <t<\infty\). The first two derivatives of \(M(t)\) are given by

\[M'(t)=n[(1-p)+pe^{t}]^{n-1}(pe^{t})\]

and

\[M”(t)=n(n-1)[(1-p)+pe^{t}]^{n-2}(pe^{t})^{2}+n[(1-p)+pe^{t}]^{n-1}(pe^{t}).\]

Therefore, we obtain

\[\mu =E(X)=M'(0)=np\]

and

\begin{align*} \sigma^{2} & =M”(0)\\ & =[M'(0)]^{2}\\ & =n(n-1)p^{2}+np-(np)^{2}\\ & =np(1-p).\end{align*}

In the special case for \(n=1\), it follows that \(X\) has a Bernoulli distribution and \(M(t)=(1-p)+pe^{t}\) for all real values of \(t\), \(\mu =p\), and \(\sigma^{2}=p(1-p)\). \(\sharp\)

Example. Let \(X\) have a Poisson distribution with p.d.f.

\[f(x)=\frac{\lambda^{x}e^{-\lambda}}{x!}\]

for \(x=0,1,2,\cdots\). The moment-generating function of \(X\) is given by

\begin{align*} M(t) & =E(e^{tX})\\ & =\sum_{x=0}^{\infty} e^{tx}\frac{\lambda^{x}e^{-\lambda}}{x!}\\ & =e^{-\lambda}\sum_{x=0}^{\infty}\frac{(\lambda e^{t})^{x}}{x!}.\end{align*}

From the series representation of the exponential function, we have

\begin{align*} M(t) & =e^{-\lambda}e^{\lambda e^{t}}\\ & =e^{\lambda (e^{t}-1)}\end{align*}

for all values of \(t\). Now, we have

\[M'(t) & =\lambda e^{t}e^{\lambda (e^{t}-1)}\]

and

\[M”(t) & =(\lambda e^{t})^{2}e^{\lambda (e^{t}-1)}+\lambda e^{t}e^{\lambda (e^{t}-1)}.\]

The values of the mean and variance of \(X\) are given by \(\mu =M'(0)=\lambda\) and

\begin{align*} \sigma^{2} & =M”(0)-[M'(0)]^{2}\\ & =(\lambda^{2}+\lambda )-\lambda^{2}\\ & =\lambda .\end{align*}

Example. Let \(X\) have a negative binomial distribution with p.d.f.

\[f(x)=C^{x-1}_{r-1}p^{r}(1-p)^{x-r}\]

for \(x=r,r+1,\cdots\). The probability-generating function of \(X\) is given by

\begin{align*} \eta (t) & =\sum_{x=r}^{\infty} t^{x}C^{x-1}_{r-1}p^{r}(1-p)^{x-r}\\ & = (pt)^{r}\sum_{x=r}^{\infty}C^{x-1}_{r-1}[(1-p)t]^{x-r}\\ & =\frac{(pt)^{r}}{[1-(1-p)t]^{r}}\end{align*}

for \(|t|<1/(1-p)\). The moment-generating function of \(X\) is given by

\[M(t)=\eta (e^{t})=\frac{(pe^{t})^{r}}{[1-(1-p)e^{t}]^{r}}\]

for \(t<-\ln (1-p)\). Either of these generating functions can be used to show

\[\mu =\frac{r}{p}\mbox{ and }\sigma^{2}=\frac{r(1-p)}{p^{2}}.\]

If \(X\) has a p.d.f. \(f(x)\) with support \({b_{1},b_{2},\cdots}\), then we have

\begin{align*} M(t) & =\sum_{x}e^{tx}f(x)\\ & =f(b_{1})e^{tb_{1}}+f(b_{2})e^{tb_{2}}+\cdots .\end{align*}

Therefore, the coefficients of \(e^{tb_{i}}\) is

\[f(b_{i})=\mathbb{P}(X=b_{i}).\]

That is, if we write a moment-generating function of a discrete random variable \(X\) in the form above, the probability of any value of \(X\), say \(b_{i}\), is the coefficient of \(e^{tb_{i}}\).

Example. Let the moment-generating function of \(X\) be defined by

\[M(t)=\frac{1}{15}e^{t}+\frac{2}{15}e^{2t}+\frac{3}{15}e^{3t}+\frac{4}{15}e^{4t}+\frac{5}{15}e^{5t}.\]

Then, the coefficient of \(e^{2t}\) is \(2/15\). Therefore, we have

\[f(2)=\mathbb{P}(X=2)=\frac{2}{15}.\]

In general, we see that the p.d.f. of \(X\) is given by \(f(x)=\frac{x}{15}\) for \(x=1,2,3,4,5\). \(\sharp\)

\begin{equation}{\label{d}}\tag{D}\mbox{}\end{equation}

For Continuous Random Variables.

The results such as \(\mu =M'(0)\) and \(\sigma^{2}=M”(0)-[M'(0)]^{2}\) are still valid.

Example. The moment-generating function of continuous uniform distribution is given by
\[M(t)=\left\{\begin{array}{ll}
\frac{e^{tb}-e^{ta}}{t(b-a)}, & t\neq 0,\\
1, & t=0.
\end{array}\right .\]

Example. Let \(X\) have the p.d.f.

\[f(x)=\left\{\begin{array}{ll}
xe^{-x}, & 0\leq x<\infty,\\
0, & \mbox{elsewhere}.
\end{array}\right .\]

Then, we have

\begin{align*} M(t) & =\int_{0}^{\infty} e^{tx}xe^{-x}dx=\lim_{b\rightarrow\infty}\int_{0}^{b} xe^{-(1-t)x}dx\\
& =\lim_{b\rightarrow\infty}\left [-\frac{xe^{-(1-t)x}}{1-t}-\frac{e^{-(1-t)x}}{(1-t)^{2}}\right ]{0}^{b}\\ & =\lim_{b\rightarrow\infty} \left [-\frac{be^{-(1-t)b}}{1-t}-
\frac{e^{-(1-t)b}}{(1-t)^{2}}\right ]+\frac{1}{(1-t)^{2}}\\ & =\frac{1}{(1-t)^{2}}\end{align*}

for \(t<1\). Therefore, we obtain

\[M'(t)=\frac{2}{(1-t)^{3}}\mbox{ and }M”(t)=\frac{6}{(1-t)^{4}}\]

which give \(\mu =M'(0)=2\) and

\begin{align*} \sigma^{2} & =M”(0)-[M'(0)]^{2}\\ & =6-2^{2}=2.\end{align*}

Example. Let \(X\) have an exponential distribution with p.d.f.

\[f(x)=\frac{1}{\theta}e^{-x/\theta}\]

for \(0\leq x<\infty\), where the parameter \(\theta >0\). The moment-generating function is given by

\begin{align*} M(t) & =\int_{0}^{\infty} e^{tx}\cdot\frac{1}{\theta}e^{-x/\theta}dx=\lim_{b\rightarrow\infty}\int_{0}^{b}\frac{1}{\theta}e^{(1-\theta t)x/\theta}dx\\
& =\lim_{b\rightarrow\infty}\left [-\frac{e^{-(1-\theta t)x/\theta}}{1-\theta t}\right ]_{0}^{b}\\ & =\frac{1}{1-\theta t}\end{align*}

for \(t<1/\theta\). Therefore, we have

\[[M'(t)=\frac{\theta}{(1-\theta t)^{2}}\]

and

\[M”(t)=\frac{2\theta^{2}}{(1-\theta t)^{3}}.\]

Therefore, the mean and variance are given by

\[\mu =M'(0)=\theta\mbox{ and }\sigma^{2}=M”(0)-[M'(0)]^{2}=\theta^{2}.\]

Example. Let \(X\) have a gamma distribution with p.d.f.

\[f(x)=\frac{1}{\Gamma (\alpha )\theta^{\alpha}}x^{\alpha -1}e^{-x/\theta}\]

for \(0\leq x<\infty\). It can be shown that the moment-generating function of \(X\) is given by

\[M(t)=\frac{1}{(1-\theta t)^{\alpha}}\]

for \(t<1/\theta\). The mean and variance are given by

\[\mu =\alpha\theta\mbox{ and }\sigma^{2}=\alpha\theta^{2}.\]

For \(\theta =2\) and \(\alpha =r/2\), where \(r\) is a positive integer. The moment-generating function of chi-square distribution is given by

\[M(t)=(1-2t)^{-r/2}\]

for \(t<1/2\).

Example. Let \(X\) have a normal distribution}with p.d.f.

\[f(x)=\frac{1}{\sigma\sqrt{2\pi}}\exp\left [-\frac{(x-\mu )^{2}}{2\sigma^{2}}\right ]\]

for \(-\infty <x<\infty\), where \(\mu\) and \(\sigma\) are parameters satisfying \(-\infty <\mu <\infty\) and \(0<\sigma <\infty\). It can be shown that the moment-generating function is given by

\[M(t)=\exp\left (\mu t+\frac{\sigma^{2}t^{2}}{2}\right ).\]

Then, we have

\[M'(t)=(\mu +\sigma^{2}t)\exp\left (\mu t+\frac{\sigma^{2}t^{2}}{2}\right )\]

and

\[M”(t)=[(\mu +\sigma^{2}t)^{2}+\sigma^{2}]\exp\left (\mu t+\frac{\sigma^{2}t^{2}}{2}\right ).\]

Therefore, we obtain

\[E(X)=M'(0)=\mu\mbox{ and }Var(X)=M”(0)-[M'(0)]^{2}=\sigma^{2}.\]

We can similarly obtain the characteristic functions that will be described in the following table.

\[\begin{array}{||c|c|c||}
\hline \mbox{Distribution} & \mbox{Parameters} & \mbox{Characteristic Functions}\\
\hline\hline \mbox{Bernoulli} & p & \phi (t)=1-p+pe^{it}\\
\hline \mbox{Binomial} & n,p & \phi (t)=(1-p+pe^{it})^{n}\\
\hline \mbox{Geometric} & p & {\displaystyle \phi (t)=\frac{pe^{it}}{1-(1-p)e^{it}}}\\
\hline \mbox{Poisson} & \lambda & \phi (t)=e^{\lambda (e^{it}-1)}\\
\hline \mbox{Standard normal} & & \phi (t)=e^{-t^{2}/2}\\
\hline \mbox{Normal} & \mu ,\sigma^{2} & \phi (t)=e^{i\mu t-\sigma^{2}t^{2}/2}\\
\hline \mbox{Exponential} & \lambda & {\displaystyle \phi (t)=\frac{\lambda}{\lambda -it}}\\
\hline \mbox{Gamma} & \alpha ,\lambda & {\displaystyle \phi (t)=\left (\frac{\lambda}{\lambda -it}\right )^{\alpha}}\\
\hline\end{array}\]

 

Hsien-Chung Wu
Hsien-Chung Wu
文章: 183

發佈留言

發佈留言必須填寫的電子郵件地址不會公開。 必填欄位標示為 *