Engineering Math

Expectation

Recall that a random variable is a function \(X:\Omega\rightarrow\mathbb{R}\) that maps from the sample space to the reals. Random variables are the arguments of probability mass functions (PMFs) and probability density functions (PDFs).

The expected value (or expectation) of a random variable is akin to its “average value” and depends on its PMF or PDF. The expected value of a random variable \(X\) is denoted \(\left\langle X \right\rangle\) or \(\E{X}\). There are two definitions of the expectation, one for a discrete random variable, the other for a continuous random variable. Before we define, them, however, it is useful to predefine the most fundamental property of a random variable, its mean.

Definition

The mean of a random variable \(X\) is defined as \[\begin{aligned} m_X = \E{X}. \end{aligned}\]

Let’s begin with a discrete random variable.

Definition

Let \(K\) be a discrete random variable and \(f\) its PMF. The expected value of \(K\) is defined as \[\begin{aligned} \E{K} = \sum_{\forall k} k f(k). \end{aligned}\]

Example 3.8

Given a discrete random variable K with PMF shown below, what is its mean mK?

 Figure 3.9
Figure 3.9: PMF of discrete random variable K.

Compute from the definitions: $$\begin{aligned} \mu_K &= \E{K} \\ &= \sum_{i = 1}^3 k_i f(k_i) \\ &= 1\cdot\frac{1} {6} + 2\cdot\frac{3} {6} + 3\cdot\frac{2} {6} \\ &= \frac{13} {6}. \end{aligned}$$

Let us now turn to the expectation of a continuous random variable.

Definition

Let \(X\) be a continuous random variable and \(f\) its PDF. The expected value of \(X\) is defined as \[\begin{aligned} \E{X} = \int_{-\infty}^\infty x f(x) dx. \end{aligned}\]

Example 3.9

Given a continuous random variable X with Gaussian PDF f, what is the expected value of X?

 Figure 3.10
Figure 3.10: Gaussian PDF for random variable X.

Compute from the definition: $$\begin{aligned} \E{X} &= \int_{-\infty}^\infty x f(x) dx \\ &= \int_{-\infty}^\infty x \frac{1} {\sigma\sqrt{2\pi}} \exp{\frac{-(x-\mu)^2} {2\sigma^2}} dx. \end{aligned}$$ Substitute z = x − μ: $$\begin{aligned} \E{X} &= \int_{-\infty}^\infty (z+\mu) \frac{1} {\sigma\sqrt{2\pi}} \exp{\frac{-z^2} {2\sigma^2}} dz \\ &= \mu \int_{-\infty}^\infty \frac{1} {\sigma\sqrt{2\pi}} \exp{\frac{-z^2} {2\sigma^2}} dz + \frac{1} {\sigma\sqrt{2\pi}} \int_{-\infty}^\infty z \exp{\frac{-z^2} {2\sigma^2}} dz. \end{aligned}$$ The first integrand is a Gaussian PDF with its μ = 0, so, by definition, the first integral is 1. The second integrand is an odd function, so its improper integral over all z is 0. This leaves $$\begin{aligned} \E{X} = \mu. \end{aligned}$$

Due to its sum or integral form, the expected value \(\E{\cdot}\) has some familiar properties for random variables \(X\) and \(Y\) and reals \(a\) and \(b\). \[\begin{align} \E{a} &= a \\ \E{X + a} &= \E{X} + a \\ \E{a X} &= a \E{X} \\ \E{\E{X}} &= \E{X} \\ \E{a X + b Y} &= a \E{X} + b \E{Y}. \end{align}\]

Online Resources for Section 3.7

No online resources.