Chapter 20 Appendix: Random Variables

\(\newcommand{\E}{\mathrm{E}}\) \(\newcommand{\Var}{\mathrm{Var}}\) \(\newcommand{\bmx}{\mathbf{x}}\) \(\newcommand{\bmH}{\mathbf{H}}\) \(\newcommand{\bmI}{\mathbf{I}}\) \(\newcommand{\bmX}{\mathbf{X}}\) \(\newcommand{\bmy}{\mathbf{y}}\) \(\newcommand{\bmY}{\mathbf{Y}}\) \(\newcommand{\bmbeta}{\boldsymbol{\beta}}\) \(\newcommand{\bmepsilon}{\boldsymbol{\epsilon}}\) \(\newcommand{\bmmu}{\boldsymbol{\mu}}\) \(\newcommand{\bmSigma}{\boldsymbol{\Sigma}}\) \(\newcommand{\XtX}{\bmX^\mT\bmX}\) \(\newcommand{\mT}{\mathsf{T}}\) \(\newcommand{\XtXinv}{(\bmX^\mT\bmX)^{-1}}\)

20.1 Expected value

Intuitively, the expected value is the “average” value of a random variable. The expected value of a continuous random variable \(Y\) is \[\E[Y] = \int y\,f(y)\, dy\] where \(f(y)\) is the probability density function of \(Y\). For a discrete random variable \(Y\), the expected value is \[\E[Y] = \sum_{k=0}^\infty y p(y)\] where \(p(y)\) is the probability mass function of \(Y\). In both cases, the integral or sum is taken only over the support of the random variable.

An important property of the expected value is that it is linear. That means for a random variable \(Y\) and real-valued constants \(a\) and \(b\), \[\E[a + Yb] = a + \E[Y]b.\]

20.2 Variance

The variance of a random variable is given by \[\Var(Y) = \E\left[\left(Y - \E[Y]\right)^2\right]\].

Let \(X\) and \(Y\) be random variables and \(a\), \(b\), and \(c\) be real-valued constants. Then

  • \(\Var(a + bY) = b^2\Var(Y)\)
  • \(\Var(bY + cX) = b^2\Var(Y) - c^2\Var(X) + 2bc\Cov(Y, X)\).