In this section, further fundamental results regarding the expectation and variance of random variables (discrete or continuous) are stated and proved. For some constant \(c \in \real\) and random variable \(X\),
$$ \mathbf{E}(cX) = c\mathbf{E}(X). \tag{13} $$
The proof of (13) is trivial. From (2), for \(X\) discrete
$$ \begin{align*} \mathbf{E}(cX) &= \sum_{x} cx \cdot f_{X}(x) \\ &= c \sum_{x} x \cdot f_{X}(x) \\ &= c \mathbf{E}(X), \end{align*} $$
while from (9), for \(X\) continuous
$$ \begin{align*} \mathbf{E}(cX) &= \int_{-\infty}^{\infty} cx \cdot f_{X}(x) \mathrm{d}x \\ &= c \int_{-\infty}^{\infty} x \cdot f_{X}(x) \mathrm{d}x \\ &= c \mathbf{E}(X). \end{align*} $$
For discrete or continuous random variables \(X\) and \(Y\),
$$ \mathbf{E}(X + Y) = \mathbf{E}(X) + \mathbf{E}(Y). \tag{14} $$
The proof of (14) is as follows. Suppose \(X\) and \(Y\) have joint mass function \(f_{X, Y} : \real^2 \to [0, 1]\) given by \(f_{X,Y}(x, y) = \mathbf{P}(X = x \text{ and } Y = y)\). Then, for \(X\) and \(Y\) discrete, an extension of (2) gives
Noting (9), for \(X\) and \(Y\) continuous the proof begins
where \(f_{X,Y}(x, y) : \real^2 \to [0, \infty)\) is the joint density function of \(X\) and \(Y\). The proof then proceeds in a similar way to the discrete case with summations replaced by integrations.
For discrete or continuous independent random variables \(X\) and \(Y\),
$$ \mathbf{E}(XY) = \mathbf{E}(X) \cdot \mathbf{E}(Y). \tag{15} $$
The proof of (15) is first presented for discrete random variables \(X\) and \(Y\). Let \(X\) and \(Y\) have joint mass function \(f_{X,Y}(x, y) : \real^2 \to [0, 1]\) given by \(f_{X,Y} = \mathbf{P}(X = x \text{ and } Y = y)\). If \(X\) and \(Y\) are independent, then (by definition) the probability of \(Y\) occurring is not affected by the occurrence or non-occurrence of \(X\). For \(X\) and \(Y\) independent,
so that \(f_{X,Y}(x, y) = f_{X}(x) \cdot f_{Y}(y)\). Therefore,
$$ \begin{align*} \mathbf{E}(X, Y) &= \sum_{x} \sum_{y} xy \cdot f_{X, Y}(x, y) \\ &= \sum_{x} \sum_{y} xy \cdot f_{X}(x) f_{Y}(y) \\ &= \sum_{x} \Big\{ x \cdot f_{X}(x) \cdot \sum_{y} y \cdot f_{Y}(y) \Big\} \\ &= \sum_{x} \{ x \cdot f_{X}(x) \cdot \mathbf{E}Y \} \\ &= \mathbf{E}X \cdot \mathbf{E}Y. \end{align*} $$
For \(X\) and \(Y\) continuous, the proof begins
$$ \mathbf{E}(XY) = \int_{-\infty}^{\infty} \int_{-\infty}^{\infty} xy \cdot f_{X, Y}(x, y) \mathrm{d}x\mathrm{d}y $$
where \(f_{X,Y} : \real^2 \to [0, \infty)\) is the joint density function of \(X\) and \(Y\). The proof then proceeds in a similar way to the discrete case with summations replaced by integrations.
For the discrete or continuous random variable \(X\),
The proof of (16) holds for \(X\) discrete or continuous. As a shorthand notation, let \(\mu = \mathbf{E}(X)\). Then,
$$ \mathbf{E}\left(\left(X - \mu\right)^{2}\right) = \mathbf{E}(X^{2} - 2 \cdot \mu \cdot X + \mu^{2}). $$
From (13) and (14),
For the discrete or continuous random variable \(X\) and the constant \(c \in \real\),
$$ \text{Var}(cX) = c^{2} \cdot \text{Var}(X). \tag{17} $$
The proof of (17) holds for \(X\) discrete or continuous. Again, let \(\mu = \mathbf{E}(X)\). Then, \(\text{Var}(cX) = \mathbf{E}((cX - c\mu)^2) = \mathbf{E}(c^{2}\cdot(X - \mu)^{2}) = c^{2} \cdot \mathbf{E}((X - \mu)^2) = c^{2} \cdot \text{Var}(X)\).
Finally, for discrete or continuous independent random variables \(X\) and \(Y\),
The proof of (18) holds for \(X\) and \(Y\) discrete or continuous. As a shorthand notation, let \(\mu_{X} = \mathbf{E}(X)\) and \(\mu_{Y} = \mathbf{E}(Y)\). Then,
However, from (15), if \(X\) and \(Y\) are independent random variables, then \(\mathbf{E}((X - \mu_{X}) \cdot (Y - \mu_{Y})) = \mathbf{E}(X - \mu_{X}) \cdot \mathbf{E}(Y - \mu_{Y}) = 0\).