This Topic considers further the sum of random variables that was
introduced in Topic 33 on auxiliary
variables. The case of independent random variables and vectors is
considered specifically, where it is seen that the probability density
function (pdf) of the sum is the convolution of the individual pdfs. An
example shown in the video, but not in the notes, is the probability
mass function (pmf) of the sum of two fair dice. The video then shows
how the sum of independent random variables can be elegantly dealt with
using characteristic functions, because convolution in the pdf space
becomes multiplication in the characteristic function space. This
becomes useful in proofs such as the central limit theorem (CLT) in
Topic 41.
PGEE11164 Probability, Estimation Theory, and Random Signals Lectures -- School of Engineering, University of Edinburgh. Copyright James R. Hopgood and University of Edinburgh, Scotland, United Kingdom (UK). 2020.
Institute for Digital Communications, Alexander Graham Bell Building, The King's Buildings, Thomas Bayes Road, Edinburgh, EH9 3JL. UK.