In this video, the concepts in estimation theory introduced so far for
scalar random variables are extended to deal with estimating multiple
parameters, for example the mean and variance of a distribution
simultaneously. The definition of a vector parameter estimator is
introduced, and the example of extending the definition of bias. The
principal focus of the video is on extending the Cramer-Rao Lower Bound
(CRLB) to real parameter vectors, by placing a bound on the covariance
matrix of the estimator. Parallels with the scalar CRLB are made
throughout, but the emphasis is on the key calculation of the Fisher
Information matrix (FIM). This is the expectation of functions of the
derivatives of the log-likelihood function, but considering the
derivatives with respect to all the elements of the parameter vector.
Finally, the line-fitting example of estimating the parameters of a
straight line to fit a set of data that is assumed to follow a linear
model. The FIM and CRLB are calculated, and it is shown that in this
case the minimum variance unbiased estimator (MVUE) can be found as
before. Numerical simulations are also provided to demonstrate the
correctness of the calculations.
PGEE11164 Probability, Estimation Theory, and Random Signals Lectures -- School of Engineering, University of Edinburgh. Copyright James R. Hopgood and University of Edinburgh, Scotland, United Kingdom (UK). 2020.
Institute for Digital Communications, Alexander Graham Bell Building, The King's Buildings, Thomas Bayes Road, Edinburgh, EH9 3JL. UK.