S.M. Kay, Fundamentals of statistical signal processing: Volume I - Estimation theory, Prentice Hall, 1998.
M.H. Hayes, Statistical Digital Signal Processing and Modeling, John Wiley & Sons, 1996.
A. Papoulis, S.U. Pillai, Probability, Random Variables, and Stochastic Processes, 4th ed., McGraw-Hill, 2002.
D. Manolakis, V.K. Ingle, S.M. Kogon, Statistical and Adaptive Signal Processing: Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing, Artech House, 2005.
P. Stoica, R.L. Moses, Introduction to Spectral Analysis, Prentice Hall, 1997.
Learning Objectives
The course has the purpose of providing the basic knowledge for the treatment of stochastic signals, with particular attention to the theory of parameter estimation, with classical and Bayesian methods, the filtering of random signals, the methods for spectral estimation.
At the end of the course, the student will have acquired the ability to classify the different methods and criteria used in estimation theory and select those that are more suitable to extract the parameters of interest of a signal in the presence of noise in a given application.
Prerequisites
The student is expected to have a basic knowledge about: signals and systems; probability, random variables and processes, and their characterization in time and frequency domain; vector and matrix representation.
Teaching Methods
Lectures
Type of Assessment
The final exam consists of two parts:
- An computer project to be solved (preferably) in MATLAB on a topic agreed with the teacher
- An oral test on the arguments developed during the course
Course program
Review of random variables and random processes. Density and cumulative probability functions. Expectations, moments, joint moments. Gaussian PDF. Stationary processes. Autocorrelation and autocovariance matrices. Power spectral density. Filtering of random processes. Spectral factorization theorem.
Introduction to the estimation problem. Observed data and signal models. PDF of data. Bias and unbiased estimators. Minimum variance unbiased (MVU) estimators. Cramer-Rao lower bound (CRLB). Fisher information. CRLB and transformation of parameters. CRLB for signals in AWGN. Sufficient statistics. Neyman-Fischer factorization. Rao-Blackwell-Lehmann-Scheffe Theorem.
Linear signal model. Estimator for linear signal model and its covariance. Generalized linear signal model. Best linear unbiased estimator (BLUE).
Maximum likelihood estimator (MLE). Asymptotic properties of the MLE. Numeric computation of the MLE. MLE for a vector of parameters. Least squares estimator (LS). LS estimator with a linear model. Weighted LS. Geometrical interpretation of the LS estimator.
Bayesian approach to estimation. Prior and posterior PDF. MMSE Bayesian estimator. generalized linear Bayesian model. Examples of Bayesian estimation. MMSE Bayesian estimation for a vector of parameters. Bayesian risk. Maximum a posteriori (MAP) estimation, scalar and vector of parameters cases. Examples of MAP estimation. Linear MMSE (LMMSE) estimation. Geometric interpretation of the LMMSE estimation. Sequential LS and LMMSE estimation. Wiener filtering, prediction and smoothing.
Spectral estimation. Estimation of the autocorrelation sequenze. Periodogram. Performances of periodogram estimates: average and variance. Modified periodogram. Methods of Welch and Bartlett. Method of Blackman-Tukey. Performances of periodogram-averaging and periodogram-smoothing methods. Parametric methods: spectral estimation based on AR, MA and ARMA models. Subspaces methods: Pisarenko Harmonic Decomposition, MUSIC, ESPRIT.