Image noise variance estimation using the mixed Lagrange time-delay autoregressive model

The mixed Lagrange time-delay estimation autoregressive (MLTDEAR) model is proposed as a solution to estimate image noise variance. The only information available to the proposed estimator is a corrupted image and the nature of additive white noise. The image autocorrelation function is calculated a...

Full description

Bibliographic Details
Main Authors: Sim, K.-S., Tso, C.-P., Law, K.-K.
Format: Article
Published: WILEY-LISS 2008
Subjects:
Online Access:http://shdl.mmu.edu.my/2751/
Description
Summary:The mixed Lagrange time-delay estimation autoregressive (MLTDEAR) model is proposed as a solution to estimate image noise variance. The only information available to the proposed estimator is a corrupted image and the nature of additive white noise. The image autocorrelation function is calculated and used to obtain the MLTDEAR model coefficients; the relationship between the MLTDEAR and linear prediction models is utilized to estimate the model coefficients. The forward-backward prediction is then used to obtain the predictor coefficients; the MLTDEAR model coefficients and prior samples of zero-offset autocorrelation values are next used to predict the power of the noise-free image. Furthermore, the fundamental performance limit of the signal and noise estimation, as derived from the Cramer-Rao inequality, is presented.