A risk-unbiased approach to a new Cramér-Rao bound Conference Paper uri icon

abstract

  • How accurately can one estimate a deterministic parameter subject to other unknown deterministic model parameters? The most popular answer to this question is given by the Cramer-Rao bound (CRB). The main assumption behind the derivation of the CRB is local unbiased estimation of all model parameters. The foundations of this work rely on doubting this assumption. Each parameter in its turn is treated as a single parameter of interest, while the other model parameters are treated as nuisance, as their mis-knowledge interferes with the estimation of the parameter of interest. Correspondingly, a new Cramer-Rao-type bound on the mean squared error (MSE) of non-Bayesian estimators is established with no unbiasedness condition on the nuisance parameters. Alternatively, Lehmann's concept of unbiasedness is imposed for a risk that measures the distance between the estimator and the locally best unbiased (LBU) estimator which assumes perfect knowledge of the nuisance parameters. The proposed bound is compared to the CRB and MSE of the maximum likelihood estimator (MLE). Simulations show that the proposed bound provides a tight lower bound for this estimator, compared with the CRB.

publication date

  • January 1, 2016