The Risk-Unbiased Cramér-Rao Bound for Non-Bayesian Multivariate Parameter Estimation Academic Article uri icon


  • How accurately can one estimate a deterministic parameter subject to other unknown deterministic model nuisance parameters? The most popular answer to this question is given by the Cramer–Rao bound (CRB). The main assumption behind the derivation of the CRB is local unbiased estimation of all model parameters. The foundations of this paper rely on doubting this assumption. Generally, in multivariate parameter estimation, each parameter in its turn can be treated as a single parameter of interest, whereas the other model parameters are treated as nuisance, as their misknowledge interferes with the estimation of the parameter of interest. This approach is utilized in this paper to provide a fresh look at deterministic parameter estimation. A new Cramer–Rao (CR) type bound is derived without assuming unbiased estimation of the nuisance parameters. Rather than that, we apply Lehmann's concept of unbiasedness for a risk that measures the distance between the estimator and the locally best unbiased estimator, which assumes perfect knowledge of the nuisance parameters. The proposed risk-unbiased CRB (RUCRB) is proven to be asymptotically attainable by the maximum likelihood estimator while being tighter than the conventional CRB. Furthermore, simulations verify the asymptotic achievability of the RUCRB by the maximum likelihood estimator for an array processing problem.

publication date

  • September 15, 2018