Executive Summary : | C R Rao's differential geometric approach to statistical problems (1945) involved various statistical models acting as manifolds and the Fisher information matrix as a Riemannian metric. A divergence is a non-negative function defined for every pair of probability distributions (p,q) that vanishes if and only if p = q. In 1992, Eguchi established a general theory that can be applied to relative entropy to obtain the Fisher information metric. Amari and Nagoaka (2001) extended this framework, derived the Cramer-Rao lower bound from the Kullback-Leibler divergence function, and extended it to Bayesian Cramer-Rao and Barankin bounds. They also established an α-version of Cramer-Rao bound from I_α-divergence and its Bayesian counterpart. The proposal aims to establish a unified theory for deriving Cramer-Rao type bounds from divergence functions, extend the Cramer-Rao type bounds to continuous probability densities, and derive quantum analogues of these Cramer-Rao type bounds. |