Tuesday, June 25, 2013 at 4:30 PM in SAS 4201
Johannes Rauh, Max Planck Institute
Maximizing the information divergence from an exponential family
The information divergence (or KL divergence) is a natural measure of dissimilarity of two probability distributions.Although it is not an algebraic quantity, optimizing the divergence often leads to algebraic equations. This happens, for example, when computing the closest distribution (MLE) in an algebraic statistical model to a given distribution. In my talk, I want to discuss the following related problem: Given an exponential family E, which distributions have the largest divergence from E? The critical equations can be translated into algebraic equations. However, the obtained system of equations is ingeneral very hard to solve. In my doctoral thesis I related this maximization problem to the maximization of anentropy-like quantity D over the boundary of a polytope, and together with F. I could show that there is a bijection between the sets of local maximizers. On each non-trivial face of the polytope, the function D is smooth, with algebraic critical equations. Hence, the original system of hard algebraic equations is equivalent to a family of slightly easier algebraic equations with less variables. I give two examples that demonstrate that this reformulation is useful, even if the number of faces of the polytope is huge.
You can add or remove yourself from a seminar mailing list by visiting this link.
Seminar Organizer: Jon Hauenstein