Location: HILL 705
Date & time: Thursday, 07 December 2017 at 12:00PM - 1:00PM
Abstract: In the first part of the talk I will discuss a new information inequality which is based on the Gibbs variational principle. It allows to derive tight upper and lower bounds on the expected value of a given observable f with respect to a probability Q if Q belongs to a (relative entropy) neighborhood of a given model P. This inequality scales with time and space and thus provide uncertainty quantification for phase diagrams and/or none-quilibrium steady states. In the second part of the talk we use Renyi relative entropy and a new variational principle to develop uncertainty quantification for rare events (and other risk sensitive functional). In general we will uncover the very intimate relation between the variational principles of statistical mechanics and the general problem of uncertainty quantification as dual optimization problems.