hIPPYlib - Inverse Problem PYthon library¶
Inverse Problem PYthon library
__ ______ _______ _______ __ __ __ __ __ / | / |/ / / / |/ |/ |/ | $$ |____ $$$$$$/ $$$$$$$ |$$$$$$$ |$$ /$$/ $$ |$$/ $$ |____ $$ $$ | $$ |__$$ |$$ |__$$ | $$ \/$$/ $$ |/ |$$ $$$$$$$ | $$ | $$ $$/ $$ $$/ $$ $$/ $$ |$$ |$$$$$$$ | $$ | $$ | $$ | $$$$$$$/ $$$$$$$/ $$$$/ $$ |$$ |$$ | $$ | $$ | $$ | _$$ |_ $$ | $$ | $$ | $$ |$$ |$$ |__$$ | $$ | $$ |/ $$ |$$ | $$ | $$ | $$ |$$ |$$ $$/ $$/ $$/ $$$$$$/ $$/ $$/ $$/ $$/ $$/ $$$$$$$/
hIPPYlib implements state-of-the-art scalable algorithms for
deterministic and Bayesian inverse problems governed by partial differential equations (PDEs).
It builds on FEniCS
(a parallel finite element element library) for the discretization of the PDE
and on PETSc for scalable and efficient linear
algebra operations and solvers.
For building instructions, see the file
INSTALL.md. Copyright information
and licensing restrictions can be found in the file
The best starting point for new users interested in
features are the interactive tutorials in the
hIPPYlib can be viewed as a toolbox that provides the
building blocks for experimenting new ideas and developing scalable
algorithms for PDE-constrained deterministic and Bayesian inverse problems.
hIPPYlib the user can express the forward PDE and the likelihood in
weak form using the friendly, compact, near-mathematical notation of
FEniCS, which will then automatically generate efficient code for the
discretization. Linear and nonlinear, and stationary and
time-dependent PDEs are supported in
For stationary problems, gradient and Hessian information can be
automatically generated by
FEniCS symbolic differentiation
of the relevant weak forms. For time-dependent problems, instead, symbolic
differentiation can only be used for the spatial terms, and the contribution
to gradients and Hessians arising from the time dynamics needs to be provided
by the user.
Noise and prior covariance operators are modeled as inverses of elliptic differential operators allowing us to build on existing fast multigrid solvers for elliptic operators without explicitly constructing the dense covariance operator.
The key property of the algorithms underlying
hIPPYlib is that solution
of the deterministic and Bayesian inverse problem is computed
at a cost, measured in forward PDE solves, that is independent of the
hIPPYlib provides a robust implementation of the inexact
Newton-conjugate gradient algorithm to compute the maximum a posterior
(MAP) point. The gradient and Hessian actions are
computed via their weak form specification in
constraining the state and adjoint variables to satisfy the forward
and adjoint problem. The Newton system is solved inexactly by early
termination of CG iterations via Eisenstat-Walker (to prevent
oversolving) and Steihaug (to avoid negative curvature)
criteria. Two globalization techniques are available to the user:
Armijo back-tracking line search and trust region.
hIPPYlib, the posterior covariance is approximated by the
inverse of the Hessian of the negative log posterior evaluated at
the MAP point. This Gaussian approximation is exact when the
parameter-to-observable map is linear; otherwise, its logarithm agrees
to two derivatives with the log posterior at the MAP point, and thus it
can serve as a proposal for Hessian-based Markov chain Monte Carlo (MCMC)
hIPPYlib makes the construction of the posterior covariance
tractable by invoking a low-rank approximation of the Hessian of the
hIPPYlib also offers scalable methods for sample generation.
To sample large scale spatially correlated Gaussian random fields from the prior
hIPPYlib implements a new method that strongly relies on the
structure of the covariance operator defined as the inverse of a differential operator:
by exploiting the assembly procedure of finite element matrices
hIPPYlib constructs a sparse Cholesky-like rectangular decomposition of the precision operator.
To sample from a local Gaussian approximation to the posterior (such as at the MAP point)
hIPPYlib exploits the low rank factorization of the Hessian of the
log likelihood to correct samples from the prior distribution.
Finally, to explore the posterior distribution,
dimension independent MCMC sampling methods enchanted by Hessian information.
Finally, randomized and probing algorithms are available to compute the pointwise variance of the prior/posterior distribution and the trace of the covariance operator.