Solvers¶
There are currently two different GP solvers included with George using different libraries for doing linear algebra. Both of the solvers implement the same API and should (up to some tolerance) give the same answers on the same datasets. A solver is just a class that takes a Kernel and that exposes 3 methods:
- compute — to compute and factorize the kernel matrix,
- apply_inverse — to left-multiply the input by the covariance matrix \(C^{-1}\,b\) (actually implemented by solving the system \(C\,x = b\)), and
- apply_sqrt — to apply the (Cholesky) square root of the covariance.
The solvers also provide the properties computed and log_determinant.
The simplest solver provided by George (BasicSolver) uses scipy’s Cholesky implementation and the second implementation (HODLRSolver) uses Sivaram Amambikasaran’s HODLR library. The HODLR algorithm implements a \(\mathcal{O}(N\,\log^2 N)\) direct solver for dense matrices as described here.
By default, George uses the BasicSolver but the HODLRSolver can be used as follows:
import george
kernel = ...
gp = george.GP(kernel, solver=george.HODLRSolver)
The HODLRSolver is probably best for most one-dimensional problems and some large multi-dimensional problems but it doesn’t (in general) scale well with the number of input dimensions. In practice, it’s worth trying both solvers on your specific problem to see which runs faster.
Basic Solver¶
- class george.BasicSolver(kernel)¶
This is the most basic solver built using scipy.linalg.cholesky().
Parameters: kernel – A subclass of Kernel implementing a kernel function. - apply_inverse(y, in_place=False)¶
Apply the inverse of the covariance matrix to the input by solving
\[C\,x = b\]Parameters: - y – (nsamples,) or (nsamples, nrhs) The vector or matrix \(b\).
- in_place – (optional) Should the data in y be overwritten with the result \(x\)?
- apply_sqrt(r)¶
Apply the Cholesky square root of the covariance matrix to the input vector or matrix.
Parameters: r – (nsamples,) or (nsamples, nrhs) The input vector or matrix.
- compute(x, yerr)¶
Compute and factorize the covariance matrix.
Parameters: - x – (nsamples, ndim) The independent coordinates of the data points.
- yerr – (optional) (nsamples,) or scalar The Gaussian uncertainties on the data points at coordinates x. These values will be added in quadrature to the diagonal of the covariance matrix.
HODLR Solver¶
- class george.HODLRSolver¶
A solver using Sivaram Amambikasaran’s HODLR library that implements a \(\mathcal{O}(N\,\log^2 N)\) direct solver for dense matrices as described here.
Parameters: - kernel – A subclass of Kernel implementing a kernel function.
- nleaf – (optional) The size of the smallest matrix blocks. When the solver reaches this level in the tree, it directly solves these systems using Eigen’s Cholesky implementation. (default: 100)
- tol – (optional) A tuning parameter used when factorizing the matrix. The conversion between this parameter and the precision of the results is problem specific but if you need more precision, try deceasing this number (at the cost of a longer runtime). (default: 1e-12)
- apply_inverse(y, in_place=False)¶
Apply the inverse of the covariance matrix to the input by solving
\[C\,x = b\]Parameters: - y – (nsamples,) or (nsamples, nrhs) The vector or matrix \(b\).
- in_place – (optional) Should the data in y be overwritten with the result \(x\)?
- apply_sqrt(r)¶
This method is not implemented by this solver yet.
- compute(x, yerr, seed=None)¶
Compute and factorize the covariance matrix.
Parameters: - x – (nsamples, ndim) The independent coordinates of the data points.
- yerr – (optional) (nsamples,) or scalar The Gaussian uncertainties on the data points at coordinates x. These values will be added in quadrature to the diagonal of the covariance matrix.
- seed – (optional) There is a stochastic component in the HODLR factorization step. Use this parameter (it should be an integer) to seed this random step and ensure deterministic results. Normally the randomization shouldn’t make a big difference but as the matrix becomes poorly conditioned, it will have a larger effect.