orbithunter.core.Orbit.costhess

Orbit.costhess(**kwargs)[source]

Matrix-vector product with the Hessian of the cost function.

Parameters
otherOrbit

Orbit instance whose state is an evaluation of the governing equations

kwargsdict

extra arguments for rmatvec method.

Returns
hessnp.ndarray

Hessian matrix of Orbit.cost()

Notes

The hessian is the combination of jacobian-transpose multiplied with jacobian plus a second term, equal to the dot product of the hessian of the governing equations (tensor) dotted with the equations, resulting in a 2-d matrix, \(J^TJ + F (d^2F)\). This method has not been implemented for any equation, but the recipe is given below. While there are tensor product functions I think the easiest way to compute this is by broadcasting and dot product.

Another issue is that \(F (d^2F)\) scales like N^3 in terms of memory, and so it’s just untenable to define it explicitly in most cases. I recommend defining it as an iteration over i, if i is an index s.t. \(F (d^2F) = \sum_i F_i (d^2F)_{ijk}\), only ever defining a single ith component of \((d^2F)_{ijk}\) at a time.