The Hessian matrix is a square matrix of second-order partial derivatives of a scalar-valued function. It’s used in the second-derivative test for multivariable functions.
The Hessian is the Jacobian matrix of the gradient.
One key idea: if the second-partial derivatives are continuous at a point, the operators are commutative. This implies that the Hessian is symmetric at such points. Because it is real and symmetric, then we can decompose it into a set of real eigenvalues and an orthogonal basis of eigenvectors.
This eigendecomposition allows us to generalise the second-derivative test to multiple dimensions.