Many algorithms for solving nonlinear optimization
problems and differential equations require the computation of Jacobians (matrices of first derivatives of vector
functions) and Hessians (matrices of second derivatives of scalar
functions). Today, derivative matrices of functions defined by computer
programs can be computed analytically, and hence accurately, using Automatic
Differentiation (AD). Jacobians and Hessians
could also be computed approximately using the relatively older method of
Finite Differencing (FD).
The overall aim of this project is to exploit the sparsity available in large Jacobian and
Hessian matrices the best possible way in order to make their computation
using AD and FD efficient. The sparsity
exploiting techniques here are essentially the same whether one uses AD
or FD, and involve partitioning the columns, or rows, (or both) of a
derivative matrix into a small number of groups. Depending on whether the
matrix of interest is Jacobian (nonsymmentric)
or Hessian (symmetric), and on the specifics of the computational
techniques employed, several partitioning problems can be identified. We
formulate these as graph coloring problems, enabling us to analyze
the complexity of the problems, and to design and implement effective
algorithms for solving them.
News
& Updates
Our serial software
package ColPack is now released. Click on Software
for more info.