Special Simplifying Approximations for Least Squares Problems

Recall the univariate minimization problem described previously. For least squares cost functions, the contribution of each individual voxel to the cost function, c,, can be expressed as the square of the intensity difference function, d{

The first derivative can be computed using the chain rule as:

c; (P) = Tdt {p)*dr (p), and the second derivative as

If the second term in the formula for the second derivative is small compared to the first term, the formula for the displacement needed to reach the minimum can be approximated as p =p_y" rd> (p)*d'i(p) = p sr d' (p)

It turns out that the multivariate equivalent of this approximation has been used routinely in a number of least squares minimization problems. Indeed, Press et al. [16] specifically recommend this approach rather than including the omitted second derivative term, noting that this will not alter the final result, but rather the path through parameter space taken to achieve this final result. We have found this strategy to be effective in eliminating problems associated with non-positive definite Hessian matrices that are associated with full Newtontype minimization. This strategy is also used by the LevenbergMarquardt algorithm, and some variant of this general approach has been used for registration by a number of authors [1,8,10,11,21,22].

0 0

Post a comment