Computation of Gaussian Derivatives and Eigenvalues

The computation of the Gaussian derivatives in the Hessian matrix and the gradient vector (needed in the later chapters) can be implemented using three separate convolutions with 1-D kernels as represented by

dp i dq

where p, q, and r are nonnegative integers satisfying p + q + r < 2. To obtain the normalized Gaussian derivatives, ap+q +r further needs to be multiplied with fxpyqzr (x; af). In our experience, it is recommended that the radius of the kernel should be 3 ■ af ,4 ■ af, and 5 ■ af in simple smoothing (p + q + r = 0), first (p + q + r = 1), and second derivatives (p + q + r = 2), respectively, for the accurate computation of the Gaussian derivatives and smoothing. Using this decomposition, the amount of computation needed can be reduced from O (n3) to O (3n), where n is the kernel diameter.

The eigenvalues were computed using Jacobi's method in the simulations and experiments shown in this chapter. A numerical problem could exceptionally occur in this computation for synthesized images of the mathematical line models without noise. However, the numerical problem can be avoided by adding a very small Gaussian noise. We did not have such a numerical problem in the experiments using MR and CT images since noise is essentially involved in real images.

0 0

Post a comment