Measuring Information

According to Studholme [15], it is useful to think of the registration process as trying to align the shared information between the images: "If structures are shared between the two images and the images are mis-aligned, then in the combined image, these structures will be duplicated. For example, when a transaxial slice through the head is mis-aligned, there may be four eyes and four ears. As the images are brought into alignment, the duplication of features is reduced and the combined image is simplified." Using this concept, registration can be thought of as reducing the amount of information in the combined image, which suggests the use of a measure of information as a registration metric. The most commonly used measure of information in signal and image processing is the Shannon-Weiner entropy measure H, originally developed as part of communication theory in the 1940s [16]:

H is the average information supplied by a set of i symbols whose probabilities are given by p1, p2, p3,..., pi. Entropy will have a maximum value if all symbols have equal probability of occurring (i.e., pi =1 Vi), and the minimum value if the probability of one symbol occurring is 1 and the probability of all the others occurring is zero.

Was this article helpful?

0 0

Post a comment