Average Distortion and SNR

By far the most common computable objective measures of image quality are mean squared error (MSE) and signal-to-noise ratio (SNR). Suppose that one has a system in which an input pixel block or vector X = (X0, X1;..., Xk_ 1) is reproduced as X = (X0, X1;..., Xk_ 1) and that one has a measure d(X, X) of distortion or cost resulting when X is reproduced as X. A natural measure of the quality or fidelity (actually the lack of quality or fidelity) of the system is the average of the distortions for all the vectors input to that system, denoted by

D = E[d(X, X)]. The average might be with respect to a probability model for the images or, more commonly, a sample or time-average distortion. It is common to normalize the distortion in some fashion to produce a dimensionless quantity D/D0, to form the inverse D0/D as a measure of quality rather than distortion, and to describe the result in decibels. A common normalization is the minimum average distortion achievable if no bits are sent, D0 = miny E[D(X, y)]. When the ubiquitous squared-error distortion given by d(X, Y) = ||X — Y||2 = ^j (X — Y)2 is used, then D0 is simply the variance of the process, D0 = E[||X — E(X)||2 ] = ffX.

FIGURE 4 Original image and compressed image at 0.15 bpp in the ROI.

Using this as a normalization factor produces the signal-to-noise ratio

0 0

Post a comment