News: Imatest 23.1 (December 2022) (available in the Imatest Pilot program). A new method for calculating the information capacity from slanted-edge patterns has been developed and presented in the white paper, Measuring Camera Information Capacity with slanted-edges. Imatest 2020.1 (March 2020) Shannon information capacity can be calculated from images of the Siemens star, described in the white paper, Camera information capacity: a key performance indicator for Machine Vision and Artificial Intelligence systems.
The Siemens star method was presented at the Electronic Imaging 2020 conference, and published in the paper, “Measuring camera Shannon information capacity from a Siemens star image“, linked from the Electronic Imaging website. The white paper, described below, is much more readable. (See also the Imatest News Post: Measuring camera Shannon information capacity with a Siemens star image.) |
The 2020 white paper, Camera information capacity: a key performance indicator for Machine Vision and Artificial Intelligence systems, briefly introduces information theory, describes the Siemens star camera information capacity measurement, then shows results (including the effects of artifacts). A second white paper (November 2022), Measuring Camera Information Capacity with slanted-edges, describes a method of measuring information capacity from widely-used slanted-edges. The Siemens star method is better for observing the effects of image processing artifacts. The slanted-edge method, described here, is faster, more convenient, and better for measuring the total information capacity of an image, but doesn’t work as well for bilateral-filtered images (most consumer JPEGs). |
Meaning – Acquiring and framing – Running the Star module – Results – Information capacity plot – Difference plot
3D Surface plot – Equations – Summary
Nothing like a challenge! There is such a metric for electronic communication channels— one that quantifies the maximum amount of information that can be transmitted through a channel without error. The metric includes sharpness (bandwidth) and noise (grain in film). And a camera— or any digital imaging system— is such a channel.
The metric, first published in 1948 by Claude Shannon* of Bell Labs, has become the basis of the electronic communication industry. It’s called the Shannon channel capacity or Shannon information transmission capacity C , and has a deceptively simple equation. (See the Wikipedia page on the Shannon-Hartley theorem for more detail.)
\(\displaystyle C = W \log_2 \left(1+\frac{S}{N}\right) = W \log_2 \left(\frac{S+N}{N}\right)\)
W is the channel bandwidth, which corresponds to image sharpness, S is the signal energy (the square of signal voltage; proportional to MTF^{2} in images), and N is the noise energy (the square of the RMS noise voltage), which corresponds to grain in film. It looks simple enough (only a little more complex than E = mc^{2 }), but it’s not easy to apply.
*Claude Shannon was a genuine genius. The article, 10,000 Hours With Claude Shannon: How A Genius Thinks, Works, and Lives, is a great read. There are also a nice articles in The New Yorker and Scientific American. The 29-minute video “Claude Shannon – Father of the Information Age” is of particular interest to me it was produced by the UCSD Center for Memory and Recording Research. which I frequently visited in my previous career.
We will describe how to calculate information capacity from images of slanted-edge, Imatest’s most widely-used test image, which (thanks to a recent discovery) allows signal and noise to be calculated from the same location, resulting in a superior measurement of image quality. Technical details are in the green (“for geeks”) boxes. |
Meaning of Shannon information capacity
(The white paper on Camera Information Capacity has a concise definition of information. )
In electronic communication channels the information capacity is the maximum amount of information that can pass through a channel without error, i.e., it is a measure of channel “goodness.” The actual amount of information depends on the code— how information is represented. But although coding is integral to data compression (how an image is stored in a file), it is not relevant to digital cameras. What is important is the following hypothesis:
Hypothesis: Perceived image quality (assuming a well-tuned image processing pipeline) and also the performance of machine vision and Artificial Intelligence (AI) systems, is proportional to a camera’s information capacity, which is a function of MTF (sharpness), noise, and artifacts arising from demosaicing, clipping (if present), and data compression.
I stress that this statement is a hypothesis— a fancy mathematical term for a conjecture. It agrees with my experience and with numerous measurements, but it needs more testing and veritication. Now that information capacity can be conveniently calculated with Imatest, we have an opportunity to learn more about it.
The information capacity, as we mentioned, is a function of both bandwidth W and signal-to-noise ratio, S/N.
In texts that introduce the Shannon capacity, bandwidth W is often assumed to be the half-power frequency, which is closely related to MTF50. Strictly speaking, W log_{2}(1+S/N) is only correct for white noise (which has a flat spectrum) and a simple low pass filter (LPF). But digital cameras have varying amounts of sharpening, which can result in response curves with response that deviate substantially from simple LPF response. For this reason we use the integral form of the Shannon-Hartley equation: \(\displaystyle C = \int_0^W \log_2 \left( 1 + \frac{S(f)}{N(f)} \right) df = \int_0^W \log_2 \left(\frac{S(f)+N(f)}{N(f)} \right) df \) S and N are mean values of signal and noise power; they are not directly tied to the camera’s dynamic range (the maximum available signal). For this reason, we reference calculations of C to the contrast ratio of the chart used for the measurement, most frequently C[4] for 4:1 contrast charts that conform to the ISO 12233 standard. For Siemens star analysis, we this equation was altered to account for the two-dimensional nature of pixels by converting it to a double integral, then to polar form, than back to one dimension. But this wasn’t necessary for slanted-edges, which are already one dimensional. |
The beauty of both the Siemens Star and Slanted-edge methods is that signal power S and noise power N are calculated from the same location: important because noise is not generally constant over the image.
This page describes the use of slanted-edges to calculate C. The Siemens star method is described in Shannon information capacity from Siemens stars. The standard Siemens star contrast is 50:1, specified in ISO 12233:2014/2017, Annex E, but lower contrasts may be used if needed.
The slanted-edge information capacity calculation is available in the Imatest 23.1 Pilot program starting in December 2022. It will be part of the Imatest 23.1 release in spring, 2023. |
Calculation summary
This section contains a concise summary of the calculation. For more detail, read the white paper.
Motivation — A major motivation for this work was the recognition that the bilateral filters used in most in-camera JPEG files — filters that sharpen images near contrasty features like edges but blur them to reduce noise elsewhere — make it appear that the image contains more information than it actually has. These filters increase apparent bandwidth by sharpening edges, while smoothing low contrast regions elsewhere — improving SNR measurements while actually removing information. Although analyzing images converted from raw with minimal processing is the most reliable method of calculating information capacity, I sought a way to get reasonable results from highly-processed JPEGs. In late October 2022, I chanced on the answer — in the overlooked capability of the slanted-edge method described briefly below and in more detail in the white paper, Measuring Camera Information Capacity from Slanted-edges.
The slanted-edge method if calculating MTF, which has been part of the ISO 12233 standard since 2000, takes each scan line in a slanted-edge Region of Interest (ROI), finds its center, fits a polynomial curve to the centers, then depending on the relation between the line center and the curve, adds the line contents to one of four bins, resulting in a 4× oversampled averaged edge. This edge signal is differentiated (resulting in the Line Spread Function, LSF), windowed, then Fourier-transformed. MTF (Modulation Transfer Function), which is usually synonymous with Spatial Frequency Response, is the absolute value of the Fourier Transform.
The overlooked capability In addition to summing the scan lines, the squares of the scan lines can be summed. This allows the signal-dependent noise power to be calculated.
\(\displaystyle \sigma_s^2(x) = \frac{1}{L} \sum_{l=0}^{L-1} (y_l(x)-\mu_s(x))^2 = \frac{1}{L}\sum_{l=0}^{L-1} y_l^2(x) = \left(\frac{1}{L}\sum_{L=0}^{L-1} y_l(x) \right)^2 \)
Signal-dependent noise is important because many images— including most JPEGs from consumer cameras— have bilateral filters, which sharpen the image (boosting noise) near sharp areas like edges, and blurs it (to reduce visible noise) elsewhere. This hides the noise at edges, which is critical to the information capacity of the system. The new technique makes signal-dependent noise visible.
After removing newly-discovered binning noise, selecting the best location to measure noise, and adjusting the signal level from the edge, which is a square wave, to be more representative of an “average’ signal, numbers are entered into the Shannon-Hartley equation (shown above) to calculate the information capacity, which is referenced to the chart contrast.
This explanation has been extremely compressed. To learn more, read the white paper, Measuring Camera Information Capacity from Slanted-edges.
Acquire the image
Acquire a well-exposed image of any slanted-edge test chart (SFRplus, eSFR ISO, Checkerboard, SFRreg, or SFR) in even, glare-free light. Chart edge contrast ratio should be 4:1 (the ISO 12233 standard) or 10:1. Click on any of the links for detailed instructions. For the most part, acquiring the image is identical to standard MTF measurements, except that additional care must be taken with the exposure.
Exposure consistency is important because exposure affects the information capacity measurement. (It’s only a second-order effect for MTF measurements.) The mean pixel level of the slanted-edge regions should be in the range of 0.20 to 0.26. (The optimum has yet to be determined.)
Run the MTF calculation module
Run either an appropriate Rescharts (interactive; recommended for starting and making settings) module (SFRplus Setup, eSFR ISO Setup, SFRreg Setup, or Checkerboard Setup in the second column in the Imatest main window) or an appropriate fixed, batch-capable module (SFR, SFRplus Auto, eSFR ISO Auto, SFRreg Auto, or Checkerboard Auto buttons in the left column).
In the Settings window, select the appropriate setting in the Information capacity dropdown menu. It may be somewhat inconspicuous. Calculating information capacity slows down operations very slightly. Three settings are available.
- No information capacity calculation is just slightly faster either of the information capacity calculations. About the only time you’d want this setting would be for high-speed realtime image acquisition.
- Method (1): Calculate info cap using mean edge noise (default) is recommended for images converted from raw with minimal processing (no bilateral filtering).
- Method (2): Calculate info cap using edge noise weighted by |LSF| is recommended for JPEG images, most of which have bilateral filtering. It works for other images, but gives less stable results.
Crop of the Settings window, showing information capacity noise setting.
The full window and complete instructions are in SFRplus, eSFR ISO, Checkerboard, SFRreg, or SFR
When OK is pressed the image will be analyzed. Any of several displays can be selected in Rescharts, but only a few are of related to information capacity. We show key Rescharts results. The same results can be obtained from running fixed modules.
From the Rescharts interactive interface, the noise calculation can always be set (or changed) from a dropdown menu on the left of the More settings window. This setting is also in the SFR and Rescharts SFR Settings windows.
Information capacity noise calculation, from left side of More settings window
Edge/MTF plot
Information capacity (for the individual edge) has been added to the upper (Edge) plot. Nothing else has changed.
The image below is from a raw-converted eSFR ISO chart image from a high quality compact camera.
Edge/MTF Results from eSFR ISO image converted from raw with minimal processing
The relatively low value of information capacity C requires comment. C is closely tied to the signal used to measure it, and the 4:1 contrast edges in the eSFR ISO chart (or any ISO 12233 chart) used for C[4] are well below the maximum signal the system can support. We created the C[50] measurement (described in the White paper) in an attempt to normalize the results to a larger signal — closer to the camera’s maximum, but the calculation ran into problems with JPEGs, likely because of their tonal response curves. C[4] gives more stable results, which are good for comparing cameras, even though it’s below the camera’s total large-signal capacity.
Edge & Info Capacity Noise
This is a variant of the Edge and MTF plot. The top plot is similar (though the Line Spread Function — the derivative of the edge — is of special interest for this plot.
Line Spread Function (LSF) and signal-dependent noise from
eSFR ISO image converted from raw with minimal processing
The solid line is the noise smoothed by a rectangular kernel with width = 5 (4× oversampled) pixels (1.25 original pixels). Note that the noise is very rough and that there is no distinct peak near the edge transition. From observing regions of the chart, it is apparent that the bumps on the noise curve are random and not repeatable.
Now it gets interesting. The image used for the plots shown above has no bilateral filtering; it’s derived from a raw image with minimal processing (straight gamma curve; no sharpening or noise reduction — and definitely NO bilateral filtering). Here are results for the in-camera JPEG from the same acquisition. Note the large noise peak near the edge.
Line Spread Function (LSF) and signal-dependent noise from
eSFR ISO image captured as a JPEG (strongly sharpened and bilateral filtered)
Method (2): Calculate info cap using edge noise weighted by |LSF| was selected for calculating the noise used for C[4]. This method is recommended for JPEGs because it uses a relatively narrow area for measuring noise (weighted by the Line Spread Function), strongly weighing the noise near the peak. Information capacity C[4] = 3.11 b/p is only slightly higher than for the TIFF: 3.0 b/p. If Method (1) (averaging over a large area, which doesn’t emphasize the peak) is chosen, C[4] = 3.84 b/p: significantly larger than the raw measurement, and clearly not accurate.
3D plot: Edge info capacity
3D plot of Edge Information capacity C[4]
A plot of C[50] is also available, but for now, C[4] is a more reliable result.
Total information capacity
The total information capacity C_{total} (referenced to the signal level: in this case 4:1) can be calculated by multiplying the weighted mean (with all weights equal to 1) by the total number of megapixels.
For the above image, C_{total} = 2.857 bits/pixel × 10.1 megapixels = 28.9 megabits.
- Shannon information capacity C has long been used as a measure of the goodness of electronic communication channels. It specifies the maximum rate at which data can be transmitted without error if an appropriate code is used (it took nearly a half-century to find codes that approached the Shannon capacity). Coding is not an issue with imaging.
- C is ordinarily measured in bits per pixel. The total capacity is \( C_{total} = C \times \text{number of pixels}\).
- The channel must be linearized before C is calculated, i.e., an appropriate gamma correction (signal = pixel levelgamma, where gamma ~= 2) must be applied to obtain correct values of S and N. The value of gamma (close to 2) can be determined from runs of any of the Imatest modules that analyze grayscale step charts: Stepchart, Colorcheck., Color/Tone, Multitest, SFRplus, or eSFR ISO. But in most cases it can be determined from the edge image if the chart contrast is entered and Use for MTF is checked.
- We hypothesize that C can be used as a figure of merit for evaluating camera quality, especially for machine vision and Artificial Intelligence cameras. (It doesn’t directly translate to consumer camera appearance because they have to be carefully tuned to reach their potential, i.e., to make pleasing images). It provides a fair basis for comparing cameras, especially when used with images converted from raw with minimal processing.
- Imatest calculates the Shannon capacity C for the Y (luminance; 0.212*R + 0.716*G + 0.072*B) channel of digital images, which approximates the eye’s sensitivity. It also calculates C for the individual R, G, and B channels as well as the C_{b} and C_{r} chroma channels (from YC_{b}C_{r}).
- Shannon capacity has not been used to characterize photographic images because it was difficult to calculate and interpret. But now it can be calculated easily, its relationship to photographic image quality is open for study.
- Since C is a new measurement, we will be happy to work with companies or academic institutions who can verify its suitability for Artificial Intelligence systems.
Note: Because the slanted-edge information capacity measurement used prior to Imatest 2020.1 is inaccurate (correlating poorly with the superior Siemens star measurements) is has been deprecated completely. It has been replaced with a better measurement that is, however, only recommended for use with a Siemens star in the center of the image for calculating the total image information capacity (in units of bits/image). |
Links
(Historical) R. Shaw, “The Application of Fourier Techniques and Information Theory to the Assessment of Photographic Image Quality,” Photographic Science and Engineering, Vol. 6, No. 5, Sept.-Oct. 1962, pp.281-286. Reprinted in “Selected Readings in Image Evaluation,” edited by Rodney Shaw, SPSE (now SPIE), 1976. Now available for download
C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J., vol. 27, pp. 379–423, July 1948; vol. 27, pp.
623–656, Oct. 1948.
C. E. Shannon, “Communication in the Presence of Noise”, Proceedings of the I.R.E., January 1949, pp. 10-21.
The University of Texas Laboratory for Image & Video Engineering is doing some interesting work on image and video quality assessment. Here are some promising papers. Challenging material!
R. Soundararajan and A.C. Bovik, “Survey of information theory and visual quality assessment (Invited Paper),” Signal, Image, and Video Processing, Special Section on Human Vision and Information Theory , vol. 7, no. 3, pp. 391-401, May, 2013.
H. R. Sheikh, and A. C. Bovik, “Image Information and Visual Quality,” IEEE Transactions on Image Processing , vol. 15, no. 2, pp. 430 – 444, February, 2006.
K. Seshadrinathan and A. C. Bovik, “An information theoretic video quality metric based on motion models,” Third International Workshop on Video Processing and Quality Metrics for Consumer Electronics , Scottsdale, Arizona, January, 2007.
H. R. Sheikh and A. C. Bovik, “A Visual Information Fidelity Approach to Video Quality Assessment (Invited Paper),” The First International Workshop on Video Processing and Quality Metrics for Consumer Electronics , Scottsdale, AZ, January, 2005.
H. R. Sheikh and A. C. Bovik, “Image information and visual quality,” Proc. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP ’04) , Montreal, Canada, vol. 3, pp. iii – 709-712, May, 2004.
Wikipedia – Shannon Hartley theorem has a frequency dependent form of Shannon’s equation that is applied to the Imatest sine pattern Shannon information capacity calculation. It is modified to a 2D equation, transformed into polar coordinates, then expressed in one dimension to account for the area (not linear) nature of pixels.
\(\displaystyle C=\int_0^B \log_2 \left( 1 + \frac{S(f)}{N(f)} \right) df\)