Low frequency artifacts, sometimes quite disturbing, that appear when signal energy above the Nyquist frequency reaches the digital sensor. Color aliasing in Bayer sensors can be particularly troublesome. "Moire fringing" is a type of aliasing.
  The circular opening at the center of a lens that admits light. Generally specified by the f-stop, which is the focal length divided by the aperture diameter. A large aperture corresponds to a small f-stop. This can confuse beginners.

Bayer sensor

The sensor pattern used in most digital cameras, where alternate rows of pixels are sensitive to RGRGRG and GBGBGB light. (R = Red; G = Green; B = Blue.) The sensor output has to be converted into a standard file format (for example, JPEG, TIFF, or PNG), where each pixel represents all three colors, by a RAW converter (in the camera or computer), which performs a "de-mosaicing" function. The quality of RAW converters varies: separate converter programs run in computers can provide finer results than the converters built into cameras. That is one of the reasons that RAW format is recommended when the highest image quality is required.

In Foveon sensors (used in Sigma cameras), each pixel site is sensitive to all three colors. Foveon sensors are less susceptible to color aliasing than Bayer sensors; they can tolerate greater response above Nyquist with fewer ill effects.
  A lens characteristic that causes different colors to focus at different locations. There are two types: longitudinal, where different colors to focus on different planes, and lateral, where the lens focal length, and hence magnification, is different for different colors. Lateral CA is the cause of a highly visible effect known as color fringing. It is worst in extreme wide angle, telephoto, and zoom lenses. Imatest SFR measures lateral CA (color fringing). See Chromatic aberration and Eliminating color fringing.
Line Pairs,
Line Width
  A Cycle is the period a complete repetition of a signal. It is used for frequency measurement. A Cycle is equivalent to a Line Pair. Sometimes Line Widths are used for historical reasons. One Line Pair = 2 Line Widths. "Lines" should be avoided when describing spatial frequency because it is intrinsically ambiguous. It usually means Line Widths, but sometimes it's used (carelessly) for line pairs.
(Optical density)
The amount of light reflected or transmitted by a given media, expressed on a logarithmic (base 10) scale. For reflective media (such as photographic or inkjet prints), density = –log10(reflected light/incident light). For transmissive media (such as film), density = –log10(transmitted light/incident light).The higher the density, the less light is transmitted or reflected. Perfect (100%) transmission or reflection corresponds to a density of 0; 10% corresponds to a density of 1; 1% corresponds to a density of 2, etc. Useful equations:
1 f-stop (1 EV)= 0.301 density units;   1 density unit = 3.32 f-stops.
                  When an object is photographed, Log Exposure = –density + k. (Constant k is generally ignored.)
Dynamic range
The range of exposure (usually measured in f-stops) over which a camera responds. Also called exposure range. Practical dynamic range is limited by noise, which tends to be worst in the darkest regions. Dynamic range can be specified as total range over which noise remains under a specified level— the lower the level, the higher the image quality. Dynamic range is measured by Q-13 Stepchart, using transmission step charts.
Exposure value (EV)
  A measure of exposure, where a change of 1 EV means doubling or halving exposure. Often synonymous with f-stop. By definition, 1 EV is a 1 second exposure at f/1.0.
  A measure of the a lens's aperture (the circular opening that admits light). A a change of "one f-stop" implies doubling or halving the exposure. This is the synonymous with a change of 1 EV.

F-stop = focal length / aperture diameter. The notation, "f/8," implies (aperture diameter = ) focal length/8. The larger the f-stop, the smaller the aperture. F-stops are typically labeled in the following sequence, where the admitted light decreases by half for each stop: 1, 1.4, 2, 2.8, 4, 5.6, 8, 11, 16, 22, 32, 45, 64, ... Each f-stop is the previous f-stop multiplied by the square root of 2.

  Exponent that relates pixel levels in image files to brightness of the monitor or print. Most familiar from monitors, where luminance = pixel levelgamma. A camera (or film + scanner) encodes the gamma so that pixel level = brightness(camera gamma). Imatest reports camera gamma. Gamma is equivalent to contrast. The reason can be observed in traditional film curves, which are represented on logarithmic scales (typically, density (log10(absorbed light) vs. log10(exposure)). Gamma is the average slope of this curve (excluding the "toe" and "shoulder" regions near the ends of the curve), i.e., the contrast. See Kodak's definition in Sensitometric and Image-Structure Data. For more detail, see the descriptions of gamma in Using Imatest SFR and Monitor calibration.

Confusion factor: Digital cameras output may not follow an exact gamma (exponential) curve: A "tone reproduction curve" (an "S" curve) may be superposed on the gamma curve to extend dynamic range while maintaining visual contrast. Such a curve boosts contrast in middle tones while reducing it in highlights and shadows. The tone reproduction curve may also be adaptive: camera gamma may be increased for low contrast scenes and decreased for contrasty scenes. This can affect the accuracy of SFR measurements. But it's not a bad idea for image making: it's quite similar to the development adjustments (N-1, N, N+1, etc.) Ansel Adams used in his zone system.
  Modulation Transfer Function. Another name for Spatial Frequency Response (SFR). Indicates the contrast of a pattern at a given spatial frequency relative to very low spatial frequencies. See Sharpness and Understanding image sharpness and MTF curves.
  The spatial frequency where image contrast is half (50%) that of low frequencies. MTF50 is an excellent measure of perceived image sharpness because detail is diminished but still visible, and because it is in the region where the response of most cameras is declining most rapidly. It is especially valuable for comparing the sharpnesss of different cameras. See Sharpness and Understanding image sharpness and MTF curves.

Random variations of image luminance arising from grain in film or electronic perturbations in digital sensors. Digital sensors suffer from a variety of noise mechanisms, for example, the number of photons reaching an individual pixel. Noise is a major factor that limits image quality. In digital sensors it tends to have the greatest impact in dark regions.

Noise is measured as an RMS value (root mean square; an indication of noise power, equivalent to standard deviation, sigma). It is usually expressed in volts, millivolts, or pixel levels (in a digital file). In Q-13 and Colorcheck, noise is converted to relative luminance units (f-stops), which provide a more meaningful indication of its visual impact.

The visual impact of noise is also affected by the size of the image— the larger the image (the greater the magnification), the more important noise becomes. Since noise tends to be most visible at low spatial frequencies, the noise spectrum has some importance (though it is difficult to interpret).
Nyquist frequency
  The highest spatial frequency where a digital sensor can capture real information. Nyquist frequency fN = 1/(2 * pixel spacing). Any information above fN that reaches the sensor is aliased to lower frequencies, creating potentially disturbing Moire patterns. Aliasing can be particularly objectionable in Bayer sensors in digital cameras, where it appears as color bands. The ideal lens/sensor system would have MTF = 1 below Nyquist and MTF = 0 above it. Unfortunately this is not achievable in optical systems; the design of anti-aliasing (lowpass) filters always involves a tradeoff that compromises sharpness.

A large MTF response above fN can indicate potential problems with aliasing, but the visibility of the aliasing depends on whether it arises from the sensor (worse) or sharpening (not as bad). The Nyquist sampling theorem and aliasing contains a complete expositon.
Raw files
  RAW files are the unprocessed output of digital camera image sensors. For Bayer sensors, each RAW pixel represents a single color in RGRGRG, GBGBGB, ... sequence. To be converted into usable, standard file formats (TIFF, JPEG, etc.), raw files must be run through a RAW converter (de-mosaicing) program. RAW converters perform additional functions: they add the gamma curve and often an additional tonal response curve, and they reduce noise and sharpen the image. This can interfere with some of Imatest's measurements.

The only way you can be sure an image file faithfully resembles the RAW file— that it has no sharpening or noise reduction— is to read a RAW file into Imatest and convert it to PPM format using Dave Coffin's dcraw.

  The first thing to remember about resolution is that it has no unique definition; it can be defined in many ways, some of which can be quite misleading. (It is almost as abused as the word "holistic.") In a generic sense, resolution refers to any measurement of an imaging system's ability to resolve fine detail. For traditional film cameras, it usually refers to vanishing resolution the highest spatial frequency where the pattern on a bar chart (usually the USAF 1951 chart) was visible. Since this tells you where detail isn't, it isn't a very good indicator of perceived sharpness. The pixel per inch (PPI or DPI) rating of a scanner is often called its "resolution." This can be highly misleading. For example, inexpensive flatbed scanners can't come close to resolving the detail of decent film scanners with the same PPI rating. I prefer MTF50 as a measure of resolution. For more on resolution, see Pixels, Images, and Files and Lens testing (the old-fashioned way).
  Spatial Frequency Response. The response of a system to a pattern at a given spatial frequency, i.e., the contrast. SFR is measured relative to contrast at very low spatial frequencies. It is expressed as a fraction or percentage. Synonymous with MTF. See Sharpness and Understanding image sharpness and MTF curves.
  Signal processing applied to digital images to improve perceived image sharpness. It can be applied in the camera or in post-processing (image editing). Virtually all digital images benefit from some degree of sharpening, but images can be oversharpened, resulting in highly visible "halos" (overshoot) near edges. See Sharpening.
  An algorithm used by Imatest that allows cameras with different amounts of sharpening to be compared fairly. With standardized sharpening, all cameras have a similar amount of overshoot (around 5%) near edges. Without it, built-in sharpening strongly affects test results, giving oversharpened cameras an unfair advantage. Results with standardized sharpening should not be used for comparing different lenses on the same camera. See Standardized sharpening.