Stray Light (Flare) Documentation

Stray Light Test Considerations

Current Documentation

All documentation versions


Stray light (flare) documentation pages

Introduction: Intro to stray light testing and normalized stray lightOutputs from Imatest stray light analysis | History

Background: Examples of stray lightRoot Causes | Test overview | Test factors | Test ConsiderationsGlossary

Calculations: Metric image | Normalization methodsLight source mask methods | Summary Metrics | Analysis Channels | Saturation

Instructions: High-level Imatest analysis instructions (Master and IT) | Computing normalized stray light with Imatest | Motorized Gimbal instructions

Settings: Settings list and INI keys/values | Standards and Recommendations | Configuration file input

This page describes the technical considerations of stray light testing.

Test Assumptions

The table below describes assumptions that may be made for stray light testing, along with the associated consequences (if the assumption is not entirely true) and possible improvements to address the consequences.

AssumptionConsequencesPossible Improvements
The room is black (0% reflectance, no other sources)
  • Reflections off of the room look like stray lights to the test
  • Will give worse stray light measurements than the camera should get
  • Bigger Room (1/r2 will improve “dissipation” of light)
  • Darker room (absorb more light)
  • Add baffles (hide the bad test hygiene and other light sources)
The source is appropriately bright
  • If the source is too bright, stronger stray light may be under-reported due to saturation
  • If the source is too dim, weaker stray light may not be measured
  • Test at appropriate levels for the DUT and scenarios the DUT will experience
  • This may require testing at multiple source levels 
The radiance of the light is spatially constant
  • Lever arms will matter in rotations when reporting angles
  • Measured stray light will be better than it should be in falloff regions
  • Use a better light source
  • Use other portions of the source
  • Apply compensation for the non-uniformity
The source illuminates the front element and the mechanical surfaces that “see” the front element via a single bounce
  • Lack of coverage
  • Measured stray light will be better than it should be
  • Use a larger-diameter source
  • Add a 2D stepping to mechanically obtain coverage
  • Use a diverging source
The light source does not have stray light
  • Manifestation: halo around the light source
    • Additional light will be present in the scene, leading to worse measured stray light performance than the camera should get from a perfect test setup
    • This additional light may hide stray light performance issues if it overlaps with a stray light feature in the camera
    • If using a large mask than should be used, then the measured stray light will be better than the camera should get from a perfect test setup
  • Cause: Contamination (dust, fingerprints, debris, etc) on the light source
    • The light source will not be spatially uniform
  • Change the optical design of the light source to eliminate stray light paths
  • Change the source-camera distance (some of these straylight paths are distance dependant)
  • Mask more of the image (not recommended)
  • Carefully clean the source
The image of the source is small
  • Repeatability issues when compared to smaller source images
  • Change the optical design of the light source
The data are linear
  • Results may be inconsistent relative to other captures in the series or other cameras
  • Measured results could be better or worse (or both in different cases) than should be measured
  • Use a linear mode for testing
  • Linearize data (decompound, gamma, pedestal subtract, apply linearization function from data)
The data are monotonic
  • Results may be inconsistent relative to other captures in the series or other cameras
  • Measured results could be better or worse (or both in different cases) than should be measured
  • Use a mode where the response is monotonic
All auto algorithms are turned off
  • Results may be inconsistent relative to other captures in the series or other cameras
  • Measured results could be better or worse (or both in different cases) than should be measured
  • Use a linear mode
  • Linearize data (decompound, gamma, pedestal subtract, apply linearization function from data)
When inside the FOV, the source is masked out
  • If the mask is too small, some of the source will be treated as stray light (metric will be worse than it should be)
  • If the mask is too big, some of the stray light will be treated as the source (metric will be better than it should be)
  • Standardization of masking methods to improve consistency (even if everyone is "wrong")
  • More testing (to figure out the consequences of masking choices)
  • Reporting of masking methods so others can re-run with the same method
Stray light is radially symmetric (a single azimuth angle is good enough for testing)
  • Asymmetries in stray light effects will be missed
  • Test at multiple azimuth angles
The test equipment is not in the FOV of the DUT
  • Portions of FOV are not covered (metric better than it should be)
  • Test equipment creates new stray light paths (metric worse than it should be)
  • Rotate source and not DUT
  • Illuminate only the DUT 
  • Change fixturing to move rotation behind the camera
  • Flip the camera upside down to get a “clear view” of the source
  • Make test equipment less reflective
The angle of the source relative to the DUT is known
  • The reported angles at which stray light occurs may not be consistent from setup to setup
  • If doing a production test (very sparse sampling) the wrong angles may be used
  • Use a collimated source
  • Report location of center of rotation with respect to a datum on the DUT
  • Pick something and stick with it
  • Plot/report  summary metrics vs capture index
All signal (outside the design path) is from stray light
  • Dust in the room may lead to "spikes" in the stray light measurements
    • Measured results are worse than they should be
  • Dark Noise and stray light are not separated
  • Test in a clean room
  • Take multiple samples at each position and “median”
  • Subtract off dark noise
The light source is collimated
  • Multiple source-DUT angles will be incorporated within a capture
    • This may shift the magnitude of and angle at which stray light features appear/disappear
  • Reporting angles will be more challenging
  • Use a collimated source
  • Report information about the test setup
The reference image compensation can be represented as single multiplicative factor
  • The normalization factor will not be correctly calculated
  • Use fixed test modes (gain, integration time) for both of the reference image and test image(s)

Light Source

Collimated vs Diverging Source

From a calculation perspective, stray light may be measured with either a diverging or collimated light source. However, for repeatability between test setups, a collimated light source is recommended. 

Left: Sampling with a diverging source; Right: Sampling with a collimated source. In both cases, the black bundle of rays are sampling the same angle/position combinations.

Two aspects over which stray light may be measured are the angle of rays and their intersection (translation) relative to the camera. Diverging and collimated light sources each sample in this space (rotating the camera/source builds up coverage).

Conjecture: For a small bundle of rays in one setup, there is an equivalent bundle of rays in the other. With sufficient (not defined) sampling, all bundles in one setup will be covered by one or more of the captures in the other setup. However, the other bundles in that capture may not be the same. This leads to all stray light features appearing in one will appear in the other, however, the magnitude and location (angle) may not be the same. 

Collimated sources are not practical for all scenarios (e.g., testing through the windshield of a car requiring a Hubble Space Telescope-sized optic), therefore in some cases, it is recommended to test with diverging light (at the cost of measurement repeatability). 

CollimatedDiverging
Advantages
  • Distance invariant (up to the transmission losses through air)
  • Translation invariant (up to the spatial uniformity of the beam)
  • All bundles of rays within the source are going in the same direction
  • Radiometrically characterized by a constant irradiance
  • Can more easily get coverage of large cameras (or through the windshield)
  • Relatively inexpensive
Disadvantages
  • Have a limited camera size that they can test
  • Optically harder to create
  • Relatively expensive
  • The angle is not constant
  • Measurement is range-dependant
  • Radiometrically characterized by radiance

Source Level (Brightness)

To first order, the stray light test is independent of the light level used to test (this is due to the normalization). However, in practice, due to the limited dynamic range of cameras, the light level will matter when the stray light becomes saturated or falls below the noise floor.

Integration TimeSample ImageNotes
800 [ms]
  • Able to easily measure fainter stray light features
  • Stray light near the source is saturated (unable to be quantitatively measured)
200 [ms]
  • Able to measure fainter stray light features
  • Improved ability to measure brighter stray light features
  • The image of the source is still saturated 
25 [ms]
  • Able to measure brighter stray light features
  • Limited ability to measure fainter stray light features
1 [ms]
  • The approximate size that the image of the source should be
  • Unable to measure fainter stray light signals

The above table shows how stray light features can be measurable at different light levels. Note: Multiple exposure levels may be needed to cover large ranges of stray light responses.

Source Extent

The source extent refers to the size of the beam at the location of the camera. The ideal source extent is slightly larger than the key surfaces of the camera. Note: that this extent will be different for every camera that is tested.

A key surface is any surface that is:

  • The front optical element (e.g., lens or cover glass)
  • Any surface of the camera that “directly sees” the front optical element 

Examples of key surfaces include:

  • Front Lens
  • ND/UV filter(s)
  • Baffling
  • Mechanical surfaces used to hold the lens in place
  • Cell phone case

Examples of key surfaces are shown below (non-optical key surfaces are shown in red).

The hood of the car is a key surface when testing at the system level.

Key surfaces include the front optical element and baffling. Key surfaces include the front optical element and the mechanical components used to hold it in place. Key surfaces include the front optical element and lens hood.

 

Note: if the source extent is much larger than the key surfaces, then there may be test hygiene problems as the “extra” light from the beam may reflect off of the camera, support equipment, and/or the room, go back out into the “scene” and create worse stray light measurements than the camera should get.

Note: that over the course of a stray light test, all of the key surfaces should be covered (at all angles). This may be done with either a single beam or by translating a smaller beam over the key surfaces.

Note: if doing “system-level” tests, surfaces outside the camera (e.g., the hood of a car and windshield are key surfaces at a system level) should also be considered to be key surfaces.

Rotation

Rotating Camera vs Rotating Source

To first order, rotating the camera and rotating the light source are functionally equivalent. However, as enumerated below, there are advantages and disadvantages for each.

Schematic of a test setup where the camera is rotated.

Schematic of a test setup where the light source is rotated.

Rotating CameraRotating Source
Advantages
  • Light source can be isolated from the camera via baffleing (improved test hygiene)
  • Typically easier access to mount/unmount cameras
  • Easier to support longer camera-source distances (particularly useful for diverging sources)
  • The camera is always looking at the same background (improved test hygiene)
  • Easier to have the camera fixturing not be in the FOV (particularly for wide FOV cameras)
Disadvantages
  • Each capture position is looking at a different part of the room
  • Control, power, and cooling cables for the camera need to be rotated
  • Control, power, and cooling cables for the source need to be rotated

Note the rotation point is usually defined relative to the DUT, even if the source is what is rotated.

Rotation Point

Under the assumption that the light source is collimated, large enough, and spatially uniform and the surround is infinitely black, then the rotation point does not matter. The rotation point is the point on the camera about which the camera (or source) is rotated.

The reasons for this are:

  • A perfectly collimated beam will have constant irradiance with distance. This means that there is no change in light level at different test distances.
  • A perfectly collimated beam will have all rays pointing in the same direction. This means that any translations will have the same intersection angle between the DUT and the beam.
  • A spatially uniform beam will have the same irradiance throughout the beam. This means that any decenters (translations orthogonal to the direction of propagation of the beam) will not cause a change in light level.
  • A large enough beam is required to get test coverage. Large beams may be used, however, they introduce extra light into the test setup.
  • The surround being infinitely black provides a baseline level of darkness from which to measure stray light and reduces the impact of using a large beam then is strictly necessary for adding light on the measurement backgrounds. 

Note that in practice, many of the assumptions are impossible to perfectly achieve with a real test setup.

The following are rotation points that may be considered:

  • Point of minimum projection of key surfaces (recommended)
  • Middle of the front lens element
  • Entrance Pupil Position
  • Exit Pupil Position
  • Middle of the focal plane array (detector)
  • Arbitrary point (may be useful for very wide FOVs to get the fixturing out of the way)

If the rotation point (for a collimated source) does not matter, then an optimal rotation point may be considered. The optimal rotation point is the one that minimizes the cumulative (overall measurement angles) projection of the key test surfaces onto a plane that is orthogonal to k-vector (direction of propagation) of the light source. This is illustrated in the animations below where the upper left rotation point produces the smallest projection. The cumulative projection is the minimum light source size necessary to get spatial coverage over the DUT for a rotation-only setup.

Example rotation points. In all cases, the light source is to the left side of the schematic. The current projection is the projection of the key surface(s) onto the orthogonal plane to the direction of propagation of the beam.

Minimizing the cumulative projected size allows for using smaller collimated beams. The smaller collimated light sources are easier to make than larger ones and will improve test hygiene as there is less “extra” light from the source to adversely affect measurements.

Miscellaneous

Relationship with other IQFs

Dynamic Range

Cameras with larger dynamic ranges are more susceptible to stray light as they are expected to have sensitivity over more input light levels. Larger dynamic range cameras are also better able to measure stray light as they can be sensitive to both strong (usually close to the source in digital number) and weak (further from the digital number response of the source) stray light artifacts.

(Dark) Noise

The current test assumes that any “signal” is stray light. However, dark noise is technically not stray light. Future versions of Imatest will allow for the subtraction of dark frames to separate dark noise from stray light.

Distortion

Distortion may warp the image of the source. This may require more complex light source masking methods to appropriately mask out the image of the source.

Lens Falloff

Lens falloff may affect the measured stray light values. Lens falloff is not accounted for in Imatest’s stray light analysis.

SFR/MTF

For the stray light test, blur and stray light cannot be separated (i.e., to the test, stray light and a blurry image of the light both look the same). If there is a goal of having orthogonal test metrics then the image of the light source should be in focus when testing stray light, allowing for a clearer delineation between stray light and SFR.

Optical aberrations (e.g, coma and astigmatism) with asymmetric point spread functions (PSFs) may lead to elongation of the image of the source. This may cause issues when masking out the image of the light source.

Reporting recommendations

In addition to the elements necessary for the calculation, Imatest recommends recording and reporting the following test conditions

Light Source

  • Level (e.g., the irradiance at the DUT)
  • Angular diameter (angular size of the direct image of the light source relative to the FOV of the DUT)
  • Aperture size (minimum beam size)
  • Divergence angle
  • Source spectra
  • Temporal stability measurements
  • Any configuration the source is in

Device Under Test

  • Serial number
  • Integration time
  • Modes (e.g., linear)
  • Gain State
  • Dark Noise

Test Conditions

  • Source-Camera distance
  • Alignment information (e.g., location of rotation point with respect to a datum on the DUT)
  • Ambient (room) temperature
  • Pictures of the test setup