Imatest attended the P2020 meeting on May 13 and 14, 2019 in Ann Arbor, Michigan. Paul Romancyzk, PhD., Senior Imaging Scientist, and Rob Sumner, Lead Engineer, represented Imatest. Paul co-led the discussion on Color Separation within the Image Quality for Machine Vision subgroup.
by Henry Koren, inspired by Paul Romanczyk, edited by Norman Koren
Not all MTF measurement systems will necessarily provide the same results. The quality of the test target can impact the measurements you obtain. Long distance tests are ideally performed at the hyperfocal distance, where there is enough depth of field to have acceptable focus at infinity. Long-range tests which exceed the available space within your lab or factory are the most challenging. A collimator or relay lens system can be used to produce virtual targets at a simulated distance. The measurements obtained through a collimator can diverge from measurements obtained from free-space test targets. This article will discuss how to cope with that.
Imatest sent two engineers to the IEEE P2020 Automotive Imaging Quality face-to-face meeting in Dusseldorf, Germany this past February in 2019. The IEEE P2020 standard, which is still in development, aims to define KPIs and test procedures to address the many challenges relevant (and often unique) to automotive imaging.
The group hopes to publish the standard in the year 2020 (no relation to the standard’s identifying number) as a unifying guideline for developers of autonomous driving systems. Most driver-assist and autonomous-driving systems rely heavily on cameras despite the significant press other imaging modalities, such as LIDAR, have received. As Level 3, 4, and 5 autonomous vehicles start to come on the road in 2020 and beyond, it is vital that these systems have an agreed-upon set of rigorous performance metrics. (more…)
Recent growth in the automotive and security industries has increased the number of cameras designed for viewing both Near Infrared (NIR) and visible wavelengths of light. NIR illumination is invisible to the human eye and can light a dark scene without being visible or annoying. Because silicon sensors are sensitive out to 1100 nm — well beyond NIR wavelengths of interest — existing consumer systems require only a small modification, the removal of the infrared filter, to allow them to image NIR. This makes dual-band security cameras cost-effective and attractive to the industry. (more…)
Learn more about the development of standards for automotive camera systems.
The IEEE-SA P2020 is a working group for automotive imaging standards. Their goal is to define a set of standards to resolve the current ambiguity in the measurement of image quality in automotive imaging systems. Generally, today’s image evaluation approaches do not adequately address the unique needs of either the human or computer-vision based automotive applications; therefore IEEE-SA P2020 is working with people in the field, understanding the gaps in current standards, and creating a coherent set of key performance indicators by which automotive camera systems might consistently be evaluated. (more…)
By Robert Sumner
With contributions from Ranga Burada, Henry Koren, Brienna Rogers and Norman Koren
Consistency is a fundamental aspect of successful image quality testing. Each component in your system may contribute to variation in test results. For tasks such as pass/fail testing, the primary goal is to identify the variation due to the component and ignore the variation due to noise. Being able to accurately replicate test results with variability limited to 1-5% will give you a more accurate description of how your product will perform.
Since Imatest makes measurements directly from the image pixels, any source that adds noise to the image can affect measurements. A primary source of noise in images is electronic sensor noise. Photon shot noise also contributes significantly in low-light situations. Other systemic sources of measurement variability, such as autofocus hysteresis, will not be addressed in this post.
Here are 5 tips to limit noise in your test results: (more…)
It is important to test your camera system in environments which reproduce lighting conditions similar to where you intend to use the camera in the real world. Failure to test a camera under low light conditions may lead to overstating the camera’s performance.
Image sensors collect light (signal) into pixel wells, then convert the resulting analog voltage levels into digital numbers. Dark current is where electrons are released from thermal activity which becomes indistinguishable from electrons released via photoresponse. The dark current that exists in uncooled image sensors leads to dark noise. At low light levels, exposures are longer to gather more light, which gives more time for dark-current electrons to be gathered. This leads to dark noise and read noise representing a larger portion of the overall response, which reduces the signal to noise ratio (SNR). Low-light conditions are most challenging for higher resolution sensors with small pixel pitches where the particle and wave nature of light can seriously impact the performance of your camera. (more…)
In this post, we will be using the Contrast Resolution Chart and Imatest Master to measure the dynamic range of a Google Pixel 2 XL. The dynamic range of a camera is the reproducible tonal range in an imaging system. Put simply, it is the range between the darkest black and the brightest white of an image and is typically measured in decibels (dB). It is an important image quality factor in many applications from machine vision to mobile cameras and more recently, automotive camera systems. In this use case, we will be evaluating both the HDR+ (default) and HDR off modes of the Pixel 2 XL; however the procedure can be used to test any camera system’s dynamic range. You can read more details about Imatest’s dynamic range test solutions
The obsolete ISO 12233:2000 standard defines a resolution test target with a high contrast ratio, These are typically produced at the maximum dynamic range of a printer which can be anywhere from 40:1 to 80:1. The high contrast can lead to clipping of the signal which leads to overstated invalid MTF values.
Some camera manufacturers who want better MTF results may take advantage of this anomaly to overstate the quality of the cameras they produce. This is why it is critical to validate cameras with a proper measurement system that includes a low-contrast target. (more…)
The dynamic range of recent HDR image sensors, defined as the range of exposure between saturation and 0 dB SNR, can be extremely high: 120 dB or more. But the dynamic range of real imaging systems is limited by veiling glare (flare light), arising from reflections inside the lens, and hence rarely approaches this level. Veiling glare measurements, such as ISO 18844, made with black cavities on white fields, result in large numbers that are difficult to relate to dynamic range. Camera dynamic range is typically measured from grayscale charts, where veiling glare depends on the design and layout of the chart, leading to inconsistent results. (more…)
As of Imatest 5.0, Imatest Master now features image acquisition capabilities. Previously, image acquisition capabilities were supported by Imatest IS, which has been discontinued as a separate product. This provides all of our customers with access to the acquisition library. The library supports direct acquisition from a wide range of frame grabbers and cameras, as well as industry standard interfaces. Direct image acquisition cuts out several steps in the image quality testing process and allows for in-the-loop testing with Imatest.
As predicted by astronomers years in advance, a peculiar cosmic event will occur on the morning of August 21st. Passing directly in front of the sun, the moon will cast a shadow racing across North America at supersonic speeds. From Salem, Oregon to Charleston, South Carolina, the 70 mile wide shadow will darken everything in its path. Outside the path of totality, all of North America will still be able to observe a partial solar eclipse!
If you can manage it, making your way into the path of totality will be an amazing experience. A little preparation goes a long way! The talented developers at Vox have created this interactive map to illustrate the magnitude of the eclipse from your location. It will inform you of the shortest distance to the path of totality!
If you’d like to capture an image of the solar corona like the one below from the National Parks Service, you must be in the path of totality. We would recommend following this shooting guide from the American Astronomical Society. Some general tips include: using a telephoto lens, setting focus manually, and capturing with optimal exposure settings. This exposure calculator by Xavier M. Jubier is a good place to start. Solar filters should be used for partial-eclipse stages, and the sun offers nearly 12 hours a day for you to practice finding good camera settings! This image is a composite of several exposures and involves hours of post-processing on a computer.
Important: If you are observing the sun on ANY day, practice safety protection by wearing a pair of ISO 12312-2 compliant glasses. Viewing the sun with non-ISO compliant glasses can cause significant eye-damage.
Imatest’s Norman Koren presents his vision for challenges in automotive image quality testing including resolving low-contrast scenarios.
Imatest’s charts and software allow you to measure the characteristics and parameters of imaging systems. Quite often these measurements simply indicate the limits of system performance and expected image quality.
But some Imatest results let you improve image quality— and subsequent images taken by the same system— by correcting measured aberrations. No new components need to be purchased; no judgement calls need to be made. All it takes is some math and computation, which you can apply in our own external program or with an Imatest module. This is an aspect of image processing pipeline tuning, which is usually done by a dedicated Image Signal Processing (ISP) chip in a device to transform raw sensor data into an appropriate image.
At Imatest, we informally call this “closing the loop”, because it completes the cycle from the camera-under-test to measurement, back to the camera (in the form of an adjustment).
Today, we’re going to illustrate how to use radial distortion measurements from Imatest to correct for optical distortion (without buying a new lens). (more…)
This paper was given as part of the Electronic Imaging 2017 Image Quality and System Performance XIV and Digital Photography and Mobile Imaging XIII sessions.
When: Tuesday, January 31, 2017, at 12:10 pm
By: Robert Sumner with support from Ranga Burada, Noah Kram
Abstract: The dead leaves image model is often used for measurement of the spatial frequency response (SFR) of digital cameras, where response to fine texture is of interest. It has a power spectral density (PSD) similar to natural images and image features of varying sizes, making it useful for measuring the texture-blurring effects of non-linear noise reduction which may not be well analyzed by traditional methods. The standard approach for analyzing images of this model is to compare observed PSDs to the analytically known one. However, recent works have proposed a cross-correlation based approach which promises more robust measurements via full-reference comparison with the known true pattern. A major assumption of this method is that the observed image and reference image can be aligned (registered) with sub-pixel accuracy.
Read Full Paper:
This paper was given as part of the Electronic Imaging 2017 Autonomous Vehicles and Machine session.
When: Monday, January 30, 2017, at 10:10 am
By: Norman Koren with support from Henry Koren, Robert Sumner
Abstract: The ISO 16505 standard for automotive Camera Monitor Systems uses high contrast hyperbolic wedges instead of slanted-edges to measure system resolution, defined as MTF10 (the spatial frequency where MTF = 10% of its low frequency value). Wedges were chosen based on the claim that slanted-edges are sensitive to signal processing. While this is indeed the case, we have found that wedges are also highly sensitive and present a number of measurement challenges: Sub-pixel location variations cause unavoidable inconsistencies; wedge saturation makes results more stable at the expense of accuracy; MTF10 can be boosted by sharpening, noise, and other artifacts, and may never be reached. Poor quality images can exhibit high MTF10. We show that the onset of aliasing is a more stable performance indicator, and we discuss methods of getting the most accurate results from wedges as well as misunderstandings about low contrast slanted-edges, which correlate better with system performance and are more representative of objects of interest in automotive and security imaging.
Imatest upgrades based on the paper
The recommended metric to replace MTF10 (min(MTF10, onset of aliasing, Nyquist frequency)) is displayed in the Wedge MTF plot as well as the (new in Imatest 5.0) Multi-Wedge plot, shown below.
Imatest’s Checkerboard module is our new flagship module for automated analysis of sharpness, distortion and chromatic aberration from a checkerboard (AKA chessboard) pattern. The big benefit of using the checkerboard is that there are looser framing requirements than with other kinds of test targets. While checkerboard lacks the color and tone analysis provided by SFRplus and eSFR ISO, these features are not available on the high precision chrome on glass substrate, so the checkerboard is the optimal pattern for this test.
By Ranga Burada
Autofocus plays a major role in many camera system applications with variable focus, including consumer electronic devices. Camera systems must be able to focus at a variety of distances. Optical systems on cameras only allow a certain range of distances from the camera to be in focus at once (this is often known as the depth of field, or depth of focus). The distance from the camera where objects will be most in focus, effectively the center of this range, is the focus distance—the role of the autofocus system in a camera is to set this point accurately every time.
We refer to autofocus consistency as the ability of a camera to focus on a given point correctly, repeatedly. To determine if a point is in focus, we measure the sharpness of an object (specifically, a test chart) at that distance. By taking many images of the chart—and letting the autofocus system reset each time and try to focus on the chart anew—we can tell if the camera system is focusing consistently or not. By examining the MTF50 values calculated from these images—a common objective image quality metric which correlates well with perceived sharpness—we can tell if sharpness varied between captures, and thus if focus accuracy on the chart varied. (more…)
What is CPIQ?
IEEE-SA working group P1858 created the CPIQ standard. CPIQ seeks to standardize image quality test metrics and methodologies across the mobile device industry, correlate objective test results with human perception, and combine this data into a meaningful consumer rating system.
CPIQ serves as a way to assess and communicate image quality to the vast majority of consumers who are unsure how to judge and compare device image quality.