News – imatest http://www.imatest.com Image Quality Testing Software & Test Charts Mon, 21 Aug 2017 03:52:34 +0000 en-US hourly 1 https://wordpress.org/?v=4.8.1 Imatest featured as one of three companies changing the autonomous driving landscape http://www.imatest.com/2017/07/imatest-featured-one-three-companies-changing-autonomous-driving-landscape/ http://www.imatest.com/2017/07/imatest-featured-one-three-companies-changing-autonomous-driving-landscape/#respond Wed, 12 Jul 2017 16:50:00 +0000 http://www.imatest.com/?p=19248 Imatest was recently featured in Automoblog.net’s article, Three Companies Changing the Autonomous Driving Landscape. Carl Anthony writes: “With driverless cars, the implication is huge because cameras will play a vital role in the forthcoming autonomous world. In order for autonomy to deliver on its promises of reducing collisions and traffic fatalities, image quality is essential. […]

The post Imatest featured as one of three companies changing the autonomous driving landscape appeared first on imatest.

]]>
Imatest was recently featured in Automoblog.net’s article, Three Companies Changing the Autonomous Driving Landscape. Carl Anthony writes:

“With driverless cars, the implication is huge because cameras will play a vital role in the forthcoming autonomous world. In order for autonomy to deliver on its promises of reducing collisions and traffic fatalities, image quality is essential. Imatest takes this into consideration as today’s automotive trends usher us further into autonomy.”

Read the full article at www.automoblog.net/2017/06/30/three-companies-changing-the-autonomous-driving-landscape/

The post Imatest featured as one of three companies changing the autonomous driving landscape appeared first on imatest.

]]>
http://www.imatest.com/2017/07/imatest-featured-one-three-companies-changing-autonomous-driving-landscape/feed/ 0
New Documentation Available for Imatest IT http://www.imatest.com/2017/06/new-documentation-available-imatest/ http://www.imatest.com/2017/06/new-documentation-available-imatest/#respond Tue, 27 Jun 2017 19:51:44 +0000 http://www.imatest.com/?p=19197 Imatest is pleased to unveil updated documentation for Imatest IT. The documentation has been updated to include the latest software release (Imatest 4.5), including details for new features, filtering options for preferred languages, and several new Troubleshooting articles. Users can now filter Imatest IT instructions for their preferred languages and interfaces including C, C++, Python, […]

The post New Documentation Available for Imatest IT appeared first on imatest.

]]>
Imatest is pleased to unveil updated documentation for Imatest IT. The documentation has been updated to include the latest software release (Imatest 4.5), including details for new features, filtering options for preferred languages, and several new Troubleshooting articles.

Users can now filter Imatest IT instructions for their preferred languages and interfaces including C, C++, Python, .NET (C# and Visual Basic), and EXE. In addition, there are now more detailed installation and setup instructions for both Windows and Linux versions.

Imatest IT ships with several example projects in C++, Python, C#, and Visual Basic. You can find them in the samples folder of your IT installation, along with example images of Imatest test charts that can be used for each of IT’s analysis modules.

Related Content

Automating Lab and Manufacturing Processes – [Webinar]

 

The post New Documentation Available for Imatest IT appeared first on imatest.

]]>
http://www.imatest.com/2017/06/new-documentation-available-imatest/feed/ 0
Challenges in Automotive Image Quality Testing http://www.imatest.com/2017/06/challenges-automotive-image-quality-testing/ http://www.imatest.com/2017/06/challenges-automotive-image-quality-testing/#respond Mon, 19 Jun 2017 15:35:34 +0000 http://www.imatest.com/?p=19145 Imatest's Norman Koren presents his vision for challenges in automotive image quality testing including resolving low-contrast scenarios.

The post Challenges in Automotive Image Quality Testing appeared first on imatest.

]]>
Imatest’s Norman Koren presents his vision for challenges in automotive image quality testing.

  • Review the challenges for human observers and/or machine vision algorithms to resolve low-contrast objects over a wide range of background brightness
  • How to distinguish low contrast patches over the full dynamic range of a test chart
  • Comparing the use of hyperbolic wedges in ISO16505 vs. slanted edges to measure MTF10 in automotive applications
  • Misunderstandings about low contrast slanted-edges

This video was previously recorded at Autosens Detroit 2017, the world’s leading vehicle perception conference. See more more at www.auto-sens.com

Related Content

For more information on Imatest’s solutions for testing image quality in the automotive industry, please visit our solutions page. 

Image Quality Testing for the Automotive Industry [Webinar]

Three Companies Changing the Autonomous Driving Landscape [Article]

The post Challenges in Automotive Image Quality Testing appeared first on imatest.

]]>
http://www.imatest.com/2017/06/challenges-automotive-image-quality-testing/feed/ 0
Closing the Loop: Distortion Correction http://www.imatest.com/2017/04/distortion-correction/ http://www.imatest.com/2017/04/distortion-correction/#respond Tue, 25 Apr 2017 20:49:19 +0000 http://www.imatest.com/?p=18706 Imatest’s charts and software allow you to measure the characteristics and parameters of imaging systems. Quite often these measurements simply indicate the limits of system performance and expected image quality. But some Imatest results let you improve image quality— and subsequent images taken by the same system— by correcting measured aberrations. No new components need […]

The post Closing the Loop: Distortion Correction appeared first on imatest.

]]>
Imatest’s charts and software allow you to measure the characteristics and parameters of imaging systems. Quite often these measurements simply indicate the limits of system performance and expected image quality.

But some Imatest results let you improve image quality— and subsequent images taken by the same system— by correcting measured aberrations. No new components need to be purchased; no judgement calls need to be made. All it takes is some math and computation, which you can apply in our own external program or with an Imatest module. This is an aspect of image processing pipeline tuning, which is usually done by a dedicated Image Signal Processing (ISP) chip in a device to transform raw sensor data into an appropriate image. 

At Imatest, we informally call this “closing the loop”, because it completes the cycle from the camera-under-test to measurement, back to the camera (in the form of an adjustment). 

Today, we’re going to illustrate how to use radial distortion measurements from Imatest to correct for optical distortion (without buying a new lens).

 

Radial Geometric Distortion

Geometric distortion, for the purposes of this post, is roughly defined as the warping of shapes in an image compared to how those shapes would look if the camera truly followed a simple pinhole camera model. (Consequently, we’re not talking here about perspective distortion). The most obvious effect of this is that straight lines in the scene become curved lines in the image.

Geometric distortion is not always a bad thing- sometimes curvilinear lenses are chosen on purpose for artistic effect, or a wide angle lens is used and the distortion is ignored because that’s what viewers have come to expect from such situations. However, subjective user studies have shown that the average viewer of everyday images has limits on the amount of distortion they are willing to accept before it reduces their perception of image quality. 

Characterizing (and correcting for) distortion is also necessary for more technical applications which require precise calibration, such as localization of a point in 3-D space in computer vision or for stitching multiple images together for panoramic or immersive VR applications. 

This geometric distortion is almost always due to lens design, and because of that (and how lenses are constructed), it is typically modeled as being (1) purely radial and (2) radially symmetric. 

Purely radial distortion means that no matter where in the image field we consider a point, the only relevant aspect of that point for determining the distortion it has undergone is how far from the center of the image it is. (For the sake of simplicity, we will assume here that the center of the image is the optical center of the system, though in general this needs to be measured in conjunction with or prior to radial distortion.) Assuming geometric distortion is radial is extremely helpful in reducing the complexity of the problem of characterizing it, because instead of a 2-dimensional vector field over two dimensions (x- and y- displacement at each pixel location) we only have to determine a 1-dimensional over one dimension (radial displacement at each radius). 

By using the SFRPlus, Checkerboard, or Dot Pattern modules, Imatest can measure radial distortion in a camera system from an image of the appropriate test charts. 

 

Distortion Coefficients in Imatest

Imatest can return functional descriptions of two different types of radial distortion. Both are described by polynomial approximations of the distortion function, but the two polynomials represent different things. In many cases, they are functionally equivalent and one can convert from one form function to the other. (For simplicity, we ignore here the tan/arctan approximation Imatest can provide and note that when it comes to distortion correction it can be applied in the same way with a change only to the forward mapping step.) 

In the rest of this post, we will use the following conventions:

  • \(r_d\) is the distorted radius of a point, i.e. its distance from center in the observed (distorted) image
  • \(r_u\) is the undistorted radius, of the point, the distance from center it would have appeared at in an undistorted image
  • The function \(r_d = f(r_u)\) is called a forward transformation because it takes an undistorted radius value and converts it to a distorted radius. That is, it applies the distortion of the lens to the point. 
  • The function \(r_u = f^{-1}(r_d)\) is called an inverse transformation because, in contrast to a forward transformation, it undoes the distortion introduced by the lens.
  • \(P(\cdot)\) indicates a polynomial function 

SFRPlus and Checkerboard modules return the polynomial coefficients that describe the inverse transformation which corrects the distortion, \(r_u = f^{-1}(r_d)\), highlighted in Rescharts below. 

 

Dot Pattern module returns the polynomial coefficients for a different parameterization of radial distortion, known as Local Geometric Distortion (LGD), or sometimes known as optical distortion. This is the description of radial distortion used by the standards documents of ISO 17850 and CPIQ.

LGD is defined as the radial error relative to the true error, as a percentage (i.e., multiplied by 100):

\[LGD = 100*\frac{r_d – r_u}{r_u}\]

By considering LGD to be a polynomial function of radius in the distorted image, \(P(r_d)\), we can re-arrange the sides of this equation to yield a more useful equation, a rational polynomial form of a distortion-correcting inverse transformation. Thus the dot pattern results can be used in the same way as the SFRPlus/Checkerboard results (though we will be be directly replacing this rational polynomial with a regular polynomial fit approximation in the code example).

\[r_u = \frac{r_d}{P(r_d)/100 + 1} = f^{-1}(r_d)\]

 

Distortion Correction by Re-Sampling

The pixel array of an image sensor essentially takes a grid of regularly-spaced samples of the light falling on it. However, the pattern of light falling on it has already been distorted by the lens and so while the sensor regularly samples the light this light, these are effectively not regular samples of the light as it appeared before entering the lens. Our computational solution for remedying this can be described as follows: 

We create a new undistorted, regularly spaced grid (a new array of pixels). At each of those “virtual sensor” pixel locations we re-sample image data from the observed image at the location in that image where the sensor pixel would have projected in the absence of distortion. So the distorted image is re-sampled by a grid which undergoes the same distortion, but the sampled results are then presented regularly spaced again- effectively undoing the distortion. This is illustrated below.

Each of the intersection points of the upper grid lines represents a pixel location in our generated, undistorted image (the pixel locations in our “virtual sensor”). Obviously, we have reduced the number of “pixels” here to increase legibility. The lower part of the image represents the distorted image with the sampling grid overlaid on it after the grid has been distorted the same way. The regularly spaced array locations above will be populated with data sampled irregularly from the distorted image below, as indicated by the distorted grid intersection locations.

As a further visual aid, the red arrows descend from the grid intersections in the upper image to the corresponding grid intersections in the lower one. These can be contrasted to the ending locations of the blue arrows, which indicate there the pixel samples would be if undistorted. (Obviously, if the pixel sample locations were not distorted, i.e. the blue arrow locations were used, then the output image would be sampled regularly from the distorted image, and would itself be distorted.)

 

 

An Example

The following example of how to do this re-sampling is provided in MATLAB code, below. You can also download the code and example images at the bottom of this post. The code is merely a particular implementation, though- the concepts can be extracted and applied in any programming language.

Note that below, we use the convention of using suffixes ‘_d’ and ‘_u’ to identify variables which are related the distorted and undistorted images/coordinates respectively, and use capitalized variables, such as RHO, to identify matrices of the same size as the test and output images (a property that will be used implicitly below). 

(0) Load the image of an SFRPlus chart into Imatest and analyze it to determine the inverse transformation coefficients (shown here measured in the Rescharts interactive module). (Alternatively, load an image into Dot Pattern module and retrieve the LGD coefficients from there and convert them into inverse transformation coefficients, and then follow along with the remaining steps.)  Load these into MATLAB.

inverseCoeffs = [0.2259 0 1 0]; % distortion coefficients reported by SFRPlus
im_d = double(imread('sfrplus_distortion.jpg'));
width = size(im_d, 2);
height = size(im_d, 1);
channels = size(im_d, 3);

(1) Define the spatial coordinates of each of the pixel locations of this observed (distorted) image, relative to the center of the image. For example, since this test image is 4288×2872 pixels, the upper left pixel coordinate is (-2143.5, -1435.5).

xs = ((1:width) - (width+1)/2);
ys = ((1:height) - (height+1)/2);
[X, Y] = meshgrid(xs,ys);

(2) Convert these coordinates to polar form so we can manipulate only the radial components (called RHO here). We also normalize and then scale the radial coordinates so that the center-to-corner distance of the undistorted image will ultimately be normalized to 1. 

[THETA, RHO_d] = cart2pol(X, Y);
normFactor = RHO_d(1, 1); % normalize to corner distance 1 in distorted image
scaleFactor = polyval(inverseCoeffs, 1); % scale so corner distance will be 1 after distortion correction
RHO_d = RHO_d/normFactor*scaleFactor;

(3) NOTE: As a subtle point, the pair of variables THETA and RHO_d actually define spatial coordinates two ways: explicitly and implicitly. They define explicit coordinates in their values, i.e. in that (THETA(1,1), RHO(1,1)) defines the angular and radial coordinate of the upper left corner pixel of the image. However, they also implicitly define a set of coordinates simply by being 2-D arrays, which have a natural ordering and structure. Even if we change the value of the (1,1) entry of these two arrays, they are both still the upper left corner entry of each array. The explicit coordinate of the point has changed, but the implicit one has remained the same.

We now apply the measured distortion to the radial coordinates, so that the explicit radial distance matches the radial distance of that point in the observed image. As pointed out above, this distorted location in the observed image is now tied to the undistorted location in the image array via the implicit location in the array. We are using the implicit array element locations as the true coordinates of the undistorted image, and the explicit array values as a map to the point in the distorted image to pull the samples from. 

Note that we don’t actually have the forward transformation polynomial yet, we have the inverse polynomial as returned by Imatest. This can be inverted by fitting a new (inverse of the inverse) polynomial, as in the provided invert_distortion_poly.m file. 

forwardCoeffs = invert_distortion_poly(inverseCoeffs); 
RHO_u = polyval(forwardCoeffs, RHO_d); 
% Convert back to cartesian coordinates so get the (x,y) distorted sample points in image space 
[X_d, Y_d] = pol2cart(THETA, RHO_u*normFactor); 

(4) We now have X_d, Y_d arrays whose implicit coordinates are those of the undistorted image and whose explicit values indicate the sampling points in the observed image associated with them. We can use these directly as query (sampling) points in the interp2() function.

% Re-sample the image at the corrected points using the interp2 function. Apply to each color
% channel independently, since interp2 only works on 2-d data (hence the name). 
im_u = zeros(height,width,channels); % pre-allocate space in memory
for c = 1:channels
   im_u(:,:,c) = interp2(X, Y, im_d(:,:,c), X_d, Y_d);
end

That’s it! Now we can view the undistorted fruits of our labor. Notice the straightened lines on top and bottom, in particular. Also note that there are black areas around the edges of this undistorted image- of course, there was no information in the original image to use to meaningfully fill in there. 

 

Of course, we can now undistort scenes besides just test chart images. Now that we have used the test chart and Imatest to characterize the distortion caused by the camera system itself, we can undo that distortion in any other image it takes. Since the supposed-to-be-straight lines of architecture are a very common source of noticeable distortion, we demonstrate this on a photo of our office building in Boulder, CO on a day with diffuse lighting (i.e., gloom). 

 

These example images and a more verbose version of the MATLAB code are available here – distortion_correction_example.zip 5.4 MB

You can measure the distortion in the images yourself in Imatest, or use the supplied values in the distortion_correct_ex.m file. We hope that this post has helped illustrate how this Imatest measurement can be immediately useful for incorporating into your pipeline to improve your images.

The Imatest Radial Geometry module (added to Imatest 5.0, August 2017)

If you don’t want to code your own, you can add or correct distortion using the Radial Geometry module, which works on single images or batches of images (using settings from the most recent single image run). Full details, including a description of the settings, are in the Radial Geometry instructions.

Here is the input window after reading a distorted image of an SFRplus test chart image.

Radial Geometry opening window.
The image is from a dcraw-converted image (with no distortion-correction).

Parameters may be obtained from Imatest runs: SFRplus is shown in the following example.

SFRplus Setup results for above image, showing distortion calculations

Here is the corrected result.

Corrected image using parameters from SFRplus Setup

The post Closing the Loop: Distortion Correction appeared first on imatest.

]]>
http://www.imatest.com/2017/04/distortion-correction/feed/ 0
The Effects of misregistration on the dead leaves cross-correlation texture blur analysis http://www.imatest.com/2017/01/the-effects-of-misregistration-on-the-dead-leaves-cross-correlation-texture-blur-analysis/ http://www.imatest.com/2017/01/the-effects-of-misregistration-on-the-dead-leaves-cross-correlation-texture-blur-analysis/#respond Thu, 19 Jan 2017 04:31:40 +0000 http://www.imatest.com/?p=17791 This paper was given as part of the Electronic Imaging 2017 Image Quality and System Performance XIV and Digital Photography and Mobile Imaging XIII sessions. When: Tuesday, January 31, 2017, at 12:10 pm By: Robert Sumner with support from Ranga Burada, Noah Kram Abstract: The dead leaves image model is often used for measurement of the spatial frequency […]

The post The Effects of misregistration on the dead leaves cross-correlation texture blur analysis appeared first on imatest.

]]>
This paper was given as part of the Electronic Imaging 2017 Image Quality and System Performance XIV and Digital Photography and Mobile Imaging XIII sessions.

When: Tuesday, January 31, 2017, at 12:10 pm

By: Robert Sumner with support from Ranga Burada, Noah Kram

Abstract: The dead leaves image model is often used for measurement of the spatial frequency response (SFR) of digital cameras, where response to fine texture is of interest. It has a power spectral density (PSD) similar to natural images and image features of varying sizes, making it useful for measuring the texture-blurring effects of non-linear noise reduction which may not be well analyzed by traditional methods. The standard approach for analyzing images of this model is to compare observed PSDs to the analytically known one. However, recent works have proposed a cross-correlation based approach which promises more robust measurements via full-reference comparison with the known true pattern. A major assumption of this method is that the observed image and reference image can be aligned (registered) with sub-pixel accuracy.

Read Full Paper:

Effects of misregistration on the dead leaves cross-correlation texture blur analysis

The post The Effects of misregistration on the dead leaves cross-correlation texture blur analysis appeared first on imatest.

]]>
http://www.imatest.com/2017/01/the-effects-of-misregistration-on-the-dead-leaves-cross-correlation-texture-blur-analysis/feed/ 0
Measuring MTF with wedges: Pitfalls and best practices http://www.imatest.com/2017/01/measuring-mtf-with-wedges-pitfalls-and-best-practices/ http://www.imatest.com/2017/01/measuring-mtf-with-wedges-pitfalls-and-best-practices/#respond Thu, 19 Jan 2017 03:22:49 +0000 http://www.imatest.com/?p=17789 This paper was given as part of the Electronic Imaging 2017 Autonomous Vehicles and Machine session. When: Monday, January 30, 2017, at 10:10 am By: Norman Koren with support from Henry Koren, Robert Sumner Abstract:  The ISO 16505 standard for automotive Camera Monitor Systems uses high contrast hyperbolic wedges instead of slanted-edges to measure system […]

The post Measuring MTF with wedges: Pitfalls and best practices appeared first on imatest.

]]>
This paper was given as part of the Electronic Imaging 2017 Autonomous Vehicles and Machine session.

When: Monday, January 30, 2017, at 10:10 am

By: Norman Koren with support from Henry Koren, Robert Sumner

Abstract:  The ISO 16505 standard for automotive Camera Monitor Systems uses high contrast hyperbolic wedges instead of slanted-edges to measure system resolution, defined as MTF10 (the spatial frequency where MTF = 10% of its low frequency value). Wedges were chosen based on the claim that slanted-edges are sensitive to signal processing. While this is indeed the case, we have found that wedges are also highly sensitive and present a number of meas­ure­ment challenges: Sub-pixel location variations cause unavoidable inconsistencies; wedge saturation makes results more stable at the expense of accuracy; MTF10 can be boosted by sharpening, noise, and other artifacts, and may never be reached. Poor quality images can exhibit high MTF10. We show that the onset of aliasing is a more stable performance indicator, and we discuss methods of getting the most accurate results from wedges as well as misun­derstandings about low contrast slanted-edges, which cor­relate better with system performance and are more repre­sentative of objects of interest in automotive and security imaging.

Full Text: Measuring MTF with Wedges: Pitfalls and best practices

Slides: Wedge_measurements_N_Koren_2017

Imatest upgrades based on the paper

The recommended metric to replace MTF10 (min(MTF10, onset of aliasing, Nyquist frequency)) is displayed in the Wedge MTF plot as well as the (new in Imatest 5.0) Multi-Wedge plot, shown below.

Multi-Wedge plot for eSFR ISO, including recommended metric.

The post Measuring MTF with wedges: Pitfalls and best practices appeared first on imatest.

]]>
http://www.imatest.com/2017/01/measuring-mtf-with-wedges-pitfalls-and-best-practices/feed/ 0
Best practices for using transmissive test targets http://www.imatest.com/2017/01/best-practices-for-using-transmissive-test-targets/ http://www.imatest.com/2017/01/best-practices-for-using-transmissive-test-targets/#respond Wed, 18 Jan 2017 23:37:36 +0000 http://www.imatest.com/?p=12156 Light Source   The most uniform light source available would be an integrating sphere, but these come with considerable size and expense. After considering how the light source uniformity impacts their results, most lab & manufacturing tests use LED or fluorescent lightboxes with flat-panel diffusers. Some customers choose a larger lightbox because that leads to greater […]

The post Best practices for using transmissive test targets appeared first on imatest.

]]>
Light Source

 

The most uniform light source available would be an integrating sphere, but these come with considerable size and expense. After considering how the light source uniformity impacts their results, most lab & manufacturing tests use LED or fluorescent lightboxes with flat-panel diffusers. Some customers choose a larger lightbox because that leads to greater uniformity within the central image plane area. We have performed a study of lightbox uniformity to see how different models compare. 

To perform tests with a transmissive chart setup, devices with built-in light sources will need to have that light source disabled.

Chart

The size of chart you need is based on the distance you want to test at. Test distance depends on whether you have a fixed focus or auto focus lens. You may select a single test distance or a range of distances between the minimum focus and hyperfocal distance, which may include macro and long-range resolution tests. 

You will want to determine what the size of your imaging plane will be at your test distance(s). The Imatest Chart Finder utility can assist you in determining this. Unless you are dealing with an extreme wide-angle lens, you would typically want to choose a light source sized to illuminates an area greater than or equal to your imaging plane. Many charts we sell are sized for particular lightbox models such as SFRplus for GTI lightboxes, or the ultra high precision composite Film SFRplus chart

Masking

You may choose a lightbox that is larger than your imaging plane at the distance you are testing at, or larger than the test chart that you are affixing to it.

Many transmissive test charts have a “base density” which causes light areas of the target to block significant amounts of light compared to a naked lightbox. Depending on the chart material this can range between 0.1 – 0.3 density units.  If your test chart doesn’t fill your light box, you will be left with significantly brighter areas outside the test chart. This flood of bright light will disrupt your test by causing stray light (flare) that can impact your results by reducing contrast. This is especially for any dynamic range measurements.

In some cases where flare is not a serious concern, you may have additional translucent material added to the sides of your test chart so that it fully covers your lightbox. 

In other cases, you may construct an or purchase an opaque mask using matte black acrylic or other opaque cards which can block any additional light from reaching your lens.

Mounting

Test charts can be taped to a mask, or taped directly to the lightbox diffuser panel using double-sided tape. Since many common adhesive-backed tapes include acid that may react with film substrate, an archival tape is recommended.

Inkjet tests should have the glossy side of the chart facing towards the lightbox, with the matte surface with the pigments facing outwards.

Film and chrome on glass charts have a directional nature to them as well. These should be oriented with the chrome or film emulsion layer facing out towards the camera under test. This is typically confirmed by verifying that the chart model number text is not mirrored.

Some customers have their unit under test facing upwards, with the lightbox & chart facing down. In this situation, adhesive tapes strips will have difficulty withstand the gravity of the chart, and will eventually de-adhere. A sheet of double-sided (possibly optically clear) adhesive can be used between chart and the diffuser panel lightbox will increase the longevity of such a setup,

Environment

Glossy test charts are prone to specular reflections when there is ambient light within the test environment. With the exception of inkjet produced targets, most high precision and high dynamic range transmissive test targets have glossy surfaces. 

Reflections can disrupt the accuracy of measurements, especially if they occur within critical test zones such as the dark areas of a dynamic range test chart. Dynamic range testing requires an environment where all additional light is blocked from the test area. Complete light-proof enclosures made of low reflectance materials such as dark opaque plastics, black felt, velvet, or anodized aluminum can enable a light-free environment that will enable accurate measurements.

 

The post Best practices for using transmissive test targets appeared first on imatest.

]]>
http://www.imatest.com/2017/01/best-practices-for-using-transmissive-test-targets/feed/ 0
Automating Lab and Manufacturing Processes http://www.imatest.com/2017/01/automating-lab-and-manufacturing-processes/ http://www.imatest.com/2017/01/automating-lab-and-manufacturing-processes/#respond Tue, 17 Jan 2017 18:52:45 +0000 http://www.imatest.com/?p=17759 For many production departments tasked with balancing manufacturing outcomes with product design requirements, the process can be cumbersome, lagged in the past by human quality control and the need to correlate results with partner tests.

Integrating image quality software with automated testing equipment in the lab and manufacturing enables maximized efficiency and enhanced quality standards throughout the development cycle.

Join us on Wednesday, March 1, 2017, for our webinar, “Automating Lab and Manufacturing Processes” to learn how to raise quality standards while maximizing efficiency.

The post Automating Lab and Manufacturing Processes appeared first on imatest.

]]>
The post Automating Lab and Manufacturing Processes appeared first on imatest.

]]>
http://www.imatest.com/2017/01/automating-lab-and-manufacturing-processes/feed/ 0
Testing a macro lens using Checkerboard and Micro Multi-slide http://www.imatest.com/2016/12/testing-macro-lens-using-checkerboard-micro-multi-slide/ http://www.imatest.com/2016/12/testing-macro-lens-using-checkerboard-micro-multi-slide/#respond Tue, 13 Dec 2016 17:37:27 +0000 http://www.imatest.com/?p=17382 Imatest’s Checkerboard module is our new flagship module for automated analysis of sharpness, distortion and chromatic aberration from a checkerboard (AKA chessboard) pattern. The big benefit of using the checkerboard is that there are looser framing requirements than with other kinds of test targets. While checkerboard lacks the color and tone analysis provided by SFRplus and eSFR ISO, these features are not available on the high precision chrome on glass substrate, so the checkerboard is the optimal pattern for this test.

The post Testing a macro lens using Checkerboard and Micro Multi-slide appeared first on imatest.

]]>
Testing a 1-5x Macro Canon MP‑E 65mm Lens

Imatest’s Checkerboard module is our new flagship module for automated analysis of sharpness, distortion and chromatic aberration from a checkerboard (AKA chessboard) pattern. The big benefit of using the checkerboard is that there are looser framing requirements than with other kinds of test targets. While checkerboard lacks the color and tone analysis provided by SFRplus and eSFR ISO, these features are not available on the high precision chrome on glass substrate, so the checkerboard is the optimal pattern for this test.

The Imatest Micro Multi-Slide contains high precision checkerboard patterns with many different scales of frequencies. This makes a single target capable of effectively testing a wide range of magnifications.

We obtained the below cycles per object mm by dividing the LW/PH obtained from the image by 2*image height in mm.

Magnification Image height Center MTF50 Best Aperture
1x 23.71 mm 29.61 cycles / object mm f/5.6
2x 11.86 mm 57.04 cycles / object mm f/5.6
3x 7.90 mm 74.55 cycles / object mm f/5.6
4x 6.07 mm 81.52 cycles / object mm f/4
5x 4.74 mm 87.61 cycles / object mm f/4

 

Thanks to this study, we now know to select the f/4 aperture at 5x magnification to maximize the resolving power of this lens. Detailed results, capture and analysis procedures are available below.


1x Magnification

This report is shown in line widths per picture height (LW/PH):

1x

1x-3d


2x Magnification

2x

2x-3d


3x Magnification

3x

3x-3d


4x Magnification

4x

4x-3d


5x Magnification

At the limit of this lens’ magnification we obtain the most spatial detail on the target at f/4

5x

5x-3d


Capture Procedure

We use this camera setup to take high precision photographs of the test targets on a LED lightbox:  

micromulti_setup_scaled

We mounted the camera on an adjustable arm and used a Manfrotto 454 Micro Positioning plate for adjusting distance and focus. We masked off the extra parts of the lightbox using opaque material to prevent additional stray light from increasing flare.

We used the following framing for the various magnifications:

micromulti_site_blog_customized

To get good sharpness measurements we selected an area of the slide that had under 10 vertical squares in the image. This yields a reasonably large SFR region for the most reliable calculations and prevents there from being an excessive number of regions.

We centered the chart on the 3-dot mark in the center of the frequency zone. For optimal slanted-edge analysis, we rotated the chart by about 5° according to the ISO 12233 standard. Our current checkerboard routines can automatically detect the checkerboard, but they look for complete rows and columns before including the set of regions. Which means that rotation can make the framing a little more difficult, and if a corner of a complete row/column of squares gets clipped off, then the automatic region detection will skip some regions you may have wanted to test, producing less than optimal region availability around the periphery of your image. We will be improving these selection routines in future releases of our software.

If we wanted more detail about lateral chromatic aberration and distortion (which are very low for this lens) we would have analyzed the dot pattern regions of the chart.

We used 5 megapixel downsampled JPG’s from the camera to perform this analysis, which gives us the following table of image sizes:

Magnification Image height Image pixel size
1x 23.71mm 13µm
2x 11.86mm 6.5µm
3x 7.90mm 4.33µm
3.9x 6.07mm 3.33µm
5x 4.74mm 2.6µm

Analysis Procedure

We performed our Checkerboard Setup and selected all regions:

setup

After initially determining our range of expected MTF values, we disabled auto-scaling on our 3D plots and set our range to the total:

micromultiscaling

We used Imatest Batchview to coalesce the large volume of tests in order to produce the above bar graphs.  We also used ImageMagick to assemble the nifty animated GIF’s from the collections of 3D plots:

convert -delay 100 mag1/Results/*3D.png 1x-3D.gif

We hope that you find this write-up to be helpful in testing your own equipment. You can purchase Micro-multi Slide on our store or contact charts@imatest.com for customizing a test target that fits your unique requirements.  See our Macro Solutions for other close-range testing items.

The post Testing a macro lens using Checkerboard and Micro Multi-slide appeared first on imatest.

]]>
http://www.imatest.com/2016/12/testing-macro-lens-using-checkerboard-micro-multi-slide/feed/ 0
Using Sharpness to Measure Your Autofocus Consistency http://www.imatest.com/2016/11/using-sharpness-to-measure-your-autofocus-consistency/ http://www.imatest.com/2016/11/using-sharpness-to-measure-your-autofocus-consistency/#respond Tue, 15 Nov 2016 16:57:50 +0000 http://www.imatest.com/?p=17253 By Ranga Burada Autofocus plays a major role in many camera system applications with variable focus, including consumer electronic devices. Camera systems must be able to focus at a variety of distances. Optical systems on cameras only allow a certain range of distances from the camera to be in focus at once (this is often known […]

The post Using Sharpness to Measure Your Autofocus Consistency appeared first on imatest.

]]>
By Ranga Burada

Autofocus plays a major role in many camera system applications with variable focus, including consumer electronic devices. Camera systems must be able to focus at a variety of distances. Optical systems on cameras only allow a certain range of distances from the camera to be in focus at once (this is often known as the depth of field, or depth of focus). The distance from the camera where objects will be most in focus, effectively the center of this range, is the focus distancethe role of the autofocus system in a camera is to set this point accurately every time.

We refer to autofocus consistency as the ability of a camera to focus on a given point correctly, repeatedly. To determine if a point is in focus, we measure the sharpness of an object (specifically, a test chart) at that distance. By taking many images of the chartand letting the autofocus system reset each time and try to focus on the chart anewwe can tell if the camera system is focusing consistently or not. By examining the MTF50 values calculated from these imagesa common objective image quality metric which correlates well with perceived sharpnesswe can tell if sharpness varied between captures, and thus if focus accuracy on the chart varied.

Imatest Autofocus Consistency Module

The Imatest Autofocus Consistency module analyzes the sharpness (specifically MTF50) results from a set of images captured at a fixed distance from an Imatest sharpness chart, such as SFRplus chart, eSFR-ISO chart, or the new AutoFocus chart.  The user can then generate MTF50 values from these images using the SFR, SFRplus, or eSFR-ISO modules in Imatest. The Autofocus Consistency module is a post-processor that runs on the outputs of these analyses and consolidates them into a more useful form. You can find a more detailed description of the test procedure here.

 

MTF50 Value

 

In the above plot, each x-axis position indicates the distance from chart to camera. The colored data marks spread vertically at each position indicate the MTF50 values calculated from the images captured at that chart distance. The consistency of the autofocus system at a given distance is indicated by the tightness of the spread of MTF50 values for images taken at that distance. The narrower this spread, the more consistent the autofocus system is. In order to determine if the system’s consistency depends on distance (perhaps it has an easy time focusing on nearby points, but tends to fail for faraway ones), this analysis is repeated at many test distances, as in the plot above.

To learn more about maintaining consistency while measuring sharpness with MTF values, visit Increasing the Repeatability of Your Sharpness Tests.

The post Using Sharpness to Measure Your Autofocus Consistency appeared first on imatest.

]]>
http://www.imatest.com/2016/11/using-sharpness-to-measure-your-autofocus-consistency/feed/ 0