
Introduction: The Paradigm Shift from Single Point to Full Field Metrology#
Under the rapid iteration of display technology—from LCD to OLED, from traditional dashboards to smart cockpits—the demand for optical metrology has undergone a fundamental transformation. In the past, a Spot Luminance Meter was sufficient to complete sampling tests of screen brightness; today, when a display panel contains millions of independent light-emitting pixels, each of which could be a potential source of brightness or chromaticity uniformity defects (Mura), a point-by-point measurement strategy is physically no longer feasible.
This shift has catalyzed the status of the Imaging Colorimeter as the core tool for display metrology. This article will systematically expound on the essential differences between imaging colorimeters and spot luminance meters from three dimensions: measurement principle, technical architecture, and application scenarios, and analyze why two-dimensional spatially resolved optical measurement has become an irreplaceable standard in modern display quality control.
Principles and Inherent Limitations of Spot Measurement#
Basic Working Principle#

The core of a spot luminance meter is a single photodetector, usually a silicon photodiode. During measurement, the instrument converges light from a specific area (measurement spot) on the measured surface onto the detector through an optical system. The detector outputs an electrical signal proportional to the incident luminous flux. After amplification and analog-to-digital conversion, combined with the instrument’s calibration coefficient, this signal outputs the luminance value (unit: cd/m²) or chromaticity coordinate values of that measurement point.
The size of the measurement spot of a spot luminance meter is determined by both the optical magnification of the instrument and the measurement distance, typically ranging from a few millimeters to tens of millimeters in diameter. High-end spot colorimeters (such as the Konica Minolta CS-150/CS-160 series) can achieve different measurement spot sizes by changing the objective lens, but no matter how it is adjusted, each measurement always acquires photometric/chromatic data for a single spatial point.
Why Spot Measurement is No Longer Sufficient#
Spot measurement faces fundamental challenges in the following scenarios:
Efficiency bottleneck in spatial uniformity assessment. Uniformity inspection of modern display panels requires evaluating the spatial distribution of luminance and chromaticity across the entire active area. Taking a 15.6-inch automotive display as an example, if a uniformity scan is performed at a 2mm interval, approximately 25,000 points need to be measured. Even if each point takes only 1 second to measure, completing one panel would take nearly 7 hours—this is completely unacceptable in production environments where the Takt Time is usually measured in seconds.
Risk of missing local defects. Spot measurement is essentially spatial sampling, and its sampling density is far lower than the resolution capability of the human eye. A bright spot defect with a diameter of 0.5mm or a line defect with a width of only tens of microns is highly likely to fall into the “blind spot” between measurement spots. This means that even if the data for all measurement points meets specifications, the panel may still have visual defects perceivable by the user.
Lack of spatial correlation information. Many display quality metrics—such as Mura (spatial patterns of luminance non-uniformity), pixel-level luminance deviation, and view-angle-dependent color shift—are inherently features of two-dimensional spatial distribution. Spot measurement cannot provide spatial correlation information between adjacent areas, and therefore cannot calculate these critical quality metrics.
Technical Architecture of Imaging Colorimeters#
Area Sensors: The Leap from Point to Plane#

The core component of an imaging colorimeter is an area image sensor, usually a scientific-grade CCD (Charge-Coupled Device) or a high-performance CMOS sensor. Unlike the single detector of a spot instrument, an area sensor contains millions to tens of millions of independent photosensitive units (photosites/pixels), each of which is an independent photodetector.
When an imaging colorimeter is aimed at a display panel, the lens images the two-dimensional luminance/chromaticity distribution of the panel onto the sensor plane. Each pixel on the sensor independently records the incident light signal from its corresponding panel area. Therefore, a single exposure can simultaneously acquire photometric data from millions of spatial sampling points—this is the most fundamental difference between imaging measurement and spot measurement.
For an imaging colorimeter equipped with a 20-megapixel sensor, the number of spatial sampling points it acquires in a single shot is equivalent to years of continuous measurement data from a spot luminance meter.
Optical System and Spatial Resolution#
The optical system of an imaging colorimeter typically uses high-quality industrial lenses or customized measurement lenses. The choice of lens directly affects two critical metrological parameters:
Spatial Resolution. This refers to the smallest spatial detail size that the system can distinguish, usually quantitatively described by the Modulation Transfer Function (MTF). High spatial resolution means the system can accurately measure adjacent, extremely small display elements (such as individual pixels or sub-pixels) without signal “crosstalk” between adjacent elements due to optical blurring. In Mini-LED and Micro-LED inspection, spatial resolution is the primary parameter determining measurement effectiveness.
Field of View (FOV). This refers to the area of the measured panel that the system can cover in a single shot. The field of view is determined by both the lens focal length and the working distance. In production line applications, it is usually necessary to cover the entire panel within the field of view while maintaining sufficient spatial resolution—a fundamental conflict between these two needs that requires balanced optimization of sensor pixel count, lens design, and measurement distance.
Three-Filter Architecture: The Hardware Foundation for Metrology-Grade Chromaticity Measurement#
In high-precision imaging colorimeters, chromaticity measurement typically adopts an architecture of a Filter Wheel + Monochrome Sensor, rather than the Bayer color filter array common in consumer cameras.

The workflow is as follows: The system is equipped with three (or more) precisely designed and manufactured optical interference filters on a filter wheel. The spectral transmittance curve of each filter is specifically designed so that its product with the sensor’s quantum efficiency is as close as possible to the CIE 1931 standard observer’s tristimulus matching functions $\bar{x}(\lambda)$, $\bar{y}(\lambda)$, and $\bar{z}(\lambda)$. During measurement, the filter wheel sequentially rotates the three filters into the optical path, and the sensor captures three images—X-channel, Y-channel, and Z-channel.
The core advantage of this architecture is that each pixel directly measures the light energy of the X, Y, and Z channels, requiring no demosaicing interpolation and avoiding false colors and spatial blurring introduced by interpolation algorithms. More importantly, the spectral matching accuracy of customized interference filters is far higher than that of RGB dye filters in Bayer arrays, thus achieving high-fidelity simulation of human visual response at the hardware level.
Systematic Comparison of the Two Technologies#
Measurement Dimension and Data Density#
A spot luminance meter outputs a set of scalar data (one luminance value and/or a set of chromaticity coordinates) for each measurement. Even with an XY translation stage for raster scanning, its data density is limited by mechanical positioning accuracy and scanning time.
An imaging colorimeter outputs a two-dimensional image for each measurement—more accurately, a set of spatially resolved luminance/chromaticity mapping maps. Each pixel position corresponds to complete photometric and chromatic information for a spatial point on the measured surface. Data density is directly determined by the sensor pixel count, easily reaching millions to tens of millions of measurement points.
Measurement Speed#
The speed bottleneck of spot measurement lies in mechanical movement and point-by-point integration. Even with a high-speed spot colorimeter, completing a uniformity scan containing thousands of measurement points takes several minutes to tens of minutes.
The measurement speed of an imaging colorimeter depends on the sensor’s exposure time and the filter wheel’s switching speed. For a display panel of medium brightness (several hundred cd/m²), the total acquisition time for the three channels is usually within a few seconds. Considering image processing and data calculation time, a complete measurement cycle can usually be controlled within 10 seconds. This allows imaging colorimeters to adapt to production line-level inspection cycles.
Differences in Accuracy Characteristics#
It should be noted that in terms of single-point absolute accuracy, high-end spot spectroradiometers still have an advantage. A spectroradiometer performs full spectral decomposition of incident light through a diffraction grating or interferometer, directly measuring the Spectral Power Distribution (SPD), and then calculates tristimulus values through integration with the CIE standard observer functions. This method fundamentally eliminates spectral mismatch errors (f1’ error) inherent in filter-based instruments, resulting in higher chromaticity accuracy when facing narrowband light sources (such as OLED, LED).
However, the advantage of an imaging colorimeter lies not in the absolute accuracy of a single point, but in Spatial Consistency and Relative Accuracy. In the same image, all pixels share exactly the same optical path, filter characteristics, and calibration parameters, which means that the relative measurement error between pixels is much smaller than the absolute error. For uniformity assessment and defect detection, relative accuracy is often more critical than absolute accuracy.
Summary of Quantitative Comparison#
| Dimension | Spot Luminance Meter / Colorimeter | Imaging Colorimeter |
|---|---|---|
| Sensor Type | Single Photodetector | Area CCD/CMOS (millions to tens of millions of pixels) |
| Data per Measurement | 1 spatial point | Millions of spatial points |
| Spatial Resolution | None (integral measurement) | Pixel-level (down to sub-millimeter resolution) |
| Typical Measurement Speed | 0.5-5 seconds per point | Full-field data acquisition < 10 seconds |
| Chromaticity Accuracy Basis | Spectral decomposition (spectroradiometer) or filters | Custom CIE-matched filters + calibration algorithms |
| Uniformity Assessment Capability | Requires point-by-point scanning, extremely low efficiency | Full-field uniformity map generated in a single shot |
| Defect Detection Capability | Limited by spot size and sampling density | Pixel-level defects can be detected |
| Typical Application Scenarios | Precise lab measurement, light source calibration, small area high-precision measurement | Production line quality control, display panel inspection, uniformity and defect analysis |
Analysis of Typical Application Scenarios#
Scenario 1: Final Inspection on Display Panel Production Lines#

In the Final Inspection station of a display panel production line, imaging colorimeters have become standard equipment. The system is usually installed inside a dark box, and panels are fed into the measurement station via a conveyor belt. The imaging colorimeter completes the acquisition of full-field luminance and chromaticity data within seconds, and software algorithms then perform the following analyses:
- Luminance Uniformity: Calculates the statistical distribution of luminance across the field, evaluating metrics such as the maximum/minimum luminance ratio and standard deviation.
- Chromaticity Uniformity: Evaluates the dispersion of chromaticity coordinates across the field, usually quantified by CIE 1976 u’v’ color difference (Δu’v’).
- Mura Defect Detection: Extracts low-frequency luminance non-uniformity patterns through spatial filtering algorithms to identify “cloud spot” defects perceivable by the human eye.
- Pixel-Level Defect Detection: In high-resolution mode, identifies bright pixels, dead pixels, and line defects.
In this scenario, a spot luminance meter is only suitable as a sampling tool—for example, to re-examine and confirm specific positions using a high-precision spot spectroradiometer after an imaging colorimeter has detected a suspected defective product.
Scenario 2: Optical Consistency Verification for Automotive Smart Cockpits#

Automotive dashboards and center console areas typically integrate multiple display screens, indicator lights, and ambient lighting. The user’s visual experience requires these optical elements to maintain a high degree of consistency in luminance and chromaticity.
An imaging colorimeter can cover the entire cockpit area in one or a few shots, simultaneously acquiring luminance/chromaticity data for all display screens and indicator lights, thereby directly evaluating the degree of visual matching between different elements. If a spot instrument were used, each element would need to be measured one by one, which is not only time-consuming but also difficult to ensure that all measurements are completed under the same environmental conditions.
Scenario 3: LED/Mini-LED Backlight Module Inspection#

LED and Mini-LED backlight modules contain hundreds or thousands of independent LED light sources. Deviations in luminance and color temperature of each LED will affect backlight uniformity, thereby affecting display image quality.
An imaging colorimeter can simultaneously distinguish and measure the luminance and chromaticity coordinates of each individual LED in a single shot, providing calibration data for each LED for Local Dimming Compensation. This task is physically impossible to complete at production line speeds with a spot instrument.
Scenario 4: Lab-Level Light Source Characterization and Standard Traceability#
In laboratories of national metrology institutes or light source manufacturers, spot spectroradiometers remain irreplaceable for absolute spectroradiometric traceability measurements of standard light sources. Their spectral resolution capability and absolute accuracy are beyond the reach of imaging colorimeters.
In this scenario, the imaging colorimeter usually serves as the object of calibration—using the standard values measured by the spectroradiometer as a reference benchmark to perform absolute calibration and calculate the Color Correction Matrix (CCM) for the imaging colorimeter.
Complementary, Not Substitutive: Building a Complete Optical Metrology System#

It should be emphasized that imaging colorimeters and spot instruments are not simple substitutes for each other, but rather two complementary levels in a modern optical metrology system.
Spectroradiometers provide spectral-level absolute accuracy and standard traceability, serving as the anchor of the metrology chain. Spot colorimeters provide fast, portable single-point measurement in scenarios where spatial resolution is not required. Imaging colorimeters, with their spatial resolution capability and parallel measurement efficiency, fill the huge gap between “measuring a point” and “understanding the entire surface.”
In designing quality control solutions for display products, reasonably configuring these three levels of measurement tools to achieve an optimal balance between accuracy traceability, production line efficiency, and defect coverage is a core task faced by optical metrology engineers.
Conclusion#
From a single photodetector to area sensors containing tens of millions of independent pixels, from one measurement point to millions of spatial samplings, display metrology is undergoing a paradigm shift from “sampling inspection” to “full-field census.” With its irreplaceable spatial resolution and parallel measurement efficiency, the imaging colorimeter has become a core component of the quality control infrastructure of the modern display industry.
Understanding the technical boundaries and applicable ranges of both imaging colorimeters and spot instruments, and reasonably combining and utilizing them in practical engineering, is a prerequisite for ensuring that optical measurement data is both accurate and complete.
FAQ#
Q1: Can imaging colorimeters completely replace spot luminance meters?#
No. They are complementary rather than substitutive. High-end spot spectroradiometers still have advantages in single-point absolute accuracy and spectral resolution, serving as anchors in the calibration chain; the core value of imaging colorimeters lies in spatial resolution and parallel measurement efficiency. In practical engineering, a spectroradiometer is typically used as a reference benchmark to calibrate an imaging colorimeter, which then executes full inspection tasks on the production line.
Q2: Is a higher sensor pixel count in an imaging colorimeter always better?#
Not necessarily. The sensor pixel count needs to match the optical resolution—if the lens MTF is insufficient to support the sensor’s Nyquist frequency, increasing pixels just samples a “blurred” optical image more densely without recovering more spatial detail. Additionally, higher pixel counts mean larger data volumes, longer readout times, and higher costs. When selecting, one should start from the pixel density of the measured object and inspection needs to choose a “sufficient and matching” pixel configuration.
Q3: Why do imaging colorimeters use a filter wheel instead of a Bayer color filter array?#
Because the RGB dye filters in Bayer arrays are designed to generate “visually natural” color images. Their spectral response deviates significantly from the CIE color matching functions, and they suffer from severe spectral overlap and metamerism issues. In contrast, customized tristimulus interference filters can accurately simulate the CIE standard observer functions, achieving high-fidelity matching of human visual response at the hardware level. Furthermore, in the filter wheel architecture, each pixel directly outputs a real measurement value for a single channel without demosaicing interpolation, avoiding false colors and spatial blurring.
This article is part of the Imaging Colorimeter Technology Knowledge Base series.
