Skip to main content
  1. Imaging Colorimetry Knowledge Base/

Pixel-Level Inspection: The Role of Ultra-High Resolution Sensors

Table of Contents
Illustration of display panel pixel defect detection—Typical appearances of bright and dead pixel defects (Image Source: Tab-TV)
图片来源于网络,如有侵权,请联系删除

Introduction: When Display Pixels are Finer than Sensor Pixels
#

In the quality control of display panel manufacturing, defect detection is one of the most critical processes. As display technology evolves from Full HD (1920x1080) to 4K (3840x2160), 8K (7680x4320), and even higher resolutions, the physical size of display pixels continues to shrink. The pixel pitch of a 65-inch 4K panel is approximately 0.37mm; it shrinks to about 0.19mm for an 8K panel of the same size. In the field of VR/AR microdisplays, pixel pitch can be as low as a few microns.

As a key tool for display panel inspection, an imaging colorimeter’s spatial resolution must match the pixel density of the measured panel. When the resolution of the measurement system is insufficient to resolve individual display pixels, defect information will be “averaged” into the signals of adjacent pixels, causing subtle defects to be missed. Furthermore, the spatial frequency relationship between sensor pixels and display pixels can produce Moire Patterns—a severe measurement artifact.

This article will analyze the spatial resolution requirements for high-resolution display inspection, explain the causes and suppression methods for Moire patterns, and discuss the matching between optical resolution and sensor resolution.

I. Spatial Resolution Requirements for 4K/8K Display Inspection
#

Samsung display panel defect type classification—Including pixel-level defects such as bright pixels, dead pixels, bright lines, and dark lines (Image Source: Tab-TV)
图片来源于网络,如有侵权,请联系删除

1.1 Understanding Resolution Requirements through Defect Types
#

Display panel defects can be classified into several levels based on spatial scale:

Panel-Level Defects: Such as large-area luminance or chromaticity non-uniformity (Mura), light leakage, etc. The spatial scale of these defects is in the range of millimeters to centimeters, and the resolution requirements for the measurement system are relatively loose.

Pixel-Level Defects: Such as bright dots, dead dots, bright lines, and dark lines. Detecting these defects requires the measurement system to resolve each individual display pixel independently.

Sub-Pixel Level Defects: Such as a single sub-pixel being stuck or having a color shift. A display pixel usually consists of Red (R), Green (G), Blue (B) (and sometimes more) sub-pixels. Detecting sub-pixel defects requires sufficient resolution in the measurement system to independently distinguish each sub-pixel within the same pixel.

In production line inspection, detection of pixel-level and sub-pixel level defects is a critical quality gate. A single bright pixel defect on a large screen displaying a dark scene can cause significant visual interference, even if its physical size is less than 0.1mm.

1.2 Sampling Theorem Constraints on Sensor Resolution
#

The resolution requirement for detecting pixel-level defects can be quantitatively analyzed using the Nyquist-Shannon Sampling Theorem.

The theorem states that to completely reconstruct a band-limited signal, the sampling frequency must be at least twice the highest frequency in the signal. Applying this principle to display inspection:

  • The display panel’s pixel array can be viewed as a two-dimensional spatial signal, with its spatial frequency determined by the pixel pitch.
  • The sensor pixel array of an imaging colorimeter constitutes the sampling system for this spatial signal.

To resolve each display pixel, the sampling interval of the imaging system on the display panel must be no more than half of the display pixel pitch—meaning each display pixel requires at least 2x2=4 sensor pixels to cover it (in 2D).

To resolve sub-pixels, the requirements are even stricter. For a typical RGB stripe arrangement, each pixel contains three horizontally arranged sub-pixels. Thus, the sub-pixel pitch in the horizontal direction is 1/3 of the pixel pitch. To resolve sub-pixels, at least 6 sensor pixel sampling points are needed for each display pixel in the horizontal direction.

1.3 Case Study: Understanding with Figures
#

Taking a 65-inch 4K UHD display panel as an example:

  • Active area size is approximately 1429mm x 804mm.
  • Horizontal pixel count is 3840, with a horizontal pixel pitch of about 0.372mm.
  • Each pixel contains RGB sub-pixels, with a horizontal sub-pixel pitch of about 0.124mm.

Pixel-Level Inspection: According to the Nyquist criterion (2 sampling points per pixel), at least 3840 x 2 = 7680 sensor pixels are needed horizontally. A 29MP (6576 x 4384 pixels) imaging colorimeter, when capturing the entire panel in a single shot, provides 6576 sampling points horizontally—insufficient for Nyquist sampling of all pixels on a 4K panel. This means under-sampling regions may exist in a single full-panel capture.

Sub-Pixel Level Inspection: At least 3840 x 6 = 23,040 sensor pixels are needed horizontally. This exceeds the resolution of almost all current single sensors. Even with a 61MP (9504 x 6336 pixels) sensor, the horizontal direction still cannot satisfy the sub-pixel level sampling requirement for the entire panel in a single capture.

In practical engineering, for sub-pixel level inspection of 4K+ panels, one of the following strategies (or a combination) is usually adopted:

  • Tiled Capture: Dividing the panel into multiple regions and capturing them one by one, followed by image stitching. This increases measurement time but provides sufficient spatial resolution.
  • Using Higher Resolution Sensors: Such as 61MP or even higher-pixel sensors to reduce the number of tiles or achieve single full-panel inspection for smaller panels.
  • Selecting Appropriate Lens Focal Length: Adjusting focal length and working distance to change the panel area covered in a single shot, achieving a balance between coverage and spatial resolution.

II. Moire Pattern: A Visual Disaster of Improper Sampling
#

Canon CMOS sensor for flat panel display inspection—High-resolution sensors are the hardware foundation for pixel-level defect detection (Image Source: Canon CMOS Sensors)
图片来源于网络,如有侵权,请联系删除

2.1 Physical Causes of Moire Patterns
#

A Moire pattern is an interference phenomenon that occurs when two periodic patterns with similar spatial frequencies are superimposed. In display inspection, the two periodic patterns involved are:

  • The Display Panel’s Pixel Grid: Consisting of regularly arranged pixel arrays with a fixed spatial frequency.
  • The Imaging Sensor’s Pixel Grid: Consisting of regularly arranged photosensitive pixels on the sensor chip, also having a fixed spatial frequency on the image plane.

When the display pixel array is imaged onto the sensor through the lens, the relationship between the imaged pitch of the display pixels and the sensor pixel pitch determines whether a Moire pattern is generated, as well as its frequency and intensity.

In mathematical terms: If the imaged pitch of display pixels on the sensor plane is $d_{display}$ and the sensor pixel pitch is $d_{sensor}$, the spatial frequency of the Moire pattern is:

$f_{moire} = |1/d_{display} - 1/d_{sensor}|$

When $d_{display}$ and $d_{sensor}$ are close (but not equal), $f_{moire}$ is low, resulting in large-scale, low-frequency light/dark stripes—a typical Moire pattern. When $d_{display}$ exactly equals $d_{sensor}$ (1:1 mapping), the Moire frequency is zero, but this precise match is nearly impossible in practice, and even slight deviations will produce wide low-frequency interference patterns.

2.2 Impact of Moire Patterns on Measurement Accuracy
#

Moire patterns have a severe negative impact on imaging colorimeter measurements:

Luminance Measurement Distortion: The light/dark stripes of the Moire pattern are superimposed on the true luminance distribution of the panel, causing measurement values in local areas to be higher or lower. During luminance uniformity assessment, Moire patterns might be misjudged as Mura defects inherent to the panel.

Interference with Defect Detection: False light/dark patterns generated by Moire can mask real pixel-level defects or cause false positive defect alarms. Both situations are destructive to production line inspection—the former leading to missed detections (quality risk) and the latter to overkill (yield loss).

Chromaticity Measurement Deviation: Since the RGB color channels are imaged independently and the spatial frequency of sub-pixels differs from that of whole pixels, Moire patterns may manifest differently in each color channel, causing spatially non-uniform deviations in chromaticity measurement.

2.3 Suppression Strategies for Moire Patterns
#

In the field of display inspection, methods to suppress Moire patterns mainly include:

Increasing Sensor Resolution (Oversampling): When the sensor sampling frequency is much higher than the spatial frequency of display pixels (i.e., significantly exceeding the Nyquist frequency), the Moire frequency is pushed near the sensor’s Nyquist frequency, and its amplitude and visibility are greatly reduced. This is the most fundamental solution and a core value of high-pixel sensors in display inspection.

Optical Low-Pass Filtering: By introducing appropriate optical defocus in the lens or using an optical low-pass filter, spatial low-pass filtering is applied to the display pixel image before it reaches the sensor, reducing the amplitude of high-frequency components. The cost is a reduction in the spatial resolution of the optical system—trading the ability to detect the finest defects for Moire suppression.

Software De-Moireing: After image capture, digital filtering algorithms are used to identify and remove frequency components corresponding to Moire patterns in the frequency domain. The effectiveness depends on how separable the Moire pattern and the real signal are in the frequency domain—if their spectra overlap, some real signal will be lost while removing Moire.

Adjusting Imaging Magnification: By changing the lens focal length or working distance, the imaged pitch of display pixels on the sensor plane is kept away from an integer ratio relationship with the sensor pixel pitch, thereby pushing the Moire frequency to higher values (less perceptible small-scale patterns). This is a practical method but requires optimization based on specific panel size and pixel density.

III. Technical Value of High-Pixel Sensors
#

Different pixel arrangements of display panels—High-resolution sensors need to match the inspection requirements of various pixel densities (Image Source: Radiant Vision Systems)
图片来源于网络,如有侵权,请联系删除

3.1 Evolution of Sensor Resolution: From FHD to 8K
#

The sensor resolution of imaging colorimeters has continuously increased over the past two decades, a trend closely related to the evolution of display panel resolutions:

  • Early 2000s: 1-2MP CCD sensors, satisfying panel-level inspection needs for CRT and early LCD panels (VGA to XGA resolution).
  • FHD Era: 5-12MP sensors gradually became mainstream, enabling pixel-level inspection of FHD panels.
  • 4K Era: 29MP sensors (e.g., 6576x4384) became the standard for pixel-level inspection of 4K panels. For FHD panels, this resolution can achieve sub-pixel level inspection.
  • 8K and High-Density Era: 61MP (e.g., 9504x6336) or even higher resolution sensors are being introduced to handle inspection needs for 8K panels and high-PPI (Pixels Per Inch) displays.

3.2 Benefits of High-Pixel Sensors
#

Higher Defect Detection Rate: More sensor pixels mean denser sampling of display pixels, allowing capture of finer spatial defect information. At the sub-pixel level, high-resolution sensors can distinguish anomalies in individual R, G, or B sub-pixels rather than just averaging them into whole-pixel anomalies.

More Effective Moire Suppression: As previously mentioned, oversampling is the most fundamental means of suppressing Moire. The higher the sensor resolution and oversampling factor, the more controllable the impact of Moire.

Larger Panel Coverage: While maintaining the same sampling density, higher resolution sensors can cover larger panel areas in a single shot, reducing the number of tiled captures and shortening total measurement time.

More Flexible Inspection Strategies: High-pixel sensors offer more flexibility in inspection strategies. For example, one can choose full-panel capture for rapid panel-level inspection (such as uniformity assessment) and also crop Regions of Interest (ROI) on the same image for pixel-level analysis—without re-adjusting the optical system.

3.3 The Cost of High-Pixel Sensors
#

High-resolution sensors are not without their costs:

Conflict between Pixel Size and Sensitivity: For a fixed sensor chip area, increasing the number of pixels means shrinking the area of each individual pixel. Smaller pixel areas correspond to smaller Full Well Capacities, leading to a decrease in single-exposure dynamic range. Also, smaller pixels collect fewer photons, resulting in weaker signals and lower SNR under the same exposure conditions.

Data Volume and Processing Speed: Image data from 61MP sensors is more than double that of 29MP. In production environments, image transmission, storage, and processing speeds directly affect inspection Takt Time. Higher data volumes require faster interfaces (e.g., Camera Link HS, CoaXPress) and stronger computing platforms.

Cost: Costs for high-resolution scientific-grade sensors (especially cooled CCD/CMOS) rise with pixel count, and resolution requirements for lenses increase accordingly.

IV. Matching Optical Resolution with Sensor Resolution
#

Keyence ultra-high-resolution industrial camera for high-precision vision inspection (Image Source: Keyence)
图片来源于网络,如有侵权,请联系删除

4.1 System Resolution: The “Cask Effect”
#

The spatial resolution of an imaging colorimeter is determined by both the optical system and the sensor, and is limited by the one with the lower resolution.

Sensor Resolution: Determined by pixel pitch. The smallest spatial frequency (Nyquist frequency) a sensor can resolve is 1 / (2 x pixel pitch).

Optical Resolution: Determined by the lens’s Point Spread Function (PSF) or Modulation Transfer Function (MTF). Optical resolution is affected by the diffraction limit, aberrations, focus accuracy, and other factors.

If the lens’s optical resolution (on the image plane) is lower than the sensor’s Nyquist frequency, increasing the number of sensor pixels will not bring a substantial improvement in spatial resolution—extra pixels are just sampling a “blurred” optical image more densely and cannot recover spatial details that the optical system failed to transmit.

4.2 Matching Principles
#

The ideal state of matching is for the lens MTF to maintain sufficient modulation depth at the sensor’s Nyquist frequency. Specifically:

  • Using a 29MP sensor (pixel pitch about 3.76μm), the lens resolution on the sensor plane needs to reach at least 133 lp/mm (line pairs per millimeter) to fully exploit the sensor’s resolution potential.
  • Upgrading to a 61MP sensor (pixel pitch about 2.74μm), the lens resolution requirement increases to approximately 182 lp/mm.

This explains why upgrading only the sensor without concurrently upgrading the lens often fails to yield the expected resolution boost. High-end imaging colorimeter systems typically use lenses specifically designed or selected to match sensor specifications.

4.3 Impact of Working Distance and FOV
#

In actual inspection scenarios, the impact of Working Distance and Field of View (FOV) must also be considered.

For large-size panels (e.g., 65-inch or larger), covering the whole panel in a single shot requires a large FOV or long working distance. This usually involves using short-focal-length lenses or increasing the distance between the camera and the panel. In both cases, the imaged size of display pixels on the sensor plane shrinks, increasing the demand for sensor resolution.

Conversely, for small-size high-PPI panels (e.g., smartphone screens or VR microdisplays), due to the small panel size, closer working distances and larger imaging magnifications can be used. At this point, optical resolution (diffraction limit) and Depth of Field (DOF) might become more significant limiting factors. For extremely small objects like microdisplays, specialized microscope lenses may be required to achieve the necessary spatial resolution.

V. Resolution Selection Strategy in Practical Applications
#

Sony IMX811 high-resolution image sensor (247 megapixels)—Targeted for high-end display inspection applications (Image Source: FRAMOS / Sony)
图片来源于网络,如有侵权,请联系删除

5.1 Selection Framework Based on Inspection Needs
#

When selecting the sensor resolution for an imaging colorimeter, start from the inspection needs and follow this framework:

Step 1: Define Defect Detection Requirements

  • Panel-level inspection only (uniformity, large-area Mura): Resolution requirements are relatively loose.
  • Pixel-level inspection needed (bright/dark dots, bright/dark lines): Each display pixel requires at least 2x2 sensor pixels for sampling.
  • Sub-pixel level inspection needed (individual sub-pixel defects): Each sub-pixel requires at least 2x2 sensor pixels for sampling.

Step 2: Calculate Required Sensor Resolution Based on the pixel count of the measured panel and defect detection requirements, calculate the minimum sensor resolution needed.

Step 3: Consider Coverage Strategy Evaluate whether a single capture can cover the entire panel or if tiled capture is required. Tiled capture increases measurement time but reduces sensor resolution requirements.

Step 4: Verify Optical System Matching Confirm that the optical resolution of the selected lens can support the resolution requirements of the sensor.

5.2 Typical Configuration Reference
#

The following list describes sensor configuration ideas for typical application scenarios (for illustration only):

Application ScenarioPanel MeasuredRecommended Sensor Resolution LevelInspection Coverage Method
Large LCD/OLED Panel-Level55"-75" 4K12-29MPSingle Full Panel
Large LCD/OLED Pixel-Level55"-75" 4K29MP and aboveSingle or 2-4 Tiles
Small/Medium Pixel-Level6"-15" FHD12-29MPSingle Full Panel
4K Panel Sub-Pixel LevelAny Size 4K61MP and aboveMulti-Tile Capture
VR/AR Microdisplay Inspection< 2" High PPI29MP + Microscope LensMulti-Tile Capture

Conclusion
#

Bad pixel detection test screen—Used to detect bright and dark pixel defects on a display screen (Image Source: Screen Burn In)
图片来源于网络,如有侵权,请联系删除

In the era of 4K/8K display technology, the spatial resolution of imaging colorimeters has become the core parameter determining the upper limit of inspection capability. The Nyquist sampling theorem provides a quantitative framework for evaluating resolution needs, and the Moire phenomenon reminds us that oversampling is not just “better” but “necessary” in many display inspection scenarios.

However, resolution increases should not be viewed in isolation. Sensor resolution, optical resolution, pixel sensitivity, and dynamic range—these parameters are mutually constrained. Blindly pursuing the highest pixel count while ignoring optical matching or sensitivity needs can lead to an imbalance in overall system performance.

A pragmatic approach is to start from specific inspection tasks, define defect types and detection requirements, calculate the required spatial resolution, verify the optical system’s support, and then find the balance between resolution, sensitivity, speed, and cost that fits your production line’s needs.

FAQ
#

Q1: To detect pixel-level defects on a 4K panel, what is the minimum sensor pixel count needed?
#

According to the Nyquist sampling theorem, each display pixel needs at least 2×2=4 sensor pixels covering it. A 4K panel has 3840 pixels horizontally and 2160 pixels vertically, so at least 7680 horizontal and 4320 vertical sensor pixels are needed, totaling about 33MP. A 29MP sensor (6576×4384) falls slightly short horizontally and might need tiled capture; a 61MP sensor (9504×6336) can satisfy full-panel pixel-level inspection. If sub-pixel level inspection is required, pixel needs triple.

Q2: What are the specific harms of Moire patterns to measurement?
#

Moire patterns superimpose false light/dark stripe patterns onto the true luminance distribution, leading to three problems: first, luminance uniformity measurement distortion—Moire might be misjudged as Mura defects; second, interference with defect detection—false patterns can mask real defects or cause false alarms; third, chromaticity deviation—the manifestation of Moire differs across RGB channels, causing spatially non-uniform chromaticity measurement deviations.

Q3: Can upgrading only the sensor pixel count yield higher resolution?
#

Not necessarily. The spatial resolution of an imaging system is limited by the lower of the optical system and the sensor (the “cask effect”). If the lens MTF has already severely attenuated at the sensor’s Nyquist frequency, increasing sensor pixels just samples a “blurred” image more densely without recovering spatial details not transmitted by the optics. For example, when upgrading from 29MP to 61MP, lens resolution requirements rise from about 133 lp/mm to 182 lp/mm. Therefore, sensor upgrades must be synchronized with lens verification or upgrades.


This article is part of the Imaging Colorimeter Technology Knowledge Base series.