
When using an imaging colorimeter to measure pixelated displays (LCD, OLED, MiniLED/MicroLED), Moiré patterns are an almost unavoidable interference factor. If left unhandled, Moiré patterns superimpose onto true luminance and chromaticity distributions, leading to severe distortion in results for uniformity measurement, Mura detection, and more. Starting from the physical causes of Moiré patterns, this article will introduce in detail the principles, operation methods, and applicable scenarios for two mainstream elimination methods: Optical Defocusing and Software Frequency Domain Filtering.
I. Physical Causes of Moiré Patterns: Spatial Frequency Aliasing#
1.1 Superposition of Two Periodic Structures#
A Moiré pattern is not a defect of the measured display itself, but an interference phenomenon generated after the superposition of two periodic patterns with similar spatial frequencies. In the scenario of measuring a display with an imaging colorimeter, these two periodic structures are:
- The Display Screen’s Pixel Array: The images on LCD, OLED, and other displays consist of regularly arranged pixels (or subpixels), whose pixel pitch defines an inherent spatial frequency.
- The Imaging Colorimeter’s Sensor Pixel Array: CCD/CMOS sensors likewise consist of regularly arranged photosensitive pixels, whose pixel pitch defines another spatial frequency.
When the display’s pixel array is imaged onto the sensor through the lens, a new low-frequency interference pattern—the Moiré pattern—is generated if the two spatial frequencies are close but not perfectly matched. Its frequency equals the difference between the two original frequencies.
1.2 Perspective from the Sampling Theorem#

From a signal processing perspective, Moiré patterns are essentially a spatial-domain aliasing phenomenon. According to the Nyquist Sampling Theorem, to reconstruct a signal without distortion, the sampling frequency must be at least twice the signal’s highest frequency.
When the sensor pixel density of an imaging colorimeter is insufficient to fully sample the pixel structure of the display (typically requiring each display pixel to correspond to more than 3x3 pixels on the sensor), high-frequency information in the pixel structure is “folded” into the low-frequency range, forming visual Moiré stripes.
1.3 Why Moiré Patterns are Almost Inevitable when Measuring Pixelated Displays#
In actual measurement, the following factors make Moiré patterns a universal problem:
- Limited Sensor Resolution: Even with high-resolution sensors, field-of-view requirements when measuring large-size panels mean each display pixel corresponds to only a few sensor pixels.
- Precise Integer-Multiple Relationships are Hard to Achieve: Theoretically, Moiré can be minimized if the sensor sampling rate is exactly an integer multiple of the display pixel frequency. In practice, this ideal condition is hard to meet due to continuously adjustable lens magnification, fixed sensor pixel pitch, and display pixel pitch varying by product.
- Lens Distortion and Tilt: Geometric distortion of lenses and slight tilt angles between the camera and screen cause local sampling rate variations, making Moiré patterns exhibit different intensities and orientations at various image positions.
II. Optical Defocusing#

2.1 Principle#
Optical Defocusing is the most direct and classical method for eliminating Moiré patterns. Its principle is: by slightly shifting the lens focus, the imaged Point Spread Function (PSF) is widened, thereby performing a low-pass filter at the optical level—blurring out the high-frequency information of the display pixel structure before the light reaches the sensor.
Once defocused to a certain extent, individual display pixel boundaries can no longer be resolved on the sensor, and Moiré patterns naturally disappear. At this point, the sensor captures the “macroscopic” distribution of luminance/chromaticity, which is exactly what is needed for uniformity assessment and Mura detection.
2.2 Operation Method#
Step 1: Initial Focus
Aim the imaging colorimeter at the measured display and first adjust the lens to the sharpest focus. At this point, pixel structures and Moiré patterns are typically clearly visible.
Step 2: Gradual Defocusing
While maintaining the relative positions of the camera and display, slowly rotate the lens focus ring to gradually defocus the image. Observe the changes in Moiré patterns in the image.
Step 3: Determine Optimal Defocusing Amount
Continue defocusing until the Moiré pattern just disappears. Caution should be taken not to over-defocus, as excessive defocusing leads to:
- Luminance information in edge regions “leaking” out of the frame, affecting measurement accuracy at edges.
- Excessive reduction in overall spatial resolution, potentially masking real local defects (such as small-area Mura).
An empirical criterion is: the defocused image should not show pixel structures but should still allow recognition of macroscopic display features (such as test pattern boundary lines).
Step 4: Lock Focus
Once the optimal defocusing position is determined, lock the lens focus ring. In production line applications, once adjusted for a specific DUT and measurement distance, re-adjustment is usually unnecessary.
2.3 Pros and Cons#
Pros:
- Simple and intuitive principle, easy to operate.
- Eliminates Moiré at the optical level without introducing computational latency from software processing.
- No need for additional software modules, suitable for resource-constrained inline inspection systems.
- Simultaneously effective against Moiré patterns in all orientations.
Cons:
- Defocusing reduces overall spatial resolution. If pixel-level analysis (e.g., dead pixel detection) is also needed, defocusing conflicts with it.
- Determining the defocusing amount depends on operator experience and subjective judgment, leading to inconsistent results.
- For edge regions, defocusing might cause intense light (or dark areas) from outside the display border to “leak” in, affecting edge measurement.
- Once focus is locked, it cannot be flexibly adjusted for Moiré patterns with different spatial frequency characteristics.
III. Software Frequency Domain Filtering#

3.1 Principle#
Software Frequency Domain Filtering solves the Moiré problem from another angle: capturing a sharp image (containing Moiré) at normal focus first, then identifying and removing the frequency components corresponding to Moiré through frequency-domain analysis in software.
The core tool is the Fast Fourier Transform (FFT). FFT transforms an image from the spatial domain to the frequency domain—where each periodic structure in the image manifests as a peak at a specific position. Moiré patterns, as regular periodic patterns, appear in the spectrum as distinct frequency peaks (or clusters) related to the display pixel structure.
3.2 Processing Workflow#
Step 1: Obtain a Sharp Image
Capture an image of the display under test at normal focus. This image contains both the true luminance/chromaticity distribution and Moiré patterns.
Step 2: Execute FFT
Perform a two-dimensional FFT on the captured image to transform it into the frequency domain. The center of the spectrum (the frequency-domain map) represents the DC component (average image luminance), and distance from the center represents higher spatial frequencies.
Step 3: Identify Moiré Frequency Components
In the spectrum, Moiré patterns appear as bright spots or rings at specific positions. These frequency components are typically located near the spatial frequency corresponding to the display’s pixel pitch. For Moiré patterns in different orientations, spectrum peaks appear at different angular directions.
Step 4: Design and Apply a Low-Pass/Notch Filter
Design a suitable filter based on the identified Moiré frequency components:
- Low-Pass Filter: Sets a cutoff frequency, retaining all components below this frequency (representing macroscopic luminance distribution) and removing all components above it (including Moiré and pixel structures). The advantage is simplicity; the disadvantage is that useful high-frequency details might also be removed.
- Notch Filter: Accurately targets the positions of Moiré frequency peaks, removing only those specific frequency components while retaining all others. The advantage is minimal loss of useful information; the disadvantage is the need to precisely locate Moiré frequencies, and it is less effective when the Moiré spectrum is dispersed.
Common forms of low-pass filters include:
- Ideal Low-Pass Filter: Completely passes everything below the cutoff and cuts off everything above. Results are clear but may introduce ringing effects (Gibbs phenomenon).
- Gaussian Low-Pass Filter: Smooth transition, introduces no ringing, and is a commonly used choice in practice.
- Butterworth Low-Pass Filter: Adjustable order controls the steepness of the transition band, balancing cutoff characteristics and smoothness.
Step 5: Execute Inverse FFT
Transform the filtered frequency-domain data back to the spatial domain using an Inverse FFT (IFFT) to obtain an image with Moiré patterns removed.
Step 6: Perform Subsequent Analysis
Execute subsequent processing steps like uniformity analysis, Mura detection, and colorimetric evaluation on the Moiré-free image.
3.3 Pros and Cons#
Pros:
- Does not affect optical system settings; images are captured at the best focus, preserving maximum spatial resolution.
- Filter parameters can be flexibly adjusted and optimized for Moiré patterns with different frequency characteristics.
- Real high-frequency details can be preserved while removing Moiré (when using a notch filter).
- The processing is repeatable, with parameters precisely recorded, independent of operator subjective judgment.
Cons:
- Increases computational load and software processing time, affecting the Takt Time of inline inspection.
- Selection of filter parameters (cutoff frequency, filter type) requires signal processing knowledge.
- Low-pass filtering might attenuate real high-frequency spatial information (such as sharp Mura edges) while removing Moiré.
- Global filters may perform poorly for complex, non-uniform Moiré patterns (e.g., variable-frequency Moiré due to lens distortion).
IV. Comparison of Applicable Scenarios for the Two Methods#
| Dimension | Optical Defocusing | Software Frequency Domain Filtering |
|---|---|---|
| Processing Stage | Optical acquisition (Hardware) | Post-processing (Software) |
| Spatial Resolution Impact | Overall reduction | Controllable impact range |
| Takt Time Impact | No extra time overhead | Increases image processing time |
| Parameter Adjustment Flexibility | Low (continuous but imprecise ring) | High (precise digital settings) |
| Operational Consistency | Depends on operator experience | Parameterized and repeatable |
| Compatibility with Pixel-level Analysis | Incompatible (pixel structure blurred) | Compatible (can retain original sharp image) |
| Suitability for Inline Inspection | Suitable (no computational latency) | Need to evaluate processing time |
| Suitability for Offline Analysis | Suitable (simple and fast) | Suitable (can optimize via multiple adjustments) |
| Typical Application | Uniformity, Mura detection | Scenarios needing both macro and micro analysis |
V. Practical Recommendations#

5.1 Prefer Optical Defocusing when:#
- It is an inline inspection system with tight Takt Time that cannot afford extra software processing time.
- Only macroscopic optical parameter evaluation (uniformity, chromaticity consistency) is needed, with no requirement for pixel-level analysis.
- The system integration solution aims for simplicity and minimal software complexity.
5.2 Prefer Software Filtering when:#
- It is offline R&D measurement requiring multi-scale analysis of the same set of images.
- Both macroscopic uniformity and microscopic defects (e.g., dead pixels) need evaluation in the same measurement.
- Moiré exhibits complex spatial variation that simple defocusing cannot fully eliminate.
- Precise control and recording of filtering parameters are needed to meet documented quality management requirements.
5.3 Combined Use of Both Methods#
In some applications, both methods can be used together for optimal results:
- Perform moderate optical defocusing to first weaken the strongest Moiré components.
- Use frequency-domain filtering in software to clear residual weak Moiré.
This combination strategy can achieve more thorough Moiré suppression without excessively sacrificing spatial resolution.
5.4 Preventive Measures#
Besides post-capture elimination, these measures help mitigate Moiré at the source:
- Optimize Optical Magnification: When conditions permit, choose measurement distances and lens combinations that allow display pixels to correspond to more sensor pixels.
- Micro-adjust Camera Angle: Tilting the camera slightly (usually 1-3 degrees) relative to the display can change the Moiré period and intensity, sometimes significantly weakening its visibility.
- Select Appropriate Sensor: During the selection phase, evaluate Moiré risk for different sensor resolutions in combination with the target DUT’s pixel pitch.
VI. Summary#
Moiré patterns are an unavoidable physical phenomenon when using imaging colorimeters to measure pixelated displays. Optical defocusing eliminates Moiré at the source via optical means; it is simple but sacrifices spatial resolution. Software frequency domain filtering preserves the original resolution but requires additional computational resources and signal processing knowledge. Engineers should choose the appropriate method or combination strategy based on specific application needs—inline vs. offline, requirement for pixel-level analysis, and Takt Time budget.
FAQ#
Q1: Are Moiré patterns a defect of the display itself?#
No, Moiré patterns are not a display defect. They are an interference phenomenon generated when two periodic structures with similar spatial frequencies—the display’s pixel array and the imaging colorimeter’s sensor pixel array—overlap. When these frequencies are close but not perfectly matched, a low-frequency interference pattern (Moiré) appears. From a signal processing perspective, this is essentially spatial-domain aliasing related to the Nyquist Sampling Theorem.
Q2: How do I choose between optical defocusing and software frequency filtering?#
The choice depends on your application. Optical defocusing is simple, introduces no computational delay, and suits inline inspection or scenarios requiring only macroscopic optical evaluation. Software frequency filtering preserves original resolution with flexible parameters, ideal for offline R&D and scenarios needing both macro and micro analysis. In some cases, combining both methods works best—moderate defocusing first weakens the strongest Moiré, then software filtering removes residual patterns.
Q3: How do I determine the optimal defocusing amount?#
The optimal defocusing amount is reached when pixel structures are no longer visible in the image but macroscopic display features (such as test pattern boundaries) remain recognizable. Over-defocusing causes edge luminance to spread outside the frame, affecting edge measurement accuracy, and excessively reduces spatial resolution, potentially masking real defects like small-area Mura. Once the optimal position is found, lock the lens focus ring.
This article is part of the Imaging Colorimeter Technology Knowledge Base series.
