Skip to main content
  1. Imaging Colorimetry Knowledge Base/

Near-Eye Display (NED) Testing: Optical Design for AR/VR Quality Assurance

Table of Contents
Near-eye display measurement system performing optical quality testing on VR/AR headsets (Image Source: AZoOptics)
图片来源于网络,如有侵权,请联系删除

Introduction: Why Near-Eye Display Measurement is So Unique
#

Measurement of Near-Eye Displays (NEDs) in VR (Virtual Reality) and AR (Augmented Reality) devices differs fundamentally from that of traditional flat-panel displays. When measuring a TV or smartphone screen, the imaging device can be placed dozens of centimeters away to capture the screen surface directly—because the screen is a physically existing light-emitting plane. However, an NED is completely different: the user’s eye is only a few dozen millimeters away from the last optical surface of the NED optical assembly. The displayed image is a virtual image generated by a combination of a microdisplay and an optical system inside the NED, and this virtual image may be projected at distances ranging from dozens of centimeters to infinity.

To measure the optical performance of such a system, the measurement equipment must “pretend to be a human eye”—placing its own optical entrance at the position where the human eye’s pupil should be, capturing images with the human eye’s perspective and aperture size. This requirement has catalyzed the design of specialized lenses for NED testing, including Front-Stop Lenses and Conoscopic Lenses. Their common feature is placing the entrance pupil at the front of the lens mechanical structure to simulate the optical entrance of the human eye.

Optical Particularities of VR/AR Near-Eye Displays
#

Near-eye display optical measurement system architecture—Showing the optical path coupling between an imaging colorimeter and an NED (Image Source: AZoOptics)
图片来源于网络,如有侵权,请联系删除

Light Path Structure
#

The light path of an NED system is entirely different from that of a flat-panel display. In a typical VR headset, an image generated by a microdisplay (such as a Micro OLED, LCoS, or LCD panel) passes through a set of near-eye optical lenses to form a magnified virtual image in front of the user’s eye. What the user sees is not the microdisplay itself, but a magnified virtual frame.

This optical architecture leads to several measurement particularities:

Extremely Short Working Distance. The distance between the human eye and the last optical surface of the NED optical assembly (known as Eye Relief) is typically only 10~20 mm. Measurement equipment must complete entrance pupil positioning and image acquisition within this short working distance.

Limited Exit Pupil Size. An NED system has a designed Exit Pupil (XP), and the human iris must be located within this exit pupil to see the complete image. The exit pupil diameter is typically 4~10 mm. Valid measurement data can only be obtained when the measurement equipment’s optical entrance is also located within the exit pupil.

Eyebox Constraints. The eyebox defines the spatial volume within which the human eye can move while still seeing an acceptable image quality. Deviating laterally or axially from the optimal exit pupil position degrades image quality but keeps it within allowable limits. Measuring image quality at different eyebox positions is an important part of NED testing.

Wide Field of View. The Field of View (FOV) of immersive VR devices often reaches 90~120 degrees, far exceeding the viewing angle range of traditional displays. Measuring such a wide FOV poses strict requirements on lens optical design.

Measurement Differences between VR and AR
#

There are important differences between VR and AR in terms of measurement:

VR (Virtual Reality) is a closed digital environment that does not involve ambient lighting. Measurement conditions are relatively simple, primarily focusing on the luminance, chromaticity, uniformity, distortion, and resolution of the displayed image itself.

AR (Augmented Reality) overlays digital content on top of the real world, involving the transmittance of ambient lighting and overlay effects. Measurement needs to consider the visibility, contrast, and color accuracy of the displayed image under various ambient light conditions. AR optical solutions such as transparent lightguides also introduce additional measurement requirements like dispersion, FOV limitations, and exit pupil replication.

Why Specialized Lenses are Needed to Simulate the Human Entrance Pupil
#

Gamma Scientific near-eye display measurement system—Precision equipment specifically designed for VR/AR headsets (Image Source: Photonics.com / Gamma Scientific)
图片来源于网络,如有侵权,请联系删除

Limitations of Traditional Lenses
#

The entrance pupil (the effective aperture position where light enters the lens) of standard industrial lenses is typically located inside the lens’s mechanical structure, at a distance from the front glass surface. When such a lens is placed near an NED, the lens’s physical housing conflicts spatially with the NED’s optical assembly—the entrance pupil cannot be placed at the NED’s exit pupil position.

More importantly, even if placed with difficulty, the entrance pupil shape and size of traditional lenses do not match the characteristics of the human pupil. The human pupil diameter varies between 2~8 mm (depending on lighting conditions), and the pupil is located at the very front of the eyeball. Measurement equipment must simulate these characteristics to obtain results consistent with the human visual experience.

Core Design Requirements
#

An NED testing lens needs to satisfy four basic requirements:

  1. Front-Positioned Entrance Pupil: The imaging colorimeter’s entrance pupil must be fully located within the NED’s exit pupil, requiring the entrance pupil to be at the front of the lens mechanical structure rather than inside.
  2. Adjustable Entrance Pupil Size: The entrance pupil size must be smaller than the NED system’s exit pupil size and needs to be adjustable to simulate the human pupil state under different lighting conditions (e.g., 3.6 mm for photopic vision, 7 mm for scotopic vision).
  3. Wide FOV Coverage: The lens field of view needs to cover the full display range of the NED (potentially exceeding 100 degrees for VR devices).
  4. Sufficient Spatial Resolution: Providing adequate angular resolution for the measurement items of interest.

These requirements have led to two main lens design solutions.

Working Principles of Front-Stop Lenses
#

Near-eye display measurement system paired with a medium-field lens for NED optical parameter testing (Image Source: Tepil)
图片来源于网络,如有侵权,请联系删除

Eyepiece-Based Solutions
#

The core idea of a Front-Stop Lens is to place the aperture (the aperture defining the entrance pupil) at the very front of the lens. One implementation uses eyepiece optical design:

Single Eyepiece Solution. An eyepiece with an appropriate focal length (e.g., 50 mm) is mounted directly to the front of the imaging colorimeter. The entrance pupil is in front of the eyepiece and can be placed within the NED’s exit pupil. This solution is structurally simple and suitable for high-resolution, small-field (about 15 degrees) measurements—for example, precise analysis of pixel-level details in the center of the NED.

When testing VR devices with a 50 mm front-stop eyepiece lens, sub-pixel structures in the image can be clearly seen. Imaging quality in the center is typically higher than in corners, where sharpness decreases and distortion increases.

Dual-Eyepiece Combination Solution. To cover a larger FOV range, a combination of two eyepieces can be used. The entire entrance pupil of the imaging colorimeter is located within the NED’s eyebox, and imaging is performed with a classical lens at the exit pupil of the dual-eyepiece system. This solution offers higher flexibility—by changing the focal length of the back-end lens, scaling can be adjusted between the field of view and imaging resolution.

For instance, using a dual-eyepiece front-stop lens combined with an 8 mm lens to test VR devices provides a luminance image covering a wide FOV range, suitable for eyebox testing and uniformity analysis.

Trade-off between Resolution and Field of View
#

There is a fundamental optical trade-off in NED testing:

  • A wide FOV requires a short-focal-length lens, which reduces angular resolution.
  • High resolution requires a long-focal-length lens or a larger sensor, which limits the field of view.
  • The sensor pixel count is fixed; the larger the FOV, the fewer pixels allocated per degree of field.

Therefore, actual testing typically requires two configurations:

  1. High-Resolution Configuration (small field, long focal length): Used for pixel-level detection, MTF measurement, and sub-pixel analysis.
  2. Wide-Field Configuration (wide field, short focal length): Used for FOV measurement, overall uniformity assessment, and distortion analysis.

Working Principles of Conoscopic Lenses
#

Westboro Photonics XR1 specialized near-eye display lens—Front-stop design simulating human entrance pupil position (Image Source: Westboro Photonics)
图片来源于网络,如有侵权,请联系删除

Fourier Optics Principles
#

Conoscopic lenses (also called hyper-centric or fourier lenses) use an entirely different optical design philosophy. They leverage Fourier optics principles to map the angular distribution of light-emitting points onto an image sensor—each sensor pixel corresponds to a different emission angle rather than a spatial position.

Key features of this mapping include:

  • The aperture (entrance pupil) naturally resides at the very front of the lens.
  • Very large test fields (reaching 120 degrees or more) can be achieved.
  • The lens forms an intermediate real image, and the final image on the sensor is inverted.
  • Can be combined with back-end lenses of different focal lengths to achieve measurement of various fields.

Applicable Scenarios
#

Conoscopic lenses are particularly suitable for the following NED testing scenarios:

  • Wide FOV Uniformity Measurement: A single acquisition covers the full NED field of view, evaluating the angular distribution of luminance and chromaticity.
  • Eyebox Scanning: By moving the conoscopic lens within the eyebox, image quality changes can be evaluated at different eye positions.
  • Field of View (FOV) Measurement: FOV parameters are extracted directly from the angular distribution image.
  • Contrast Evaluation: Evaluating the angular dependence of contrast across the entire field of view.

Limitations
#

The main limitation of conoscopic lenses is restricted spatial resolution. Because their design goal is angular rather than spatial resolution, imaging resolution decreases as FOV increases. Furthermore, some barrel distortion is inevitable across a wide FOV, requiring compensation through calibration and software correction.

Thus, conoscopic lenses are not suitable for testing items requiring high spatial resolution (such as pixel-level defect detection or high-frequency MTF measurement); those tasks should use front-stop eyepiece solutions.

Measurement Methods for Field of View (FOV)
#

Definition of FOV
#

The Field of View defines the maximum angular range a user can see through the NED, typically divided into horizontal, vertical, and diagonal FOV. For VR devices, immersion depends heavily on FOV—larger FOV means a stronger sense of spatial presence.

Measurement Method
#

Basic workflow for FOV measurement:

  1. Display Standard Patterns on the NED: Typically using a test frame with feature points marked at the four corners and the center.
  2. Capture Virtual Image: Use a measurement system with a wide-field configuration (such as a conoscopic lens or a short-focal-length front-stop lens) to capture images from the NED exit pupil position.
  3. Angular Calibration: Establish the correspondence between image pixel positions and angles using a standard target with known angular intervals.
  4. FOV Calculation: Measure the angles corresponding to the outermost visible feature points in the image to determine the effective FOV.

Effective FOV vs. Optical FOV
#

It is important to distinguish between two concepts:

  • Optical FOV: The theoretical maximum field of view of the NED optical system.
  • Effective FOV: The actual usable field of view for the user, provided that minimum luminance and image quality requirements are met.

Effective FOV is typically smaller than optical FOV because degradation in luminance uniformity, resolution, and increased chromatic aberration at the field edges makes image quality unacceptable in those areas. Determining the effective FOV requires identifying usable boundaries based on uniformity and image quality assessments.

Application of MTF in Near-Eye Display Evaluation
#

Why MTF is Critical for NEDs
#

The Modulation Transfer Function (MTF) quantifies an optical system’s ability to transfer spatial details. For an NED system, MTF comprehensively reflects microdisplay resolution, the quality of NED optical lenses, and the imaging performance of the entire light path.

NED MTF measurement has a key difference from that of ordinary displays: NED MTF can vary significantly across different positions in the field of view. MTF is typically highest in the center and gradually decreases toward the edges. This spatially varying MTF characteristic is an important metric for the quality of NED optical design.

Measurement Method
#

MTF of NEDs can be measured using the following methods:

Slanted Edge Method. Display a black-and-white edge on the NED at a small angle (e.g., 5 degrees) to the horizontal or vertical direction. Capture the virtual image of this edge with a high-resolution front-stop lens, analyze the sharpness of the grayscale transition, calculate the Edge Spread Function (ESF) and Line Spread Function (LSF), and obtain the MTF curve via Fourier transform.

Line Pair Method. Display line pair patterns of different spatial frequencies (alternating black and white stripes) on the NED, measure the contrast between stripes at each frequency, and directly construct the relationship curve of MTF vs. spatial frequency.

Evaluation of MTF Spatial Distribution
#

To comprehensively evaluate NED imaging quality, MTF should be measured at multiple positions in the field of view. Typically, measurements are taken at 9 or more positions including the center, four edges, and four corners to map the spatial distribution of MTF.

Spatially distributed MTF data provides direct guidance for optimizing NED optical design—it reveals where in the field the optical system has aberration bottlenecks, helping designers targetedly improve lens structures.

Measurement of Distortion and Chromatic Aberration
#

Schematic of full-color AR near-eye display optical system—Showing lightguide and color channel design (Image Source: Light: Advanced Manufacturing)
图片来源于网络,如有侵权,请联系删除

Distortion Measurement
#

Distortion is almost inevitable in NED systems due to short-focal-length, wide-field optical designs. Common types include:

Barrel Distortion. Image edges bulge outward, with straight lines near edges curving into arcs. This is a typical feature of short-focal-length wide-angle lenses.

Pincushion Distortion. Image edges shrink inward.

Asymmetric Distortion. Caused by decentering or tilting of optical components.

Measurement involves displaying standard patterns of known geometric shapes (such as a checkerboard or dot matrix) on the NED, capturing virtual images at the exit pupil, and calculating distortion by comparing actual image shapes with standard patterns.

Notably, VR devices typically use software pre-distortion to compensate for optical distortion—intentionally introducing distortion in the opposite direction when rendering, so it appears distortion-free after passing through the optical system. Measurement needs to distinguish between “residual distortion after pre-distortion compensation” and “original distortion of the optical system itself.”

Chromatic Aberration Measurement
#

Chromatic aberration is caused by the differing refractive indices of optical materials for various wavelengths of light. In NED systems, it manifests as:

Lateral Chromatic Aberration (LCA). Images of different colors shift laterally, causing colored fringes at the edges. LCA is usually most noticeable at field edges.

Longitudinal Chromatic Aberration. Images of different colors have different focal lengths longitudinally (in depth), making it impossible to focus precisely on all colors simultaneously.

Chromatic aberration is typically measured using the following steps:

  1. Display high-contrast patterns (such as crosshairs or grids) in red, green, and blue separately on the NED.
  2. Capture images for the three color channels using an imaging colorimeter at the same position and focus conditions.
  3. Compare the positional offsets of the same feature point across color channels to calculate LCA.
  4. Analyze differences in the best focus distance across color channels to calculate longitudinal chromatic aberration.

Automation Trends in NED Testing#

GS-E10 NED comprehensive measurement system—Supporting automated eyebox scanning and multi-parameter testing (Image Source: Gamma Scientific / YouTube)
图片来源于网络,如有侵权,请联系删除

Robot-Assisted Testing
#

Comprehensive quality evaluation of an NED requires measurements at multiple positions within the eyebox (e.g., a 25-point grid of 5 horizontal by 5 vertical positions), with each position potentially requiring switches in test patterns and configurations. Manual operation is inefficient and poorly repeatable.

Automated solutions use multi-axis motion platforms or robotic arms to precisely position the imaging colorimeter, completing scanning measurements across the entire eyebox according to pre-programmed position sequences. Lightweight and compact imaging colorimeters are the hardware prerequisite for achieving this.

Production Line Integration
#

For mass production of VR/AR devices, NED testing must be integrated into production lines and meet Takt Time requirements. Production line testing solutions typically make the following compromises:

  • Reducing the number of eyebox sampling points (e.g., measuring only the center and four edge points).
  • Selecting critical test items rather than comprehensive testing.
  • Using wide-field configurations to cover larger areas in a single acquisition.
  • Eliminating manual operation steps through software automation.

In production environments, precise alignment between the imaging colorimeter and the NED is key to efficiency and repeatability. Automatic alignment systems analyze the spot shape and position of the NED exit pupil to adjust the spatial pose of the imaging device in real-time, ensuring each measurement is conducted at the optimal exit pupil position.

Summary of Key Quality Parameters for NED Testing
#

A complete optical quality assessment of an NED includes the following core parameters:

Test ItemEvaluation ContentRecommended Lens Configuration
Luminance UniformityFull-field luminance distributionConoscopic lens or wide-field front-stop
Chromaticity UniformityFull-field chromaticity distributionConoscopic lens or wide-field front-stop
FOV MeasurementHorizontal/Vertical/Diagonal FOVConoscopic lens
ContrastFull On/Off contrast, local contrastConoscopic or front-stop lens
MTFSpatial resolution at multiple positionsHigh-res front-stop eyepiece
DistortionBarrel/Pincushion/Asymmetric distortionWide-field front-stop lens
Chromatic AberrationLateral/Longitudinal chromatic aberrationHigh-res front-stop eyepiece
Virtual Image DistanceFocus position and depthElectronic focus front-stop lens
Eyebox UniformityImage quality changes at different eye positionsConoscopic lens (with motion platform)
Mura DetectionPixel-level and area-level non-uniformityHigh-res front-stop eyepiece

Different test items require different lens configurations, meaning a complete NED testing system typically needs multiple lenses and the ability to switch between them quickly.

FAQ
#

Q1: Why do VR/AR near-eye display tests require specially designed lenses?
#

NED measurement equipment must simulate the human eye by placing its optical entrance at the position where the human pupil should be. However, standard industrial lenses have their entrance pupil inside the mechanical structure, causing spatial conflicts with NED optical assemblies when placed close—unable to reach the NED’s exit pupil position at the typical 10-20mm eye relief distance. This led to two specialized designs: front-stop lenses that place the aperture at the lens front, suitable for high-resolution small-field measurements; and conoscopic lenses that use Fourier optics to map angular distribution to the sensor, covering fields of view over 120 degrees. Both feature adjustable entrance pupils (3.6-7mm) to simulate human pupil states under different lighting conditions.

Q2: How does MTF measurement for near-eye displays differ from ordinary displays?
#

NED MTF measurement has a key distinction: MTF can vary significantly across different field positions, typically being highest at the center and gradually decreasing toward the edges. Therefore, MTF must be measured separately at multiple positions in the field (usually 9 or more, including center, four edges, and four corners) to map its spatial distribution. Methods include the slanted edge method (analyzing grayscale transitions of a black-white edge) and the line pair method (measuring contrast of different frequency stripe patterns). This spatially distributed MTF data directly guides NED optical design optimization by revealing aberration bottleneck locations.

Q3: What are the main optical testing differences between VR and AR devices?
#

VR is a closed digital environment without ambient lighting, making measurement conditions relatively simple and focused on the displayed image’s luminance, chromaticity, uniformity, distortion, and resolution. AR overlays digital content on the real world, requiring additional consideration of ambient lighting transmittance and overlay effects, as well as evaluating display image visibility, contrast, and color accuracy under various ambient light conditions. AR optical solutions like transparent lightguides introduce further measurement requirements such as dispersion, FOV limitations, and exit pupil replication, making AR optical testing significantly more complex than VR.


This article is part of the Imaging Colorimeter Technology Knowledge Base series.