
Introduction: Measuring a “Non-Existent Image”#
Measurement of a Head-Up Display (HUD) differs fundamentally from that of conventional flat-panel displays: it does not measure a physically existing light-emitting surface, but a virtual image projected by an optical system into the driver’s line of sight. This virtual image floats several meters in front of the windshield with no physical support. Its luminance, chromaticity, clarity, and geometric accuracy depend on the coordinated performance of every optical component in a complex light path.
For traditional spot luminance or colorimeters, measuring a virtual image of variable distance and limited size is already quite challenging. A comprehensive quality assessment of a HUD—including ghosting detection, distortion measurement, MTF analysis, and virtual image distance verification—requires a system-level solution combining an imaging colorimeter with an electronic lens and specialized analysis software.
Optical System Principles of HUD#
Basic Architecture#
The core light path of a HUD system consists of three key components:
Picture Generation Unit (PGU). Responsible for generating the original display image. Common PGU technologies include TFT-LCD panels (with LED backlighting), DLP (Digital Light Processing) chips, and LCoS (Liquid Crystal on Silicon) chips. The resolution, luminance, and contrast of the PGU directly determine the basic image quality of the virtual image.
Optical Magnification and Folding System. Includes optical components such as mirrors (flat and curved) and lenses, responsible for magnifying, folding, and adjusting the direction of light emitted by the PGU, eventually directing it to the projection area on the windshield. Curved mirrors simultaneously perform the functions of magnification and aberration correction.
Windshield (Combiner). Serves as a semi-reflective, semi-transmissive optical component that overlays the projected HUD image into the driver’s forward field of view. The curvature, tilt angle, laminated structure, and coating characteristics of the windshield directly affect virtual image quality.
Three Types of HUD#

C-HUD (Combiner HUD). Uses an independent translucent combiner (usually a plastic lens) mounted above the dashboard. It has a small projection area, a short virtual image distance (about 2 meters), and low cost, common in entry-level models and the aftermarket.
W-HUD (Windshield HUD). Directly utilizes the windshield as the combiner. It has a larger projection area and a virtual image distance ranging from 2 to 10 meters, making it the current mainstream factory-installed HUD solution. Since windshields are not originally designed for optical applications, their wedge-shaped structure and curvature variations can introduce issues like distortion and double images.
AR-HUD (Augmented Reality HUD). Extends the virtual image distance to 7–15 meters or further and achieves spatial superposition with real road scenes (e.g., navigation arrows overlaid on actual lanes). AR-HUD places the highest demands on virtual image distance accuracy, geometric consistency, and dynamic response speed.
Particularities of Virtual Image Distance Measurement#
Why Virtual Image Distance is Important#
The projection distance of a HUD virtual image directly affects the driver’s experience and safety. If the distance is too short, the driver’s eyes must frequently refocus between the distant road surface and the near virtual image, increasing visual fatigue. If the distance deviates significantly from its nominal value, AR-HUD virtual information will not accurately overlay real scenes, leading to misleading information.
Measurement Principle#
Measurement of virtual image distance utilizes basic optical imaging principles: when an imaging colorimeter’s lens focuses on the virtual image plane, the distance can be back-calculated based on the lens’s focal setting.
Specifically, an imaging colorimeter equipped with an electronic lens captures virtual images at multiple focus positions. By analyzing the image sharpness (e.g., edge sharpness or contrast) at each position, the focal value corresponding to the sharpest virtual image is determined and converted into actual distance units.
The accuracy of this process depends on:
- The focal step precision and repeatability of the electronic lens.
- The sensitivity of the sharpness evaluation algorithm.
- The depth of field range of the virtual image itself (it is not an infinitely thin plane but has a certain depth).
Challenge: Virtual Images are Not on a Flat Plane#
In W-HUDs and AR-HUDs, because the windshield has curvature, the virtual image is not projected onto a flat plane but presents a certain curved shape. This means the distance of different areas of the virtual image may vary—the center area might be at 7 meters, while edges might be at 6.5 or 7.5 meters.
Therefore, virtual image distance measurement cannot be done at a single point; it must be measured at multiple sampling positions across the virtual image to generate a distance distribution map and evaluate distance uniformity.
Automatic Focusing of Virtual Images with Electronic Lens#

Why Electronic Focus is Needed#
Traditional fixed-focus lenses can only focus at a fixed distance, making them unsuitable for HUD measurement where the virtual image distance is variable. Electronic focus lenses (also called electronically controlled focus lenses) adjust focus positions automatically via software commands within a continuous distance range.
Typical working modes for electronic lenses in HUD measurement include:
Focus Sweep Mode. The lens scans continuously from its nearest to farthest focus, capturing an image at each position. By analyzing the sharpness variation curve across the scan, the distance corresponding to the sharpest focus point is determined.
Focus Mode. Given an approximate virtual image distance, the lens focuses directly to the target distance to capture a single frame for luminance, chromaticity, or spatial quality analysis.
Calibration Requirements#
The focus settings of an electronic lens must correspond precisely to actual distances. Calibration is typically performed using standard targets at known distances: high-contrast targets (such as ISO 12233 resolution charts or checkerboard patterns) are placed at different distances, and the focal control values when the lens is focused at each distance are recorded to build a focal-distance calibration curve.
Calibration precision directly determines the accuracy of virtual image distance measurement. Ambient temperature changes can cause focal drift, so in strict applications, multi-temperature point calibration is required across the operating temperature range.
Causes and Detection Methods for Ghosting#

Physical Causes of Ghosting#
Ghosting is the most common and concerning image quality issue in W-HUDs. When light from the PGU reaches the windshield, part of it is reflected at the outer surface to form a primary image, while another part penetrates the outer surface and is reflected at the inner surface, forming a secondary image with a positional offset. These two images overlap in the driver’s field of view, creating a ghosting effect.
The severity of ghosting depends on:
- Windshield Thickness: Thicker glass results in a larger offset between reflections from the inner and outer surfaces.
- Windshield Wedge Angle: A wedge-shaped laminated structure is the standard means to mitigate ghosting by making the inner and outer surfaces non-parallel, thereby separating the directions of the primary and secondary images.
- Angle of Incidence: The angle at which light hits the glass affects reflectivity and the offset direction.
- Luminance Distribution of PGU: Ghosting from high-luminance areas is more perceptible.
Detection Methods for Ghosting#
Standard ghosting detection using an imaging colorimeter involves:
- Displaying high-contrast test patterns: Displaying white blocks or line patterns (against a black background) on the PGU so both primary and ghost images are clearly captured.
- Image Acquisition: The imaging colorimeter focuses at the virtual image distance and captures the complete image containing both primary and ghost images.
- Ghosting Separation: Software algorithms automatically identify primary and ghost images and calculate the spatial offset between them (usually in pixels or angles).
- Quantification of Ghosting Intensity: Calculating the ratio of ghost image luminance to primary image luminance, known as ghosting contrast. Typical pass standards require ghosting contrast to be below a certain threshold.
- Ghosting Direction Analysis: Determining the offset direction of the ghost image relative to the primary image (usually vertical, depending on the glass wedge angle design).
Application of JND in Ghosting Evaluation#
Research shows that human perception of ghosting relates to its contrast, offset, and background luminance. Simple luminance ratios do not fully reflect perceptibility. Consequently, some advanced detection solutions introduce ghosting evaluation models based on JND, considering visual factors like spatial frequency and background adaptation to provide a ghosting severity score closer to actual human experience.
Distortion Measurement#
Sources of Distortion#
Distortion in a HUD virtual image stems from the combined effects of all optical components:
- The PGU itself may have optical distortion.
- Surface errors of mirrors cause local magnification variations.
- Non-uniform windshield curvature introduces asymmetric distortion.
- Installation tolerances lead to relative position shifts of optical components.
For AR-HUDs, distortion is particularly critical: if a navigation arrow shifts away from an actual lane due to distortion, it directly impacts driving safety.
Measurement Methods#
Distortion measurement typically follows these steps:
- Displaying Standard Patterns: Displaying a dot matrix or grid test pattern on the PGU with points arranged at uniform intervals.
- Capturing Virtual Image: The imaging colorimeter captures the dot matrix or grid image within the virtual image.
- Point Extraction: Software automatically identifies the actual position coordinates of each test point in the image.
- Deviation Calculation: Comparing the actual position of each test point with its ideal position (expected position without distortion) to calculate deviation vectors.
- Quantification of Distortion: Calculating distortion percentages using standard formulas, distinguishing between barrel, pincushion, and asymmetric distortions.
Common distortion evaluation methods refer to definitions in the SAE J1757-2 standard, using 9 or more standard points and calculating distance deviations from reference line segments.
Field of View (FOV) Measurement#
FOV defines the angular range of the virtual image within the driver’s field of view. Horizontal and vertical FOVs are calculated by measuring distances between standard test points in the image and combining them with the virtual image distance.
Typical W-HUD horizontal FOV is about 5–10 degrees, with vertical FOV around 2–4 degrees. AR-HUDs have larger FOVs, reaching 10–15 degrees horizontally and 5–7 degrees vertically. FOV measurement precision directly impacts the accuracy of virtual image size evaluation.
Evaluation of Luminance and Contrast under Different Ambient Light Conditions#
Challenges under Sunlight#
The most extreme luminance challenge for HUDs is sunlight. Under sunny conditions, the ambient luminance outside the windshield can exceed 10,000 cd/m². For the virtual image to remain readable against such a bright background, its own luminance must be high enough, and the contrast with the background must meet minimum readability thresholds.
The ISO 15008 standard defines minimum luminance and contrast requirements for automotive displays and HUDs under various ambient illuminance levels. Testing involves simulating different conditions (from night to intense sunlight) and evaluating virtual image readability for each.
Luminance Uniformity#
Like ordinary displays, HUD virtual images have luminance uniformity issues. Due to characteristics of curved mirrors in the light path, the center area of a virtual image is usually brighter than the edges. Uniformity is evaluated similarly to displays by using an imaging colorimeter to capture the luminance distribution over the whole virtual image and calculating deviation ratios from the center luminance for each sampling point.
Contrast Measurement#
HUD contrast evaluation involves two dimensions:
Full On/Full Off Contrast. Measuring luminance in the virtual image area for full white and full black frames separately and calculating the ratio. This metric reflects the PGU’s basic contrast performance.
Checkerboard Contrast. Displaying a checkerboard pattern and measuring luminance in adjacent white and black squares to calculate local contrast. This metric better reflects contrast performance between adjacent high and low brightness areas in actual use and is closer to display scenarios for text and icons.
MTF (Modulation Transfer Function) Measurement#
MTF measures an optical system’s ability to transfer contrast at different spatial frequencies and is a key metric for evaluating HUD virtual image clarity.
HUD MTF measurement typically uses the ISO 12233 slanted-edge method: a precise slanted edge (black-white boundary) is displayed on the PGU. By analyzing the grayscale transition profile of this edge in the virtual image, the Edge Spread Function (ESF) and Line Spread Function (LSF) are calculated, and the MTF curve is obtained via Fourier transform.
The MTF curve shows the decay of contrast as spatial frequency increases. At specific cutoff frequencies, the MTF value must meet minimum requirements (e.g., MTF > 0.3 at half the Nyquist frequency) to ensure that text and icons in the virtual image have sufficient clarity.
Spot Size Analysis#
Spot size testing evaluates the imaging sharpness on the HUD virtual image plane. Ideally, a point on the PGU should also be an infinitely small point on the virtual image plane. In reality, due to various aberrations in the optical system, each point on the virtual image expands into a spot with a certain diameter.
Evaluation metrics for spot size include:
- Average: The mean spot diameter of all sampling points in the virtual image area, reflecting overall system sharpness.
- Standard Deviation: The spatial fluctuation of spot size, reflecting sharpness uniformity.
- Minimum: The upper limit of imaging performance at the best position in the system.
- Maximum: The lower limit of imaging performance at the worst position, crucial for identifying design weaknesses.
Smaller and more uniform spot sizes indicate higher and more consistent virtual image clarity.
Components of a HUD Measurement System#

A complete HUD optical measurement system typically includes:
Imaging Colorimeter or Luminance Meter. Serves as the core measurement device, providing high-resolution, high-dynamic-range spatial optical measurement. Imaging colorimeters are used when chromaticity performance also needs evaluation; otherwise, a luminance meter suffices.
Electronic Focus Lens. Supports software-controlled continuous focus, enabling automatic distance measurement and rapid switching of focus between different distance planes.
HUD-Specific Software Module. Integrates specialized functions for ghosting analysis, distortion measurement, MTF evaluation, virtual image distance calculation, luminance/chromaticity analysis, FOV measurement, etc. Test sequences can be automatically executed according to standards like SAE J1757-2.
Darkroom or Controlled Lighting Environment. Baseline measurements are typically performed in a darkroom (ambient illuminance < 1 lux). Evaluating performance under different ambient light conditions requires a controllable ambient light simulation system.
Positioning Fixture. Precisely fixes the HUD module or vehicle windshield position, ensuring repeatability of measurement geometry. The optical axis of the imaging colorimeter needs to align with the HUD’s design eye point, the preset position of the driver’s eyes.
Solar Load Testing#
In addition to conventional optical performance tests, HUDs must pass solar load testing. Intense sunlight could enter the system via the HUD’s reverse light path and converge on the PGU surface, causing thermal damage. This test is typically conducted under controlled conditions using a solar simulator and is not measured by an imaging colorimeter, but is essential in the full HUD validation process.
From Module to Vehicle: Two Stages of HUD Measurement#

Module-Level Measurement#
On HUD manufacturers’ production lines, module-level measurement tests independent HUD units in darkroom environments. This measures the optical performance of the HUD module itself, excluding windshield effects. It can efficiently screen for PGU defects, optical alignment deviations, and mechanical assembly issues.
Vehicle-Level Measurement#
After a HUD is installed in a vehicle, vehicle-level measurement is performed through the windshield. This measures the quality of the virtual image actually seen by the driver, including all optical effects introduced by the windshield (ghosting, distortion, dispersion, transmittance attenuation, etc.). Vehicle-level measurement is more complex, requiring the imaging equipment to be precisely placed at the design eye point and considering virtual image quality variations at different positions within the eyebox.
A significant application of vehicle-level measurement is evaluating and optimizing windshield optical design—by comparing module-level and vehicle-level results, specific impacts of the windshield on virtual image quality can be isolated, providing a quantitative basis for process improvement by glass suppliers.
FAQ#
Q1: How does HUD ghosting occur and how is it detected?#
Ghosting is the most common image quality issue in W-HUDs. When light from the PGU reaches the windshield, part is reflected at the outer surface forming a primary image, while another part penetrates and reflects at the inner surface forming a positionally offset secondary image. These overlap in the driver’s view to create ghosting. Severity depends on glass thickness, wedge angle, incidence angle, and PGU luminance distribution. Detection involves displaying high-contrast patterns on the PGU, capturing images at the virtual image distance with an imaging colorimeter, then software automatically separates primary and ghost images, calculates spatial offset and luminance ratio (ghosting contrast), and compares against thresholds for pass/fail judgment.
Q2: Why does HUD measurement require an electronic focus lens?#
HUD virtual image projection distances are variable (2-10 meters for W-HUD, 7-15 meters or more for AR-HUD), and due to windshield curvature, different areas of the virtual image may be at different distances. Traditional fixed-focus lenses can only focus at a single distance and cannot accommodate these needs. Electronic focus lenses adjust focal position via software commands, supporting Focus Sweep mode (continuous scanning to determine the sharpest focus distance) and Focus Mode (directly focusing at a target distance for image capture), enabling automatic virtual image distance measurement and rapid focus switching between different distance planes—making them a core component of HUD optical measurement systems.
Q3: What is the difference between module-level and vehicle-level HUD measurement?#
Module-level measurement tests independent HUD units in a darkroom, measuring the optical performance of the HUD module itself without windshield effects. It efficiently screens for PGU defects, optical alignment deviations, and assembly issues. Vehicle-level measurement is performed after the HUD is installed in the vehicle, measuring through the windshield to assess the virtual image quality actually seen by the driver, including all optical effects introduced by the windshield such as ghosting, distortion, dispersion, and transmittance attenuation. The imaging equipment must be placed at the design eye point. Comparing results from both levels isolates the specific impact of the windshield on virtual image quality.
This article is part of the Imaging Colorimeter Technology Knowledge Base series.
