Skip to main content
  1. Imaging Colorimetry Knowledge Base/

Traceability and Calibration: Bridging Lab Standards to Production Floors

Table of Contents
Metrological Traceability Chain Pyramid—The hierarchical transfer structure from national standards to field instruments (Image Source: Calibration Awareness)
图片来源于网络,如有侵权,请联系删除

Introduction: Why “Measuring Accurately” is More Important than “Just Measuring”
#

In the quality control system of the display industry, the Imaging Colorimeter plays a role as a core measurement tool. However, the fact that a device can output luminance values and color coordinates does not equate to those values being accurate or trustworthy. The key to whether a measurement instrument is “credible” lies not in its nominal accuracy, but in whether its measurement results possess Metrological Traceability.

The definition of metrological traceability is clear: the property of a measurement result whereby the result can be related to a national or international metrological standard through a documented unbroken chain of calibrations, each contributing to the measurement uncertainty. Measurement data lacking traceability is meaningless in quality comparisons across factories or suppliers—because it is impossible to determine whether the difference between two sets of data stems from actual differences in the products being tested or from the deviations of the measurement instruments themselves.

This article will elucidate the structure of the metrological traceability chain, explain the complete process for imaging colorimeters from factory calibration to user calibration, and discuss the concept of calibration uncertainty—an often overlooked but vital concept.

I. National Metrology Standards: The Top of the Traceability Chain
#

Hierarchical Diagram of Metrological Traceability Chain—Transfer of Calibration Certificates from International Standards to Working Standards (Image Source: Zero Instrument)
图片来源于网络,如有侵权,请联系删除

1.1 International and National Metrological Systems
#

The top level of the global metrological traceability system is coordinated by the International Bureau of Weights and Measures (BIPM, Bureau International des Poids et Mesures). In the fields of photometry and colorimetry, the National Metrology Institutes (NMIs) of each country are responsible for maintaining and reproducing their national photometric standards. Major NMIs include:

  • NIST (National Institute of Standards and Technology, USA): Maintains US photometric and radiometric standards. NIST has developed and maintains a new generation of transfer standards and working standard illuminance meters and tristimulus colorimeters, determining spectral responsivity on its SIRCUS (Spectral Irradiance and Radiance Responsivity Calibrations using Uniform Sources) and SCF (Spectral Comparator Facility), achieving consistency of illuminance responsivity within 0.1% and expanded uncertainty (k=2) of 0.2%.
  • NIM (National Institute of Metrology, China): Established in 1955 and affiliated with the State Administration for Market Regulation, it is China’s highest metrological science research center and a national-level legal metrological technical institution, undertaking the establishment, preservation, and transfer of national standards for photometry, colorimetry, etc.
  • PTB (Physikalisch-Technische Bundesanstalt, Germany): One of Europe’s most important metrological institutions, with deep accumulation in the fields of photometry and radiometry.

These NMIs ensure the consistency of national standards through International Comparisons. This means that whether the traceability chain of an imaging colorimeter ultimately traces back to NIST or NIM, its measurement results should theoretically be equivalent—provided the traceability chain is complete and uncertainty is correctly evaluated.

1.2 Physical Realization of Photometric Standards
#

The base unit of photometry is the candela (cd), one of the seven base units of the International System of Units (SI). The definition of the candela is based on the power of radiation and the human photopic spectral luminous efficiency function V(λ). NMIs reproduce photometric standards by measuring radiant power with absolute radiometers and combining this with the V(λ) function.

In terms of colorimetry, the three color matching functions x(λ), y(λ), and z(λ) of the CIE 1931 Standard Colorimetric Observer form the mathematical basis of colorimetric measurement. NMIs achieve the transfer of colorimetric values using precisely calibrated spectroradiometers.

II. Structure of the Metrological Traceability Chain
#

Basic Concepts of Instrument Traceability—Relationships between Standards, Calibration, and the Traceability Chain (Image Source: Dracal Technologies)
图片来源于网络,如有侵权,请联系删除

2.1 From National Standards to Field Instruments
#

A complete metrological traceability chain typically includes the following levels:

Level 1: National Primary Standards The highest-level metrological standards preserved and maintained by NMIs (e.g., NIST, NIM). These standards are usually not directly used for daily calibration work but for calibrating transfer standards.

Level 2: Transfer Standards High-level instruments calibrated by NMIs, such as high-precision spectroradiometers. After being sent to an NMI for calibration, these instruments return to instrument manufacturers or qualified calibration laboratories to serve as intermediate links in the calibration chain.

Level 3: Working Standards Reference-grade standard light sources or reference-grade instruments used in the calibration laboratories of instrument manufacturers. The values of working standards are obtained through transfer standards, and they participate directly in the calibration process of imaging colorimeters before they leave the factory.

Level 4: Field Instruments The imaging colorimeters used by end-users on production lines or in laboratories. Their values are obtained from working standards through factory calibration.

Each level of transfer introduces additional measurement uncertainty. Therefore, the longer the traceability chain, the greater the final uncertainty. High-quality instrument manufacturers strive to shorten the length of the traceability chain and control the propagation of uncertainty at each step.

2.2 The Role of Standard Light Sources in the Traceability Chain
#

Standard light sources are critical physical media in the traceability chain. In the calibration of imaging colorimeters, commonly used standard light sources include:

  • CIE Standard Illuminant A: A tungsten filament lamp with a color temperature of approximately 2856K, whose spectral power distribution can be accurately described by Planck’s Law of Radiation. Due to its predictable spectral characteristics, Illuminant A is widely used as a basic calibration source for imaging colorimeters.
  • Integrating Sphere Uniform Light Source: Point light sources are converted into large-area uniform light sources through an integrating sphere, providing a spatially uniform calibration field for imaging colorimeters. The diameter of the integrating sphere (e.g., 0.5m or 1m) determines the grade of uniformity at its output port.

Standard light sources themselves also need to be regularly sent to qualified calibration laboratories for re-calibration to ensure that their values do not drift due to lamp aging or other factors.

III. Factory Calibration: The Starting Point of Instrument Accuracy
#

GL OPTICAM 2.0 4K TEC Luminance Measurement Equipment—A high-precision imaging colorimeter for lab and production line calibration (Image Source: YouTube / GL Optic)
图片来源于网络,如有侵权,请联系删除

3.1 Illuminant A Calibration Process
#

The basic calibration of an imaging colorimeter before leaving the factory is typically performed using Standard Illuminant A. According to industry-standard validation procedures, this process includes:

  1. Placing the Standard Illuminant A inside an integrating sphere and preheating for at least 20 minutes to ensure stable output.
  2. Using a reference-grade spot spectroradiometer (such as Konica Minolta CS-2000 or equivalent) with traceable calibration from an NMI to measure the output port of the integrating sphere and obtain reference data—including luminance (L) and color coordinates (x, y).
  3. Using the imaging colorimeter to be calibrated to measure the same light source and obtain raw data.
  4. Calculating and writing Calibration Coefficients by comparing reference data with raw data, aligning the output of the imaging colorimeter with the reference instrument.

The core logic of this process is to take the measurement results of the spectroradiometer as the “true value” and correct the readings of the imaging colorimeter to match. The accuracy of a spectroradiometer stems from its ability to measure spectra directly, independent of spectral matching of filters, thus making it a higher-level reference.

3.2 Validation of Calibration Data
#

After calibration is completed, its effectiveness must be validated. The validation method involves comparative measurements between the calibrated imaging colorimeter and a reference-grade spectroradiometer, covering not only the calibration source but also various pure colors (e.g., Red, Green, Blue, Cyan, Yellow, Magenta, White) of an actual display screen. Comparison metrics include differences in luminance and color coordinates.

If the imaging colorimeter shows significant deviations when measuring non-Illuminant A spectra (such as RGB primaries of a display) after Illuminant A calibration, this usually means that the instrument’s tristimulus filters do not ideally approximate the CIE color matching functions—in which case further user-level correction needs to be considered.

IV. User Calibration: Bridging the Gap between Universal Calibration and Specific Applications
#

Labsphere Spectra UT Luminance Standard Source—A traceable standard light source for photometric and colorimetric calibration (Image Source: Pro-Lite Technology / Labsphere)
图片来源于网络,如有侵权,请联系删除

4.1 Why Factory Calibration is Not Enough
#

Factory Illuminant A calibration provides a “universal benchmark.” However, the objects measured by imaging colorimeters in actual use—display screens—have highly diversified spectral characteristics. Different display technologies (LCD with CCFL backlight, LCD with LED backlight, OLED, QD-OLED, Mini-LED) have vastly different Spectral Power Distributions (SPD).

As noted in references: “A set of correction matrix data must be created for each type of backlight or display panel with similar spectral characteristics”—different panels must be calibrated separately. This is because there is always some degree of Spectral Mismatch between the spectral transmittance of tristimulus filters and the CIE standard observer functions. This mismatch might have a minor impact under the continuous spectrum of Illuminant A but can be significantly amplified under the excitation of a display’s narrowband spectrum.

4.2 Four-Color Calibration (User-Level Correction)
#

Four-color calibration is a user-level correction method for specific display types. The process is:

  1. Using a reference-grade spectroradiometer (such as CS-2000) and the imaging colorimeter to be corrected, separately measure four pure colors—Red (R), Green (G), Blue (B), and White (W)—on the target display.
  2. Taking the spectroradiometer results as reference values, calculate a 3x3 Color Correction Matrix.
  3. This matrix is applied to the raw tristimulus data of the imaging colorimeter in subsequent measurements, correcting it to match the spectrometer results.

The essence of four-color calibration is to mathematically compensate for the spectral mismatch of tristimulus filters under specific spectral conditions. This allows the four-color calibrated imaging colorimeter to achieve colorimetric accuracy close to that of a spectrometer when measuring display screens of the same type (or with similar spectral characteristics) as those used during calibration.

4.3 Applicability and Limitations of User Calibration
#

The applicability of user calibration is limited by the degree of match between calibration conditions and actual measurement conditions. Specifically:

  • The correction matrix generated by four-color calibration is valid for a specific display technology. If the spectral characteristics of the measured display differ too much from those of the display used during calibration (e.g., using an LCD-calibrated matrix to measure OLED), the correction effect will drop significantly.
  • The uncertainty of the reference instrument used during calibration will be directly passed into the correction matrix. Therefore, the traceability and uncertainty grade of the reference instrument itself are critical.
  • Factors such as ambient temperature and instrument preheating time also affect the effectiveness of calibration. Display screens are usually required to preheat to a stable state before measurement (30-60 minutes for LED-backlit LCDs, 10-15 minutes for OLEDs).

V. Calibration Uncertainty: Quantifying “How Credible the Measurement Result Is”
#

Instrument Systems LED Luminance and Color Calibration Standard Source—Reference-grade equipment for production line calibration validation (Image Source: Instrument Systems / Direct Industry)
图片来源于网络,如有侵权,请联系删除

5.1 The Concept of Uncertainty
#

Calibration uncertainty is a quantitative description of the range within which a measurement result might deviate from the true value. It is not an “error”—an error is the difference between a measured value and the true value (usually unknown)—whereas uncertainty is an evaluation of the possible range of that difference based on statistical methods.

Uncertainty is usually expressed as Expanded Uncertainty, denoted as U = k × u_c, where u_c is the combined standard uncertainty and k is a coverage factor (usually k=2, corresponding to a confidence level of approximately 95%). For example, if the luminance measurement uncertainty of an imaging colorimeter is stated as “U = ±2% (k=2)”, it means that at a 95% confidence level, the deviation between its measured value and the true value traceable to a national standard does not exceed ±2%.

5.2 Sources of Uncertainty
#

During the measurement process of an imaging colorimeter, uncertainty arises from multiple stages:

Traceability Chain Transfer Uncertainty: Each level of calibration adds new uncertainty components. National standards themselves have the smallest uncertainty (potentially below 0.1%), but cumulative uncertainty gradually increases through transfer standards, working standards, and down to field instruments.

Uncertainty of Standard Light Sources: Including spectral stability, spatial uniformity, and temporal drift of the light source.

Instrument-Specific Uncertainty: Including sensor linearity, dark current noise, spectral matching degree of tristimulus filters (f1’ value), stray light in the optical system, etc.

Environmental Factors: Temperature changes, vibrations, ambient stray light, etc.

Measurement Method Uncertainty: Including operational factors such as Region of Interest (ROI) selection, exposure parameter settings, and number of averages.

5.3 Practical Significance of Uncertainty in Production Decisions
#

In production line quality control, the concept of uncertainty has direct engineering significance. Suppose the luminance pass criteria for a display panel is 500 ± 25 cd/m², and the expanded uncertainty of the measurement instrument is ±3%. When the instrument reads 520 cd/m², its true value could range from 504.4 to 535.6 cd/m²—still within the pass range. However, when the reading is 524 cd/m², the upper limit of the true value could reach 539.7 cd/m², exceeding the pass range—at this point, there is a risk in the pass/fail judgment of the product.

This is why measurement uncertainty must be taken into account when establishing production line judgment criteria. International standard ISO 14253-1 proposes the concept of “Guard Bands”: at the boundary of pass/fail judgment, set a certain margin based on measurement uncertainty to avoid misjudgment.

VI. Periodic Calibration and Maintenance of Values
#

Traceable Calibration Hierarchy—Uncertainty transfer from NIST national standards to field measurement instruments (Image Source: MH Force / Load Cell Calibration)
图片来源于网络,如有侵权,请联系删除

6.1 Determining Calibration Intervals
#

The calibration status of any measurement instrument is not permanently valid. Sensor sensitivity drifts over time, optical components may degrade due to environmental factors, and mechanical structures may shift due to frequent use. Therefore, Periodic Calibration is an essential part of maintaining measurement traceability.

Calibration intervals are typically determined based on:

  • Manufacturer recommendations (usually 12 months)
  • Frequency of use and severity of the operating environment
  • Drift trends observed in historical calibration data
  • Accuracy requirements for the parameters being measured

6.2 Field Calibration Validation
#

Between formal calibrations, users can monitor instrument status through daily Calibration Verification. A typical practice is to use a traceable reference grayscale standard plate or a stable light source to perform quick comparative measurements daily or weekly. If the results deviate from the reference value beyond a preset threshold, it suggests the instrument may need early factory re-calibration.

This “calibration verification” differs from “re-calibration”—it does not update the instrument’s calibration coefficients but merely confirms whether the existing calibration status remains valid.

Conclusion
#

Metrological traceability is the fundamental guarantee for the credibility of imaging colorimeter measurement data. From national metrology institute standards to every measurement on the production line, a complete, unbroken calibration chain, where uncertainty at each step is evaluated and controlled, is the basis for ensuring that measurement results from different locations and times are comparable.

For quality engineers in the display industry, understanding the structure of the traceability chain, mastering the timing and methods of user calibration, and correctly interpreting the meaning of uncertainty are not only basic metrological skills but also essential for making correct quality decisions in an increasingly sophisticated display technology competition. Measurements that ignore traceability, no matter how precise their readings appear, are merely numbers without a credible foundation.

FAQ
#

Q1: Why do significant chromaticity deviations still occur for some displays after factory Illuminant A calibration?
#

Because Illuminant A is a broadband light source with a continuous spectrum (a tungsten lamp with a color temperature of 2856K). Matching deviations of filters under a broad spectrum can be cancelled out through integration. However, displays (especially OLEDs and LED-backlit LCDs) have narrowband spectral characteristics, and filter matching deviations in these narrow bands are completely exposed and cannot be cancelled by integration. Therefore, four-color calibration (user-level correction) is required, using a display screen with spectral characteristics close to the actual object being measured as the calibration target.

Q2: What is the difference between measurement uncertainty and measurement error?
#

Measurement error is the difference between a measured value and the true value (usually unknown) and is a specific numerical value. Measurement uncertainty is an evaluation of the range within which a measurement result might deviate from the true value based on statistical methods, representing an interval estimate. For example, expanded uncertainty U = ±2% (k=2) means that at a 95% confidence level, the deviation between the measured value and the true value traceable to a national standard does not exceed ±2%. Uncertainty can be evaluated and declared, whereas error cannot be precisely known.

Q3: How can measurement uncertainty be used on a production line to avoid misjudgment?
#

The “Guard Band” concept proposed in ISO 14253-1 should be adopted. At the boundary of pass/fail judgment, set a margin based on measurement uncertainty. For example, if the pass criteria is 500 ± 25 cd/m² and the instrument uncertainty is ±3%, the actual judgment interval should be narrowed by the amount of uncertainty—the upper luminance limit changes from 525 to 525 - 15 = 510 cd/m², and the lower limit changes from 475 to 475 + 15 = 490 cd/m². This avoids the risk of misjudgment due to instrument uncertainty.


This article is part of the Imaging Colorimeter Technology Knowledge Base series.