The Importance of Accuracy, Precision, and Predictability in Dissolved Gas Analysis
Authors:
Kate Vacca – Product Manager, Dissolved Gas Analyzers
Mark Gross – Regional Director NAM, DGA Subject Matter Expert
In modern transformer monitoring programs, the value of dissolved gas analysis (DGA) depends on one fundamental requirement: accuracy. Diagnostic methods and trending rely on precise, repeatable gas measurements to distinguish between normal operating behavior and potential failure modes associated with developing electrical or thermal stress. When measurements lack precision, diagnostic confidence is drastically reduced, and decision-making becomes more difficult.
DGA is not just “multi-gas”; it is a complete set of key and unique individual gas measurements that must be coherent. These data points are used as standalone references and in ratios to each other, so measurement certainty is critical. For many applications, DGA involves eight gases associated with specific failure modes plus nitrogen, and these values need to be produced through a single, consistent measurement chain to assure correlation and overall measurement integrity. Mixing multiple measurement technologies can complicate the interpretation of results.
This is why accuracy remains a critical benchmark across all DGA technologies, and that is why gas chromatography (GC) is widely recognized as the Reference Method for measuring dissolved gases. This is because it physically separates each gas before measurement based on fundamental principles and then uses stable and proven sensing technologies. That separation reduces cross-interference and supports precise concentration values, which are essential for reliable diagnostics. For this reason, GC is the measurement approach referenced alongside laboratory DGA in the most widely used industry guidance (CIGRE, IEEE, IECi), and it is the only method used in laboratories for dissolved gas analysis.
Precision is important not only for the primary gases associated with failure modes, but also for oxygen and hydrogen. Hydrogen is a key indicator gas, appearing with the first signs of abnormal electrical or thermal activity, and while not used in ratios or triangles, it is a key early indicator widely used in asset health monitoring. Oxygen (and Nitrogen) also provides important insight into seal integrity and air ingress. While not direct indicators of failure modes, if oxygen and nitrogen are not properly accounted for, they can influence diagnostic interpretation and introduce uncertainty.
This is where technological differences start to matter. Infrared and IR-adjacent approaches used in online DGA, including PAS, NDIR, and FTIR families, measure gases through optical absorption rather than physical separation. In all IR-based architectures, only a subset of gases associated with failure modes can be measured because diatomic molecules do not absorb infrared energy. As a result, hydrogen, oxygen, and nitrogen can only be handled using separate sensor technologies, such as solid-state or electrochemical sensors.
Systems that rely on fixed optical filters or broad absorption bands are also more susceptible to cross-interference from other gases and oil-borne compounds, particularly when background gases that absorb infrared energy are present at elevated concentrationsii. There is no way to anticipate these interfering compounds without prior and ongoing oil testing.
Unlike GC, all infrared approaches measure overlapping absorption features rather than isolating each gas independently. As a result, IR-based systems place greater reliance on compensation algorithms, multiple calibration paths, and assumptions about background conditions. This makes careful validation and accuracy assurance especially important when IR technologies are applied to applications where precise failure mode identification and differentiation are required.
By contrast, GC is the only DGA technology in which all gases associated with failure modes, along with hydrogen and oxygen, are derived through a single chromatographic separation and detection framework. Nitrogen is determined separately as a calculated value based on the combined measured gas composition within the same coherent measurement system. Measuring the full gas set within one coherent measurement system supports correlation, consistency, and accuracy across all gases used in DGA diagnostics.
In many discussions around online DGA, performance is sometimes described in terms of trending rather than accuracy. This is often because relative changes over time can appear useful even when absolute measurement accuracy is uncertain. However, trending alone does not confirm that reported gas concentrations are close to their true values. Without validated accuracy, consistent movement in data may reflect sensor bias, offsets, or cross-interference rather than actual changes inside the transformer.
Industry standards and laboratory practice make a clear distinction here. Accuracy, as defined in ISO 5725-1iii, refers to closeness to the true value and includes both trueness and precision. Trueness can only be established through comparison to an accepted Reference Method. For dissolved gas analysis, laboratory testing uses gas chromatography as the Reference Method, which is explicitly referenced in IEC, IEEE, and CIGRE guidance and remains the basis for DGA diagnostics worldwide. As a result, validation of online DGA accuracy ultimately requires correlation to laboratory GC results.
Trending may provide directional insight, but only validated accuracy ensures that diagnostic thresholds, failure mode identification, and maintenance decisions are based on reliable measurements rather than relative movement alone. If offsets and drift are unacceptable for other monitoring instruments such as temperature, current, or power factor, it is reasonable to apply the same expectation for calibrated, consistent dissolved gas measurements.
The purpose of DGA is fundamentally about identifying failure modes early, which is why accuracy is so important. Without sufficient accuracy, results can shift where measurements appear within diagnostic tools such as the Duval Triangles and Pentagons, which can change the failure mode classification reported. This has become even more critical with the latest improvements from Dr. Duval, wherein areas associated with certain failure modes have been further divided into sub-areas. This has reduced the overall area associated with each failure mode and made accuracy even more important to avoid incorrect interpretation. For example, carbonization has been divided into C1, C2, and C3 regions.
Additionally, all DGA systems require upkeep to maintain measurement integrity, whether it be a carrier gas, a calibration system, a mechanical fan, light filters, or sensor parts. The key distinction is not whether maintenance exists, but how it is managed. When maintenance is condition-based and delivered through structured service programs, it becomes predictable and transparent rather than reactive. Predictability allows utilities to plan maintenance activities with confidence, avoid unnecessary interventions, and guarantee consistent measurement quality. As with all precise systems, maintenance and calibration are expected, but the best systems provide advanced visibility to avoid reactive and costly decisions and mobilizations.
In practice, accuracy and predictable maintenance are not compromises. They are essential elements of dependable transformer diagnostics and long-term asset confidence. Taken together, that is why GC has been and continues to be the reference standard for dissolved gas measurement, both in the field and in the laboratory.
i Some examples are CIGRE Technical Brochure 783, XXVIII SNPTEE Technical Paper, IEEE Std C57.104, IEC 60567
ii Infrared-based DGA techniques rely on overlapping absorption features and are inherently susceptible to cross-interference from background gases and oil-borne compounds whose presence and concentration depend on transformer oil chemistry and aging. See Dai et al., Frontiers in Physics (2025); Tang et al., Energies (2018); Valaskivi, Aalto University (2025).
iii ISO 5725‑1:2023, Accuracy (trueness and precision) of measurement methods and results — Part 1: General principles and definitions. [iso.org]