Introduction

Carbon monoxide (CO) concentrations may be measured in exhaled breath, ambient air or blood. Because of the high affinity of CO for hemoglobin (Hb), it has been assumed that the majority, if not all, of CO binds with Hb when introduced into the blood circulation. As a result, carboxyhemoglobin (COHb) has traditionally been considered the most appropriate clinical marker of exposure in CO poisoning [1]. However, COHb does not represent the only reservoir of CO in the human body; CO may be found in a free state dissolved in blood and can bind to other heme-containing respiratory globins, such as myoglobin in muscle, neuroglobin in the nervous system and, to a lesser extent, cytoglobin [2]. Although CO dissolved in blood in free form is acknowledged to have a role in the pathophysiology of CO poisoning [3, 4], its influence may be more substantial than what has been revealed in studies thus far. This would result in under- or overestimation of the true level of CO present in the analyzed blood sample, potentially elucidating some of the cases where inconsistencies between measured COHb levels and reported symptoms were found. However, there is currently little data available on free CO.

COHb in blood is measured directly or indirectly using either optical methods, such as CO-oximetry, ultraviolet (UV)-spectrophotometry and pulse oximetry, or gas chromatography (GC) in combination with a variety of detectors (flame ionization detector, mass spectrometer). In clinical cases, the “gold standard” for the measurement of COHb in blood is by CO-oximetry (or pulse oximetry), either as a separate instrument or integrated in what is commonly known as a blood gas analyzer (BGA) or radiometer [5]. Although UV-spectrophotometry remains the most frequently used method in forensic cases, CO-oximetry and GC methods are also widely employed in this field.

Like any biomarker, the quantitative measurement of COHb is subject to a variety of factors that influence the measurement. Measurement error in analytical studies is defined as “uncertainty” or “bias”. Uncertainty originates when several predictable, but not always controllable factors affect the measured values and potentially alter the values obtained, resulting in a deviation from the true value. In medical practice, and especially for toxicologists, the correct and accurate determination of a biomarker is crucial in order to make the correct diagnosis and initiate the proper treatment in clinical cases, and to determine the correct cause of death in forensic cases. Failure to do so can have severe clinical and legal consequences. Therefore, in this paper, we aim to review the accuracy of current methods for measuring CO and to determine their potential sources of error and effects on the interpretation process.

Method of literature search

PubMed was searched in November 2018 using the keywords (“carbon monoxide” or “carboxyhemoglobin”) and (“poisoning”) and (“measurement” or “determination” or “quantification” or “analysis” or “breath” or “blood” or “oximet*” or “spectro*” or “gas chromatography” or “storage”); this produced 191 hits. Systematic reviews, meta-analyses, general review articles, and retrospective, prospective, observational and clinical cohort studies were excluded as well as case reports, limiting articles included in those which were focused specifically on describing a method for analysis of CO or COHb in various tissues and those describing issues related to analysis of samples (storage, sample pretreatment, etc.). This left 49 relevant articles on measurement methods and sources of errors.

Measurement of CO in breath

Analytical techniques

Analysis of CO in exhaled breath was evaluated as a measurement method for clinical cases, because a good correlation between alveolar breath CO and COHb was found by several research groups [6,7,8,9]. Portable devices, called MicroCOmeters or CO monitors, are often used in smoking cessation programs [8, 10] and may be useful when a rapid on-site assessment in multiple casualties is necessary, enabling the most severe cases to be identified [11]. This measurement is based on an electrochemical fuel cell sensor, which works through the reaction of CO with an electrolyte on one electrode and oxygen (from ambient air) on the other. This reaction generates an electrical current proportional to a CO concentration. The output from the sensor is monitored by a microprocessor, which detects a peak at expired concentrations of CO in the alveolar gas [12]. These are then converted to COHb% using the mathematical relationships described by Jarvis et al. [8] for concentrations below 90 parts per million (ppm) and by Stewart et al. [13] for higher levels.

Sources of error

Measurement of CO in breath cannot account for the total CO concentration present in the blood at the time of exposure. The method is very susceptible to the influence of a variety of factors that can easily alter the result, leading to under- or overestimation of the true concentration (Table 1). One major factor is the variability in subjects' breath-holding ability. To obtain the alveolar gas, it was found that the breath needs to be held for 20 s, and then only the end-tidal expired air is used for CO measurement. Given the individual differences in pulmonary function, capillary diffusion surface, and inspiration and expiration rates, coupled with the inability to fully control whether a subject is properly holding their breath, the portion of expired alveolar gas sampled and the results obtained can have a high degree of variability [6, 8, 13]. This can also pose an issue in susceptible groups of the population, such as the elderly, children or those with respiratory diseases. Furthermore, because they were initially designed for smoking cessation programs, the accuracy of CO monitors is better in lower CO concentrations and might therefore not be sufficiently accurate for acute intoxication [14]. Nevertheless, CO monitors are highly useful on sites of mass casualties or for first responders. They are portable and can provide an indication of the gravity of the case, enabling both the appropriate treatment of the patient and proper precautions to be taken by first responders.

Table 1 Overview of methods used for carboxyhemoglobin/carbon monoxide analysis, their main characteristics and limitations, and reference examples

Measurement of CO in blood: optical techniques CO-oximetry and spectrophotometry

Analytical techniques

Spectrophotometric or optical methods measure the concentration of COHb based on the quantity of light absorbed when the compound is exposed to light of different wavelengths. Early methods involved single-beam UV or double-wavelength spectrophotometry, and were developed for use due to the spectral absorbance of the Hb structures and to the distinct spectral differences between oxyhemoglobin (O2Hb) and COHb [15,16,17]. A similar method measures differences in absorbance in the visible spectra between reduced Hb (HHb) and COHb, where a reducing agent is added to the blood sample that reduces O2Hb but not COHb [18, 19].

However, double-wavelength spectrophotometry was not a very accurate and specific method [16], because results were based on the measurement of only two wavelengths. Automated differential spectrophotometry was later developed, which uses double-laser beams to determine the difference in absorbance of a sample compared to a negative sample; thus with this method, matrix effects are accounted for, resulting in better accuracy.

CO-oximetry is a measurement technique based on multiple-wavelength spectrophotometry, which uses up to the full range of wavelengths for analysis, allowing for more accurate measurement of COHb [20,21,22]. They are currently the standard analytical technique used for measurement of COHb, either with a separate instrument or, for hospital cases, integrated into a BGA [18, 23, 24].

Despite the advantages of CO-oximetry, for the sake of cost-efficiency, UV and double-wavelength spectrophotometers are currently still used in many developing countries and are also listed in the International Organization for Standardization (ISO) standard 27368:2008, “Analysis of blood for asphyxiant toxicants—carbon monoxide and hydrogen cyanide” [25].

Sources of error

Several issues can alter the measurement results from optical methods, mainly due to the susceptibility of these methods to changes in sample quality as a result of poor sample handling techniques and storage conditions (e.g., temperature, preservative) and biochemical alterations that occur over time [26]. Some of the most important potential errors for COHb determination are as follows:

  1. 1.

    Type of preservative: the type of preservative used in the blood tube used to store the sample can alter the results due to biochemical reactions that can take place, which can either increase or decrease the concentration of CO [27, 28].

  2. 2.

    Storage temperature: the use of different storage temperatures was shown to alter the results; storage over prolonged periods of time can lead to degradation of the sample, which can lead to in vitro CO production, resulting in overestimation of the concentration; storage at room or hot temperatures leads to faster degradation as compared to storage in the fridge or freezer [26, 28, 29].

  3. 3.

    Dead volume: the different amounts of headspace (HS) volume in the sampling tube (which is known as dead volume) can alter the results because of the reversibility of the bond between CO and Hb; the more dead volume in the tube, the more likely the dissociation of CO from Hb and release into the HS [30].

  4. 4.

    Freeze-and-thaw cycles: whether a sample has been frozen and then thawed one or more times can also alter the resulting measurement, due to the breakdown of the erythrocytes [28].

  5. 5.

    Reopening of the sampling tubes: the repeated opening of the tube can lead to substance loss (in gaseous state when CO is not bound to Hb) with increasing number and time of reopening as well as increased exposure of the sample to oxygen [23, 28].

  6. 6.

    Postmortem (PM) changes: thermocoagulation, putrefaction and PM CO production are all known sources of error, but they cannot be quantified due to their biologically unpredictable nature [27, 31, 32].

  7. 7.

    Instrument and personal error: errors due to the instrument or the operator are random, but they can be corrected by using an internal standard when possible, which minimizes the error [33].

These factors are applicable to both optical measurements of COHb and GC measurements of CO. Specifically for spectrophotometric methods, several of the factors listed in Fig. 1 have been investigated and are described in more detail as follows.

Fig. 1
figure 1

General steps for a quantitative laboratory analysis and respective potential sources of error for carbon monoxide (CO) determination

Studies performed earlier by Chace et al. [28] and later by Kunsman et al. [27] evaluated a number of storage conditions, including the amount of air present in the sampling tube (known as dead volume, which can alter the results because of the reversibility of the bond between CO and Hb and potential dissociation of the gas into the HS of the tube), storage temperatures, preservatives and initial COHb saturation levels. They observed that decreased COHb levels were related to the ratio of the exposed surface area to the volume of blood (the higher the exposed surface area, the greater the loss), the storage temperature (the higher the temperature, the greater the loss) and the initial COHb% saturation level (the higher the COHb level, the greater the loss). A hypothesis was proposed whereby the influence of the HS volume in the sampling tube was explained by the formation of an equilibrium between CO in the blood and the air above the blood sample in the tube [28]. Storage of blood at room temperature or higher leads to faster degradation and lower sample stability, affecting spectrophotometric measurement of CO, which was also confirmed by other research groups [26, 34]. Additionally, they found no effect from the preservative used; however, testing was performed with an insufficient number of preservatives [only two, namely sodium fluoride (NaF) and ethylenediaminetetraacetic acid (EDTA)], which were compared to samples with no preservative, and only on samples stored frozen immediately after sampling over a period of 2 years. Analysis of the samples on only two significantly distant time points might fail to detect changes in short-term storage due to the use of preservatives, which is more relevant than long-term storage, since in the majority of cases samples are analyzed within a few hours to days. Nevertheless, these findings are especially relevant for forensic or legal cases, where retrospective analyses can still provide sufficiently reliable information. The resulting lack of impact from the preservative might however be biased because the measurements were performed with optical methods only, which are known to be influenced by the blood state. Therefore, smaller changes due to the preservatives might not have been detected by this less sensitive measurement method. However, Vreman et al. [35] were able to determine that the use of EDTA as preservative led to falsely increased COHb values when measured by CO-oximetry. Nevertheless, these findings would have been more significant with confirmation by another measurement method, such as GC.

Furthermore, these conditions may influence not only the CO levels present in the blood, but also the blood quality [28]. For samples that cannot be readily analyzed and are not stored under optimal conditions, degradation of the sample occurs, which was confirmed to hamper optical measurement methods used to determine COHb levels [36]. This can be a major issue for many laboratories where optical techniques are routinely used for sample analysis.

Additional factors influencing the measurement of COHb-levels that have been reported in the literature include the presence and amount of oxygen in air [23] and, in PM samples, thermocoagulation in fire victims [34], putrefaction during a prolonged PM interval (PMI) [37], contamination due to hemolysis, high lipid concentrations or thrombocytosis, all of which result in turbidity of the sample, hampering measurements performed with optical techniques. Another frequent and significant phenomenon to consider during evaluation of the results is the PM production of CO in the organism [32, 38]. CO was found to be produced in significant quantities in cases that were not related to fire or CO exposure. However, the cases in which this occurs are mostly cases of putrefied bodies. It was confirmed that CO is formed due to the decomposition of various substances present in the body, such as erythrocyte catabolism, a phenomenon that also occurs in living organisms [32]. Therefore, it is important to differentiate those cases from the real CO intoxication cases, which can be done with the help of autopsy-determined cause of death, even though it is not always a simple task to completely exclude the possibility of the role played by CO in these cases [23]. As a result, PM decomposition currently constitutes a field with open questions that require further investigation.

Antemortem COHb measurement by pulse CO-oximetry

Analytical techniques

In clinical settings and generally for living patients, a noninvasive alternative to venous or arterial blood COHb measurement by BGA or CO-oximetry that has been widely investigated is pulse CO-oximetry [39,40,41,42,43]. Similarly to standard CO-oximetry, pulse CO-oximetry is a spectrophotometric method that quantifies multiple types of hemoglobin, including COHb, based on the absorbance of light after exposure to different wavelengths [43]. As opposed to regular CO-oximeters, pulse CO-oximeters have the ability to measure COHb continuously and without the need for blood sampling, thus allowing the monitoring of COHb levels in real time and simultaneously with the administration of treatment.

Sources of error

Noninvasiveness and cost- and time-efficiency are some evident advantages of using pulse CO-oximeters. However, for CO poisoning diagnosis, there are more important factors from a medical perspective, such as accuracy, precision and reliability. The ability to diagnose a CO poisoning case quickly is necessary, but if the results obtained over- or underestimate the true COHb levels, this can have severe and potentially fatal consequences. Several studies have reported low precision and accuracy as well as elevated false-positive and false-negative rates in comparison with regular blood measurements [5, 39,40,41,42]. Especially for COHb levels above 10%, pulse CO-oximeters significantly underestimated the COHb levels [39].

Furthermore, factors such as blood pressure, oxygen saturation and body temperature also appear to affect the accuracy of pulse CO-oximeters [42]. Feiner et al. [40]  reported that the pulse CO-oximeter always gave low signal quality errors and did not report COHb levels when oxygen saturation decreased below 85%, which is indicative of hypoxia. Considering that hypoxia is one of the main effects of CO poisoning, it is a severe disadvantage not to be able to accurately measure COHb in hypoxic states. However, a more recent study by Kulcke et al. [43] reported good accuracy in measuring COHb even during hypoxemia by use of an upgraded/revised version of the pulse CO-oximeter, although slightly greater underestimation of COHb levels was reported for COHb concentrations above 10%. This confirms that pulse CO-oximeters can be useful for monitoring exposure to low CO levels, but accuracy and precision are not guaranteed for more severe poisoning or for smokers, who generally have baseline COHb levels that range from 3 to 8% in normal smokers but can easily reach 10–15% in heavy smokers [1, 2].

In contrast to postmortem CO-oximetry, antemortem COHb measurement by pulse CO-oximetry is not affected by storage or sampling parameters, thus reducing the potential sources of error. Additionally, no laborious and time-consuming calibration of the device seems to be needed based on what is reported in the literature, leading to a more simplified routine analysis, although there is scarce information regarding device maintenance. Similar to general CO-oximetry, and despite good accuracy and precision, measurement of only CO bound to Hb can lead to underestimation of the total CO burden and thus to misdiagnosis. Another relevant point from a juristic perspective is that pulse CO-oximetry does not provide samples that can be used for confirmation or counter-expertise in legal disputes.

Measurement of CO in blood: gas chromatography

Analytical techniques

The principle behind GC CO detection is based on the measurement of the released CO dissolved in blood as well as the one bound to Hb through a liberating agent (after red cell lysis). Therefore, the sample is firstly treated with a hemolytic agent, such as saponin, Triton X-100 or other detergents, and subsequently acidified to liberate the CO in blood [34, 44,45,46,47]. The reaction of COHb with a powerful acid/oxidizing agent was found to efficiently release CO and water as products. The releasing agents commonly used are sulfuric acid (H2SO4), hydrochloric acid (HCl) and potassium ferricyanide (K3Fe(CN)6). Other acids including lactic acid [48], citric acid [48, 49] and phosphoric acid [49] have also been tested.

In the studies performed in earlier years (1970s, 1980s and 1990s), potassium ferricyanide was introduced for the release of CO and became very popular due to its easy availability, since it was already used in spectrophotometric methods as hemolytic agent. It was also found to be efficient in liberating the CO, and the extent of its reaction was not influenced by the presence of O2 or O2Hb over a wide pH range, as compared to other acids tested [30, 46, 48, 50, 51]. However, in more recent studies, sulfuric acid has been preferred, mostly because it is more readily available and cheaper than other acids of the same efficiency, and allows the simultaneous liberation of CO and production of 13CO from formic acid-13C used as internal standard [4, 30, 31, 47, 49, 52,53,54]. After successful liberation, CO is analyzed by GC and then detected with one of the below-mentioned detectors.

For the GC separation, a capillary column with a 5 Å molecular sieve has been found to be specific for the separation of CO from other interfering gases such as nitrogen (N2), oxygen (O2) and methane (CH4) [51]. Various packed columns have been used previously, but have been replaced by capillary columns because of the significantly reduced size.

To enhance sensitivity and accuracy and increase the range of analysis, GC methods have been studied with various types of detection, such as thermal conductivity detection (TCD), flame ionization detection (FID), mass spectrometry (MS) and reduction gas analyzers (RGA) [55,56,57,58,59,60,61,62,63,64,65,66]. The most commonly used and investigated detector was FID, first reported in relation to CO determination in 1968 [51]. After GC separation, the CO was chemically reduced to methane (CH4) with a methanizer and subsequently analyzed via FID.

Sources of error

The most important sources of error for GC techniques are found in the process of calibration before analysis and the methods for correlating measured CO concentrations to COHb levels that have previously been linked to the symptomatology. Generally, calibration of the instrument is performed either with pure CO gas, which is diluted to obtain the desired CO concentrations, or with fortification of blood with CO to reach different COHb% saturation levels. Additionally, excess CO has been removed by performing a “flushing” step, in which the calibrators are flushed with a stream of inert gas (usually N2). This step enabled the removal of unbound CO from the sample, thus leaving only CO bound to Hb to be analyzed, but thereby deliberately neglecting the potential toxicity of free CO.

The first changes in the calibration method were made in 1993, when Cardeal et al. [49] took advantage of the reaction of formic acid with sulfuric acid to form CO for calibration. However, no details were given on how the analyzed blood was saturated with CO, nor was it explained how the formula was created for back-calculation of the measured CO concentration to a COHb level.

Czogala and Goniewicz [67] proposed a GC–FID-based method which directly correlated the CO levels in air to COHb in blood through back-calculation and extrapolated it to the other factors assessed (exposure time, smoking frequency, number of smoked cigarettes and ventilation conditions). The technique was designed to ensure complete release of CO from the blood samples by performing the reaction and subsequent analysis in an airtight reactor. Similarly, the air samples were directly transferred from the room to the analysis instrument, thus avoiding time delays and possible loss of CO, and allowing for direct correlation of the results to the other measurements. However, no details about the procedure for obtaining 100% CO-saturated blood used for calibration were described, which is necessary to assess whether the method is reliable and reproducible. Furthermore, the formula used to back-calculate the COHb saturation levels from the measured CO concentrations contained a Hüfner factor of 1.51, which differs from the factor reported by other studies [30, 46]. The Hüfner factor expresses the maximum amount of CO that can be bound to 1 g of Hb [68, 69]. A detailed list of additional pitfalls of GC methods is found in Table 1.

Measurement of CO in blood: GC–MS and HS–GC–MS

Analytical techniques

MS is the method of choice for detecting CO because identification is based on both the retention time and the mass spectrum. Middleberg et al. [31] developed a method which combined GC–MS with flame atomic absorption spectroscopy (FAAS). CO was determined by GC–MS after release with sulfuric acid and heating, while FAAS was used to determine the total iron content of the blood, which was used to calculate a more precise total amount of available Hb. It should be mentioned that with this assay, it was assumed that all the iron present in blood was part of the heme protein and was capable of binding to CO; however, this is not completely true, as it depends on the state of the organs, tissues and possible diseases present. Therefore, the obtained values may not accurately reflect the real CO levels.

Sources of error

Similar to other GC methods, the main errors in MS also derive from calibration of the methods, the subsequent back-calculation of COHb from CO, and extrapolation of already existing COHb% saturation–symptom correlation (Table 1).

Hao et al. [37] published an approach built on an HS–GC–MS method for analysis of CO in putrefied PM blood. The standard curve was constructed from putrefied blood, which was saturated by CO-bubbling to reach 100% COHb and then flushed to remove excess CO. COHb% levels were then calculated from the ratio of saturated to untreated blood. In PM cases, to prevent the variation in Hb levels from affecting the results, direct blood saturation was performed. The authors reported that 30 min of pure CO exposure was necessary to fully saturate blood, although the procedures used to assess complete saturation, putrefied blood state and PMI were not described [37]. Furthermore, according to the results for the storage condition tests (possible loss of sealing parts of the HS vial, water bath temperature, stability, interval and temperature), the storage temperature did not affect COHb% levels. This appears to contradict findings in the majority of previously published studies, although they were obtained using other approaches, such as optical methods and other GC detections.

Varlet et al. [52] developed and validated a new method which used isotopically labeled formic acid (H13COOH) to produce 13CO as internal standard for HS–GC–MS. This was very advantageous, because formic acid (HCOOH) was already used for the calibration, and sulfuric acid could be used to react with both types of formic acid, forming a mixture of CO and 13CO, from which the CO concentration could be derived mathematically and correlated to the COHb levels using previously published formulae [46, 49]. However, these formulae describing back-calculation of COHb from CO concentrations measured by GC could be debatable due to the random finding of a good correlation between the spectrophotometrically measured COHb levels and the CO levels measured by GC–MS [52]. Varlet et al. [36] improved their method and compared it with results obtained through the CO-oximeter. They were able to obtain cutoff values for different categories of back-calculated COHb% levels as compared to those directly measured by the CO-oximeter. However, while this approach seems to show reliability for both clinical and forensic cases, only a limited number were tested. Oliverio and Varlet [4, 70] further developed this approach by validating both clinical and PM settings for the measurement of the total amount of CO in blood (TBCO) by GC–MS with the use of an airtight gas syringe for sampling, which minimized any potential loss that could occur with a normal syringe or HS sampler. Application to PM samples showed relevant differences between the content of CO and COHb when applying formulae in the literature for back-calculation. Significant differences were also observed between flushed and non-flushed samples from a clinical cohort exposed to CO [70]. This demonstrates the presence of free CO and confirms the weaknesses of COHb for accurate CO poisoning determination, even though the number of subjects in the cohort was limited. Thus, the measurement of TBCO should be performed as an alternative to COHb and the current routinely used spectrophotometric methods for the determination of CO.

Interpretation of results and choice of biomarker

After analysis of the samples, an important and challenging aspect in CO determination is the interpretation of the results. There is no consensus on cutoff values for the different levels of exposure and severity of poisoning. According to the World Health Organization (WHO), COHb levels in blood of the healthy non-smoking population should not exceed 2.5–3%, while for smokers, levels above 10% are considered abnormal [11, 71,72,73]. Values of 30–35% COHb are the upper extreme reportedly found in clinical poisoning cases. Above this limit, irreversible damage to the organs is expected, thus initiating a cascade of events eventually leading to death.

However, these values are interpreted differently according to different cases. Various parameters can affect perimortem COHb% levels and in the agonal period before death, which include the presence of oxidative smoke or other gases that can interfere and/or compete with the CO absorption mechanism, such as nitrogen dioxide (NO2) (increased methemoglobin), or the formation of other toxic gases such as hydrogen cyanide (HCN) [74]. Pre-existing cardiovascular, hemolytic and respiratory diseases also can alter the mechanism and magnitude of CO absorption, with the potential to both decrease and increase the resulting COHb% levels [11, 23]. Therefore, each case must be analyzed and interpreted individually, based on all relevant information available. For example, a COHb level of 25% in a PM case may be considered a contributing factor to the cause of death, but should not be considered exclusively as cause of death. Similarly, in clinical cases, 15% COHb can be considered a poisoning case, but in heavy smokers, levels up to 18% have been found [72] in individuals who showed no symptoms of CO poisoning. Overall, there seem to be some significant discrepancies between COHb values and reported symptoms, which makes the correct diagnosis of CO poisoning in clinical cases and the determination of the cause of death in forensic cases challenging.

A possible explanation for these phenomena is that a diagnosis of CO poisoning based only on COHb% levels might actually underestimate the real CO burden. There may be an unknown amount of CO that on the one hand dissociates back from COHb, and on the other hand is dissolved in the blood without being bound to Hb, resulting in higher total CO content than that determined by CO-oximetry. The conventional assumption that the part of CO bound to Hb causes the most significant adverse health effects has been repeatedly debated [3, 4, 75,76,77,78]. Free CO in blood could constitute a toxic reservoir of CO for the organism and could also have major implications for the central nervous system (CNS) by the known binding to other globins such as myoglobin, neuroglobin and cytoglobin [79, 80]. The ratio of COHb to dissolved and dissociated CO is also probably subject to interpersonal variability, which includes factors such as metabolic rate and age [11], and needs to be taken into account when interpreting the results obtained by CO-oximetry.

Another issue is that GC assays, with the exception of Varlet et al. [36, 52] and Oliverio and Varlet [4, 70], include the “flushing” step in their sample preparation procedure. The excess CO which is not bound to Hb is flushed away with inert gas, allowing the determination of only CO bound to Hb. This procedure is performed under the assumption that only CO bound to Hb is relevant and responsible for the adverse effects of CO poisoning. However, this point has been widely debated, raising the possibility that additional CO found in the blood and not bound to Hb could have an effect on an intoxicated individual. Furthermore, in routine clinical COHb analysis, blood samples are not flushed, because it is usually considered not to comply with the pathophysiology of CO poisoning. In general, the use of formulae to back-calculate GC-measured CO to COHb may be prone to additional errors and could lead to misestimation of the true amount of CO present in the blood of an individual.

All these issues raise doubt as to whether the measurement of COHb is the most appropriate method for determination of CO poisoning. It seems plausible that a more accurate biomarker of CO poisoning may be found. Several alternative biomarkers have been proposed, including lactate [81,82,83], bilirubin [84], S100β [85] and troponin concentrations in blood. Some of these demonstrated positive and good correlations with COHb and were reported to be potentially helpful for diagnosing CO poisoning. However, none of these biomarkers is specific to CO poisoning; rather they are indirect biomarkers derived from toxicity caused by CO in the cardiovascular system, nervous system and cellular levels, which can also be attributable to other diseases.

The development of an alternative biomarker specific to CO should be directed toward finding a novel measurement approach that not only focuses on the CO bound to Hb, but also takes into consideration the role and toxicity of CO at the cellular level, by measuring the total amount of CO present in the sample, such as TBCO. Mainly because of the dependence of spectrophotometric methods on good-quality samples, which in forensic cases in particular is not always available, it seems that GC methods are currently the most suitable techniques to be further explored. With regard to detectors, the MS is the most versatile, accurate and user-friendly, and is nowadays routinely present in the majority of laboratories. The ability to determine the true CO exposure and to correlate this with the symptoms reported by patients would allow for more conclusive and comprehensive CO poisoning determination, reducing the number of misdiagnosed cases and falsely determined causes of death.

Conclusions

Although COHb is routinely measured by spectrophotometric methods, several issues concerning sample stability and the dependence of optical methods on sample quality have led to the search for alternative ways to measure CO, such as GC. In addition, there is evidence that a significant amount of CO present in blood is in free form. Free CO has major toxic effects at a cellular level, affecting not only the respiratory system, but also especially the CNS. However, it is not quantified with current methods, which focus only on COHb; hence the back-calculation of COHb from CO leads to misestimation. Therefore, an alternative approach for quantifying the total amount of CO in blood directly instead of using CO in breath or COHb in blood should be used for determination of CO poisoning, such as the proposed TBCO measurement by GC–MS. Although blood CO concentration cutoffs and their correlation with symptomatology are not yet available, and GC–MS is more time-consuming, we recommend that toxicologists use GC–MS methods to verify the results obtained by CO-oximetry or spectrophotometry, especially for doubtful or very challenging cases. This leads to results closer to the true CO burden, reducing the underestimation caused by COHb measurement and thus the risk and number of misdiagnoses. Especially if the analysis is delayed from sampling requiring storage, we further recommend that toxicologists document information about sampling time, analysis time and storage conditions, as these factors can significantly influence the final interpretation.