The results of diagnostic tests ordered by a patient’s provider play a pivotal role in decision-making, patient outcomes, and quality of care delivered. Therefore, appropriate and meaningful patient care relies on timely, accurate, and consistent communication of test results.1-6 Professional societies promote quality through the development of guidelines and standards, which assist medical providers in reporting results.7,8 To facilitate consistency in nuclear cardiology reporting, the American Society of Nuclear Cardiology (ASNC) published guidelines in 2009 for standardized reporting. These guidelines describe key data reporting elements and definitions.9 The Intersocietal Accreditation Commission Standards and Guidelines for Nuclear/PET Accreditation, which are based on published guidelines from several professional societies, define minimum levels of care and include specific requirements for reporting.10

In 2011, Tilkemeier et al described nuclear cardiology reporting having a high degree of noncompliance with the Intersocietal Accreditation Commission (IAC) Standards.11 Laboratory characteristics associated with noncompliance included accreditation decision and cycle, laboratory type, geographic location of the laboratory, and number of myocardial perfusion imaging tests performed.

In response to this publication, multiple efforts by professional societies ensued to improve the quality and compliance of nuclear cardiology reports as part of a broad quality improvement initiative. Examples include an emphasis on reporting standards and their importance at national meetings (ASNC, American College of Cardiology, and others); promotion and dissemination of the IAC Standards; and the utilization of webinars as a mechanism to better inform the laboratories regarding this important topic. The outcomes of these efforts are unknown.

We hypothesized that with time, continuing medical education, increased lab accreditation along with feedback, and other instruction related to IAC Reporting Standards compliance would improve. In comparison of the IAC datasets from 2008 and 2014, this study had four aims: (1) identify reporting compliance of nuclear cardiology laboratories applying for accreditation with the IAC Standards; (2) examine characteristics of nuclear cardiology laboratories applying for accreditation with regards to laboratory type, number of physicians certified by the Certification Board of Nuclear Cardiology (CBNC), and geographic region; (3) examine noncompliant reporting elements ranked according to their relative importance in relation to the nuclear cardiology laboratory characteristics; and (4) compare reporting compliance between 2008 and 2014.

Methods and Statistics

This study was a retrospective evaluation of all laboratories applying for IAC myocardial perfusion imaging accreditation in 2008 and 2014. We chose this period as it encompasses the time necessary for a laboratory to undergo multiple accreditation applications; the IAC accreditation cycle is three years with the IAC Standards relatively stable during this time.

The IAC database was used to extract anonymous laboratory characteristics and reporting compliance data. Laboratory compliance with the IAC Nuclear/PET Reporting Standards was determined from the findings of the peer evaluation of 3-5 myocardial perfusion reports submitted as part of the accreditation process.

Laboratory Characteristics

Laboratory characteristics evaluated include laboratory type (non-hospital- vs hospital-based) and the geographic region of the United States (Northeast, Midwest, South, and West).12 The number of interpreting physicians, physicians certified by the CBNC, and technologists were also evaluated.

Reporting Elements

The 18 required reporting elements, previously defined by Tilkemeier et al,11 were assessed for compliance and ranked by their relative importance by experts on a 1-5 scale (1 = very important, 5 = least important) and are shown in Table 1.

Table 1 Reporting element compliance 2008 and 2014 (N = 1816)

The results of these rankings yielded three clear categories of high (1), moderate (2), and low importance (3). We grouped the laboratories according to the highest severity noncompliant reporting element present in their reports. The four groups are as follows: high—labs noncompliant with any reporting element of high importance; moderate—labs with full compliance with all high importance reporting elements and noncompliance with any reporting element of moderate importance; low—labs with full compliance with all high and moderate importance elements and noncompliance with any reporting element of low importance; and all compliant—full compliance with the reporting standards.

Statistical Analyses

The data were analyzed using SPSS for Windows (version 22.0; Chicago, IL) and R (version 3.2.3; Vienna, Austria). The data were cleaned and examined for outliers, normality of distribution, and correlations. The laboratory characteristics were summarized using number and percentage for categorical variables and the mean (±standard deviation) for continuous variables. For the 18 reporting elements, the number and percentage of laboratories with deficiencies were calculated overall and by element. The overall mean (±standard deviation) reporting deficiencies was also calculated. While reporting deficiencies do not follow a normal distribution, (Shapiro-Wilk test P value < .001) for this analysis, means and standard deviations are reported rather than medians for ease of interpretation. Reporting medians results in numerous medians of ‘zero’ reporting deficiencies, which are misleading.

2008 vs 2014 Comparisons

Comparisons were made between laboratories applying for accreditation in 2008 vs 2014 by individual reporting elements and mean reporting deficiencies. Comparisons were made grouped by laboratory type, CBNC-certified physicians on staff, and geographic region. Global comparisons were also made based on severity group for lab type and CBNC-certified physicians on staff.

Differences in individual reporting elements were calculated using chi-squared test (n > 5) for large samples and Fisher’s exact test for small sample sizes (n ≤ 5). Cochran-Mantel-Haenszel tests were used to analyze differences in individual reporting elements stratified by lab type, staff type, or region. Differences in mean reporting deficiencies were calculated using Student’s t test. Similar analysis was performed when analyzing total reporting issues by region and lab type. Differences in accreditation decision were analyzed using chi-square tests. For all tests, two-sided P values <.05 were indicative of statistical significance.

Results

Laboratory Characteristics

One thousand eight hundred sixteen labs applying for accreditation or reaccreditation in 2008 (n = 980) and 2014 (n = 836) were analyzed for compliance with the IAC Standards. Lab characteristics are presented in Table 2. There was a significant increase in the percentage of facilities initially granted accreditation in 2008 compared with 2014 (13.9% vs 37.4%, P < .001). In 2008, 79.0% of facilities applied for accreditation the first time compared with only 5.7% in 2014 (P < .001). There was a significant change in the distribution of lab type between 2008 and 2014 (P < .001), represented by an increase in the proportion of hospitals (6.2%-10.9%). The distribution of the labs by region between 2008 and 2014 was not statistically different. The number of interpreting physicians increased from 2008 to 2014 (mean 3.8 ± 4.2 to 4.3 ± 4.2, P = .045) and the number of technologists remained unchanged (mean 2.56 ± 2.7 to 2.54 ± 2.7 P = .865). The average number of physicians per lab with CBNC certification increased from a mean of 0.64 ± 1.5 to 1.82 ± 2.8 (P < .001) and the percentage of labs with at least one CBNC-certified physician rose from 30.2% in 2008 to 69.8% in 2014 (P < .001).

Table 2 Laboratory characteristics (N = 1816)

The overall mean noncompliant reporting elements per lab decreased significantly from 2008 to 2014 (2.48 ± 2.67 to 1.24 ± 1.79, P < .001). The mean reporting noncompliance decreased across lab types, labs with CBNC-certified physicians on staff, and geographic region from 2008 to 2014 (P < .001) (Figure 1).

Figure 1
figure 1

Mean reporting issues by facility characteristics of lab type, CBNC-certified physicians on staff, and region. All comparisons significant (P < .001) except between non-hospital and hospitals in 2008 (red bracket), CBNC and no CBNC physician on staff in 2014 (green bracket), and all regions in 2014 (green bracket). CBNC, Certification Board of Nuclear Cardiology

Reporting Elements

Eighteen elements of the IAC Reporting Standards ranked by level of importance are shown in Table 1. The percentage of labs with issues significantly decreased for 14 of 18 elements (P < .05) between 2008 and 2014. The elements that showed the greatest improvement were documentation of the date of report finalization (20.4%), patient gender (16.8%), route of administration of the radiopharmaceutical (i.e., intravenous) (14.6%), and clinical indication (13.5%). No statistical difference was observed in the percentage of noncompliant labs related to defect quantification, wall motion findings, or demographic items. Timeliness of reporting was the only element where the percentage increased significantly from 2008 to 2014 (P = .014).

The four least compliant elements in 2008 were the date of report finalization (33.78%), integration of stress and imaging reports (26.22%), radiopharmaceutical route of administration (25.82%), and patient gender (21.94%). For comparison, in 2014, the least compliant elements were perfusion defect quantification (21.05%), integration of stress and imaging reports (14.71%), date of report finalization (13.4%), and radiopharmaceutical route of administration (11.24%).

Severity Score

Based on the importance score, we grouped the laboratories by the highest severity, noncompliant reporting element present in their reports. Noncompliant labs decreased significantly across all severity groups between 2008 and 2014 both individually (P < .001) and overall (P < .001) with the percentage of compliant labs increasing from 35.0% in 2008 to 57.1% in 2014 (P < .001) (Figure 2).

Figure 2
figure 2

Comparison of noncompliant reporting elements between 2008 and 2014 by the highest severity noncompliant element per lab. The severity of noncompliance decreased for all severity types with a higher percentage of labs in full compliance

Comparative Analysis

Lab type

Mean total noncompliant reporting elements for both non-hospital- and hospital-based facilities decreased significantly between 2008 and 2014 (P < .001) (Figure 1). In 2008, no significant difference was observed in mean total noncompliant reporting elements between the two lab types. However, in 2014, a statistical difference did exist between mean total noncompliant reporting elements when comparing non-hospital-based to hospital-based (1.3 ± 1.8 and 0.8 ± 1.5, P = .012) with non-hospital facilities less compliant.

Comparison of the severity of reporting issues in 2008 only shows a significant difference in the percentage of labs with issues for the lowest severity group, with more hospitals having issues than non-hospital-based labs (11.5% vs 4.6%, P = .037) (Table 3). There was no significant difference between severity groups in 2014. Comparison by facility type between 2008 and 2014 shows a significant decrease in the percentage of non-hospital facilities with issues in all three severity groups (P < .001). Conversely, the proportion of hospital labs with issues did not change significantly for both the highest and lowest severity groups between 2008 and 2014.

Table 3 Comparison of facility type by severity group and year (N = 1816)

For the labs with fully compliant reports, there is a significant increase in the percentage of both non-hospital and hospital lab compliance between 2008 and 2014. In 2008, there was no significant difference between non-hospital and hospital labs. However, in 2014, there was a significant difference demonstrating a larger percentage of hospital labs with fully compliant reports.

CBNC Certification

Mean total noncompliant reporting elements in labs with (P < .001) and without (P < .001) CBNC decreased from 2008 to 2014. In 2008, the mean number of reporting elements differed significantly in labs with (2.09 ± 2.39) and without (2.65 ± 2.77) CBNCs (P = .002). We observed no difference in compliance between the two in 2014 (Figure 1).

Evaluation of severity of reporting issues in 2008 shows a significant difference based on CBNC status for high severity issues with a smaller percentage of labs with CBNC physicians on staff having issues (P = .013) (Table 4). In 2014, there was no significant difference in the proportion of labs with issues for any severity group. Comparing 2008 to 2014, a smaller proportion of labs with a CBNC-certified physician on staff had moderate and low severity issues (P < .001 and P = .010, respectively). However, there was no significant difference in the percentage of labs with high severity issues. For facilities without a CBNC-certified physician on staff, there was improvement for all severity groups with a smaller percentage of labs having issues.

Table 4 Comparison of CBNC status by severity group and year

For labs with fully compliant reports, both labs with and without CBNC physicians demonstrated a significant improvement (P < .001). In 2008, a greater percentage of labs with CBNC physicians had compliant reports (P = .02). However, in 2014, there was no significant difference with labs based on CBNC status.

Region

Across all regions and overall, the mean number of noncompliant reporting elements decreased from 2008 to 2014 (P < .001). (Figure 1) In 2008, there was a significant difference in mean elements between regions (P = .003); however, no difference was present between regions in 2014 (P = .067).

Discussion

This study demonstrates improvement in compliance with IAC Reporting Standards between 2008 and 2014. The results of this study are similar to a study by Coleman et al. who found important reporting elements missing in positron emission tomography reports.13

The grading of the relative importance of the 18 reporting elements allows a more comprehensive assessment regarding the quality of the reporting improvement. Every element in the low and moderate importance groups showed significant improvement. Three of the four least compliant elements in 2008 (date of report finalization, radiopharmaceutical route of administration, and patient gender), showed the greatest improvement. Yet, while these three elements require little more than attention to detail, they remained in the top four least compliant elements in 2014 as well.

Two elements in the high importance group did not show significant improvement in reporting compliance. We observed continued reporting noncompliance for defect quantification and wall motion findings. The published guidelines direct that myocardial perfusion defects must be described in terms of size, severity, type, and location using standardized terminology.14 The guidelines also state that a qualitative description of global and regional left ventricular wall motion/thickening must be described. The inclusion of both of these interpretive elements in the report with adequate explanation is critical in guiding appropriate patient management, as they are both important differentiators of normal or abnormal results. A study by Tragardh et al. in 2012 found that although referring physicians understand the general presence or absence of ischemia and infarction from myocardial perfusion imaging reports, they often underestimate the extent of disease.15 Thus, an adequate description of the abnormal findings is critical for patient management decisions.

That there has not been a significant improvement in these elements is troubling. Numerous hypotheses should be considered as to why this trend is occurring; a few of which include the need for focused education in describing abnormalities and standardizing descriptive language. Furthermore, adoption of nuclear cardiology reporting software will facilitate an increased accuracy of reporting these elements.

While the aforementioned noncompliance remained constant, issues related to timeliness increased. The most likely explanation for this finding is the change in the IAC Standards and Guidelines for Nuclear/PET Accreditation for report turnaround time. Previously, four working days were allowed for report interpretation, transcription, finalization, and transmission to the referring healthcare provider. In 2012, the standard was decreased to 2 days.10 Although this change resulted in an increase in timeliness issues, it should lead to an improvement in the quality of patient care. An additional explanation for an increase in problems with timeliness may be that this is a result of more time required to increase compliance related to the other 17 reporting elements.

In 2014, 30.3% of labs had reporting issues that fell into the severe category when grouped according to the highest severity noncompliant reporting element present in their reports, 11.8% and just over 1.2% fell into the moderate and low categories, respectively. While the trend of labs favoring the severe category should be concerning; reporting noncompliance decreased across all categories between 2008 and 2014 and the percentage of compliant labs increased during the given period to 57.1%.

We observed dissimilarities between non-hospital- and hospital-based labs. In 2008, there was no significant difference in compliance based on the percentage of labs. However, there was a difference in 2014 with more hospital-based labs compliant and showing greater improvement. This finding suggests initiatives to effect change may be more impactful in the hospital setting, or perhaps more accessible to those coming from that type of facility. Hospitals are inherently more regulated, and many have developed mechanisms for quality improvement, which would predict that they would be higher performers. Independent labs may invest fewer resources into quality improvement measures. Additionally, the increasing prevalence of the electronic medical record may contribute to these variances. Another factor to be considered is that as more laboratories transition from non-hospital to hospital-based employment models, the potential exists for even more reporting improvement as laboratories benefit from a potentially more rigorous quality structure in the hospital setting.

Our results indicated that the percentage of labs with at least one CBNC-certified staff member increased from 2008 to 2014, and the average number of certified physicians per lab increased. This finding is not surprising in the currently complex cardiovascular imaging setting, where federal agencies, payers, health systems, and patients expect quality performance and interpretation of imaging studies along with accountability.16

However, the impact of this increase in the number of certified physicians is debatable. In 2008, the presence of at least one certified staff member appeared to increase compliance as labs with CBNC staff had a lower average number of noncompliant elements and less severe elements. However, the same difference cannot be confirmed in 2014. There was no significant difference in mean noncompliant elements or the severity of the elements. There was an equal percentage of labs with compliant reports based on CBNC status in 2014.

Limitations

Potential limitations related to this study include the criteria for submission of cases to the IAC. As the cases are selected by the laboratory, they are designed to reflect their best work and only include 3-5 reports per laboratory. This may result in selection bias toward a more positive result than would be present in their routine clinical work. The criteria for case selection changed from a randomized method in 2008 to a best-case method in 2014. Additionally, the results are only applicable to laboratories who have sought accreditation through the IAC Nuclear/PET pathway. Results from laboratories seeking accreditation through other pathways are unknown. Finally, it is also unclear as to the effect of the report quality on clinical outcomes.

New Knowledge Gained

In 2011, Tilkemeier et al reported significant nuclear cardiology laboratory noncompliance with reporting standards. The outcome of educational efforts by professional organizations to improve reporting quality is unknown. The results of this study demonstrate that continuing medical education, accreditation, and other instructional activities aimed at improving nuclear cardiology reporting standards appear to have made a positive impact over time.

Conclusion

Continuing medical education, accreditation, and other instructional activities aimed at improving nuclear cardiology reporting standards appear to have made a positive impact over time. The percentage of labs with compliant reports increased over the period studied, while less severe noncompliant elements were observed. Although we detected significant improvement for most of the 18 reporting elements, more work is needed to improve reporting of myocardial perfusion defect quantification and wall motion analysis. The results of this study demonstrate significant improvement in nuclear cardiology reporting compliance with the IAC Standards; however, additional efforts to improve remain to ensure the most effective communication of nuclear cardiology reporting.