Introduction

According to the National Osteoporosis Foundation and a study published by Wright et al., an estimated 54 million American adults 50 years and older are affected with low bone mass and osteoporosis. The percentage of population with low bone mass and osteoporosis is greater in the female and non-Hispanic white populations [1]. Dual-energy X-ray absorptiometry (DXA) is the most frequently performed examination to assess bone mineral density (BMD) in clinical practice.

The primary output of DXA exams is a group of numbers comprised of multiple BMD values that are typically displayed as a screen capture within the PACS workstation. These values are then manually dictated into the diagnostic report, which takes time and is prone to transcription errors.

At our institution, we began by pulling a sample of 100 studies (further explained below) to evaluate error rates. We found an error rate of between 15 and 20% depending on if the report was preliminary or final. This can have obvious implications with patient care, and was evidence that a new solution must be explored that would both decrease error rates and improve report turnaround time while simplifying the workflow for technologists and radiologists.

We decided to explore alternative options for reporting the values into the diagnostic radiology report. Exporting the numerical data via Health Level 7 (HL7) to the electronic medical record (EMR) was proposed to improve reporting efficiency and accuracy. This was not only expected to improve accuracy but also thought to have lower cost than some commercially available solutions. Additionally, there is the potential for wide applicability given the significant number of patients and providers using an integrated electronic medical record system.

Materials and Methods

Parkland Health and Hospital system is one of the largest public hospital systems in the country, averaging more than 1 million patient visits per year [2]. In 2014, 4040 DXA examinations were performed at Parkland Health and Hospital system.

DXA scans were performed on a GE Lunar Prodigy Advance machine with software package 13.60. The DXA modality devices and EMR interfaces were modified to support sending discrete measurement data details directly from the modality through an HL7 interface [3] into the reporting application function within the EMR (Epic Hyperspace, Version 2015, Verona, Wisconsin). HL7 is a set of standards used to transmit clinical data between healthcare applications, most commonly used for admission, discharge, transfers, order information, and in our case, results. We do not have other systems to test the feasibility of this solution, but given the widespread use and compatibility of HL7, think that most systems could find a similar solution.

Our workflow, prior to any changes, was for the radiologist to launch the study from PACS after the technologist completed the exam. The DICOM-wrapped report would be viewable in PACS, and we would then launch our dictation software and start composing the report while manually dictating the BMD values referencing those values from the PACS display. The report could be initiated by a resident prelim and finished by an attending, or started with an attending. The report would then post to the EMR and made available for our referents.

The limitations of this workflow were primarily transcription error rates (manually reporting numbers displayed on one monitor into a report on another monitor) and speed of reporting (numerous values to transcribe was tedious and time consuming).

For our new workflow, the modality first needed to be set up and configured to export the report via HL7, as originally it only forwarded the DICOM-wrapped report to the PACS. Once complete, the project team then met to discuss what level of detail was needed out of the DXA modality to be forwarded to the EMR. The team consisted of three staff radiologists and members of the IT team. As DXA reports can contain a substantial amount of ancillary information, it was decided to send the raw report data only.

To start formatting the data so that it was accepted by the EMR and legible to users, we leveraged Parkland’s HL7 interface engine (Cloverleaf Integration and Information Exchange Suite, Infor, New York, NY). The interface engine allowed for several key pieces of information to be processed and manipulated, including forcing the reports to always cross with a preliminary status, inserting a generic provider, pre-pending the report with text indicating that this was a machine generated report, and adding line breaks throughout the report after each segment. This processing allowed for some of the clinically useful information to be immediately available in the EMR, while also allowing for the referent to be aware that the report was not yet completed.

At this point, standard EMR report functionality was used by the reporting radiologist to modify this machine-generated report to include the interpretive details. EMR smart text feature elements were created to streamline and standardize the interpretive reporting elements (Fig. 1). The workflow was initially developed as EMR driven, but was later integrated into the standard PACS driven workflow (Fig. 2).

Fig. 1
figure 1

Examples of the SmartText. Various smart feature elements are demonstrated, including the picklists from the dropdown menus shown above. These picklists allow the reporting physician to give a custom feel to the report, without spending unnecessary time using manual entry

Fig. 2
figure 2

Workflow guide. The order entry to completion of DXA reporting is demonstrated. Note that while acquired images are still sent to PACS, additional information is sent through the Cloverleaf interface engine, allowing for direct reporting the electronic medical record

In order to evaluate whether this quality improvement initiative led to decreased errors, 100 preliminary DXA radiology reports before the change and 100 after the change were examined. All reports went through a resident preliminary reporting process. These reports were analyzed for errors that included decimal change, number transposition, negative number issue, other incorrect number error, and failure to include prior exam for comparison. Errors by residents and errors by attending physicians for each report in each category were then tabulated and pre- and post-change scores were compared. In addition, report turnaround times were evaluated before and after the changes were made based on EMR timestamps for the different exam statuses (exam begin, exam end, preliminary report, and final report). Time evaluations included 1-year volume prior to change (3915 reports) and 1 month post-change (206 reports).

Results

Out of 100 DXA exams before the change, 20 preliminary reports contained 44 errors, and 15 final reports contained 25 errors. The errors were comprised of incorrect numerical values and missing comparison references. The incorrect numbers were seen in both the manually entered T and Z score fields. Seven errors in both the preliminary and final reports were found related to negative number issues. Zero errors were identified for decimal placement or simple number transposition. Thirty five otherwise incorrectly transcribed values were found in the preliminary report, while only 18 made it to the final report (Table 1). The errors that were discovered between the preliminary and final reports were identified during the final review by the staff radiologist. Two preliminary reports were identified without the prior listed for comparison, but none were found in the final report (Table 1). Out of 100 DXA exams after the change, only one preliminary report and one final report contained errors, and in both cases this included the prior exam not being listed for comparison (no number transcription errors were identified). This issue was subsequently resolved when the missing exam titles and imaging codes were then updated in the database, allowing for recognition of prior DXA exams.

Table 1 Decrease in error between pre- and post-change of DXA reporting

Exam end to preliminary report time decreased from 1235 to 0 min average (153 to 4 min median). Exam end to final report time decreased from 2159 to 625 min average (1252 to 225 min median). Exam begin to final report time decreased from 2197 to 670 min average (1278 to 260 min median, Figs. 6 and 7).

Compared to other available solutions, the relative cost of build and installation of this automated solution is significantly more affordable. One of the commercially available options explored for automation included a third-party application that integrates with the voice recognition transcription application, which costs about $160,000 (including install and a 5-year maintenance contract). This is opposed to the estimated $6175 install for our solution that requires no additional contractual maintenance cost.

Typical reports include the patient history, technique, comparison, findings, and clinical impression. Within the findings portion of the pre-change manually created reports were 15–20 manually dictated numerical fields (Fig. 3). Secondary to the time consuming and likely error-prone process of manually dictating the 15–20 numerical values in the report, an alternate solution was sought. The initial attempted solution of a brief dictated report consisting only of the interpretive impression elements excluding all of the manually entered values in the findings section of the report (Fig. 4) was not well received by many of the referring clinicians who indicated the measurement values were needed in the reports.

Fig. 3
figure 3

Original report with manually entered fields. A complete report is demonstrated, showing the various components of the report, including the error-prone manually entered numeric values. Other standard statements and interpretation are also provided

Fig. 4
figure 4

Brief dictation. This example demonstrates the brief report that was initially attempted. The previously seen manually entered numeric values were not included, but made available in PACS. The referents were not satisfied with this style of reporting, and therefore other options were sought

The process of an automated data entry based upon the current imaging and EMR systems was then evaluated. The new process was able to directly push the pertinent information from the DXA scanner, which was originally dictated into the findings section, directly into the EMR. With the leveraging of smart features in the EMR environment, a customized report with the look, feel, and information of the original report was able to be obtained, without the need for manual data entry (Fig. 5).

Fig. 5
figure 5

Post-change report. Once the numeric values for the DXA scan were able to be directly sent to the EMR, and the smart features in the EMR were leveraged, a custom report with the desired numeric values and interpretive pieces were able to be accurately reported

The reporting change did alter the standard PACS-driven workflow that had been normally used in the radiology department. Rather than launching the exam from PACS to dictation software and completing a report, the exam was viewed in PACS but then interpreted and reported directly in the EMR. Standard tools were incorporated in the workflow to give the customized feel that the referents desired, while allowing the T and Z scores to be automated. Initially, the EMR and PACS systems were not integrated (ability to launch imaging and reporting automatically in context with each other), which was corrected about 1 month later. This allowed for the radiologist to simply open the case from either the EMR or the PACS and have the other system also linked to open the correlative information.

An evaluation of the efficiency or report turnaround times was performed comparing the initial pre-change, first post-change, and second post-change times. We found a shorter turnaround time in nearly all measureable data comparing the three categories (Figs. 6 and 7). The exam end time to preliminary was cut to zero in both of the post-changes as the DXA modality was automatically sending the raw data to the EMR in a preliminary report. It appeared that the preliminary report to final report turnaround time was longer, but this was an artifact of the modality sending a preliminary report directly to the EMR rather than being created by an interpreting radiology resident. The most significant measure of improved efficiency was the category of exam end to final, which showed an incremental improvement independent of whether a resident was assisting in report creation or staff was doing the full process.

Fig. 6
figure 6

Reduction in average turnaround times

Fig. 7
figure 7

Reduction in median turnaround times

Additional changes that were seen between the pre-change and both post-changes were an increase in the number of staff physicians interpreting exams without the help of residents. This was thought a direct result of the improved efficiency of the new system, less burden of reporting and greater acceptance by the faculty.

Discussion

Many health information technology tools are available to aid in report generation, including speech recognition software and structured reporting [3]. Tools such as voice recognition software have even shown improved report turnaround times [4]. However, the available tools that were being used to create DXA reports at our institution, essentially templates with blank fields for DXA report entries, were tedious and error prone. Commercial solutions were available but were cost prohibitive.

The reporting for DXA examinations was historically managed by having the resident and staff physicians use a PACS-driven workflow that was linked to voice recognition dictation software at our institution and elsewhere at most radiology practices [5]. This was done in a standardized format, as recommended by the 2007 intersociety conference, to demonstrate both the study findings and procedure being performed [6].

Other in-house solutions have been reported, including methods using Microsoft Windows-based macro script editing and reported to be “inexpensive” [7]. However, the availability, applicability, and ease of use across multiple health system platforms were not discussed. Another benefit of our solution compared to the macro script described by Iv et al. is that no DXA-specific workstation must be used to report, nor are there healthcare information security issues since we are using our standard workstations. Our solution also has the advantage of not being linked to the voice recognition software that was a problem which Iv et al. were unable to overcome.

One problem that was encountered when changing the reporting of DXA examinations was the adjustment from a PACS-driven workflow to a RIS-driven workflow. Radiologists at our institution generally open studies in PACS and then are able to dictate into voice recognition software. The initial change that was made of sending data to the EMR allowed one to look at the images in PACS, but reporting was then done in the EMR. Initially, our PACS and EMR were not integrated, which made the workflow cumbersome. Subsequently, we were able to integrate our PACS and EMR so exams could launch in context. With this workflow when a study was opened in the EMR or PACS, the other system shows the same patient’s information. This helped to reduce the number of clicks that a physician needed to perform to view, interpret, and complete a report.

Abujudeh et al. have previously shown that the automated insertion of technical details have improved report and billed examinations [8]. We hypothesized that a similar method of automation would allow for improvement in the number of errors for similar technical details. The rate of error was improved after the change, as no numerical errors were found in preliminary or final reports. Only one error in the post-change group was found, in which a comparison report was not included. This was caused by a naming convention change that occurred after that report comparison which had not been accounted for in the EMR template build, but which was subsequently corrected after identification. The increase in percentage of the staff reading exams without the use of resident preliminary reports (Table 2) was felt to be due to the improved efficiency of the new system and less burden of reporting. The percentage of exams that were reported with the help of a resident physician is shown to drop from 92.52 to 76.56% and 83.98% after the changes in reporting. The first post-change data reflects when the reporting in the EMR began, and the second post-change data was when the PACS and EMR were able to be fully integrated and allow simultaneous launching of cases.

Table 2 Improved staff reporting related to efficiency

The use of electronic medical records for the use of health information is growing (http://www.healthit.gov/sites/default/files/rtc_adoption_and_exchange9302014.pdf). Most EMR’s are capable of accepting multiple downstream reporting feeds and workflows, which makes this solution applicable to a broad population. The HL7 format of the information is an ANSI-accredited standard for network and application integration, available with most healthcare products. Given the broad use of integrated EMRs and standardized HL7 data format, our method of automated report generation should be applicable to a large number of physicians and medical facilities who perform DXA reporting and interpretation. Our solution also has the advantage of being more affordable than the currently available commercial solutions mentioned previously. In some practices, mid-level health care providers such as nurse practitioners and physician or radiology assistants dictate preliminary reports. Our approach obviates such a need and the added cost.

Conclusion

Based on the results of our solution including improved report turnaround time, improved accuracy, affordability, safety, and widespread applicability, we think that many other institutions would benefit from implementing similar systems. This method of directly reporting numeric values into the imaging reports may be able to be expanded to other area modalities in the future, such as ultrasound, allowing for increased accuracy and improved turnaround times in a cost-saving manner.