Abstract
Purpose
The vast amount of information found on the internet, combined with its accessibility, makes it a widely utilized resource for Americans to find information pertaining to medical information. The field of radiology is no exception. In this paper, we assess the readability level of websites pertaining specifically to emergency radiology.
Methods
Using Google, 23 terms were searched, and the top 10 results were recorded. Each link was evaluated for its readability level using a set of ten reputable readability scales. The search terms included the following: abdominal ultrasound, abdominal aortic aneurysm, aortic dissection, appendicitis, cord compression, CT abdomen, cholecystitis, CT chest, diverticulitis, ectopic pregnancy, epidural hematoma, dural venous thrombosis, head CT, MRI brain, MR angiography, MRI spine, ovarian torsion, pancreatitis, pelvic ultrasound, pneumoperitoneum, pulmonary embolism, subarachnoid hemorrhage, and subdural hematoma. Any content that was not written for patients was excluded.
Results
The 230 articles that were assessed were written, on average, at a 12.1 grade level. Only 2 of the 230 articles (1%) were written at the third to seventh grade recommended reading level set forth by the National Institutes of Health (NIH) and American Medical Association (AMA). Fifty-two percent of the 230 articles were written so as to require a minimum of a high school education (at least a 12th grade level). Additionally, 17 of the 230 articles (7.3%) were written at a level that exceeded an undergraduate education (at least a 16th grade level).
Conclusions
The majority of websites with emergency radiology-related patient education materials are not adhering to the NIH and AMA’s recommended reading levels, and it is likely that the average reader is not benefiting fully from these information outlets. With the link between health literacy and poor health outcomes, it is important to address the online content in this area of radiology, allowing for patient to more fully benefit from their online searches.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
The Internet is engrained in American society and is a logical outlet to seek health-related information. Nearly 80% of Americans have reported searching for health-related information on the Internet [1]. Online health information is crucial in the field of health literacy, which is defined as “the degree to which individuals can obtain, process, and understand the basic health information and services they need to make appropriate health decisions” [2]. Unfortunately, approximately 80 million Americans have limited health literacy [2]. Readability is a quantitative measure that correlates with health literacy and is used to measure the ease with which a patient can read and understand a particular text [3]. This measure becomes useful for evaluating textual material regarding adherence to the National Institute of Health (NIH) and American Medical Association’s (AMA) recommendations that patient education materials be written between a third and seventh grade reading level to accommodate the average American patient [4, 5].
The readability of online patient education materials has been studied in previous fields [6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25] but has not been analyzed in the subspecialty of emergency radiology. Health literacy has been shown to play a role in a patient’s decision to undergo radiologic testing, with one study showing an association between low health literacy in caregivers and a decreased use of radiologic testing [26]. Since health literacy is correlated with readability, online emergency radiology education materials are an important area of concern with regard to readability so as to promote online educational resources suitable for the general public. The aim of this paper is to quantitatively determine the readability of online patient education materials concerning emergency radiology imaging and diagnoses.
Methods
A Google (Mountain View, CA) search was performed for 23 search terms, including: abdominal ultrasound, abdominal aortic aneurysm, aortic dissection, appendicitis, cholecystitis, cord compression, CT abdomen, CT chest, diverticulitis, dural venous thrombosis, ectopic pregnancy, epidural hematoma, head CT, MR angiography, MRI brain, MRI spine, ovarian torsion, pancreatitis, pelvic ultrasound, pneumoperitoneum, pulmonary embolism, subarachnoid hemorrhage, and subdural hematoma. The top 10 unique links specific for patient education for each term were selected. Search results from the same Internet domain but with different unique webpages were considered as separate results. If the website was not specifically written for patients, such as journal articles, it was not included in the analysis. The text was subsequently extracted, converted to plain text, and analyzed for its literary level. Only relevant educational text was included.
Analysis was performed using Readability Studio Professional Edition Version 2012.1 (Oleander Software, Ltd., Vandalia, OH). Text from each link was analyzed with ten readability assessment scales, all of which are routinely used in the assessment of medical literature. The scales included Flesch Reading Ease (FRE) [27], Flesch-Kincaid Grade Level (FKGL) [28], Simple Measure of Gobbledygook (SMOG) [29], Coleman-Liau Index (CLI) [30], Gunning Fog Index (GFI) [31], New Dale-Chall (NDC) [32], FORCAST formula [33], Fry graph [34], Raygor Reading Estimate (RRE) [35], and New Fog Count (NFC) [28].
The FRE scale [27] uses syllables, words per sentence, and number of sentences to produce a value between 0 and 100 that represents the ease of reading that text. Scores of 0–30 indicate very difficult, 30–50 are difficult, 50–60 are fairly difficult, 60–70 are standard, 70–80 are fairly easy, 80–90 are easy, and 90–100 are very easy. The remaining nine tests all produce values that correspond directly with academic grade level (i.e., a score of 9.0 indicates material written at the ninth grade level). The FKGL [28] examines the number of syllables per word and the average number of words per sentence. The SMOG [29] assesses the number of words with three or more syllables and the average number of sentences. The CLI [30] looks at the number of letters per 100 words and number of sentences per 100 words. GFI [31] evaluates the number of sentences, number of words, and number of words with three or more syllables. NDC [32] examines the number of words per sentence and the percent of unfamiliar words. The FORCAST formula [33] looks at the number of single-syllable words in a 150-word sample. The Fry graph [34] assesses the average number of sentences and syllables per 100 words. The RRE [35] analyzes the average number of sentences and long words (six or more characters) per 100 words. Finally, the NFC [28] includes the number of complex words, the number of easy words, and the number of sentences.
OriginPro (Northamptom, MA) was used for the statistical analysis in comparing the readability between the 230 websites. A one-way ANOVA in combination with Tukey’s Honestly Significant Difference (HSD) post hoc analysis was performed with p < 0.05.
Results
The average reading level of the 230 articles was assessed at a 12.1 grade level. Only 2 of the 230 articles (1%) were written at the NIH and AMA recommended reading level (third to seventh grade). Fifty-two percent of the 230 articles were written so as to require a minimum of a high school education (at least a 12th grade level). Finally, 17 of the 230 articles (7.3%) were written at a level that exceeded an undergraduate education (at least a 16th grade level).
Figure 1 depicts the average reading levels of each individual search term, with all topics well above the recommended guidelines. Average reading levels among pneumoperitoneum websites found the readability at a 14.2 grade level, the highest of all the searched topics and well above the recommended guidelines. Chest CT websites were the most readable with the lowest average reading level at a 9.8 reading level. The SMOG scale found none of the 230 articles to be written within the guidelines, with all articles being written above a seventh grade reading level. The FORECAST and CLI had similar findings, will all 230 articles being written above a seventh grade level.
A one-way ANOVA and Tukey’s HSD post hoc analysis demonstrated statistical differences (22,208) = 3.39, P = 0.00001. CT abdomen (mean readability of 10.1 grade level) was statistically easier to read than ovarian torsion (14.1) and pneumoperitoneum (14.2). CT chest (9.8) was statistically easier to read than dural venous thrombosis (13.5), epidural hematoma (13.5), ovarian torsion (14.1), pancreatitis (13.5), and pneumoperitoneum (14.2). Additionally, head CT (10.4) and MRI spine (10.2) were both more readable than ovarian torsion (14.1) and pneumoperitoneum (14.2).
Discussion
Patient education materials on websites related to emergency radiology imaging and diagnoses are written at a much higher readability level than the recommendations set forth by the AMA and NIH, contributing to a communication gap between healthcare providers and the general American public. Our results are consistent with other studies analyzing the readability of patient education materials in the fields of radiology [8, 14, 18, 20, 21, 36].
Radiologic imaging is a vital tool for effective diagnosis and treatment of patients in an emergency department (ED). A study analyzing the trends of diagnostic imaging usage over a 10-year period demonstrated a dramatic increase in the use of both CT and MRI. Between the years of 1997 and 2006, imaging using CT doubled and imaging with MRI tripled [37]. Additionally, another study described an increase in ED visits utilizing either CT or MRI scans from 6% in 1998 to 15% in 2007 [38]. The use of radiologic imaging is becoming exceedingly important in patient evaluation, particularly the ED. The dramatic increase in the number of patients requiring radiologic imaging only reinforces the importance of proper patient education in radiology.
It is evident that patients are interested in learning about radiology and are actively seeking additional information on the Internet. For example, the website radiologyinfo.org receives over one million visitors a month [36]. With this high demand for information, the field of radiology has an outstanding opportunity to improve the health literacy of its patients. Additionally, online information can be quickly distributed and updated by authors at little cost, making it a convenient outlet for healthcare providers to supply high-quality healthcare information [39].
However, the high readability level of the online patient education material has presented a barrier in communication. Of the 230 websites analyzed, less than 1% met the NIH and AMA readability recommendations. Additionally, more than half (52%) of the websites were written at a level that required a high school education and 7% required an undergraduate college education. This study suggests that patient education material may not be achieving its goal of higher health literacy among patients. The high readability of material excludes the average American from the potential benefits of obtaining higher health literacy and could possibly result in negative health outcomes. Some of the imaging terms (CT abdomen, CT chest, CT head, MRI spine) were easier to read than some of the diagnoses, such as pneumoperitoneum. This could be related to the inherent difficulty in eliminating the complexity of medical jargon when describing terms like pneumoperitoneum compared to CT head, which likely can be simplified into more recognizable terms.
Although the Internet provides an outstanding wealth of information for patients to take advantage of, a lack of comprehension can perpetuate negative health trends associated with less-informed patients. Patients with low health literacy are more likely to have poor management of their disease [40], greater disease progression [41], reduced compliance with treatment recommendations [42], and lower self-reported health [43]. A recent study also demonstrated a reduced utilization of radiologic diagnostic testing in the emergency department for pediatric patients whose caregivers had lower health literacies [26]. This helps to explain why patients with limited healthcare literacy often experience disparities in their overall health, with lower literacy level patients reporting poorer overall health compared to more literate patients [44, 45]. In today’s healthcare system, patients are encouraged to make informed healthcare decisions in conjunction with their providers [46]. In order to make informed decisions, patients must first be able to do their own research, in addition to heeding the physician’s advice. Unfortunately, seeking health literature and comprehending health literature are two vastly different phenomena. In fact, 3–5% of the total healthcare costs in the USA are attributed to poor health literacy, further highlighting the importance of health literacy and the burden it can have on the health care system [47]. As such, it is imperative that online information is understandable for the average American so as to encourage informed healthcare decision-making.
Readability is not an easily measured value, as it is dependent on the measures used to analyze the text. The authors acknowledge that the readability of educational literature analyzed may very well be under- or overestimated depending on how the material is analyzed. Readability analysis that relies on syllable count may falsely inflate the readability scores assigned to the material, as it is difficult to differentiate the comprehension levels of multisyllabic words. For example, the word “identical” would not be differentiated from the word “cephalogram” in certain assessments of readability. Additionally, the use of medical terminology in radiology education material is often unavoidable. Much of this scientific language is inherently longer and contains more syllables, inflating the readability scores for tests that account for the number of characters, syllables, or both. However, it has been shown by some studies that even after eliminating some of the medical jargon, educational medical literature is still at a readability level above the average American [48]. Each readability assessment measures variables differently and each introduces some extent of bias. Multiple assessment scales were used in this study in order to minimize the effect of inherent bias within each individual scale. The overall patient comprehension of education materials is multifactorial and goes beyond simply readability. Other factors besides textual content can impact comprehension such as website layout or multimedia resources. When selecting articles for readability analysis, we chose the first ten links written specifically for patients without regard to their content which has inherent limitations. For example, when searching for diverticulitis, several links did not mention diagnostic imaging; however, while this may not be immediately relevant to radiology, these are the websites patients are most likely to encounter. Also, certain search terms displayed webpages from the same Internet domain. While these are unique webpages, they are published from the same domain.
Radiologists, other physicians, as well as other individuals responsible for contents on websites alike can find it challenging to write education materials at a reading level at which most Americans can understand. The medical vernacular contains a plethora of long and complex terms that does not help this endeavor. However, it is vital for physicians to explain symptoms, procedures, and other health-related information in a way that excludes a lot of the terminology that can inhibit a patient’s understanding of the material. Providing patients with understandable education resources will allow for patients to take a more active role in their healthcare by enhancing their understanding of medical literature. Patient education materials can be composed in a more simplified manner that is better understood by the general public by using resources provided by the NIH, the AMA, or the US Center for Disease Control [3, 4].
Patients are taking an increased interest in their own healthcare and are utilizing the Internet for supplemental information on emergency radiology. However, there is a discrepancy between the complexity of online educational resources and patients’ ability to read and comprehend them. To obtain a higher level of health literacy among patients, educational resources should be written, or rewritten, according to the NIH and AMA readability recommendations. Adhering to these recommendations will augment the community’s understanding of radiology and benefit both practitioners and their patients.
Funding statement
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
References
Demographics of Internet Users (2011) In Internet and American Life Project. Pew Research Center, Washington D.C
Berkman ND et al (2011) Health literacy interventions and outcomes: an updated systematic review. Evid Rep Technol Assess (Full Rep) 199:1–941
(CDC), U.D.o.H.a.H.S.-C.f.D.C.a.P (2009) Simply put: A Guide for Creating Easy-to-Understand Materials. Atlanta, GA
BD, W (2003) Health literacy: a manual for clinicians, ed. A.M. foundation, Chicago: American Medical Association
How to Write Easy-to-Read Health Materials. [cited 2017 03/02]; Available from: https://www.nlm.nih.gov/medlineplus/etr.html
Prabhu, A.V., et al. Online palliative care and oncology patient education resources through Google: do they meet national health literacy recommendations? Practical Radiat Oncol
Prabhu AV, Hansberry DR, Agarwal N, Clump DA, Heron DE (2016) Radiation oncology and online patient education materials: deviating from NIH and AMA recommendations. Int J Radiat Oncol Biol Phys 96(3):521–528. https://doi.org/10.1016/j.ijrobp.2016.06.2449
Hansberry DR, Ramchand T, Patel S, Kraus C, Jung J, Agarwal N, Gonzales SF, Baker SR (2014) Are we failing to communicate? Internet-based patient education materials and radiation safety. Eur J Radiol 83(9):1698–1702. https://doi.org/10.1016/j.ejrad.2014.04.013
Hansberry DR, Agarwal N, Baker SR (2014) Health literacy and online educational resources: an opportunity to educate patients. Am J Roentgenol 204(1):111–116
Agarwal N, Chaudhari A, Hansberry DR, Tomei KL, Prestigiacomo CJ (2013) A comparative analysis of neurosurgical online education materials to assess patient comprehension. J Clin Neurosci 20(10):1357–1361. https://doi.org/10.1016/j.jocn.2012.10.047
Agarwal N, Sarris C, Hansberry DR, Lin MJ, Barrese JC, Prestigiacomo CJ (2013) Quality of patient education materials for rehabilitation after neurological surgery. NeuroRehabilitation 32(4):817–821. https://doi.org/10.3233/NRE-130905
Hansberry DR, Agarwal N, Shah R, Schmitt PJ, Baredes S, Setzen M, Carmel PW, Prestigiacomo CJ, Liu JK, Eloy JA (2014) Analysis of the readability of patient education materials from surgical subspecialties. Laryngoscope 124(2):405–412. https://doi.org/10.1002/lary.24261
Prabhu AV, Kim C, Crihalmeanu T, Hansberry DR, Agarwal N, DeFrances MC, Trejo Bittar HE (2017) An online readability analysis of pathology-related patient education articles: an opportunity for pathologists to educate patients. Hum Pathol 65:15–20. https://doi.org/10.1016/j.humpath.2017.04.020
Hansberry DR, Donovan AL, Prabhu AV, Agarwal N, Cox M, Flanders AE (2017) Enhancing the radiologist-patient relationship through improved communication: a quantitative readability analysis in spine radiology. AJNR Am J Neuroradiol 38(6):1252–1256. https://doi.org/10.3174/ajnr.A5151
Hansberry DR, Agarwal N, John ES, John AM, Agarwal P, Reynolds JC, Baker SR (2017) Evaluation of internet-based patient education materials from internal medicine subspecialty organizations: will patients understand them? Intern Emerg Med 12(4):535–543. https://doi.org/10.1007/s11739-017-1611-2
Hansberry DR, Patel SR, Agarwal P, Agarwal N, John ES, John AM, Reynolds JC (2017) A quantitative readability analysis of patient education resources from gastroenterology society websites. Int J Color Dis 32(6):917–920. https://doi.org/10.1007/s00384-016-2730-3
Agarwal N, Feghhi DP, Gupta R, Hansberry DR, Quinn JC, Heary RF, Goldstein IM (2014) A comparative analysis of minimally invasive and open spine surgery patient education resources. J Neurosurg Spine 21(3):468–474. https://doi.org/10.3171/2014.5.SPINE13600
Hansberry DR, Agarwal N, Gonzales SF, Baker SR (2014) Are we effectively informing patients? A quantitative analysis of on-line patient education resources from the American Society of Neuroradiology. AJNR Am J Neuroradiol 35(7):1270–1275. https://doi.org/10.3174/ajnr.A3854
Agarwal N, Hansberry DR, Singh PL, Heary RF, Goldstein IM (2014) Quality assessment of spinal cord injury patient education resources. Spine (Phila Pa 1976) 39(11):E701–E704. https://doi.org/10.1097/BRS.0000000000000308
Hansberry DR, John A, John E, Agarwal N, Gonzales SF, Baker SR (2014) A critical review of the readability of online patient education resources from RadiologyInfo.Org. AJR Am J Roentgenol 202(3):566–575. https://doi.org/10.2214/AJR.13.11223
Hansberry DR, Kraus C, Agarwal N, Baker SR, Gonzales SF (2014) Health literacy in vascular and interventional radiology: a comparative analysis of online patient education resources. Cardiovasc Intervent Radiol 37(4):1034–1040. https://doi.org/10.1007/s00270-013-0752-6
Hansberry DR, Suresh R, Agarwal N, Heary RF, Goldstein IM (2013) Quality assessment of online patient education resources for peripheral neuropathy. J Peripher Nerv Syst 18(1):44–47. https://doi.org/10.1111/jns5.12006
Prabhu, A.V., et al. Radiology online patient education materials provided by major university hospitals: do they conform to NIH and AMA guidelines? Current problems in diagnostic radiology
Prabhu AV, Gupta R, Kim C, Kashkoush A, Hansberry DR, Agarwal N, Koch E (2016) Patient education materials in dermatology: addressing the health literacy needs of patients. JAMA Dermatol 152(8):946–947. https://doi.org/10.1001/jamadermatol.2016.1135
Crihalmeanu, T., et al. Readability of online allergy and immunology educational resources for patients: implications for physicians. J Allergy Clin Immunol: In Practice
Morrison AK, Brousseau DC, Brazauskas R, Levas MN (2015) Health literacy affects likelihood of radiology testing in the pediatric emergency department. J Pediatr 166(4):1037–1041 e1. https://doi.org/10.1016/j.jpeds.2014.12.009
Flesch R (1948) A new readability yardstick. J Appl Psychol 32(3):221–233. https://doi.org/10.1037/h0057532
J.P., K (1975) Deviation of new readability formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy enlisted personnel, N.T.I. Service, Editor. Springfield, VA
G.H., M., SMOG grading (1969) A new readability formula. J Read 12:639–646
Coleman M (1975) A.L.T.L., a computer readability formula for machine scoring. J Appl Psychol 60(2):283–284. https://doi.org/10.1037/h0076540
R., G (1952) The technique of clear writing. Mcgraw-Hill, New York
J.S., C., Readability revisited (1995) The new Dale-Chall readability formula, ed. B.B. Cambridge. Northampton, MA
Caylor J.S., S.T.G, Fox L.C., et al. (1973) Methodologies for determining reading requirements of military occupational specialties, H.R.R. Organization, Editor: Alexandria, VA
E., F (1968) A readability formula that saves time. J Read 11:513–578
A.L., R (1977) The Raygor readability estimate: A quick and easy way to determine difficulty, in National Reading Conference. Clemson, SC
Hansberry DR, Ayyaswami V, Sood A, Prabhu AV, Agarwal N, Deshmukh SP (2017) Abdominal imaging and patient education resources: enhancing the radiologist-patient relationship through improved communication. Abdom Radiol (NY) 42(4):1276–1280. https://doi.org/10.1007/s00261-016-0977-3
Smith-Bindman R, Miglioretti DL, Larson EB (2008) Rising use of diagnostic medical imaging in a large integrated health system. Health Aff (Millwood) 27(6):1491–1502. https://doi.org/10.1377/hlthaff.27.6.1491
Korley FK, Pham JC, Kirsch TD (2010) Use of advanced radiology during visits to US emergency departments for injury-related conditions, 1998–2007. JAMA 304(13):1465–1471. https://doi.org/10.1001/jama.2010.1408
Christensen H, Griffiths K (2000) The Internet and mental health literacy. Aust N Z J Psychiatry 34(6):975–979. https://doi.org/10.1080/000486700272
Schillinger D, Grumbach K, Piette J, Wang F, Osmond D, Daher C, Palacios J, Sullivan GD, Bindman AB (2002) Association of health literacy with diabetes outcomes. JAMA 288(4):475–482. https://doi.org/10.1001/jama.288.4.475
Juzych MS, Randhawa S, Shukairy A, Kaushal P, Gupta A, Shalauta N (2008) Functional health literacy in patients with glaucoma in urban settings. Arch Ophthalmol 126(5):718–724. https://doi.org/10.1001/archopht.126.5.718
Badarudeen S, Sabharwal S (2010) Assessing readability of patient education materials: current role in orthopaedics. Clin Orthop Relat Res 468(10):2572–2580. https://doi.org/10.1007/s11999-010-1380-y
Baker DW, Parker RM, Williams MV, Clark WS, Nurss J (1997) The relationship of patient reading ability to self-reported health and use of health services. Am J Public Health 87(6):1027–1030. https://doi.org/10.2105/AJPH.87.6.1027
Sarkar U, Karter AJ, Liu JY, Adler NE, Nguyen R, López A, Schillinger D (2010) The literacy divide: health literacy and the use of an internet-based patient portal in an integrated health system-results from the diabetes study of northern California (DISTANCE). J Health Commun 15(Suppl 2):183–196. https://doi.org/10.1080/10810730.2010.499988
Sudore RL, Mehta KM, Simonsick EM, Harris TB, Newman AB, Satterfield S, Rosano C, Rooks RN, Rubin SM, Ayonayon HN, Yaffe K, for the Health, Aging and Body Composition Study (2006) Limited literacy in older people and disparities in health and healthcare access. J Am Geriatr Soc 54(5):770–776. https://doi.org/10.1111/j.1532-5415.2006.00691.x
Gutierrez N, Kindratt TB, Pagels P, Foster B, Gimpel NE (2014) Health literacy, health information seeking behaviors and internet use among patients attending a private and public clinic in the same geographic area. J Community Health 39(1):83–89. https://doi.org/10.1007/s10900-013-9742-5
Eichler K, Wieser S, Brugger U (2009) The costs of limited health literacy: a systematic review. Int J Public Health 54(5):313–324. https://doi.org/10.1007/s00038-009-0058-2
Berland GK, Elliott MN, Morales LS, Algazy JI, Kravitz RL, Broder MS, Kanouse DE, Muñoz JA, Puyol JA, Lara M, Watkins KE, Yang H, McGlynn EA (2001) Health information on the internet: accessibility, quality, and readability in English and Spanish. JAMA 285(20):2612–2621. https://doi.org/10.1001/jama.285.20.2612
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests statement
The authors declare that they have no conflict of interest.
Rights and permissions
About this article
Cite this article
Hansberry, D.R., D’Angelo, M., White, M.D. et al. Quantitative analysis of the level of readability of online emergency radiology-based patient education resources. Emerg Radiol 25, 147–152 (2018). https://doi.org/10.1007/s10140-017-1566-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10140-017-1566-7