Abstract
Background Medication literacy refers to the ability of individuals to safely and appropriately access, understand and act on basic medication information. It is vital for correctly and safely using medications. General health literacy measures do not adequately address specific skills for medication literacy, and there are no general, self-administered, performance-based instruments for assessing patients’ medication literacy. Objective The aim was to develop and validate a self-administered performance-based questionnaire measuring functional medication literacy and to evaluate functional medication literacy among the Slovenian general population. Setting A random sample of adult Slovenian residents received the questionnaires at their home addresses. Method The initial content was derived from medication counselling literature. Thirteen patients and 14 healthcare professionals provided feedback about its comprehensibility, comprehensiveness, and difficulty thus supporting content and face validity. The developed questionnaire, comprising 30 items divided into 5 categories (dosage, adverse effects, interactions, precautions, and other information), was sent to a random sample of 1500 adult Slovenian residents. The overall validity of the questionnaire was assessed via reliability, criterion and discriminant validity using the Kuder–Richardson Formula 20, multiple linear regression and Mann–Whitney test. Descriptive statistics were used to evaluate medication literacy. Main outcome measure The psychometric properties of the questionnaire (reliability, content, face, criterion, and discriminant validity); level of functional medication literacy. Results A total of 402 residents returned eligible questionnaires (26.8% response rate). The Kuder–Richardson Formula 20 reliability coefficient for the whole questionnaire was 0.823. One item that did not demonstrate discriminant validity was deleted. Criterion validity was supported by a significant association between age and medication literacy (β = − 0.303). Income (β = 0.243) and current self-perceived health (β = 0.187) also were associated with medication literacy. The median of medication literacy score was 24 out of 29 points. Dosage-related items requiring understanding of long text instructions and the use of numeracy skills received the most incorrect answers. Conclusion A performance-based questionnaire measuring functional medication literacy among a general population with supported validity was developed. Slovenian residents encountered difficulties when dealing with items requiring prose literacy and numeracy skills, especially related to dosing. Special attention should be paid to low-income elderly with poor self-perceived health.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Impact on practice
-
A new, self-administered, performance-based tool for measuring functional medication literacy was developed.
-
Low-income and elderly patients display lower level of functional medication literacy.
-
Understanding long text instructions and using numeracy skills presents problems to patients.
Introduction
The European Literacy Policy Network defined literacy as “the ability to read and write at a level whereby individuals can effectively understand and use written communication in all media” [1]. Literacy can be further divided into three distinct types: document literacy (ability to understand non-continuous information, such as prescription labels), prose literacy (ability to understand continuous text, such as patient information leaflets), and numeracy (ability to perform quantitative computation, such as properly following dosage directions) [2]. Higher literacy skills enable people to improve their potential and knowledge thus allowing fuller participation in a society and economy. Literacy is both content and context specific. Individuals with high general literacy skills may encounter difficulties when applying their skills to unfamiliar contexts or situations which require specific knowledge, such as health care environment. Consequently, health literacy has emerged as a content specific literacy in a health context [3].
It can be defined as “personal, cognitive and social skills which determine the ability of individuals to gain access, understand and use information to promote and maintain good health” [4]. Low health literacy is linked to decreased knowledge of medical conditions and treatment, low use of preventive services, poor adherence to treatment plans, high rates of hospitalization, poor health, and high healthcare costs [5,6,7]. In Nutbeam’s outcome model for health promotion, health literacy is an outcome of health promotion actions, such as patient education and counselling. In this model, health literacy includes measures such as health-related knowledge, attitudes, behavioural intentions, personal skills and self-efficiency [4]. Nutbeam also proposed a health literacy model with 3 sequential levels through which patients progress. The basic level is functional literacy, which describes patients’ basic-level skills needed to obtain relevant health information (e.g. on health risks and using the healthcare system) and to apply this knowledge to a range of healthcare situations. It is also a prerequisite for the two higher literacy levels. The second level involves interactive/communicative literacy. It focuses on advanced cognitive skills that enable patients to actively engage in health-related interactions, extract health information and derive meaning from different forms of communication and apply it to changing circumstances. The third and highest level involves critical literacy. This level builds on functional and interactive literacy and refers to patients’ ability to critically analyse health-related information from a wide range of sources and to use this information to make informed decisions and exert greater control over health-related events and situations [8].
Patients’ poor literacy skills create unique problems in different health-care settings [2]. From a pharmacy perspective, patients must understand an increasingly complex health care system, particularly as it relates to medication [7]. Based on definitions of health literacy Sauceda et al. proposed a definition of medication literacy as “ability of individuals to safely and appropriately access, understand and act on basic medication information” [9]. Measuring such specific literacy allows better identification and understanding of literacy issues and their impact on patient outcomes [10].
Measures of health literacy can be constructed as perception-based or performance-based instruments. Perception-based instruments are subjective measures and involve respondents rating their perceived abilities to collect, understand, communicate and evaluate health information. They have the advantage of being shorter and potentially less embarrassing for the patients as their abilities are not scored by the researchers, but by themselves. There are usually used as screening tools. However, they might assess self-efficacy or behaviour rather than health literacy. Cultural norms could also affect patients’ replies thus limiting the validity of such measures for measuring health literacy. Performance-based instruments are considered objective in their assessments. Respondents show higher or lower performance on tasks such as filling in the gaps in texts, reading out loud health-related terms or completing knowledge quizzes. These quizzes are usually a set of true/false or multiple choice questions which address patients’ knowledge on disease symptoms, causes, management etc. Health literacy level is inferred from measured performance. They are mostly used in research contexts. While they may provide a more comprehensive health literacy assessment, they are more time consuming, impose performance pressure on patients and disregard contextual or situational factors that may affect performance [11,12,13]. There are a few validated instruments which measure medication literacy. Sauceda et al. developed and validated a Medication Literacy Assessment in Spanish and English (MedLitRxSE). It is a performance-based measure targeting a distinct geographical setting, the border between the United States and Mexico. It contains items about specific cases (e.g. diabetes), which may not be equally applicable to all patients. Specific patient groups (e.g. patients with diabetes) might be more familiar with topics covered and therefore show better results [9]. Stilley et al. developed and validated the Medication Health Literacy Measure, a performance-based measure which focuses on understanding of prescription labels. The questionnaire uses two labels, one for an immunosuppressant medication and one for a diabetes medication, and was validated using adult liver transplant recipients and patients with diabetes [14]. Again, the chosen cases and validation method might cause the questionnaire to be too specific. Yeh et al. developed and tested a Chinese medication literacy measure consisting of 17 items divided into 4 sections: vocabulary, non-prescription drug, prescription drug, and drug advertisement. Respondents had to read and interpret medication-related phrases and prescription, non-prescription, and drug advertisement written information. It was administered as face-to-face interviews and may be specific to Chinese cultural environment [15]. A recent instrument to measure medication literacy was developed by Vervloet et al. in the Netherlands. The RALPH instrument is an interview guide for practicing pharmacists to recognize patients with limited medication literacy skills. It tests functional, communicative and critical literacy [16]. The instrument is practice oriented and therefore not intended for patients’ self-administration which would allow anonymity. It is not solely performance-based instrument as it contains perception-based questions as well. No other published, validated performance-based instruments to measure medication literacy has been found.
Aim of the study
The aim of this study was to develop and validate a new, self-administered, performance-based questionnaire measuring functional medication literacy and to evaluate functional medication literacy among the Slovenian general population.
Ethics approval
Ethical approval for the study was obtained from the National Medical Ethics Committee (Reference Number 55/02/16).
Methods
The content of the new questionnaire was generated using various pharmacy sources addressing patient counselling, health literacy, medication literacy and then validated by a panel of patients and healthcare professionals. The resulting questionnaire was validated and used on a large sample of Slovenian residents. The entire development and validation process was conducted in Slovenian.
Content generation
In line with Nutbeam’s model, where health literacy is viewed as an outcome of patient education and counselling, the initial content of the questionnaire was generated from pharmacy counselling literature, patient information booklets, as well as articles from a Pubmed search for pharmacy counselling, patient education, medication literacy and health literacy [9, 14, 17,18,19,20,21,22,23,24]. Found literature was screened for generic types of key information on medication therapy (e.g. medication purpose, dose frequency, adverse effects) which patients should be able to access, understand and act upon. A list of such medication literacy elements was compiled from found literature.
The resulting list of literacy elements was examined by 3 academic pharmacy experts with knowledge of health and medication literacy who were willing to participate in the study. They selected the elements according to 4 criteria. The first criterion was in line with the chosen definition of medication literacy, which highlights safe and proper use of medications. Thus, the experts jointly selected elements that were considered essential to ensure such use, excluding elements that did not address it, e.g. knowing the terms adverse and side effects and the difference between them. Secondly, according to the chosen definition, the elements had to deal with basic medication information. Elements requiring advanced knowledge, e.g. the meaning of the precautionary symbol § on the outer packaging, were excluded. Thirdly, since the aim was to create a general questionnaire, the chosen elements had to be applicable to the general patient population, excluding specific issues such as the use of inhalers. Finally, the elements had to represent topics that patients should know about. The questionnaire was designed for patient self-administration and therefore all the issues which should not be patient’s concern were excluded, e.g. interactions with prescribed medications.
Questionnaire items were generated from the selected elements and presented as mostly close-ended test questions (e.g. which statement is true for adverse effects?). Guidelines for the formulation of the items were followed, with special consideration to plain language and brevity [25, 26]. The questionnaire items were not designed to refer to a specific medication a respondent might be taking or to a specific patient population. They were rather general to allow for relevance to all respondents.
Content validation was conducted by a panel of 27 participants (9 community pharmacists, 3 general practitioners, 2 nurses, and 13 patients) who gave feedback on questionnaire content and format. The members of the panel were asked to self-administer the questionnaire and send it to the research team with feedback, such as comprehensibility, missing aspects of safe and proper use of medications, redundant items, and items that might exceed the expected level of patient medication literacy. The panellists were recruited through snowball sampling until no new suggestions were proposed. Their feedback and suggestions for improvements were pooled, analysed and discussed by the research team. The questionnaire was optimized accordingly.
Validation and medication literacy evaluation
The optimised questionnaire was used in the validation study and medication literacy evaluation phase. This questionnaire comprised 30 items divided into 5 categories: dosage, adverse effects, interactions, precautions, and other information. All but 1 of the 20 close-ended items had 5 response options. The remaining 10 items were open-ended. All items had a “No answer” option. Sociodemographic data were collected at the end of the questionnaire. The questionnaire was piloted on a small opportunity sample (N = 10) of patients who were asked for additional feedback on the wording, clarity, and format, and accompanying instructions. These patients were not included in the following validation study. The pilot study resulted in minor modifications, mostly related to wording of the questionnaire.
To validate the questionnaire and measure medication literacy, the questionnaire was sent by postal mail to a random sample of 1500 adult Slovenian residents. The random sample was obtained from the Statistical Office of the Republic of Slovenia. The sample size was calculated based on the number of adult Slovenian inhabitants in 2016 (1,701,642 inhabitants), 95% confidence interval and 5% margin of error [27, 28]. The calculated required sample size was 385. Anticipating 25% response rate, 1500 questionnaires were sent. Participants were apprised of the medication literacy importance and study aims, given the instructions to complete the questionnaire, presented the estimated time needed to complete the questionnaire and assured the anonymity of their answers. Respondents returned fulfilled questionnaires in a pre-paid envelope to the research team. All returned questionnaires were analysed, and total medication literacy scores were calculated. Each correct answer was worth 1 point, with a maximum of 30 points. The criteria for evaluation the open-ended questions were defined in advance. One of the researchers used these criteria to assign the points. In case of doubt, he consulted the second researcher and a consensus was reached.
Statistical analysis
Descriptive statistics were used to assess sociodemographic data and item-level statistics (% of correct responses, total score, mean/median total score, standard deviation/interquartile range of total score). The overall validity of the questionnaire was assessed via reliability, criterion validity, discriminant validity, and content validity. Reliability was assessed using the Kuder–Richardson Formula 20, a measure of internal consistency for questionnaires with dichotomous choices (in our case correct or incorrect response) [29]. Coefficients above 0.7 were considered satisfactory [30]. The criterion validity was addressed by assessing the association between literacy and age, as cited in the literature [31,32,33]. For that purpose, a multiple linear regression analysis was run to determine factors associated with medication literacy. The medication literacy score was set as a dependent variable. The participant’s sex, age, education, income, current self-perceived health status, and any chronic illnesses were used as factors, as these were frequently reported as significant predictors of health literacy [34]. Dummy variables were used in cases of categorical variables with more than 2 categories. The forced entry method of regression (SPSS: Enter method) was used. Multicollinearity was examined by the variance inflation factors. To test the discriminant validity of individual items, a Mann–Whitney test was performed comparing total scores of respondents who answered particular items correctly with those who answered incorrectly. In contrast to other validity measures, content validity cannot be assessed by statistics, thus the procedures for content generation and the validation phase (literature search, expert review, patient and healthcare professionals panel, pilot study) ensured that relevant content was incorporated in the questionnaire.
The validation study and medication literacy evaluation were conducted in December 2016 and January 2017. All statistical analysis was performed by statistical package SPSS v22.0. A significance level below 0.05 was considered statistically significant in all analyses.
Results
Content generation
A list of 92 medication literacy elements was compiled from the literature search. After screening using inclusion criteria, the list was reduced to 30 medication literacy elements, from which 30 questionnaire items were generated. As a result of the panellists’ feedback on questionnaire content and format, two questionnaire items were deleted, four items were modified and two were added. Thus, the optimized questionnaire comprised 30 items divided in 5 categories. Table 1 shows the medication literacy elements with corresponding categories.
Validation and medication literacy evaluation
Participants in the pilot study took about 15 min to complete the questionnaire. Out of 1500 sent, 425 questionnaires were returned (28.3% response rate). After excluding 23 questionnaires with more than 20% missing responses, the final number of questionnaires available for analysis was 402. Table 2 shows sociodemographic data of the respondents. Their mean age was 52 years (range 18–87 years).
The reliability coefficient for the questionnaire, as assessed by the Kuder–Richardson Formula 20, was 0.823, which exceeds the 0.7 criterion.
Table 3 shows the factors used in the final regression model. The model explained 23.0% of variance in participants’ knowledge (R = 0.479, N = 336, P < .001). All variance inflation factors were below 5 and the tolerance statistics all above 0.2, indicating minor multicollinearity among the factors. Three factors associated with medication literacy were statistically significant: age (standardized β = − 0.303), income (standardized β = 0.243), and current self-perceived health status (standardized β = 0.187). Younger participants with higher income and better self-perceived health status demonstrated higher medication literacy scores compared to other respondents. Other factors were not significantly associated with medication literacy.
One item discussing the correct interpretation of the precaution label, “Keep out of reach of children” did not have discriminant validity. Participants answered this item correctly regardless of their medication literacy score. The item was thus deleted from the questionnaire.
When evaluating medication literacy, the median of total score was 24 points (interquartile range 4 points) out of 29 points. The minimum achieved total score was 6 points, and the maximum was 29. The scores were not normally distributed as they were negatively skewed (skewness coefficient of − 1.615). Table 1 presents the percentages of correct answers for each generated element, with the accompanying category. For full item wordings, see the questionnaire in the Electronic supplementary material.
The close-ended items received the highest scores (more than 95% of participants answered correctly). Item 14, which addressed storing or consuming medications in relation to their expiration date and how to dispose of expired medications, received the most correct answers. Item 18.4, which focused on interpreting the “Shake before use” label, received the next highest number of correct responses, followed by item 9, which was about general medication storage conditions and retaining the original packaging and patient information leaflet, and item 13, which checked awareness of issues when consuming alcohol with medication.
Items related to dosing, which required numeracy skills and understanding information from long texts, received the lowest scores. Specifically, 6 items were answered correctly by fewer than 70% of respondents. Of those, 4 were open-ended questions. Items 10.1, 10.2, and 10.3 were open-ended questions related to interpreting instructions and dosage for a commonly used OTC analgesic, and items 10.1 and 10.2 required some calculations. For item 8, which was a close-ended question about the precaution symbols △ and ▲ on outer packaging, 40.3% chose the “No answer” option. For item 20, which tested whether participants knew that multi-dose preparations (e.g. eye drops) have expiration dates, irrespective of whether they had been opened, 17% selected the “No answer” option and 16.9% incorrectly answered that the medications had unlimited expiration even if opened. Item 18.3 was an open-ended question to test interpretation of the storage label “Store in a cold place at 2–8 °C” and asked participants to specify an example of a place where such a medication could be stored. Most correctly listed the refrigerator, but some incorrectly answered “in a cold place,” “on a high shelf,” or “in the basement.” On average, participants correctly answered 84.2% of the close-ended questions and 72.7% of the open-ended questions (Wilcoxon Signed Ranks Test, P < .001).
Discussion
A new performance-based questionnaire measuring functional medication literacy among the Slovenian general population was developed and validated. The median of medication literacy score was 24 out of 29 points.
International comparison
International and published literature on measuring medication literacy is scarce with only a few studies for comparison. During the testing phase of MedLitRxSE, Sauceda et al. reported an average score of 10.7 out of 14 points (76.4%) in an English population [9]. Yeh et al. reported an average overall score of 13 out of 17 points (76.5%) using the developed Chinese medication competency measure [15]. While using the RALPH instrument in the Netherlands, Koster et al. reported more than 90% of correct responses to most questions related to functional medication literacy skills [35].
The content and results of these previous studies are similar to that of the current study: all test participants’ ability to read and correctly interpret information on prescription or non-prescription medications. They also address at least one of the three types of literacy: document literacy, prose literacy and numeracy [2].
Prose literacy and numeracy
Items related to dosing that required understanding long texts and numerical skills had the most incorrect answers. A study by Davis et al. confirms that patients may read dosing instructions but not correctly demonstrate the use of such information, because it also involves numeracy skills [36]. Numeracy-related tasks are common in healthcare situations, including handling of medications. The correct use of medications and adherence to therapy involves many numeracy-related issues, including administration frequency, treatment duration, dose per weight, and refill scheduling [37]. Therefore, interventions to improve numeracy skills linked to prose literacy should be a priority when addressing patient medication literacy needs.
Age and education as medication literacy predictors
In this study, age was significantly associated with medication literacy, a phenomenon frequently cited in literature [31,32,33, 38, 39]. Medication literacy among participants decreased as age increased. As patients age, their cognition, learning, and memory deteriorate and negatively affects their medication literacy [40]. This problem is compounded by frequent use of medications. The elderly are prescribed triple the number of prescriptions as do younger adults, and comorbidities often require taking multiple medications several times a day [14]. The elderly thus are a target population for medication literacy interventions.
In regression analysis, education was not significantly associated with medication literacy, although people with higher levels of education achieved higher average scores than those with lower levels of education (Primary school or less 20.0 points, Secondary school 22.6 points, College 24.1 points, University or more 25.1 points). Collinearity diagnostics indicated education might be moderately negatively correlated to age although all variance inflation factors were below 5. Even though consistent with the study of Maniaci et al., the current study is one of only a few studies where education has not been found to be associated with literacy [41]. Most other studies reported increasing health literacy with increasing number of education years [31, 32, 38, 42, 43]. This inconsistency might result from focusing on functional literacy. At this basic level of literacy, differences among differently educated patients may not be evident. Furthermore, Berkman et al. suggested that health literacy might not relate to years of education [44].
Questionnaire application
The initial development of questionnaire content was done as general as possible (not region- or patient-specific). The later stages (expert panel, pilot and validation study) were, however, not done internationally. Thus, applicability to international setting cannot be automatically assumed and should be tested separately.
The questionnaire is primarily research-oriented and thus not intended for regular clinical use, as it takes about 15 min to complete. Nevertheless, individual pharmacists can use it as part of their initial patient assessments to provide medication (use) review and to identify potential patient misunderstandings and concerns. Further validation also could decrease the questionnaire length, making it appropriate for routine use in outpatient settings. At a population level, the questionnaire can provide scientific bases for medication literacy interventions, such as visual aids, customized patient information, and information accessibility improvements [45].
Health literacy and health knowledge
Health literacy and health knowledge have been frequently linked, but their relation is still theoretically inconclusive. Health literacy theories view knowledge as either an antecedent, a dimension or as a consequence of health literacy [46, 47]. Sørensen et al. systematically reviewed models of health literacy and developed an integrated conceptual model of health literacy summarizing the most comprehensive evidence-based dimensions of health literacy. The model lists knowledge as one of the main dimensions of health literacy [48]. In line with the Sørensen’s model, the developed instrument features questions addressing patients’ knowledge about medications.
Limitations
Although the self-administered questionnaire sent via postal mail has several advantages, it is also a source of potential bias. Those who responded might feel more competent in medication literacy, thus leading to over-estimation in the results. Family and friends might have also assisted participants in completing the questionnaire. Although most closed ended questionnaire items have 4 or 5 response options, possibility of guessing cannot be eliminated. Furthermore, the current questionnaire does not define cut-off or class values for medication literacy. This is considered for future work. Lastly, self-administration of the questionnaire prevented observation of medication use demonstration among participants.
Conclusion
A performance-based questionnaire measuring functional medication literacy among the Slovenian general population with supported validity was developed. The median of medication literacy score among Slovenian respondents was 24 out of 29 points. Items that required comprehension of long texts (prose literacy) and numerical skills (dosing instructions) had the most incorrect answers. Special attention should be paid to low-income and elderly people with poor self-perceived health, as they demonstrate significantly lower medication literacy scores.
References
Valtin R, Bird V, Brooks G, Brozo B, Clement C, Ehmig S, et al. European declaration of the right to literacy: European Literacy Policy Network ELINET. 2016.
King SR, McCaffrey DJ, Bouldin AS. Health literacy in the pharmacy setting: defining pharmacotherapy literacy. Pharm Pract. 2011;9(4):213–20.
Nutbeam D. Defining, measuring and improving health literacy. Health Eval Promot. 2015;42(4):450–6.
Nutbeam D. Health literacy as a public health goal: a challenge for contemporary health education and communication strategies into the 21st century. Health Prom Int. 2000;15(3):259–67.
Weiss BD. Health literacy research: isn’t there something better we could be doing? Health Commun. 2015;30(12):1173–5.
Mitchell B, Begoray D. Electronic personal health records that promote self-management in chronic illness. OJIN Online J Issues Nurs. 2010. https://doi.org/10.3912/ojin.vol15no03ppt01.
The U.S. Department of Health & Human Services-Office of Disease Prevention and Health Promotion. Quick guide to health literacy. 2017. www.health.gov/communication/literacy/quickguide/Quickguide.pdf. Accessed 12 Dec 2017.
Okan O, Bauer U, Levin-Zamir D, Pinheiro P, Sørensen K. International handbook of health literacy: research, practice and policy across the life-span. Bristol: Policy Press; 2019.
Sauceda JA, Loya AM, Sias JJ, Taylor T, Wiebe JS, Rivera JO. Medication literacy in Spanish and English: psychometric evaluation of a new assessment tool. J Am Pharm Assoc JAPhA. 2012;52(6):e231–40.
Mackert M, Champlin S, Su Z, Guadagno M. The many health literacies: advancing research or fragmentation? Health Commun. 2015;30(12):1161–5.
Kiechle ES, Bailey SC, Hedlund LA, Viera AJ, Sheridan SL. Different measures, different outcomes? A systematic review of performance-based versus self-reported measures of health literacy and numeracy. J Gen Intern Med. 2015;30(10):1538–46.
Schulz PJ, Hartung U. The future of health literacy. In: Schaeffer D, Pelikan JM, editors. Health literacy. Forschungsstand und Perspektiven. Bern: Hogrefe; 2017. p. 79–91.
Doan J, Zakrzewski-Jakubiak H, Roy J, Turgeon J, Tannenbaum C. Prevalence and risk of potential cytochrome P450-mediated drug-drug interactions in older hospitalized patients with polypharmacy. Ann Pharmacother. 2013;47(3):324–32.
Stilley CS, Terhorst L, Flynn WB, Fiore RM, Stimer ED. Medication health literacy measure: development and psychometric properties. J Nurs Meas. 2014;22(2):213–22.
Yeh YC, Lin HW, Chang EH, Huang YM, Chen YC, Wang CY, et al. Development and validation of a Chinese medication literacy measure. Health Expect Int J Public Particip Health Care Health Policy. 2017. https://doi.org/10.1111/hex.12569.
Vervloet M, van Dijk L, Rademakers J, Bouvy ML, De Smet P, Philbert D, et al. Recognizing and addressing limited pharmaceutical literacy: development of the RALPH interview guide. Res Soc Adm Pharm RSAP. 2018. https://doi.org/10.1016/j.sapharm.2018.04.031.
Puspitasari HP, Aslani P, Krass I. A review of counseling practices on prescription medicines in community pharmacies. Res Soc Adm Pharm. 2009;5(3):197–210.
Blom L, Jonkers R, Kok G, Bakker A. Patient education in 20 Dutch community pharmacies: analysis of audiotaped patient contants. Int J Pharm Pract. 1998;6:72–6.
Schommer JC, Sullivan DL, Wiederholt JB. Comparison of methods used for estimating pharmacist counseling behaviors. J Pharm Technol jPT Off Publ Assoc Pharm Tech. 1994;10(6):261–8.
Beardsley RS, Kimberlin CL, Tindall WN. Communication skills in pharmacy practice: a practical guide for students and practitioners. 5th ed. Philadelphia: Wolters Kluwer/Lippincott Williams & Wilkins; 2007.
Rantucci MJ. Pharmacists talking with patients: a guide to patient counseling. 2nd ed. Philadelphia: Lippincott Williams & Wilkins; 2007.
Sorensen K, van den Broucke S, Pelikan JM, Fullam J, Doyle G, Slonska Z, et al. Measuring health literacy in populations: illuminating the design and development process of the European Health Literacy Survey Questionnaire (HLS-EU-Q). BMC Public Health. 2013. https://doi.org/10.1186/1471-2458-13-948.
Baker DW, Williams MV, Parker RM, Gazmararian JA, Nurss J. Development of a brief test to measure functional health literacy. Patient Educ Couns. 1999;38(1):33–42.
Weiss BD, Mays MZ, Martz W, Castro KM, DeWalt DA, Pignone MP, et al. Quick assessment of literacy in primary care: the newest vital sign. Ann Fam Med. 2005;3(6):514–22.
Gnjidic D, Johnell K. Clinical implications from drug-drug and drug-disease interactions in older people. Clin Exp Pharmacol Physiol. 2013;40(5):320–5.
Obreli Neto PR, Nobili A, Marusic S, Pilger D, Guidoni CM, Baldoni Ade O, et al. Prevalence and predictors of potential drug-drug interactions in the elderly: a cross-sectional study in the brazilian primary public health system. J Pharm Pharm Sci Publ Can Soc Pharm Sci Soc Can Sci Pharm. 2012;15(2):344–54.
Daniel WW. Biostatistics: a foundation for analysis in the health sciences. 9th ed. Hoboken: Wiley; 2009.
Republic of Slovenia Statistical Office. Population by age and sex, statistical regions, Slovenia, half-yearly. Ljubljana. 2016.
Kuder GF, Richardson MW. The theory of the estimation of test reliability. Psychometrika. 1937;2(3):151–60.
Aday LA, Cornelius LJ. Designing and conducting health surveys: a comprehensive guide. 3rd ed. San Francisco: Jossey-Bass; 2006.
Jovic-Vranes A, Bjegovic-Mikanovic V, Marinkovic J, Vukovic D. Evaluation of a health literacy screening tool in primary care patients: evidence from Serbia. Health Prom Int. 2014;29(4):601–7.
Eyuboglu E, Schulz PJ. Validation of Turkish health literacy measures. Health Prom Int. 2016;31(2):355–62.
Downey LV, Zun LS. Assessing adult health literacy in urban healthcare settings. J Natl Med Assoc. 2008;100(11):1304–8.
Martin LT, Ruder T, Escarce JJ, Ghosh-Dastidar B, Sherman D, Elliott M, et al. Developing predictive models of health literacy. J Gen Intern Med. 2009;24(11):1211–6.
Koster ES, Philbert D, van Dijk L, Rademakers J, de Smet P, Bouvy ML, et al. Recognizing pharmaceutical illiteracy in community pharmacy: agreement between a practice-based interview guide and questionnaire based assessment. Res Soc Adm Pharm RSAP. 2018. https://doi.org/10.1016/j.sapharm.2018.01.009.
Davis TC, Wolf MS, Bass PF, Thompson JA, Tilson HH, Neuberger M, et al. Literacy and misunderstanding prescription drug labels. Ann Intern Med. 2006;145(12):887–94.
Rothman RL, Montori VM, Cherrington A, Pignone MP. Perspective: the role of numeracy in health care. J Health Commun. 2008;13(6):583–95.
Gazmararian JA, Baker DW, Williams MV, Parker RM, Scott TL, Green DC, et al. Health literacy among medicare enrollees in a managed care organization. JAMA. 1999;281(6):545–51.
Ginde AA, Weiner SG, Pallin DJ, Camargo CA Jr. Multicenter study of limited health literacy in emergency department patients. Acad Emerg Med Off J Soc Acad Emerg Med. 2008;15(6):577–80.
Gazmararian JA, Kripalani S, Miller MJ, Echt KV, Ren J, Rask K. Factors associated with medication refill adherence in cardiovascular-related diseases: a focus on health literacy. J Gen Intern Med. 2006;21(12):1215–21.
Maniaci MJ, Heckman MG, Dawson NL. Functional health literacy and understanding of medications at discharge. Mayo Clin Proc. 2008;83(5):554–8.
Barber MN, Staples M, Osborne RH, Clerehan R, Elder C, Buchbinder R. Up to a quarter of the Australian population may have suboptimal health literacy depending upon the measurement tool: results from a population-based survey. Health Prom Int. 2009;24(4):445.
von Wagner C, Knight K, Steptoe A, Wardle J. Functional health literacy and health-promoting behaviour in a national sample of British adults. J Epidemiol Community Health. 2007;61(12):1086–90.
Berkman ND, Sheridan SL, Donahue KE, Halpern DJ, Viera A, Crotty K, et al. Health literacy interventions and outcomes: an updated systematic review. Evid Rep Technol Assess. 2011;199:1–941.
Wali H, Hudani Z, Wali S, Mercer K, Grindrod K. A systematic review of interventions to improve medication information for low health literate populations. Res Soc Adm Pharm RSAP. 2016;12(6):830–64.
Gellert P, Tille F. What do we know so far? The role of health knowledge within theories of health literacy. Eur Health Psychol. 2015;17(6):266–74.
Baker DW. The meaning and the measure of health literacy. J Gen Intern Med. 2006;21(8):878–83.
Sorensen K, Van den Broucke S, Fullam J, Doyle G, Pelikan J, Slonska Z, et al. Health literacy and public health: a systematic review and integration of definitions and models. BMC Public Health. 2012;12:80.
Acknowledgements
The authors would like to thank Linda Vidic and Špela Vidmar for their valuable contribution to this study, as well as patients and healthcare professionals who participated in this study.
Funding
This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sector.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Horvat, N., Kos, M. Development, validation and performance of a newly designed tool to evaluate functional medication literacy in Slovenia. Int J Clin Pharm 42, 1490–1498 (2020). https://doi.org/10.1007/s11096-020-01138-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11096-020-01138-6