Background

Medical students are often placed in situations where they must care for limited English proficient (LEP) patients [1]. As medical students prepare for residency training and independent practice, they must face the prospect of the growing LEP patient population juxtaposed with the lack of linguistically appropriate services at many medical centers. Recent national data shows that 64% of the U.S. LEP population is Spanish-speaking, whereas all other languages each comprise between 1 and 6% [2]. Although at least three quarters of U.S. hospitals report routinely serving LEP patients, only 18% reported offering any formal assessment of staff language skills in caring for these patients, according to a nationally representative survey [3]. Medical schools are challenged with providing Medical Spanish courses in an attempt to address this gap.

Projections suggest that the Hispanic population will increase by 115% in the next 50 years, at which time Hispanics are expected to comprise 119 million, or 29% of the U.S. population [4]. Further, research suggests that, at present, nearly three fourths of Hispanics speak Spanish at home [5]. Research has shown that provider-patient language concordance leads to improved health outcomes [6], reduced medical errors of significance [7], and improved patient satisfaction [8]. Despite the increasing need for culturally and linguistically competent physicians [9, 10], and the student and institutional demand for Spanish language education, there is currently no standardized curriculum that addresses students’ Spanish skills in U.S. medical schools, including performance-based assessments of non-English-speaking patient encounters [1, 6].

Appropriately assessing students’ ability to take a medical history in another language is an unresolved problem for the many institutions challenged by large numbers of non-English-speaking patients, where the ad hoc use of untrained medical students as Spanish interpreters or providers is widespread [11]. Self-evaluation has been previously used as a means for rating students’ ability to determine comfort level and preparedness to interview Spanish-speaking patients or caregivers [12, 13], though in the absence of formal Medical Spanish exposure and assessment may have limited accuracy [1416].

Theoretical/Conceptual Framework

We implemented a novel faculty-taught Clinical Medical Spanish elective for fourth-year medical students at the University of Illinois—College of Medicine (UI-COM). The course carries 2 weeks of credit, but is taught over an extended 10-week module rather than a brief language immersion program to allow for better language skills retention. Basic Spanish grammar, conversation, and listening comprehension skill-learning were incorporated into health-relevant topics through a combination of educational methodology as described in Table 1. The curriculum mirrors the standard approach to clinical skills education in U.S. medical schools, including a step-wise approach to the clinical interview [17], an organ system-based structure to teaching medical terminology, and assessments via Standardized Patient (SP) Objective Structured Clinical Examinations (OSCEs).

Table 1 Summary of Medical Spanish curriculum

Despite the accepted use of simulation as a medical education and evaluation tool [18, 19], its use as an instructional and assessment tool for students’ medical interviewing skills in other languages has seldom been addressed or studied [20]. We hypothesized that simulation would provide a useful environment in which to test students’ medical vocabulary, interviewing, and general communication skills in a low-stakes but highly realistic patient scenario. Fourth-year medical students are already experienced with OSCE assessments throughout their clinical skills education, so it was anticipated that the application of the same methodology to newly acquired Medical Spanish skills would be a useful and familiar way for them to view their Spanish clinical performance in a similar way to their English clinical performance.

The purpose of this study was to evaluate the first 2 years of curriculum implementation. We examined change in student language fluency level, the change in comfort level with various components of medical interviewing, and the utility of OSCE-based assessments of student medical skills in Spanish.

Methods

Participants

Data were collected over a 2-year period for 58 fourth-year medical students enrolled in the Medical Spanish elective in the Fall and Spring of the 2013–2014 (n = 31) and 2014–2015 (n = 27) academic years from UI-COM. Data collected pertain to pre-course self-evaluation of Medical Spanish comfort level, post-course self-evaluation, and OSCE assessments. The institutional review board of the University of Illinois at Chicago approved this study.

Data Collection

A voluntary online pre-course and post-course self-evaluation form was sent to all registered students in the Medical Spanish elective. Fifty-five students completed two OSCE stations during the Medical Spanish elective, and three self-study track students each completed one OSCE, for a total of 113 scored encounters over the 2-year time period. OSCE assessments took place at the course mid-point (week 5 out of 10) and end-point (week 10 out of 10). The standardized patients (SPs) were native Spanish speakers who received 4 h of training. During each OSCE, students were expected to perform an interview in Spanish for a patient presenting to the clinic with an acute complaint. The interviews were expected to include history of present illness, medical and surgical history, medications, allergies, family history, and social history. The physical examination component was provided to the students in written format at the appropriate point of the interview, to allow them to use the data to summarize the results and explain a tentative assessment and plan to the patient. Students were asked to write a case note in English to test their comprehension of the encounter and ability to consolidate the information in the same way that would be expected in a clinical setting.

As a means to evaluate the sustainability of the course’s effects on student comfort level with Spanish-speaking medical encounters for the purposes of course quality assurance and improvement, a voluntary online survey was sent to participating students 1 year into their residency training.

Measures

Pre- and Post-Course Survey

All 58 students (100%) completed the pre-course survey, and 51 students (88%) completed the post-course survey, each consisting of 14 questions. Students were asked to provide a self-rating of their Spanish language fluency level on a scale ranging from 0 to 5 that was developed by the course director (Table 2). The rest of the survey questions were developed in accordance with the course objectives that were provided in the Clinical Medical Spanish elective syllabus in order to assess students’ self-rating of their achievement of course goals. For instance, students were asked to rate their comfort level with performing various aspects of the medical interview and examination in Spanish on a 5-point scale (strongly agree, agree, neutral, disagree, or strongly disagree). A sampling of the survey questions is available in Table 3.

Table 2 Spanish Fluency Scale used as general Spanish rating tool throughout the study
Table 3 Sample feedback and survey questions used in the study instruments

OSCE

Evaluations were completed for all 113 OSCE encounters and consisted of a three-part assessment model including (a) an SP feedback form on the students’ interviewing skills, (b) a faculty feedback form on the students’ interviewing skills, and (c) a faculty feedback form on the students’ clinical case note (Table 3). OSCE checklists were developed by the course instructor to reflect the intended goals of the Spanish simulation encounter and were based on checklist questions that are routinely asked in comparable English simulated medical encounters. Checklist items for the SP feedback form focused on whether the students performed specific tasks (e.g., Did the student elicit the chief complaint?) and also evaluated the students’ ability to establish culturally appropriate rapport (e.g., Did the student solicit my perspective regarding the problem?). SP training included education on how to provide feedback and complete the checklist form.

Checklist items for the faculty feedback form included more complex medical skill evaluation, such as whether the student asked a minimum number pertinent positive or negative questions in consideration of specific diagnoses. Only one faculty member scored all the interviews in this study and had received prior training in simulation and feedback techniques. Both SP and faculty were asked to evaluate the students’ overall Spanish language communication skills based on the same fluency scale used for student self-fluency rating (Table 2), and both faculty and students were asked to evaluate the students’ ability to perform that particular medical interview in Spanish on a 5-point scale (Able to take a history as well as in any other language; Able to cover at least 80% of the history successfully; Able to cover about 50% of the history; Able to obtain less than 50% of the history; Unable to obtain the history). Scoring cut-offs were identified by the course instructor as a guide to when a student would need to consider the use of a medical interpreter in a clinical setting. Students able to take a history in Spanish as well as any other language would not require a medical interpreter; those able to cover between 80 and 100% of the history in Spanish may occasionally need an interpreter or other language resource or may need to work on specific skills that would allow them to bridge the gap in their knowledge; those able to cover about 50% of the history would need an interpreter unless the patient case was quite routine to their practice; and those able to cover less than half of the appropriate patient interview would be recommended to always have a medical interpreter. Using these percentage cut-offs allows students to compare their Spanish medical performance with the expected English medical performance to provide a way to gauge how much they need to improve in order to independently interview Spanish-speaking patients. The students received a score based on the checklist items as graded by the SP and faculty independently.

Post-Medical School Survey

Thirty-seven out of 58 students (63.8% of those contacted) completed the survey. The survey consisted of 20 questions which were developed by the course instructor to gauge student self-assessment of knowledge retention regarding skills taught in the Medical Spanish course 1 year prior and their perceived applicability to their current residency experience. For example, the survey inquired about the learners’ approximate percentage of Spanish-speaking patients, as well as their self-perceived comfort level with caring for Spanish-speaking patients, and their opinion about whether the course has been useful to their residency experience.

Analysis

Descriptive statistics were calculated to examine overall trends in students’ survey response and OSCE performance. T tests were used to compare means, and chi-squared tests were used to compare proportions. More specifically, the following comparisons were made: (a) comparisons between pre- and post-course fluency and comfort levels with components of the medical interview and examination; (b) comparisons between student self-fluency rating, SP fluency rating, and faculty fluency ratings; and (c) comparisons between fluency ratings to students’ overall scores and self- and faculty-rated ability to perform the OSCE interviews. Data compilation and analyses were conducted using Stata 14 (Stata Corp, College Station, TX).

Results

Over 2 years, 58 students took the course, and 113 total SP encounters were completed and evaluated. All were fourth-year medical students and were required to have a minimum low-intermediate Spanish fluency level (Level 2 per chart provided in Table 2) in order to enroll in the course. Students were excluded from the course if they did not meet the basic fluency requirement or if they were unable to attend at least 80% of the class sessions.

Bias analyses were conducted using chi-squared tests to compare sampled respondents with population characteristics, which revealed that besides a higher percentage of female versus male course participants, the class sample was otherwise not significantly different from their respective graduating class members with regards to honors, identification as underrepresented minorities, or residency match in primary care fields (Table 4).

Table 4 Comparison of Medical Spanish Elective students to Class of 2014–2015

Comparison of pre- and post-course comfort levels using t tests showed that comfort level with interviewing and examining Spanish-speaking patients significantly improved after taking the Medical Spanish course (Fig. 1). The improvement in Medical Spanish comfort level was sustained 1 year into residency training. Regarding their general Spanish interviewing skills, 100% of 1-year follow-up respondents reported comfort with introducing themselves to patients in Spanish, and 73.0% (27 out of 37 students) reported comfort in performing a simple, problem-focused patient interview in Spanish. Also, 78.4% (29 out of 37 students) further indicated that their Spanish skills were perceived as an asset during their internship, 89.2% (33 out of 37 students) reported that the Medical Spanish elective was useful for their intern year, and 97.3% (36 out of 37 students) reported that they would recommend the course to other fourth-year medical students.

Fig. 1
figure 1

Pre- and post-course comparisona of student comfort level with Spanish interviewing and physical examination skills. aAll pre- and post-course comfort levels were significantly different, P < 0.05. P values based on t test

Mean OSCE assessment scores from SP-rated feedback checklists, faculty-rated feedback checklists, and faculty-rated case note ratings are presented in Table 5. Interview scores in Table 5 are summary data of student OSCE performance, as rated by faculty and the SP. The faculty note scores refer to how accurately the students reported the information obtained from the Spanish-speaking patient into the English standard case note, therefore reflecting the comprehension of the communication.

Table 5 Descriptive statistics: OSCE assessment data

Comparison of mean SP-rated fluency and faculty-rated fluency using t tests showed that they were each significantly higher than the student self-rated pre-course fluency (P < 0.001). By the end of the course, the students’ self-rated post-course fluency was still significantly below the SP-rated fluency (P = 0.027) but did not statistically differ from the faculty-rated fluency (P = 0.052). Post-course self-rated fluency levels were significantly higher than faculty-rated ability to perform the medical interview (P < 0.001). The fluency score refers to the fluency of the student’s Spanish speech including general accent, vocabulary, and conjugation skills, but does not take into account specific task-performance or comprehension.

Students’ reported post-course comfort level with all elements of the medical interview was significantly higher than pre-course comfort levels, as shown in Fig. 1 (P < 0.05). We compared student self-reported post-course comfort level with performing parts of the medical interview with the faculty-rated score for the corresponding parts of the OSCE. Students’ self-reported post-course comfort level with introducing themselves to patients, eliciting the chief complaint, obtaining the medical history, obtaining the medication history, and explaining the diagnosis and plan did not show a statistical difference from the corresponding faculty OSCE rating. However, the self-reported post-course comfort level with obtaining a history of present illness (P < 0.001), obtaining the allergy history (P = 0.01), and establishing cross-cultural rapport with the patient (P = 0.03) was higher than the faculty OSCE rating for those elements.

One year into residency training, 37.8% of respondents (14 out of 37 students) reported that Spanish-speaking patients comprised on average greater than 25% of their intern year patient exposure, and 59.5% (22 out of 37 respondents) reported primarily using Spanish with their Spanish-speaking patients, while the remainder spoke Spanish with patients sometimes and used a Spanish/English interpreter as needed.

Discussion

Students enrolled in our Clinical Medical Spanish course were able to improve their general and focused Spanish interviewing and examination skills by the end of the 10 weeks of instruction. While some prior data suggests that learners may overestimate language fluency skills prior to being exposed to learning or testing the material [14], our data suggests that a robust extended-duration faculty-led course with opportunities for simulated formative assessment enables students to reach a balanced and informed understanding of their own language skill set in a medical context, and to sustain these skills into residency.

We found that there are differences between SP interview rating, faculty interview rating, and faculty case note rating of the students’ performance in the Medical Spanish OSCEs, suggesting that assessing medical skills in a second language may be intrinsically more complex than standard U.S. medical school English OSCE assessments [21]. Work on bilingualism and language acquisition has suggested that learning a second language may also have significant effects on the brain’s executive function and task-switching [22], two critical cognitive areas involved in medical interviewing and decision-making. Our three-part assessment model provides a multi-layered understanding of students’ ability to care for Spanish-speaking patients, which may be incomplete if one of these assessments is omitted. For example, SPs provide an evaluation of fluency and overall interview skills from the perspective of patient-comfort and lay-person comprehension, whereas faculty provide feedback that is focused on the medical accuracy and medical skill set involved in the communication.

Student fluency levels were rated by both faculty and SPs in the OSCE assessments as significantly higher than what the students had self-rated prior to the course, suggesting that there is an improvement in basic Spanish fluency level through this extended duration Medical Spanish elective. Data is mixed regarding the accuracy or self-reported fluency levels prior to formal testing [15]. It is possible that the students were underestimating their fluency prior to the course start, and the OSCE experiences and feedback empowered them to gain a more nuanced understanding of their Spanish-speaking abilities in a medical context.

Fluency levels differed significantly from medical interviewing ability. This suggests that having a good general fluency level is not necessarily equivalent to being able to utilize Spanish adequately in a medical setting, supporting the concept behind training medical students with intermediate and higher levels of basic Spanish in the appropriate use of Medical Spanish. This finding reinforces the critical notion that although hiring bilingual staff is an important step in reducing language barriers for institutions, in order to be effective and safe, this initial effort must be supported by substantive medical language skills assessments and training as well as staff education on appropriate medical-interpreter resource utilization [10, 23].

The OSCE-based feedback provided both at the course mid-point and end-point tells students what focused skills they need to work on, rather than just providing a score that can be difficult to interpret. Examples of qualitative feedback provided include comments on how a student can introduce him or herself more professionally in Spanish, comments about working on past-tense conjugation, a suggestion to improve consistency between usage of formal and informal pronouns when addressing the patient, as well as a recommendation to develop a higher awareness of psychosocial cultural issues that may play a role in a particular patient. In some cases, faculty recommends the use of a medical interpreter to students on a limited or general basis until specific Medical Spanish learning goals are achieved.

Our data suggests some differences between student self-rated comfort with specific Spanish interviewing skills and the faculty-rated performance on those tasks in the OSCEs. It is relevant to note that if the student forgot to ask a particular question (e.g., if he/she forgot to ask about allergies), this would be marked as a low score on the faculty rating, but may not necessarily determine whether or not the student is capable of asking that particular question. Nonetheless, remembering to ask critical pieces of the history is an important component of being able to successfully complete a medical interview, which is more difficult when done in a second language. The OSCEs thus provide another level of utility to students by showing them what deficits they may have in the context of a full medical encounter (such as tendency to forget to inquire about allergies), even though they may believe their ability to complete each element independently is adequate. A formal assessment via OSCE as part of a Medical Spanish course may help ensure that students are well prepared to care for Spanish-speaking patients, including having a safe and realistic awareness of their own limitations. The students themselves reported a high degree of praise for the Medical Spanish curriculum and OSCE experience as a significant contributor to their ability to understand their functional Spanish interviewing abilities and deficits.

The study is limited by relatively small sample size of student participants from a single institution. However, data were based on two cohorts of fourth-year medical students and included a post-medical school survey. As a voluntary survey, the post-medical school respondents may have self-selected for individuals who found higher utility in the course, whereas those who did not find the course to be helpful long term may have not bothered to respond to the survey request. Finding and training native Spanish-speaking SPs for the course’s OSCEs was a challenging component of the course. Inter-rater reliability could not be obtained in the current study design since double rating by multiple SPs or faculty members was not performed, but could be considered for future studies. OSCE feedback forms that are completed by the faculty and the SP had to be balanced between increased reliability, by raising the number of questions in the form, and increased accuracy of responses, by keeping the number of questions lower and more manageable.

New Contribution to the Literature

Our curriculum for teaching and assessing Medical Spanish is the first of its kind and can be applied at other medical schools and residency programs. Spanish OSCEs can be incorporated as part of the curriculum to assess student progress and also as a proficiency examination to evaluate students’ or other medical staff members’ ability to independently conduct medical interviews.

Future investigations should evaluate data from a larger and more heterogeneous sample of students, and should compare student performance in Spanish patient interviews with their performance in English patient interviews as a means to control factors other than language in individual students’ performance. Our course’s future OSCE assessments will further include evaluation of practical patient education skills, in which students will be expected to explain a complex medical concept such as describing a pelvic or rectal examination, explaining proper use of peak flow meter, or describing the risks and benefits of a recommended treatment. In addition, our OSCE scoring will be revised to correlate more closely with the scoring system for standard English OSCEs in order to facilitate comparisons among the two data sets.

Offering a substantive clinical Medical Spanish educational experience to medical students is an innovative and compelling solution to increasing the pool of effective bilingual Spanish-English physicians in the U.S. The formal evaluation of these students’ skills via Objective Structured Clinical Examination is an important element in demonstrating skill attainment and safety in interviewing patients in Spanish.