Keywords

1 Introduction

Today, modern information and communication technology (ICT) in omnipresent and increasingly affects our daily lives. Due to its wide distribution, many people have constant access to the great stock of information available on the internet and elsewhere. But to be able to really take advantage of information as a resource, one needs information literacy. By investigating different definitions and models of information literacy, Stock and Stock [1] identify two threads: The first focuses on skills for information retrieval. “It starts with the recognition of an information need, proceeds via the search, retrieval and evaluation of information, and leads finally to the application of information deemed positive.” The second puts emphasis on skills for knowledge representation. It includes the “creation of information”, “representation and storage of information” and issues of information ethics, law and privacy. No matter what definition of information literacy you look at, it becomes obvious that information literacy is a core competence for both social and economic participation in the information age. This becomes especially clear at urban level in so-called informational world cities. These “prototypical cities of the knowledge society” [2] are characterised by their “space of flows (flows of money, power, and information) [that] tend to override space of places” [2]. Compared to traditional industries, especially the creative industries and the knowledge economy take on greater significance in informational cities, which leads to a so-called job polarisation: Routine tasks that used to be done by employees are now executed by computers with increased regularity, leading to the loss of jobs in the middle class. This results not only in “a gap between rich and poor” but also between “educated and uneducated people” [3] – the digital divide. To manage the flows that define informational cities, companies and public authorities, citizens must be able use technologies appropriately to search for, produce and use needed information. Here, information literacy plays a major role and enables people to participate socially and professionally, giving them an advantage at school, at work and in their everyday lives [4]. It must be said, however, that most people today never had any information literacy education. And although the importance of information literacy is widely recognised on an academic level, there is plenty of research showing “evidence that many students are information illiterate when they enter institutions of higher education.” [5] Furthermore, “despite clear evidence that sophisticated information literacy skills are beneficial to academic success, students are generally unsophisticated information seekers in academic contexts.” [6].

The purpose of this study is not only to assess the status of information literacy among students, but also to attempt an international comparison. By the means of a multiple-choice questionnaire, we assess the level of information literacy among university students of informational cities in Canada and Germany, allowing a comparison between the two countries for the different competence areas of information literacy. After presenting our results, it is necessary to discuss what can be learned from this approach and whether such a comparison can be beneficial to improve information literacy education or if a comparison is even possible.

Different tools to assess the state of information literacy, especially among students, already exist. The Information Literacy Test (ILT), developed at James Madison University, is one of them [7]. It is based on and covers four of the five aspects presented in the ALA standards [8]. The actual use of information is excluded, as it cannot be covered in a multiple-choice test. Regarding the total score, the student is classified as “below proficient” (< 65%), “proficient” (≥65%) or “advanced” (≥90%). Smith et al. used the ILT at high-schools in Canada and revealed that 80 out of 103 students of the 12th grade were classified as “below proficient” [9]. Another method is the Standardized Assessment of Information Literacy Skills (SAILS) [10]. SAILS utilises eight skill sets, based on the ALA standards. Beutelspacher [11] developed another assessment tool, a multiple-choice questionnaire available for the following target groups: “7th grade”, “10th grade”, “high-school graduates and students”, “teachers” and “scientists”. It is based on 62 indicators for information literacy, divided into seven spheres of competence:

I. to identify an information need

II. to search for and find information

III. to evaluate information

IV. to use information

V. to organise information

VI. to communicate and publish information

VII. responsible handling of information

These indicators which represent a “generic list of skills which should be mastered in order to persist in a knowledge society” [11] were collected by evaluating contemporary definitions, models and standards of information literacy. It is important to note that Beutelspacher’s questionnaire tests skills in information retrieval, similar to the ILT and SAILS, but also includes skills in knowledge representation. This second thread of information literacy has become more and more important and should not be missing in any assessment tool. It is the main reason this tool was chosen for our study.

2 Methods

To test information literacy skills, Beutelspacher’s questionnaire version for high-school graduates and students was used. It consists of 41 different multiple-choice questions leading to positive and negative scores. As an example, question 10 of the test is shown below. A complete list of all questions and possible answers can be found in the appendix.

Question 10 : If a search engine retrieves too many web pages, what should you do?

• Use advanced search.

• Only use one search engine.

• Only look at the first ten search results.

• Use the “help”-function.

• Add further search terms.

• Delete some search terms.

• I don’t know.

Checking the answer “I don’t know” leads to 0 points. The maximum score is 69 points. Participants are classified as “not information literate” if the total score is below 50% (34.5 points). With a total score of at least 50% they count as a “beginner”. The “advanced” level starts at a total score of at least 75% (51.75 points). Our target groups were students attending universities located in informational world cities [2]. We further limited the first round of our survey to two countries: Canada and Germany. In each of those two countries, there are currently 3 cities identified as informational world cities by Mainka [12]: Montreal, Toronto, Vancouver and Berlin, Munich, Frankfurt. There are 14 universities located in those cities. An online survey (English and German) was set up and the link to the questionnaire was distributed among Facebook groups associated with those universities. Literature shows that many students are using Social Networking Sites (SNSs) on a regular basis. Facebook is one of the most popular SNSs [13], especially for students [14, 15]. We identified Facebook groups for this study by searching groups containing the university’s name in its group title. Beforehand, the administrators of the groups were asked for permission. The survey link was posted in 128 different Facebook groups. The distribution started in February 2014 and ended in October 2014. Due to the long processing time of the voluntary questionnaire, a low participation rate was expected [16]. A raffle (gift cards) was added to the survey as incentive to raise the participation rate and to finish the questionnaire.

To test Beutelspacher’s questionnaire in terms of internal consistency, Cronbach’s Alpha (α) was calculated [17]. In addition to that, a t-test shows whether differences between the total score of Canadian and German students are significant.

3 Results

In total, 892 students participated in the survey. 291 Canadian (109 male; 175 female; 7 preferred not to say; average age: 21.3 years) and 601 German students (203 male; 398 female; average age: 23.3 years). 154 of the 291 Canadian students were based in Montreal (52.92%), 74 in Vancouver (25.43%), 63 in Toronto (21.65%). 395 of the German participants were studying in Berlin (65.72%), 151 in Frankfurt (25.12%), 55 in Munich (9.15%). Since Berlin and Montreal offer more universities than the other cities, their strong participation was predictable. Most of the participants were aiming for a bachelor degree or state examination (Canada: 83.85%; Germany 75.04%). 16.15% of the Canadian and 24.96% of the German students were in a master or PhD program at the time of the survey. Over 40 different major subjects were represented. On average, German students scored 48.62 (70.46%) and Canadian students scored 44.63 (64.68%) out of 69 points (maximum score). A significant difference between the two groups was verified (p < 0.001). 13.06% of the Canadian and 4.83% of the German students were judged to be “not information literate”, while the greatest share of both groups (Canada: 65.64%; Germany: 56.24%) reached the “beginner”-level. Only 21.31% of Canadian and 38.94% of German participants were classified as “advanced”. Table 1 in the Appendix lists all items as well as the arithmetic mean of point scores for both countries and the significance value (p) of the t-test. A significant difference (p < 0.05) between the results was found in 25 cases. It should be noted that an equal variance is given in items 2, 9, 10, 13, 15, 17, 19, 23, 26, 30, 33c, 37 and 38 only. All other p-values were calculated with the total score of the students.

According to the six spheres of competence tested, it is observed that German participants scored higher in every sphere (Fig. 1). Compared to the other spheres, the results of both groups in sphere V (“to organise information”) are noticeably low. The distribution of information literacy skill level by gender showed that in both countries, male participants had more members in both the “advanced” and the “not information literate” category. On average, however, female participants scored slightly better. When comparing students by desired degree, it stands out that “Bachelor of Science” and “Master of Arts” students had the best results, also, no master student from Canada was classified as “not information literate”. This improvement cannot be observed for German students. Here, the best results were achieved by “Master of Arts” and “Bachelor of Arts” students. Comparing Canadian and German students who aim for the same degree, significant differences could be found within the groups “Bachelor of Arts” (p ≤ 0.001), “Master of Arts” (p ≤ 0.001) and “Master of Science” (p = 0.033). An equal variance is given in each of these groups.

Fig. 1.
figure 1

Average results for each sphere of competence (Canada vs. Germany)

For all test items Cronbach’s Alpha (α) was 0.814, which is an indicator for a “good” internal consistency and a “reasonable goal” [18].

4 Discussion

By means of a multiple-choice questionnaire, we are able to take a glimpse at the current status of information literacy among young citizens of informational cities in Canada and Germany. Overall, the results are in conformity with other studies assessing student’s information literacy [5, 19], which means that measured information literacy levels were relatively low. The fact that students in a master program achieved noticeably better results than their colleagues aiming for a bachelor degree, indicates that students at least improve their information literacy skills during their academic career. The results of the comparison indicate that there are noticeable differences in the information literacy skills of German and Canadian students. On average, German students obtained a more favourable result in all of the six measured spheres of information literacy competence. For the most part, this proportion can also be seen in the numbers according to city, gender and target degree. It is necessary to investigate why the number of advanced students in Germany is that much higher than in Canada. By taking a closer look at the results of PISA [20], a study measuring, among others, the competences of 15-year-old students on an international scale, Canada’s students scored very well when it came to digital skills, while students from Germany showed an average performance. But would a similar study in the field of information literacy show the same results? Furthermore, academic and public libraries in informational cities are supporting schools and universities in promoting information literacy among citizens and students [21] while this is not as common in Germany, where the term “information literacy” is known to few. The question arises, what results a different method of assessment would have yielded.

The use of a questionnaire has the advantage that less time and resources are needed, compared to an interview. Also, no influence by an interviewer’s behaviour is possible. Additionally, participants experience written surveys more anonymously [22]. A questionnaire yields objective, reliable and comparable results. Every test person is given the same questions and answer options, which are explicitly right or wrong. Moreover, the results of this survey can be compared with future surveys of the same kind [11]. While a multiple-choice questionnaire has advantages, it is also limiting the assessment of competences, workflows and other aspects. Especially now, that the definition of information literacy is shifting from a catalogue of standards to a framework of “core ideas” [23], it might not be as easy to create questionnaires, which are able to asses this “new” definition of information literacy. In general, it is difficult to measure information literacy skills in a holistic way, since these are higher-order skills much more complex than assessible by a short questionnaire [24]. Since students were not monitored while filling in the questionnaire it is possible that participants were looking for answers with help of a search engine. Most students filled in the survey in the presumed time which leads us to believe that students were usually not using any help. However, the possibility that a student is guessing or picking any answer randomly is still given.

Although almost 900 students participated in our survey, it is not possible to draw general conclusions yet. A greater number of students — from different major subjects and faculties, and with different degrees — is needed, to get results that are more representative. Also, an equal distribution of participants from each city or university is needed, to draw a more detailed analysis. Up to now, none of these assessment tools have been used on a national range. Luckily, we may soon be able to see results of the International Computer and Information Literacy Study (ICILS), a computer-based international assessment and comparison of eighth-grade students’ computer and information literacy [25]. These results could provide valuable information, for example on when and how to promote information literacy skills among students. Different institutions could learn from each other. If a real difference existed, what could be reasons for those? Furthermore, results could be analysed regarding correlations with programs offered by universities and libraries to promote information literacy skills. Does the availability of such programs lead to better results?

While comparing results from different countries, this is by no means seen as a competition. It is rather an opportunity to learn about differences and to teach each other. But first, to find out more about the origin of the differences in results, further information is needed. For example, personal interviews at the participating universities, not only with students but also with teaching faculty, could help us to gain further insight. We chose Canada and Germany for this study, because we deemed them to be relatively similar. But when comparing different countries, cultural differences, linguistic peculiarities, distinctions in school systems and infrastructure have to be taken into account as well. These and other factors can turn a simple comparison into a challenge. And if this challenge were mastered, we still had to ask ourselves whether our definition of information literacy is the same of our neighbours. And does it have to be?