Introduction

Citizen science is established on the involvement of volunteers in scientific research with the goal of increasing the scope, scale, or diversity of scientific practices [1]. However, the sustainability of citizen science depends on the degree to which members of the public are interested and provide continued contributions. As such, one of the biggest challenges in citizen science is recruiting and retaining sufficient participants to achieve a project’s goals. Participation in citizen science varies greatly across projects and platforms, but projects can be at risk to be cancelled if they cannot reach a critical mass or keep their volunteers engaged [2]. Within the citizen science community, several researchers including Newman et al. [3] and Sørensen et al. [4] have pointed to games as an opportunity for doing citizen science. While the existent literature has mainly examined motivation, quality of player contributions, aspects related to content learning and education, and system/task design in online citizen science games [5], both player experience (PX) and quality of experience (QoE) of games of this type appears to be under explored.

In this paper, we focus on understanding the QoE of Quantum Moves (QM), an online game contributing to problem optimization in quantum physics [4, 6]. To do this, we conducted an exploratory study on the individual interaction with the game by players who took part in the study. This premise is consistent with the argument that the individual (socio-) psychological level is the constituent aspect of QoE [7]. The paper is structured as follows. First, we identify research gaps from existing research and then we will introduce the exploratory survey and the analytic methods we used. Then we present our findings. We conclude with a discussion and reflections on the implications of the findings for the game QM.

Player experience and quality of experience

In this section, we discuss the concepts of player experience (PX) and quality of experience (QoE). Both refer to the notion of experience—or experiencing—which has been defined as “the individual stream of perceptions (of feelings, sensory percepts and concepts) that occurs in a particular situation of reference” [8]. In terms of the application domain of this paper, experience results from the encounter of a player with a serious game. This encounter is what we can refer to as PX. PX has been a growing field of research in the last decade [9]. In the literature, there are a number of different understandings of the term PX in games. Here, we define PX as “the individual and personal experience of playing games” [7]. For Wiemeyer et al., PX can be described in terms of the characteristics of the interaction between a player and a game and is typically investigated during and after the interaction with games. These authors distinguish three levels of PX: (1) the (socio-) psychological level (individual experience), (2) the behavioral level, and (3) the physiological level. The individual and personal experience comprises a number of dimensions, including intrinsically motivated actions, ambivalence and openness to both procedure and outcomes, immersion, satisfaction and fun of gaming, among others. The behavioral level describes specific observable behavior (like laughing, smiling or frowning), while the last level describes physiological reactions (like increased heart rate or blood pressure) [7]. According to Wiemeyer et al., PX should be distinguished from player types, because the former denotes a transient and dynamic construct (state), while the latter denotes a more or less stable and static construct.

PX is different from QoE. The notion of experience—and similarly that of PX—as defined earlier does not imply an evaluation of the quality of experience. As noted by Raake and Egger [8], evaluating the quality of experience results from cognitive processes related to the experience. They proposed a definition in which QoE results from the individual evaluation of personal expectations and needs with respect to the enjoyment or annoyance experienced while interacting with an application, a service or a system. As the application domain in this study is a serious game, this definition is adequate because it goes beyond the mere use of the game and embraces non-instrumental, hedonic and affective aspects that are central to understanding the quality of experience when playing the game. In addition, the inherent subjective and individual character and the context-dependency of QoE are also implied by this definition [8].

QoE is a highly complex and subjective concept to evaluate. QoE concerns the individual evaluation of a system performance. Such evaluation is influenced by contextual factors, culture, psychological characteristics and individual expectations and needs, among others, with respect to a system. In the context of games, previous studies have mostly focused on single dimensions of the experience with the game. Positive dimensions have included enjoyment [10], flow [11, 12], immersion [13], and effect and control [14]. Negative dimensions have included boredom and tension relating to the difficulty of a game or the competition a player feels. Apparently, game research has not paid much attention to these negative dimensions, even though they contribute to the challenge of a game and are likely to influence the QoE [15]. A reason for the focus on single dimensions might rest in the need for a multi-disciplinary and multi-methodological approach to the study of the QoE [16] Methods to assess the experience with a game include psychophysiological player testing, eye tracking, game logs, surveys and questionnaires, among others [17].

QoE in online citizen science games

Online citizen science games are a subset of citizen science projects conducted entirely via the Internet, in which citizens play an enjoyable game and at the same time generate useful data by performing a computation or task which cannot (yet) be performed by computers alone [18]. Both full-fledged games and elements of games or ‘gamification’, where game-related and external motivators (such as points, leaderboards, or achievements) are applied to non-game contexts [19], are used in citizen science to develop applications that invite citizens to collect data, annotate images or documents, or solve difficult scientific problems. Research on the use of games in citizen science is relatively novel but expected to grow rapidly (for a more comprehensive review on online citizen science games, please see [20]). Most of the existing studies thus far have explored the value of games for motivating and engaging volunteers (e.g., [2, 21,22,23,24]). A systematic mapping review of empirical studies on citizens involved in research through gaming showed that, besides motivation, the topics examined included the quality of player contributions, aspects related to content learning and education, and system/task design [5], but not PX and/or QoE of citizen science games. Even though studies of PX have reported that the various features designed into games can influence experience significantly and can further result in the retention of players (e.g., the speed of player interactions and time pressure affected game immersion [25]), there has been very little research looking specifically at citizen science games [26]. A search of the Scopus and Web of ScienceFootnote 1 databases retrieved no articles about QoE of citizen science games and only two empirical studies of PX in citizen science games. The first study was conducted by Prestopnik and Tang [26], who examined how two different reward structures, one point-based and the other story-based, impacted PX in two citizen science games. Their findings showed that participants strongly preferred the story-based game over the points-based game because in the former the focus on story-motivated activities and rewards made the citizen science task more enjoyable and gave participants various reasons to continue to play. The authors argued that a story-based game allows players to engage with an entertainment-oriented game world that only occasionally requires them to act as a “citizen scientist,” unlike a point-based game where players earn points and other rewards specifically for engaging with the science, with these activities comprising the majority of the game experience. The second study was conducted by Hess et al. [27], who conducted a small user study (20 participants using convenience sampling) with a short version of the Game Experience Questionnaire to measure PX through the improved mechanics of a computer game aimed at solving protein sequence alignments. The results indicated that players had fun playing the game and also had a substantial feeling of competence and challenge in the game. The majority of the players were neither bored nor found the game tiresome.

After conducting the exploratory research on QoE for the game QM, we recognized some findings that could help to address the research gap in the existing literature regarding motivation and engagement of citizen science game players.

The game: Quantum Moves (QM)

QM [4, 28] is a dynamic puzzle game about single-atom transport problems related to the development of a so-called quantum computer [6]. QM is the first citizen science game developed by ScienceAtHome (SAH). SAH is a research group working at Aarhus University where scientists and game developers are developing a platform which includes a variety of tools and games for research in different fields in citizen science. QM presents players with the task of finding the optimal (efficient-yet-quick) way of moving an atom (visualized as a liquid-like substance in the game due to its interpretation as a quantum wave phenomenon) from an initial location to a target area (Fig. 1). The path (‘hit’) created by the player is stored in a database and post-processed with further optimization as detailed in [4]. QM has been played by over 250,000 players around the world, generating over 6 million solutions to the 23 puzzles (levels). For each trial of a QM level the player gets a score and a star rating (zero to three stars). The score positions players on a leaderboard ranking as a motivation to improve their performance. The user research presented in this paper builds on two previous studies of QM. In 2014, Lieberoth et al. [29] conducted a study with 1190 players of QM characterizing the types of players who play the game. Lieberoth et al. found that there were two types of players who play QM, the ‘heroes,’ who play methodically, are intrinsically motivated and connect with the science behind the game and the casual players whose motivations and skill levels are fluid. In 2017, Pedersen et al. [28] conducted a study with 150,000 players of QM inquiring about the influence of different types of leaderboards on motivation and performance in the game.

Fig. 1
figure 1

Interface used by the player of QM. The player has to move the purple liquid-like substance on the wave to a specific point by means of the concentric circle (which represent an optical tweezer)

Data and methods

As part of our research on the QoE of QM, we constructed a 20-question post-play questionnaire to assess players’ motivations for playing the game and possible improvements for future updates of the game. Initially, we decided to build a custom questionnaire as the main goal of the research was to assess the game from a development perspective while addressing the challenges that pose to study citizen science games. That is, questionnaires targeting commercial entertainment games inquire for PX from a marketing perspective, and questionnaires targeting learning games inquire for the relationship between gameplay experience and content learning. Players of citizen science games can be driven to play by different motivations not explored in those questionnaires, such as interest in helping the development of science or the scientific topic behind the game.

The authors who worked at SAH and were familiar with the study population defined and discussed the questions. The purpose was to gather information about the way players relate to the game. Between March and April 2017, we distributed a cross-sectional, self-administered online questionnaire to approximately 25,600 people included in the SAH mailing list. The questionnaire included two parts (Table 1). The first part consisted of 13 compulsory questions, of which three were only for participants who play games regularly. The first four questions were about general gaming habits and were included for enquiring into how gaming preferences were related to recruitment and retention in QM. The next six questions enquired about player perceptions of the game, and player motivations before and during gameplay. The last three questions probed inquired on the experiences with the SAH platform in general to investigate the efficacy of the infrastructure around the game. The second part of the questionnaire was voluntary and contained four questions about sociodemographic player characteristics and one free text question (Q19) enquiring into the individual and personal PX, which is considered the constituent dimension of PX [7]. In this paper, we focus on the results from the analysis of the responses to the free text question, while characterizing the players of the game by using the sociodemographic data provided. The purpose of these analyses was to provide feedback to the project from a scientific, design, and engagement perspective; which in turn generated a set of limitations of generalizing our findings. This study follows all the ethical regulations for research in the Social Sciences within the European Union [30]. Survey respondents were asked to sign an electronic consent form. The survey data did not contain personal identifiable data and was stored in a password protected electronic format.

Table 1 Questionnaire for assessing players’ motivations for playing QM

In addition, Q20 prompted our respondents to take the Bartle test for players who play MUDs (designed by Erwin Andreasen and Brandon Downey). The Bartle test is of wide use in the game design industry as it gives a general perspective on what players like from the game and how players relate to it, allowing for balancing and improving retention on a game based on its target population. However, due to the reported limitations of the test [31] and the lack of accessibility to its psychometric proprieties, we decided to do a basic report on this data. We highlight that out of the 743 respondents, 630 took the Bartle test and reported back to us, suggesting that these type of self-assessment tools are of interest for our players. Many players reported belonging to more than one category (probably because the test reports on more than one category). The category of ‘Explorer’ was the most prevalent (528 participants) followed by the categories of ‘Achiever’ and ‘Socialiser’, (126 and 115 participants respectively) and with only 32 players belonging to the category of ‘Killer’ (distribution of the players provided in “Appendix 1”). This suggests that players whose motivation is to explore might be the ones most attracted to QM.

We used two methods to analyze the survey data: (a) descriptive statistics for analyzing frequency and distribution of responses according to age, gender and occupation, and (b) iterative thematic analysis [32] for identifying themes within the responses to the free-text question (Q19) about what players liked or disliked from the game, what they would like to improve, and how they played the game and created strategies. Out of the 743 respondents, 121 (16%) did not answer Q19 at all, and 163 (22%) gave a response containing less than eight characters. Seventy-nine responses were longer than 50 words, with one response being 577 words long. The shortness of responses and the “bittiness” of data made the identification of patterns challenging. Two of the authors coded the entire corpus of the free-text responses, using a coding frame developed inductively and iteratively after discussion. The coding frame represented the topics and subtopics identified in the responses (see “Appendix 2”). Most responses were related to multiple codes. For example, a response could be associated with “Engagement” as well as “Strategy Making.” To validate the coding consensus between the two researchers, we followed the proportion agreement method by Campbell et al. [33], finding an agreement of 70%. Additionally, we used Mezzich’s Kappa [34] as an extra reliability method, finding an agreement better than chance (p < .05).

Results

Sociodemographic characteristics of participants

We obtained responses from 817 participants (3.2%) from a total of 25,587 people who received the invitation to participate in the survey. Eight-hundred-and seventeen participants filled the first part of the survey, from which 743 also filled the second part. Eighty-two percent (N = 607) of the respondents were male and 16% (N = 116) were female; 2% (N = 20) of the respondents did not disclose their gender. The most common age bracket was 26–35 years (N = 193, 26%) (Fig. 2). The majority of participants stated working full time (N = 310, 42%) or studying at school or university (N = 268, 36%) (Fig. 3). Lastly, 80% (N = 660) of participants reported playing games in general, while the remaining 20% reported that QM was an exception.

Fig. 2
figure 2

Distribution of respondents according to age and gender

Fig. 3
figure 3

Distribution of answers to Q6. Reasons why people played QM

QoE and units of meaning

We summarized the frequency of topics identified in the responses to Q19. Although most of the answers described more than one topic and subtopic, several key topics and subtopics predominated, as depicted in Tables 2 and 3 (for a full view of the codes and units of meaning identified in the analysis please refer to the “Appendix 2”).

Table 2 Frequent topics found in the responses to Q19
Table 3 Frequent subtopics found in the responses to Q19

Additionally, we performed a co-occurrence analysis on matching pairs of topics. As Table 4 shows, “engagement” and “strategy making” appear to co-occur relatively often (23 times in total), suggesting that players who enjoyed the game the most were also those more focused on finding and creating strategies to solve the challenges of the game. As a close runner-up, the co-occurrence of “engagement” and “game design suggestions” accounts for the fact that players who engaged either positively or negatively with the game made suggestions for improving the game.

Table 4 Most frequent co-occurring topics on the surveys

Finally, using a thematic analysis we identified recurrent topics and subtopics in the responses to question Q19. As noted earlier, the shortness of responses to this question and the “bittiness” of data, rather than a longer narrative, made the identification of patterns challenging. However, we identified three broad, cross-cutting, themes at the socio-psychological level, which are reflected on the results of the coding: (a) engagement; (b) learning, and (c) contribution to science. We found that aspects of the participants’ reported experience overlap across these themes. This is consistent with the complexity of construction of understandings and meaning of experiences, which are not built on isolated concepts but are relative to each other. To illustrate each theme, we selected a sample of poignant and representative responses from the survey.

Engagement This theme is defined here by how players perceived: their involvement to the game in terms of time, effort and interest; their balancing match between skills and challenges; their feelings when playing the game and about their achievements (or lack of); their appreciation of the game design features, the game purpose of the game, and whether it was valuable to them. The analysis of responses indicates that a positive—or negative—experience relates primarily with the aspects of engagement, which, in turn, is often associated with the ability to play successfully, the actions chosen to perform well, and how well the design of the game is perceived. The following four responses illustrate how engagement embodies both negative and positive types of experiences. In the following response, the co-occurrence of positive engagement and game design suggestions indicates an interest of the player to improve the game, through the lens of his/her own experience, and provide player support.

R01: “I liked the simplicity in the display and interface, if I were given the opportunity, i would try to add another layer to the game, would a 3d model be more accurate than a 2d in the quantum world? Perhaps more examples of quantum mechanics through mini games for players to better understand the attributes associated with the ‘liquid data.”

In the following excerpt, perceived enjoyment is associated with the challenge of the game, although the player feels being hampered by his/her limited knowledge of quantum physics:

R02: “I liked the challenge of the game. I think more help or explanation could be given early on with the option to read about how Quantum Moves relates to quantum physics. Explanation would be nice but I do concede I may not fully understand the physics anyway, that is why you made the game. [I] tried out a couple different strategies. Mostly i thought speed, displacement affected the movement. I considered playing around with oscillating the mover but I couldn’t find a easy to approach it systematically.”

This response is grounded in the observation made by the player about his/her poor understanding of physics. The player struggled to reconcile his/her appreciation for the game (“I liked the challenge of the game”) with the unsuccessful experience of trying different approaches based on a limited understanding of the game mechanics. Knowing quantum physics is not a prerequisite for playing Quantum Moves; however, learning a bit about the science can enrich the gameplay experience, as for R03 the player stated:

R03: “I liked being able to experiment and move around in random ways to see how it reacts, and I found the game more enjoyable after learning a bit more about quantum physics and having a better understanding of what i’m doing. Though I would like to see a more clean and simpler interface.”

This player used trial and error, the most frequent strategy reported by respondents (7.46%). Players seem to resort to this approach they do not understand the rules of the game. When using trial and error, players proceed mainly by exploring and manipulating elements of the game in an effort to sort out possibilities and to run across steps that might carry them closer to the solution. This behavior is most likely to be reported when players lack knowledge about the character of the solution, or when no single rule seems to underlie the solution.

Responses often offered an assortment of experiences, linking one or more elements of appreciation with unmet expectations. In R04, the player appreciated the representation of a complex problem in what s/he considers a casual game, but s/he ended up quitting the game early on:

R04: “It conveyed a complex problem with a simple visualization, the graphics were complex enough to give a sense of ‘high tech’ but still not distracting. I think it did a fantastic job in presenting the issue(s) and made the challenges compelling for a casual game. Not fun enough to return more than once since the task was quite monotonous and ‘special mindset’ type of problems.”

Learning This theme encompasses three main strands of learning identified in the responses: learning about the game (subsuming both learning about the purpose of the game and how to play the game), learning about quantum physics, and learning and developing one’s own skills. The theme is defined by how well players with various levels of experience and skills could approach and play the game, and how they felt about the opportunity to learn something about quantum physics and the how the game works. The following five responses illustrate the three strands in the theme. For example, R05 displays an appreciation for the challenge the player faced when playing the game, which s/he saw as an opportunity to learn to be creative and use deductive reasoning, without much scaffolds:

R05: “What I really liked about the game was that it takes some time to really figure out how to complete a level. And that the instructions gave you just a little hint about it, but the rest is about yourself, creativity and deduction skills.”

In contrast with this response, the player in R06 perceived an imbalance between the complexity of the game and the support provided, resulting in a steep learning curve:

R06: “Both too simple tutorials and challenging game, too steep learning curve.”

In most of the answers under the learning curve label (4.9% of the total answers), players reported being stuck in the game because the curve was not adapted to their abilities. Within this set of responses, we were able to identify several instances of frustration resulting in quitting the game. We also found also some instances of enjoyment of the challenge leading to learning about a difficult topic, as stated eloquently in R07 and R08:

R07: “It was really challenging and I loved that aspect. I like problem solving and trying to come up with new ways to solve problems. I also liked learning more about quantum physics so that was really a bonus!”

R08: “I liked the steep learning curve and how it creates insight in a visual way in quantum mechanics.”

When the challenges associated to the learning curve did not result in frustration, they were turned into appreciated learning opportunities. For example, in R09 the player engaged with “discovery learning” [35, 36], drawing on his/her own experience as a method through which s/he interacted with the game by exploring and manipulating objects and wrestling with strategies.

R09: “I liked the process of creating and refining strategies to improve the score on the level. I would try something weird to see what happened, then repeat the level while slightly modifying the strategy to see what change that may have made. I discovered that smooth moves are much better than sharp, jerky moves, and that any change in the speed of the mouse would induce some ‘waves’.”

Contribution to science This theme is defined by players’ reported intrinsic interest in contributing to science, how clearly players perceived the scientific purpose of the game and how they felt about the interaction with scientists at SAH. The analysis of responses indicates that the interest in contributing to science is often combined with aspects of engagement. We discuss a sample of four responses to illustrate this theme. Of the respondents, 5.9% expressed their interest in contributing to science, as this player stated clearly:

R10: “I heard of this research on the NPR Program Science Friday. I love the idea of crowdsourced research and have a laypersons interest in the theories of quantum physics.”

To feel they are part of the scientific project, some of these respondents also made demands to access the gameplay data and receive reports, and wanted the ability to communicate with the scientists at SAH, as the following two quotes illustrate:

R11: “To improve the game experience for me would be to have periodic reports on the research and any insights, progress or problems made regarding the objectives. This would enhance the feeling of being part of the research which is the main reason I played.”

R12: “I like seeing the data being collected, it made it clear that i was actually doing something which could be analyzed. I would have liked to know more about what was being collected, though.”

Responses also revealed that the interest in contributing to science could be discouraged by a lack of understanding of the relationships between the game and its scientific purpose, which can result in players quitting the game. Below is a selection of relevant answers:

R13: “Explain how the game works, make a link with the part of physics which it concerns, it was all a bit unclear what is really all about. It worked for me but have not a clue what you accomplished with all data that is gathered. The idea to turn to the public is great, but explain more.”

R14: “Fun but frustrating, it would be nice to have a better understanding of how the data helps real life research. Enjoyable game, but I lost interest due to the perceived disconnect from the science behind it.”

The qualitative themes emerging from Q19 were also supported by the quantitative analyses of Q6, Q7, and Q10. The answers of Q6 inquiring on why the player choose QM indicate that the main motivation was the scientific nature of the game, particularly the interest on contributing to and learning from the science behind the game (Fig. 3).

In addition, the answers to Q7 highlight the main motivations for the players to contribute to science, to learn about the science in the game (including problem solving in a scientific environment), and to learn about the science in the game (including problem solving in a scientific environment), and the challenge posed by the game (Fig. 4).

Fig. 4
figure 4

Answers to Q7. Motivations of players for playing QM

The answers to Q6 and Q7 create a cross-sectional cut on the three main themes we found as motivators to play QM (Engagement, Learning, and Contribution to Science), highlighting even more relevance for exploring these concepts when researching PX in Citizen Science games.

Finally, from Q10 we can take a quick view of which are the most common ideas/words associated with the game. The 10 most common words (as depicted in Fig. 5) depict that most people found the game to be interesting, associating it with good feelings such as ‘good’ or ‘cool.’ However, ‘challenge’ and ‘frustration’ also fulfill a role in the engagement of people.

Fig. 5
figure 5

Answer to Q10. Most common ideas associated to QM

Discussion

Given the limited literature on QoE of online citizen science games, we engage mainly with relevant literature in game studies to present the underlying meaning of our results.

Main themes

While the initial aim of our study was to explore PX of QM for improving the game design, our findings prove compelling for understanding gaps in the QoE studies involving online citizen science games. We identified three main themes (Engagement, Learning, and Contribution to Science) helping to understand this experience. As regards to engagement, multiple interrelated factors, including game design (particularly levels and features, the appeal of the task, and the strategies that players can choose within the game) appear to influence longer-term engagement. These findings were quantitatively supported by the answers to Q6 and Q7 (see Figs. 3, 4), and resonate with Tinati et al. [37], who found that memorability/interestingness of subjects, speed, task difficulty, and amount of feedback provided, influence sustained engagement. New or improved game design features for players to use or to overcome seem to influence longer-term engagement [38]. For example, the request by a player of “more examples of quantum mechanics through mini-games for players to better understand the attributes associated with the ‘liquid data” indicates a wish for re-designing, turning a complex problem into smaller problems [39] without reducing the importance of the project’s scientific purpose.

In several other instances of the survey responses, players suggested the implementation of new or improved features for enhancing the gameplay experience, including adding sound or introducing a story in the game. Our interpretation of the co-occurrence of engagement and game design suggestions we found in the data is that features influence the perception of the interestedness [37] of the game. The finding that QM is perceived as monotonous and uninspiring by several players is consistent with what Prestopnik and Tang [26] argue about citizen science tasks as being mundane or repetitive sometimes. Making challenging scientific tasks interesting, worthwhile and achievable is critical for a successful citizen science system [40], as it impacts the decision of players to leave or stay.

Our findings also indicate that the strategies used by players seem to relate to the type of engagement and player decision to stay. The prevalent use of trial-and-error can arise from various sources. For example, it can arise from the perception of the game as “hard,” in the sense that there are several choices that appear to be good, even when they are not [41]. Instances of player responses indicate that this approach is more common when players experiment with various options in the game. Nonetheless, trial-and-error has proven to be a challenging concept to define, as the term can be associated with two different definitions. The first definition [42] refers to a learning process involving inductive reasoning. In this process, the person interacts with the problem as a way for probing it, analyzing the results of their interactions to form a mental model and infer effective rules for solving a problem. The second definition refers to trial-and-error as solving a problem using ‘brute force’ (i.e., trying every possible combination coming to a solution without reflecting or learning from it). Unfortunately, in the current research, we could not discern the extent to which the meanings could correspond to the first or the second connotation due to the short length of the units of analysis. More research is required to clarify this aspect.

When examining learning more closely, we found that learning about the game, in particular, learning how to play QM, can be “hard” for several players. Players reported that the learning curve can be steep, thus driving away players who do not want to exert big effort to the game, as well as players who are interested in the game and may be willing to make some effort but get frustrated because they have a low level of ability. Our analysis suggests the importance of achieving a feeling of flow [11] by balancing the skills required to solve a task, the rewards from solving it, and the continuous improvement of the skills required to solve it. The results also indicate that the provision of in-game support such as tutorials or guided task interfaces could influence the choice of strategies by equipping players with a better understanding of the game mechanisms, which could further increase interest in the game and support longer-term participation.

Several players were interested in learning more about quantum physics but did not report whether they thought they had learnt something about it. These findings were both supported by the qualitative thematic analysis and the quantitative answers to Q6 and Q7 (see Figs. 3, 4). According to Wiemeyer et al. [7], game developers should consider whether players may actually be interested in learning a subject, for example, learning about the environment of serious games, because it may add to the attractiveness of games. It should be noted that learning quantum physics is not at the forefront of the design objectives of QM and our results do not provide enough evidence of whether, through gameplay, players concurrently develop an understanding of the science behind the game. On the contrary, our data provide evidence of players who did not understand whether they were engaged in a scientific practice. We contrast this result with [43] study of Foldit (a citizen science game that gets people to understand protein folding through a collaborative puzzle-solving approach [www.fold.it]). The Foldit Citizen Science game could be an entryway for some players into the science of proteins as well as scientific practice. This may arise from the fact that the embodied activity of playing Foldit mirrors one of the activities in a biochemistry lab [43]. In QM, players who enter the game are unlikely to learn the science behind the game and struggle to understand its scientific significance. Arguably, this could be related to the complex nature of quantum mechanics: QM cannot afford to mirror the ‘real’ quantum behavior; instead, it provides a simplified representation of the problem for players while keeping the difficult components inside for the physicists. While QM does not constitute an entryway into quantum physics, it is an opportunity to develop skills, use intuition and think outside the box. This is especially evident in those responses where players report experimenting with the game features to see what they can achieve. The appreciation of this opportunity for trying and discovering things complements the enjoyment of the mental challenge created by the game and the pleasure of seeing improvements in one’s skills over time.

As regards to contributing to science, this interest is an important intrinsic motivation for QM players. This is particularly visible in the answers to Q6, Q7, and Q10 (see Figs. 3, 4, 5). While this anticipated result is in line with previous studies [2, 22, 24, 44], we emphasize here that this aspect can contribute to enrich the experience of a serious game. The use of games in citizen science suggests that game designers tend to rely on players’ civic zeal to trigger the excitement of ‘participatory science’ and the rewarding feeling of achieving something in a field considered prestigious by the general public [45].

Player motivation

Based on the analysis of the three themes, we present two main aspects that need consideration for improving the design of QM: contribution to science and extrinsic rewards.

Contribution to science When players make an effort to contribute to science, they wish to know what happens “behind the scenes,” either for better understanding the scientific problem they are asked to solve, for understanding the extent of their contribution, or for helping improve the game (Tables 2, 3, 4). We suggest the creation of an online community (e.g., Facebook or Reddit group) where the scientific team and developers participate actively in the discussion about the scientific background of QM, the way their contributions help achieve the specific scientific goal, and the upgrades that can be implemented to make the game more fun. Additionally, we suggest the involvement of players in scientific outreach, by reporting via newsletters how they have helped achieve the scientific purpose.

Extrinsic rewards Some elements, such as leaderboards, help engage certain types of players, but are not extensible to every player. Moreover, being an artificially introduced motivator it quickly wears weak, so it does not engage players for long periods of time [26, 46]. However, in our research, we found that game elements can be engaging to a certain extent for those players whose main motivation is not to help science, but to have a game experience. The responses to the free-text question of the survey suggest that not only were these players happy with the game elements used in QM (i.e., leaderboards and achievement stars) but also asked for more of these elements (e.g., unlockable content). In this case, the players’ qualitative experience contrasts with other findings [28, 46, 47] on the absence of correlation between player motivation and performance, and game elements. However, these suggestions are valuable for game designers because they seem to give players a subjective feeling of progression within the game, appropriation and agency, and engagement with the community, which might contribute to long-term engagement with the game.

Further steps

Although this exploratory study contributes to understanding the QoE of QM (therefore, it is limited to specific experience and demographical data), we consider some of our findings to be an important stepping-stone for understanding the QoE in other citizen science games. A further study mapping experience data onto player demographics would allow a richer representation and optimization of aspects of the QoE of a single player or a group of players with similar characteristics. In relation to the generalizability of our results, two other limitations need to be addressed. First, as the focus of this survey was on the experience of players participating in QM, our findings are informed by—and limited to—the respondents who completed the survey. These respondents were individuals who were interested in this study and may not be representative of the entire population of players. Arguably, our results are somehow biased towards the players who actually liked the game and/or to participate in citizen science projects. Second, our results are based on the analysis of a single game, although we have compared them and shown similarities with findings from previous studies of citizen science projects.

The reach of our current research allows us to point to specific aspects that are particularly present in the QoE of citizen science game players and which are not present in other types of games such as commercial or learning games [48, 49]. Acknowledging the presence of these topics in citizen science games can improve the construction of future scales or subscales used in PX evaluation. Construction and standardization of these scales will allow for (a) a better evaluation of citizen science projects in the aspects of engagement, learning, and contribution to science; and (b) designing and developing better tools for engaging citizen scientists as a particular population type.

As further research on the subject, we suggest that our findings are re-tested using standardized questionnaires [7, 49]. Also, we suggest the application of our coding schema (see “Appendix 2”) to the analysis of open questions in different citizen science games as a way to assess the occurrence of the three cross-cutting topics (i.e. engagement, learning, and contribution to science) found in the current research, as well as other possible topics that are particular of citizen science games.

Conclusion

We designed this survey study to investigate the participation in and the QoE of QM, an online game reproducing a problem optimization in quantum physics. Understanding how players participate and experience a citizen science project provides another lens for examining such projects. We found that most of the contributions were provided by a small share of players, although several players return to QM after long periods of absence. Regarding the QoE, our findings indicate that the perceived QoE goes beyond fundamental design issues [50], including affective aspects, such as having fun, feeling rewarded for learning something and for contributing to science; while experiencing other emotional responses, such as getting frustrated for not progressing in the game. These results highlight the importance of design interventions that increase both the perceived and factual effectiveness and efficiency of task performance. The perceived effectiveness and efficiency being crucial for the interaction and thus for the quality of experience, while creating something that players find meaningful and scientifically relevant, influencing both the attraction and retention of the game. The factual effectiveness and efficiency being important for generating more and better-quality scientific data. The trade-off between the need for collecting sound scientific data and ensuring gamers a rewarding experience has been recognized as a challenge not only for designers of serious games, but more generally in all citizen science projects started by domain experts [51].

In addition, the fact that the game attracted such a broad audience, either by awakening the curiosity of people wanting to try a different type of game, or by attracting those who wanted to help science, proved to be a powerful tool for collecting scientific data. Data collected in this way was rich both transversally (many different people playing) and longitudinal (few players playing for a long time and trying to get the best solutions) [4]. The findings of this research will help not only the scientists behind QM to improve the QoE in the game so that more and better answers can be generated, but also future studies using Citizen Science games.