Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Everyday Worlds as “Wicked Problem” for Robotics

Dealing with ‘real world problems’ poses a number of challenges. The first and foremost is a crucial change within the object area of robotics: Humans and their interactions become part of the problem to make a robot work. Until the 1990s humans have been either a visionary reference point or a limiting condition for robotics. They were mostly considered as a safety risk. But the problems that arise when robots are deployed in everyday worlds cannot solely be understood as technical challenges, e.g., in terms of obstacle avoidance. Leaving the factory buildings and laboratories does not only add a new set of tasks for the machines, it shocks the scientific—sociologists of science say: epistemic—foundations of the field.

Making robots work in everyday worlds does not only challenge the theories and methods of robotics, it also demands a new understanding of the role of the roboticist. By aiming at the actual use of robots in everyday life, robotics suddenly becomes a discipline such as architecture or urban planning, in which scientific, engineering, political, social, and aesthetic expertise meet. Robotics now shares the same resisting—some say malicious [Rit73]—kind of problem like architecture: Human activity in socio-technical systems is hard to operationalize and predict. Social situations and human(-robot) interactions are technically and scientifically incomplete. Factors and actions that become effective might not always be foreseeable. Demands of human-robot interaction are furthermore perspective-dependent, that is, they are subject to the interpretations of people—which often differ from the expectations of the designers and engineers. Furthermore, every action of a robot or a roboticist leads to reactions by the addressed users, so the roboticists expectations about the expectations of the future users may directly influence the resulting human-robot interaction.

Roboticists are forced to operationalize this social complexity into machine language. There is a stark contrast between the phenomenon of human-robot interaction in everyday worlds, which is difficult to standardize on the one hand, and robotics as an established set of problem-solving strategies, that rely on standardized procedures on the other hand. Other domains of Computer Science like software development or ubiquitous computing have related to the concept of “wicked problems” to describe this issue [DeG90, Cou08].

“Wicked problems” demand types of knowledge, skills, and perspectives that have not been part of robotics’ self-concept, such as empathy, or bearing ambiguity and contingency of interpretation. Previous studies on robotics for everyday life have identified a lack of conceptual and practical approaches to deal with the “wickedness” of constructing robots for everyday worlds. The conceptual gap of robotics [Lin16] becomes obvious when compared to sociological and social-psychological perspectives. In summary, the criticism is that the (mostly implicit) social theories within robotics miss essential aspects of social interaction. Lindemann names the significant role of expectations for a successful human-robot interaction, and the indexicality of communication—both factors that are difficult to formalize [Lin16]. On a broader level, there is the strong tendency to conceptualize human-robot interaction as dyadic exchange between two entities on the micro level [Mei14, Hoe13]. This mostly cognitive approach to human-machine interaction , equivalent to “the second paradigm of HCI” [Har07], reduces the specific complexity of the everyday world: By focusing on cognitive factors, the complexity of the numerous, interacting, and context dependent social factors of everyday worlds become marginalized. To deal with this complexity is the core challenge for robotics in everyday worlds [Mei14].

In the following sections, I want to contribute to this challenge by discussing a methodological aspect of robotics research and development that is widely underexposed: the crucial role of robotics’ and roboticists’ own (inter-)actions towards and in everyday worlds. Their actions and interpretations do not only then become effective when a robot is deployed, but already in the very beginning of research and development. Therefore, I will argue for a reflective stance within robotics to the goals of research (Sect. 2), the everyday knowledge of the researchers (Sect. 3.1), and the way they foster expectations about robotic behavior (Sect. 3.2).

2 How Does Robotics Define Social Goals?

It is rarely publicly discussed, how goals and desired effects of robots in everyday worlds are defined. Research from Science and Technology Studies shows that the definition of goals in robotics research and development is mainly influenced by two factors: culturally shared “imaginaries” of robots on the one hand, and the conditions and constraints of research funding on the other hand. Šabanović’s ethnographic study has shown that scientific theories and engineering problems are only a minor resource for the definition of objectives within social robotics. Moreover, everyday experiences, stories and symbols, especially from science fiction, played a decisive role and thus provided a common ground for researchers, the public, and funders [Sab07]. Studies comparing the culturally shared images of robots between Japan and the U.S. or Europe report similar findings [Lei06, Wag14]. The underlying concept of “imaginaries”, collectively shared ideas that influence the researchers as well as the funding institutions, has a longer tradition in science and technology studies [Boe14]. It is known that imaginaries coordinate the communication of different groups of actors, e.g., the public and funding institutions [Roe08] and even the practices of researchers [Gie07]. It is characteristic that imaginaries interlink the technical feasibility and societal desirability of a technology, for example by promoting a politically desirable idea as being technically feasible to justify investments and funding [Jas09].

When looking at robotics research and development funding programs like the EU’s SPARC [Noe14] or the National Robotics Initiative in the U.S. it becomes evident, that robotics funding is fueled by such claims of political desirability [e.g., Kro14]. Major research funding programs for technology development, for example in the field of Ambient Assisted Living, directly refer to this discursive figure. In these funding programs and the accompanying speeches, press releases, and workshops robotics is described as an instrument to control and compensate societal developments, like, e.g., the demographic change. This recurrent intertwining of solution promise and technological development has direct consequences for how robotics research and development sets its goals.

The classical epistemological path of producing knowledge in science is thus reversed. Robotics orientation of research and development can be called “post hoc” [Kno84]. The aim of this kind of work is not so much to discover new solutions, as rather to successfully implement the previously defined solution “robot application”, as Meister has pointed out for service robotics [Mei11]. Whether these objectives defined upfront are actually relevant for everyday worlds is most often not a superior criterion to decide upon the goals of robotics research and development.

The most problematic aspect of this framing is, that its effect on the actual research and development practices becomes implicit. The instrumental reference to everyday worlds as “context of application” makes it invisible, how users and usage scenarios are configured by upfront defined objectives [Woo90]. This blind spot is methodologically and ethically problematic. If robotics does not explicitly reflect and discuss the implications of its goals, it runs the risk of being a self-fulfilling prophecy of the imperative of usefulness, rather than building robots that actually fit in everyday worlds.

The following questions provide a rough guide to reflect on the origin and function of goals in robotics research and development and can easily be applied to any kind of project:

  • Who defined the goal of R&D/task for the robot?

  • When was the goal defined—before or after a contact with the users addressed?

  • Is there any valid empirical evidence, that this goal refers to actual needs of users in everyday worlds?

3 Silent Contributions by the Researchers and Engineers

Human-robot interaction and other branches of robotics adapted a plethora of methods and methodological approaches to research everyday worlds and users’ interactions with machines. Methods like ethnographic observations of field tests, in-depth interviews with users or acceptance tests with focus groups are particularly suitable to deal with the “wicked” nature of building robots for everyday worlds. More quantitative, lab-oriented methods like usability or task tests in closed environments, or questionnaires on perceived quality of human-robot interaction most often do not lack explicit reflection either. Most of these methodological applications from fields like psychology, anthropology or even design research evolved to a comparably high standard of methodical reflection.

However, there is an unseen area of robotics research and development, that is only unsatisfactorily controlled or reflected methodologically: roboticists actions towards their research objects and subjects. Thereby I mean two sets of activities that are not considered scientific or academic at all, yet have a great impact on the demands and expectations towards robots in everyday worlds. The first are everyday activities and everyday knowledge of the researchers that implicitly become a resource for understanding and defining demands (Sect. 3.1). The second are activities of staging robot behavior and thereby actively creating expectations of robots’ capabilities (Sect. 3.2).

3.1 Everyday Knowledge as Ambivalent Resource

It should come as no surprise that everyday knowledge is a resource to make robots work for everyday worlds. Relying on one’s own observations, implicit knowledge, and incorporated abilities in order to achieve a fit of machines and everyday scenarios is an obvious and promising strategy. By doing so roboticists become instruments of robotics requirement engineering. They draw on experiences and expertise gained as everyday people. This has been confirmed in many cases during my visits in robotics laboratories in the U.S. and Europe [Bis17]. Particularly when talking about the motivation and aptitude to work in the field of social robotics it becomes obvious, that everyday occurrences in the lives of researchers have an epistemic value for their work. However, these everyday methods and this knowledge are most often neither documented nor questioned, which is problematic.

An example for such everyday methods of researchers is “lay ethnography”. Thereby I mean observations of everyday life done by roboticists. For example, a researcher followed people through a university building in order to find out which floors the students and staff typically use and what goals they have. Although the roboticist does not consider such an activity to be a proper scientific experiment, it still helps him or her to limit the space of solvable problems in order to determine the further course of the robotic project. (The specific case retold here aimed at building a service robot that fulfils simple tasks in the university building.) The strategy applied by the researcher is typical for laymen’s ethnographies in robotics: “Let’s just see how people do it.”

These lay ethnographies differ from methodically controlled ethnographies in their purpose: Lay ethnographies are means of generating meaning within the researchers’ work [Wee06]. They are rarely discussed outside of a laboratory nor presented at conferences or contested by competing interpretations or further empirical material: lay ethnographies only serve themselves so to speak. They are not aimed at being transformed into a discourse that explains and contrasts the researcher’s own understanding, as professional ethnographies are required to do. Without this reflective element, everyday observations of researchers miss out the core analytic quality of ethnography [Dou06]. Lay ethnographies are a rather descriptive method, that does not question the point of view of the observer. But this would be highly critical for an ethnography in order to be valid. There is a long tradition in Science and Technology Studies analyzing and criticizing such “I-Methodology ” of engineers and designers [Akr92], that consist in extrapolating knowledge from their point of view.

3.2 Creating Expectations About Robots

Most robotics research teams develop a kind of “demonstration routine”. There are plenty occasions, were roboticists are asked to present their work, such as open house days for publics or funders, during events to attract new students, competitions or Science Fairs. Due to these regular presentations of their machines, most roboticists gain a feeling for the expectations of the audience, the effects of certain behavioral patterns, and also for the necessity of props and scripts.

The “YouTubization” of research [Bot15] is an expression of this important role of staging robots’ behavior in robotics. The creation and circulation of “demo videos” of robots are almost obligatory for robotics research. For a “demo video”, the behavior of the robot is usually presented in a scripted scenario in which a human demonstrator or narrator comments on the machine. These presentations serve to illustrate the feasibility of a technical solution or the fault-free operation of a prototype [Ros05]. Such videos are omnipresent in robotics: they can be found on the researchers’ websites, are used in lectures, in conference talks, become part of the publications, and circulate on YouTube, Facebook, and in tech blogs as well as in the researchers’ mail inboxes.

This popularity has inspired a growing body of research literature on the role of demo videos for research and development [Ros05, Suc07, Suc11, Suc14, Win08, Bot15]. Winthereik et al. [Win08] examined the staging of future uses in such videos. They found that the videos first and foremost serve the “life-world of the demo”, the places and occasions of the demonstration of the videos: Above all they are a tool for connecting to discourses of potential industrial cooperation partners, marketers, politicians, colleagues, and user groups. Both highlighted the expressional element in the researchers’ activity of staging robotic behavior along the aesthetic conventions of video clips [Bot15]. Demo videos are thus also an expression of the identity constructions of the researchers, but this does not mean that they depict themselves in the videos. Instead, the focus is almost entirely on the performance of the machines. This can be interpreted as a successful representation of the original goal of new robotics: to build machines that are suitable for everyday use. The videos and their productions then aim to provide proof of the robots’ efficiency.

These practices of staging include clearly an interaction between the researchers and the addressed users. “Demo videos” are shaped by expectations and form expectations for human-robot interaction. Robotics does not just rely on cultural imaginaries of robots’ capabilities, it shapes them too. The staging of technically not yet feasible robot behavior is part of the negotiation on the potential of a technology and for what purposes it might be used [Win08, Lat05]. Many researchers in the field are well aware that they contribute to a changing experience of sociality by developing their machines and creating presentations of them [Dau98, Tur17]. However, the implications and conditions of this change—and the roboticists’ share thereof—are not explicitly reflected and discussed in robotics research and development.

4 Conclusion: Towards a Reflective Methodology of Robotics for Everyday Worlds

In this text, I pointed towards some critical practices of roboticists. However, I did not do this to discredit or de-legitimize the goal to build autonomous machines or support systems. Instead, the empirical findings of Science and Technology Studies and Sociology of Science are meant to improve the ability of engineers and scientists to include themselves in the equations they make about everyday worlds. Engineering robots for everyday worlds has been described as a “wicked problem”. This means that the objects in this area are so complex and changing that there are no standardized solutions to treat them. Instead, the processing of “wicked problems” essentially depends on the formulation and definition of the problem by the developers themselves [Rit73].

Beside psychological theories and effects, robotics for everyday worlds should focus on the dynamics of social worlds and the diversity of perspectives of different stakeholder groups—including roboticists themselves. New theories and methods require a different understanding of the relationship between researchers and their ‘objects’ of investigation, which are alive, interpreting, responding, and involved.

The proposal of the article is to take a reflective stance on practices and assumptions within research and development—even when they seem mundane and/or marginal to the goal of making the robot work technically. As we have seen, within everyday worlds the fit between man and machine goes far beyond technical problems. It is also a matter of culturally shared meaning, contesting definitions of usefulness, and power with respect to defining goals. Thus, making robots work in everyday worlds requires heuristics that go far beyond ‘neutral’ measurements and technical modeling. The personal contribution of roboticists cannot be ignored nor eliminated from the task to build robots for people.

This extends to all creators of technical support systems, although their specific technical and social challenges may differ. Overarching fields like Human-Computer interaction or concrete domains like wearable technologies or smart homes share the problem to mechanize and digitize practices and situations that are wicked in the above described sense. Thereby engineers and designers of support systems have been assigned a leading role in the overall process and not only with regard to the construction of the artifact proper: Their products embody social relations, for example power relations, and they are responsible to reflect upon this. This transcends simple categories like ‘intended’ or ‘unintended’ effects of technology use. It requires to critically consult the technical development as a process and to understand, whether it is biased in a particular direction, or which social interests it favors. The process of technological development is critical in determining the politics of an artifact [Win80]; hence the importance of incorporating all stakeholders—instead of viewing the actions and beliefs of scientists and engineers as somehow external to the social and normative dimensions of support systems.