Keywords

1 Introduction

Educational robots can play different roles, such as helping children to learn basic algorithms by programming the robots themselves [1]. In our HCI lab, we carried out a co-design activity with children aimed at devising an educational robot called Wolly [4]. The main goal of the robot is acting as an affective peer for children: hence, it has to be able to execute a standard set of commands, compatible with those used in coding, but also to interact both verbally and affectively with students. We are now working on controlling Wolly by means of a standard visual block environment, BlocklyFootnote 1, which is well know to many children with some experience in coding. However, we would also like to have a simpler set of instructions, specifically designed for Wolly, so that children can use basic commands to control its behavior.

The idea of developing tools that allow children to build structures, mechanisms and behaviors dates back to the Resnick’s project on programmable bricks [9], which allow children to build and program even robots and served as an inspiration for commercial products such as LEGO MindStorms [6]. Another very common approach to robot programming for kids is the use of block-based visual programming languages, such as ScratchFootnote 2, Blockly, and others. Graphical and visual environments for programming the behavior of robots are also proposed as end-user development solutions for humanoid commercial robots [8], retail contexts [5], social therapies [3], and more. In order to investigate a children-centered solution for the end-user programming of Wolly, we have organized a co-design session with children, which will be described in Sect. 2.

2 Background and Experiment

As a first step in the development of the robot, in November 2017 we conducted a co-design session with 25 children, described in detail in [4]. All children were in the third grade, 8 to 9 years old, and with no experience in educational robotics. Following a co-design methodology, they were asked to provide suggestions for some features of the robot: its name, physical appearance, facial expressions, personality and character. Based on the insights drawn from the co-design process, we designed the robot appearance and structure (see Fig. 1(a)). In particular, the robot -built using a common hobby robotic kit- is able to move through its four independent motorized wheels and can be controlled through either a web application or a set of Android apps that contact its REST APIs. Its body has been almost completely 3D printed, while its head consists in an Android-based smartphone able to show and perceive emotionsFootnote 3, to produce verbal expressions and to understand voice commands.

Fig. 1.
figure 1

(a) the Wolly robot; (b) the most frequent children command proposals

As far as interactive features are concerned (see [7] for details), Wolly plays the role of an educational robot that helps kids in coding exercises, giving them suggestions on how to reach their goals and write their code, at the same time being able to execute instructions such as moving on a chessboard, as other educational robots can do. However, since Wolly is also able to interact with kids in a verbal and affective way, we would like to enable children to program its basic behaviors and social interactions, in order to teach them the basis of social robot programming.

Thus, in April 2019 we carried out another co-design session with 24 children (10 females and 14 males) belonging to the same class involved in the first robot co-design, with the aim of eliciting suggestions on the design of Wolly’s basic behavior commands. Children were in the fourth grade of elementary school, 9 to 10 years old. The command co-design activity lasted one hour and was organized in the following phases.

Introduction (10 min): the coordinator presented the activity, introduced three facilitators and answered the children’s questions. Children were asked to draw their proposals for the following robot commands: move forward, move backward, turn right, turn left, stop, repeat a command a number of times, say something, express an emotion (happiness, fear, surprise, disgust, anger, sadness, plus a neutral one).

Ideation (30 min): the kids could draw their proposal with the help of the facilitators, which went around the desks and answered any questions.

Recreation (15 min): the children, in turn and in groups, were invited to interact with Wolly, so that they could appreciate the progress of the robot that they had helped to create.

Results and Discussion. The analysis of the proposed symbols was inspired by observational studies from Bakeman and Gottman [2], adapted to this context of analysis. In particular, we borrowed the idea of coding schemes, used to categorize the different proposals. Then, we computed the percentages of agreement among children as follows:

$$\begin{aligned} Pa = \frac{Na}{(Na+Nd)}*100 \end{aligned}$$
(1)

where Pa refers to the percentage of agreement, Na refers to the number of agreements, Nd refers to the number of disagreements. We found the following percentages of agreement among children’s proposals (see Fig. 1(b)):

  • move forward: arrow (Pa = 85.7%), upwards (Pa = 62.2%), containing written direction (Pa = 57.14%);

  • move backward: arrow (Pa = 65.1%), downwards (Pa = 65.0%), containing written direction (Pa = 52.17%);

  • turn right: arrow (Pa = 80.0%), to the right (Pa = 95.8%), containing written direction (Pa = 48.00%);

  • turn left: arrow (Pa = 90.0%), to the left (Pa = 95.5%), containing written direction (Pa = 60.00%);

  • stop: stop symbol (Pa = 37.5%), stop textual instruction (Pa = 31.3%), hand up (Pa = 18.8%)

  • repeat: repetition of the symbol of the command to be repeated (Pa = 43.8%), textual repeat instruction (Pa = 31.3%), cycle block (Pa = 18.8%). Numbers for repetitions were present not as much as expected (Pa = 31.3%), while the most frequent symbol was the arrow (Pa = 37.5%), often used as final part of a block;

  • say something: textarea (Pa = 44.4%), balloon (Pa = 33.3%), microphone (Pa = 22.2%);

  • express an emotion: emoticon (Pa = 89.5%)

Although children have been coding for two years, results show that they have not always proposed typical coding instructions, e.g., they seem to prefer directional arrows to express movement commands instead of simply writing the desired direction in a block (as Blockly does), thus confirming some already encountered orienteering difficulties (e.g., problems in recognizing the left and the right side, especially when the robot to command was not turned from their point of view, see [7]). Another surprising result is that almost 20% of them represented the repetition through a cycle block, while 44% proposed to use the repetition of the same command symbol. Although in fourth grade, children have often proposed solutions typical of the pre-school age, preferring symbols (often containing the textual instruction) to blocks containing textual instructions, thus showing that they have not completely internalized abstract concepts.

As future work we will re-propose the same co-design approach to children with no or little experience in coding, to compare the results. Then we will implement the most shared and suitable command proposals in a drag and drop interface, and we will test the approach in the wild. We will also add other commands as moving the robot head, changing the voice volume and its utterance, etc. in order to provide an increasingly refined control over the robot social behavior.