1 Introduction

The number of robots used at homes has been increasing. Domestic robots need to be used continuously at homes, and their acceptance by a user is important. Various studies are on the acceptance of information technologies, including robots [39]. Several such studies have reported many factors that affect the acceptance of robots, such as the usefulness and social intelligence of robots and trust toward robots [20, 44]. Domestic robots are frequently used for long periods [17], and many studies are on factors that affect the acceptance of robots by users in case of long-period use [8, 9, 34].

Based on these studies, not only the usefulness of robots but also their enjoyment, intelligence, and sociability are important for determining the acceptance of their long-term use. This suggests that it is insufficient for domestic robots to only perform tasks, and they must make the users perceive them as fun, intelligent, and social by their behaviors. However, cleaning robots, one of the most common robots used at homes, have limited expression modalities. Thus, they cannot use facial expressions or gestures, which are methods to express such internal states in general communication robots. Therefore, in this study, we investigated a method to express the internal states of cleaning robots without facial expressions or gestures.

Various methods have been proposed for the expressions of nonanthropomorphic robots [1, 2, 15, 18, 24, 25, 28, 30,31,32,33, 43]. These methods can be classified into two types. The first type uses colors to express the emotions or intentions of a robot. Such robots utilize illumination, e.g., light-emitting diodes (LEDs). Many studies have been conducted on the relationship between colors and emotions [12, 19, 35]. For example, humans frequently associate bright colors (e.g., white and pink) with positive emotions (e.g., happy and relaxed) and dark colors (e.g., black and brown) with negative emotions (e.g., anxious and sad) [12]. In addition, humans tend to infer the valence of a stimulus (good or bad) based on brightness [19]. Considering such relationships between colors and emotions, many studies have established methods for expressing the internal states of robots by mounting LEDs on them [2, 25, 33].

The second type uses the motions or trajectories of a robot to express its internal states. People tend to interpret the motions of objects as their emotions or intentions, even if the objects have simple shapes [11, 37]. This type of a robot utilizes human nature to express its internal states. Robots using their motions have the advantage of not requiring any additional expression device, such as LEDs and displays. Motion-based robots are further classified into two categories depending on the design method of their motions: robots whose motions are designed by humans and designed as parameterized motions. Humans design the motions of the former robots by referring to human movements in various scenarios [1, 15, 18, 43]. For example, Yoshioka et al. designed the expressive motions of a robot for four typical affective states selected based on Russel’s circumplex model of affect [27]. They confirmed that the participants could infer the emotions of the robot [43]. Furthermore, several studies have designed and evaluated the motions of robots that express emotions [1, 15] and dominance [18] with reference to human behavior in theaters, movies, and dances.

In the latter type of robots, motions are parameterized using several parameters, such as acceleration and curvature, and their internal states are expressed by changing these parameters [24, 28, 30]. Saerbeck et al. focused on the changes in the acceleration and curvature of robots, and reported that the impressions of the robots could be changed by changing the acceleration and curvature of their motions [28]. Papenmeier et al. evaluated the effects of the velocity profile and orientation of a robot. The results showed that decreasing the velocity profile affected the predictability of the motions of the robot. Moreover, the mismatch between the orientation of the robot and the motions reduced the autonomy scores of the robot [24]. Schulz et al. also evaluated the effects of the velocity profile of a robot on its human impressions, and compared linear and slow-in, slow-out profiles [30].

Considering different types of robots, Song et al. proposed a robot that combined its motions and colors to express its internal states. Moreover, they investigated the impression changes caused by combining the blinking pattern of an LED and in situ robot motions, such as rotation and vibration [32]. Furthermore, Singh et al. proposed a robot employing a dog tail-like device to express its emotions. This dog tail-like device was attached to a cleaning robot, and the impression changes with the movement of the tail were investigated [31]. As explained above, various studies have been conducted aimed at expressing the internal states of nonhuman/nonanimal robots.

In this study, as described above, we aimed to establish a method to express the internal states of a robot without its facial expressions or gestures and to make users perceive robots as fun, intelligent, and social by their behaviors. Toward these objectives, we investigated a method to express the parameterized internal states of a robot by its parameterized motions and analyzed the changes in the impressions of the robot caused by varying its motion parameters. In the experiments, we focused on four parameters of the motions of the robot: difference in the speed (acceleration/deceleration), difference in the direction (curvature), distance between the robot and a target, and duration of the pausing action.

Fig. 1
figure 1

Robot used in experiments

Fig. 2
figure 2

Experimental environment

Fig. 3
figure 3

Experimental conditions

Table 1 Detailed factors for each motion

These parameters were selected based on several conventional studies as described below. Acceleration/deceleration and curvature were chosen considering a study [28] that reported that these parameters affect the impressions of robots. Proximity is also important for the interaction between humans and robots, and it has been considerably researched [16, 26, 36, 42]. Thus, we introduced the distance between the robot and a target (a participant or an obstacle) as a factor in our experiments. In addition, Zhou et al. reported that pausing behaviors affect the impressions of a robot and that humans tend to interpret the pausing behaviors of robots as robot planning [45]. Specifically, in the experiments, we evaluated the duration of pausing motions. As the base movements of the robot, we employed “going straight,” “turning 90 degrees,” “stopping,” and their combinations to generate several motion patterns by changing the abovementioned four motion parameters for evaluating the impression changes.

The experimental results showed that several tendencies of the impression changes are caused by the robot motion changes and that the various impressions of the robot can be changed by changing the robot motions. However, this study had several limitations, e.g., effects of the cultural differences, because we conducted the experiments with only Japanese participants. Many studies have reported the effects of cultural differences on human–robot interactions [4, 5, 38].

2 Experimental setup

2.1 Experimental environment

We constructed a robot for our research based on Turtlebot 3 waffle Pi, which is shown in Fig. 1. The robot was designed to resemble cleaning robots. We modified the sizes and positions of its tires to increase its speed and set the rotation center as the robot center. To investigate the effects on human impressions of the robot using only its movements, we used a white box as the exterior cover of the robot. The experimental environment is shown in Fig. 2. The participants were seated during the experiments, and the motions of the robot were observed. After observing these motions, the participants evaluated their impressions of the robot.

Fig. 4
figure 4

Age distribution of participants

Table 2 Adjective pairs
Table 3 Factor analysis results

2.2 Conditions

As mentioned in Sect. 1, we evaluated four types of parameters that define the robot motions: difference in the speed (acceleration/deceleration), difference in the direction (radius), distance between the robot and a target (distance), and pausing duration (time). For these experiments, we designed six types of motions by combining the base motions: “going straight” and “turning 90 degrees.” This is shown in Fig. 3. Several conditions were set for each behavior, which are summarized in Table 1. In motions 1 and 2, we evaluated the effects of acceleration/deceleration, i.e., slow and fast acceleration (aS, aF) and slow and fast deceleration (dS, dF). In motions 3 and 4, we evaluated the effects of curvature and distance between the robot and a human/an obstacle, i.e., small and large radii (rS, rL) and small and large distances (diS, diL). In motion 5, the effects of pausing duration (2 s, 5 s, and 8 s) were evaluated. In motion 6, the distance in the frontal/side direction from the participants was examined. We compared short and long distances in the frontal (dfS, dfL) and side (dsS, dsL) directions. The total number of prepared robot motions was 23 (4 for all motions except motion 5, and 3 for motion 5).

2.3 Procedure

The experiments were conducted using the following procedure. The participants sat on a chair in the experimental environment (Fig. 2). The total number of participants was 19, comprising 10 females and 9 males aged 20–49 years; the age distributions are shown in Fig. 4. These participants were recruited by an agent company and paid for their participation. Before the experiments, an oral informed consent was obtained from each participant. The experimenter first explained the detailed procedure of the experiments to the participants and instructed them to visualize the robot as a cleaning robot during the experiments. In the experiments, the participants observed the motions of the robot and evaluated their impressions of the motions using 17 adjective pairs, which are listed in Table 2. Although several validated questionnaires are available for evaluating robot impressions, such as Godspeed [3] and RoSAS [6], they have not been designed for evaluating the impressions of cleaning robots. Thus, we selected the 17 adjective pairs owing to their ease of expressing the impressions of cleaning robots. As described in Section 2.2 and shown in Fig. 3, 23 motion patterns are prepared for analyzing the effects of the changes in the motion parameters of the robot on the impressions of the participants of the robot. In this study, the basic order of the conditions shown to the participants was determined as follows: from motion 1 to motion 6. This order was common to all participants. The order of the conditions in each motion was counter-balanced across participants. Specifically, the participants first observed the four motions of motion 1, followed by the four motions of motion 2, and then the remaining four motions of motions 3–6 in order. The order of the conditions within the motions was counter-balanced, respectively.

3 Analysis

3.1 Factor analysis

Subsequently, we conducted factor analysis (maximum likelihood method with promax rotation) of the impression scores evaluated by all participants and adopted four factors from a scree plot (decreasing scenarios of the eigenvalues of the initial solution). We excluded the adjective pair “p6” answers from the analysis because their distribution was biased to high scores. Tables 3 and  4 summarize the factor analysis results and the extracted factors, respectively. Based on the factor analysis results, the four factors are interpreted as ability, comfort, activity, and anthropomorphism, respectively. Table 3 also summarizes Cronbach’s alphas. Generally, Cronbach’s alpha should be greater than 0.7 [21]. Because the alphas of factors 1–3 are above 0.7, they satisfy this criterion. However, the alpha of factor 4 is 0.591. Here, considering that factor 4 is composed of only two items (adjective pairs), the alpha of factor 4 is considered acceptable.

Table 4 Factor correlations
Table 5 ANOVA results (Motion 1,g: gender, a: acceleration, d: deceleration)
Table 6 ANOVA results (Motion 2, g: gender, a: acceleration, d: deceleration)
Table 7 ANOVA results (Motion 3, g: gender, r: radius, di: distance)
Table 8 ANOVA results (Motion 4, g: gender, r: radius, di: distance)
Table 9 ANOVA results (Motion 5, g: gender, t: time)
Table 10 ANOVA results (Motion 6, g: gender, df: distance (front), ds: distance (side))

3.2 Analysis of variance for each motion

Based on the extracted factors, we calculated the factor scores of the answers of the participants for each motion. To evaluate the effects of changing the motion parameters on the impressions, we conducted an analysis of variance (ANOVA) for repeated measurements of the motions in each motion pattern. Specifically, we evaluated the effects of the motion parameters as well as the gender of the participants (f: female, m: male) on the impressions of the robot. Thus, for motion 5, we performed two-way ANOVA (one parameter + gender), and for the remaining motions, we conducted a three-way ANOVA (two parameters + gender).

Tables 5, 6, 7, 8, 9, and 10 summarize the analysis results. In the tables, df, MS, and F denote the degrees of freedom, mean squares, and F values, respectively. Figures 5, 6, 7, 8, 9, and 10 are the violin plots of the factor scores, which show the distributions of the data. The horizontal axes denote the conditions, and the vertical axes denote the factor scores. We confirm several significant differences (\(p<.05\)). The results of motion 1 (Fig. 5 and Table 5) show that factor 1 is affected by gender and is evaluated higher by male participants than by female ones. Factors 2 and 4 are affected by both acceleration and deceleration changes and decrease with fast acceleration/deceleration changes (aF, dF). The results of motion 2 (Fig. 6 and Table 6) show that factors 1 and 3 are affected by gender and are evaluated higher by male participants (m) than by female ones. Moreover, factor 4 is affected by deceleration changes and decreases with fast deceleration changes (dF). The results of motion 3 (Fig. 7 and Table 7) show that factors 1 and 3 are affected by gender and are evaluated higher by male participants (m) than by female ones. The results of motion 4 (Fig. 8 and Table 8) show that factor 3 is affected by the distance between the robot and the obstacle and decreases with the increase in the distance (diL). The results of motion 6 (Fig. 10 and Table 10) show that factors 1 and 3 are affected by the distance in the frontal direction and decrease with increasing distance (dfL). It can also be inferred that factor 3 is affected by the distance in the side direction and decreases with increasing distance (dsL).

We obtained the following three dominant tendencies with respect to the significant differences described above:

(1) Effects on F2 (comfort) and F4 (anthropomorphism) by acceleration/deceleration: F2 and F4 in motion 1 were affected by the acceleration changes and F2 and F4 in motion 1 and F4 in motion 2 were impacted by the deceleration changes. The rapid changes in the speed of the robot decreased the impressions of the robot of comfort and anthropomorphism.

(2) Effects on F1 (ability) and F3 (activity) related to the distance between the robot and target: F1 in motion 6 and F3 in motions 4 and 6 were affected by the distance between the robot trajectory and the target. The ability and activity scores decreased with increasing distance.

(3) Effects on F1 (ability) and F3 (activity) by gender: F1 and F3 in motions 2 and 3 and F1 in motion 1 were affected by gender. Male participants rated higher F1 (ability) and F3 (activity) scores than female participants. However, we could not confirm any interaction effect. These results showed that the effects of the motion parameter changes on the impression of the robot did not change with gender.

Fig. 5
figure 5

Motion 1 results

Fig. 6
figure 6

Motion 2 results

Fig. 7
figure 7

Motion 3 results

Fig. 8
figure 8

Motion 4 results

Fig. 9
figure 9

Motion 5 results

Fig. 10
figure 10

Motion 6 results

4 Discussion

From the factor analysis, we obtained four factors: ability, comfort, activity, and anthropomorphism. Three factors—“evaluation,” “activity,” and “potency,”—are known to be stably obtained by an analysis based on the SD method [23]. Among them, “activity” can be directly observed, whereas “evaluation” and “potency” can be interpreted as ability and comfort, respectively. The anthropomorphism factor is a unique factor in this experimental setup. From the ANOVA results, we obtained three tendencies, suggesting that several factors may affect the impression of the robot, as explained above. Each tendency is discussed in the following sections.

4.1 Effects on F2 (comfort) and F4 (anthropomorphism) by acceleration/deceleration

The first tendency is the impression change caused by an acceleration/deceleration change. Comparing the F2 (comfort) results of motions 1 and 2 showed the effects on comfort caused by the acceleration/deceleration changes only in motion 1. Because motions 1 and 2 were motions toward and away from the participant, respectively, not only the acceleration/deceleration of the robot but also the motion toward the participant may affect the comfort impression (F2).

Many studies have been conducted on how robots approach humans [7, 22, 41]. For example, [7] observed that most participants disliked an approach from the frontal direction. Although their research did not investigate the effects of acceleration/deceleration during the approach, acceleration during an approach may strongly affect the comfort impressions. Furthermore, comparing the conventional research on acceleration, Saerbeck et al. reported that acceleration affects arousal but does not influence valence [28]. comfort is closer to valence than it is to arousal; thus, we could not confirm a similar tendency regarding acceleration compared to the results of Saerbeck et al. This difference may be caused by the difference in the moving direction of the robot. The robot in the study by Saerbeck et al. moved around the experimental space, whereas the robot in our study moved toward or away from the participant.

4.2 Effects on F1 (ability) and F3 (activity) by distance between robot and target

The second tendency is the impression difference based on the distance between the robot trajectory and the target (participant or object). Focusing on F1 (ability), we confirmed the effects on ability caused by the distance only in motion 6 (the frontal direction, df). We also confirmed a significant tendency (\(p<.1\)) for the distance effect on ability in motion 3. Because both motions 3 and 6 were motions toward the participants, the distance between the robot and the participant may affect ability only when the robot moves toward the participant. By focusing on F3 (activity), the effects on activity by the distance were confirmed in motions 4 and 6, but not in motion 3. Although the frontal distances in motions 3 and 6 were apparently the same, the distances in the large-distance conditions in motions 3 and 6 were different (1.0 m in motion 3 and 0.8 m in motion 6). In addition, motion 6 consisted of two turns. This suggests that activity evaluations may be affected not only by the distance between the robot and the target (human or object) but also by the small differences in the distance and the succeeding motions.

Sensing of discomfort caused by the approach of a robot has been analyzed in previous studies [7, 13, 40]. As mentioned above, Dautenhahn et al. reported that most participants disliked the approach of a robot from the frontal direction [7]. Several studies have also analyzed the distance at which participants feel anxious or are scared when approached by a robot approached [13, 40]. In contrast, our study could not confirm the effects of the distance between the robot and the participant on the comfort impressions (F2). This may be because the robot was small. Hiroi et al. showed that as the size (height) of a robot increases, the comfortable human–robot distance also increases [13]. These results suggest that if a robot is small, participants may find it difficult to feel any anxiety when it approaches them; thus, we could not obtain the effects of the distance on comfort. Conversely, we could confirm the distance effects on the ability and activity factors. These results also suggest that the distance during the robot approach may affect various impressions of the robot.

4.3 Effects on F1 (ability) and F3 (activity) by gender

The third tendency is the impression difference by gender. The results showed that the effects of the changes in the motion of the robot on the impressions did not change with gender. However, because several significant tendencies (\(p<.1\)) of interaction effects were present, further investigations regarding the impression differences by gender are required.

Thus far, considerable number of studies on gender differences in the impressions of robots have been conducted [10, 29]. Schermerhorn et al. reported that males tend to consider robots as more human-like [29] than females. Similar results of males evaluating a higher anthropomorphism of a robot than females were obtained in this research. Conversely, [10] reported that females trust security robots more than males. Considering that trust can be interpreted as the ability of security robots, their results do not match our findings. Because we instructed the participants to imagine the robot as a cleaning robot in our experiment, these results suggest that gender differences in ability evaluations may change depending on the tasks of the robot. Further investigation is required to evaluate these effects.

4.4 Other parameters

In this section, we discuss the curvature (radius) and the pausing duration. Their effects on the impressions could not be confirmed in our research, contrary to previous studies [28, 45]. First, focusing on the curvature, although Saerbeck et al. reported its effects on valence, arousal, and dominance [28], our study could not confirm this. This may be because the effects of curvature are sensitive. Saerbeck et al. analyzed three levels of curvatures and showed that small differences in the curvature affect the impressions. These results suggest that the selected radii in our experiments may have exceeded the range in which impression changes appear.

Focusing on the pausing duration, the existence of pauses was evaluated in [45], whereas we evaluated the pausing duration in this study. Thus, as a preliminary analysis, regarding motions 3 and 5 as the motions without and with pausing behavior, we compared their impressions to evaluate the effects of the existence of pausing behavior. The results showed a significant tendency (\(p<.1\)) for F2 (comfort). Because this comparison ignored various detailed conditions, further investigation is required; however, this result suggests that the pausing behavior has some effect on the impressions of our robot.

4.5 Summary and limitations

This study obtained similar tendencies to conventional research, in that the impressions of the studied robot changed depending on its motion parameters and the gender of the participant. Thus, the impressions of robots can be changed by changing their motion parameters, even of cleaning robots.

Focusing on the detailed changes in the impressions caused by the parameter changes, several different tendencies were confirmed. Although such differences may be caused not only by the reasons discussed above but also by differences in cultures, we conducted experiments with only Japanese participants. This is one of the limitations of this research, as mentioned in Sect. 1. Furthermore, experiences of people with robots reported in the media can increase their expectations regarding the skills of robots [14]. The recent spread of various robots may have changed their impressions of robots.

The small number of participants is another limitation of this research. The number of participants was 19, which we believe was sufficient to understand the tendencies of the overall impression changes of all participants and gender differences. However, although we observed several significant tendencies (\(p<.1\)), we could not determine whether they lacked statistically significant differences or sufficient statistical power owing to the limited number of participants. To clarify this, we need to conduct experiments with an increased number of participants.

In the experiments, we selected and employed 17 adjective pairs that we determined to easily express the impression of a cleaning robot. From the experiments, four factors were extracted: ability, comfort, activity, and anthropomorphism. In contrast, the Godspeed questionnaire, which is designed for evaluating the impressions of interactions between human and general robots, consists of five factors (subscales): anthropomorphism, animacy, likability, perceived intelligence, and perceived safety. Although our questionnaire was designed for evaluating the impressions of cleaning robots, similar factors consisting of similar adjective pairs to the Godspeed questionnaire appeared. In this study, ability and comfort corresponded to perceived intelligence and likability, respectively, of the Godspeed questionnaire, and anthropomorphism had the same factor name. These suggest that we can evaluate the impressions of our robots using the evaluation schemes of general human–robot interaction, such as Godspeed or RoSAS. Thus, in the future, we will conduct experiments based on such evaluation schemes.

In addition, several proximity-related parameters, such as the direction of approach and the size of the robot, were not evaluated in this study. Because these parameters may also affect the impression of the robot, we need to investigate them.

5 Conclusion

In this study, we analyzed human impressions based on the motions of a robot, aiming to design appropriate motions for cleaning robots that can express various internal states. We focused on trajectories as the expression modalities of the robot and analyzed the differences in impressions based on the differences in its motions. We focused on four parameters for changing the motions of the robot: difference in the speed (acceleration/deceleration), difference in the direction (curvatures), distance between the robot and a target, and duration of pausing action. Accordingly, we evaluated the effects of these parameter changes on the impressions. In the experiments, we employed the SD method to evaluate impressions. From the analysis results, we obtained four factors: (1) ability, (2) comfort, (3) activity, and (4) anthropomorphism. After calculating the factor scores, we conducted an ANOVA for each behavior condition. The experimental results showed three tendencies regarding the impression changes caused by the motion changes, which were the effects of (i) acceleration/deceleration, (ii) distance between the robot trajectory and the target, and (iii) gender. In contrast, we could not confirm the effects of curvature and pausing duration on the impressions. Although further experiments are required to evaluate the details of these effects, the results suggest that the various impressions of the robot can be changed by changing the robot motions.

Future studies include designing and implementing behaviors of cleaning robots that can appropriately express various impressions to users based on the results obtained in this study.