Keywords

1 Introduction

Explicit biofeedback effectively conveys an individual’s physiological data, allowing participants to exert direct and conscious control. Physiological data are categorized into two types: indirect control, such as EEG, heart rate, and GSR, which are not entirely controllable, and direct control, such as breathing and facial expressions, which users can fully control [1]. EEG is a non-invasive method for monitoring brain activity through electrodes placed on the scalp, recording spontaneous electrical activity over a period. Commercial EEG devices commonly used in interactive experiments include Muse, Emotiv, and NeuroSky [2].

Emotions, rooted in biological action tendencies, play a crucial role in determining behavior [3]. Most theorists agree that emotions comprise three components: subjective experience (e.g., feeling happy), expressive behavior (e.g., smiling), and physiological arousal (e.g., sympathetic nervous system activation) [4]. Physiological signals can be effectively used in automatic human emotion recognition systems [5]. Thus, by gathering and processing relevant physiological signal data from participants through various measurement methods and feeding this information into a system loop, it’s possible to enhance the emotional experience of interactive systems, aiding in emotional self-recognition and facilitating communication among multiple participants [6].

In the current field of human-computer interaction, numerous studies and practices have applied biofeedback technology to solo experiential works to enhance emotional experiences and understanding of the work’s theme. However, the use of biofeedback technology to increase empathy among participants in multi-person collaborative systems remains relatively rare.

This paper designs an interactive device that delves into the role of explicit biofeedback in the emotional connection between experiencers and the system through collaborative interaction mechanisms. By sharing physiological data (such as EEG) among participants, the study promotes mutual understanding and emotional resonance, deepening the perception and empathy towards others’ emotional states. This not only offers a new perspective for innovative interactive design but also opens new avenues for exploring interpersonal relationships and emotional communication.

2 Materials and Methods

The Mind Monitor software was employed to record EEG data, complemented by a device interaction designed with Python-based programming software. The research was developed through Design Science Research (DSR) [7], a methodology legitimizing the creation of artifacts as a means to generate technological knowledge, particularly after identifying gaps in the audio-visual system’s technical and scientific production. DSR’s critical element involves understanding the external and internal environment surrounding the artifact to be created [8]. Recent findings show a lack of models, methods, or structures mentioned in human-computer interaction or media studies to support multi-person emotional communication in collaborative systems through biofeedback and direct or indirect control. The method’s phases include problem identification, objective determination, design and development, demonstration, evaluation, and conclusion.

Fig. 1.
figure 1

“NeuroBrush” enhances interaction and competition among players with real-time shared drawing processes and dynamic background color changes.

In the literature, the use of EEG for emotional recognition research has proven effective, achieving a 86.97% accuracy rate through data fusion with fuzzy integration. Researchers also explored how displaying biofeedback affects the user experience in multiplayer interaction systems. “Bacteria Hunt,” a multiplayer EEG-based game, utilizes alpha waves and SSVEP for scoring points by controlling bacteria, examining the impact of relaxation and attention-driven brainwave bands in competitive environments, underscoring collective participation and experiential value [9]. “BrainBall” allows players to control a ball race using EEG and EMG [10], while “NeuroBrush,” a web application (see Fig. 1), facilitates competitive post-modern art creation through BCI, enhancing interaction and competition among players with real-time shared drawing processes and dynamic background color changes, highlighting social collaboration and creative experience [11].

3 Goal Setting and Experiment Development

This study investigates if explicit biofeedback in a collaborative setting can enhance empathy within a multi-user system. Key features of the experiment include:

  1. 1.

    A dual-user wearable device for simultaneous experience.

  2. 2.

    Capturing participants’ EEG waves.

  3. 3.

    Employing existing software for real-time, aesthetic data visualization.

  4. 4.

    Data storage for future analysis.

  5. 5.

    Collecting user feedback through surveys and interviews.

We detail the system’s design, implementation, and testing processes, along with insights and conclusions.

The hardware utilized is the Muse headband for EEG. Data transmission involves WiFi LAN and OSC communication via Mind Monitor software, with data processing on a website developed using the Google Charts API, and interaction and visualization through touch designer software based on Python (see Fig. 2).

Fig. 2.
figure 2

Software and hardware of the system.

Muse is an advanced commercial EEG device designed for real-time brain activity feedback via wireless Bluetooth, enhancing focus and relaxation through meditation practices [12]. It features multiple reading channels primarily located in the frontal areas, like the FP1 and FP2 positions, and uses specific ear clips as reference points for accuracy and stability [13]. High sampling rates capture EEG data, which are then filtered by built-in algorithms for real-time feedback and converted into an easily understandable format. Data, stored in Excel format via Mind Monitor software to Dropbox, facilitates further analysis, including EEG readings, event markers, and other physiological indicators [14].

4 Experiment Description

Against the backdrop mentioned, we designed a multi-person, multimodal biometric data collaborative interaction system. Following the DSR method, 10 users were tested, comprising five males and five females, within the media lab at Tongji University. To mitigate external experimental influences, a pre-experiment survey based on the Likert scale was conducted [15], gathering data on emotions, physical condition, sleep levels, hunger, caffeine intake, and prescription medication usage to identify potential noise and variances in EEG data.

The interaction and feedback process involved two participants, “the questioner” and “the respondent”, both equipped with EEG devices, headphones, and cameras, without direct interaction (see Fig. 3). The questioner’s interface displayed the respondent’s facial expressions, an emotion-triggering question selector, EEG visual effects, and guidelines, while the respondent saw the questioner’s image and their EEG graphics. EEG visualizations showcased alpha (blue spheres), beta (purple pentagons), and theta (yellow triangles) waves, with sounds from the International Affective Digitized Sounds (IADS) library inducing emotions [16].

Fig. 3.
figure 3

The interaction and feedback process involved two participants, “the questioner” and “the respondent”.

When the questioner clicks a question on the screen, the respondent hears the question and ambient sounds, eliciting an emotional response. For example, the “turtle” question triggers a calm emotion, indicated by the enlargement of a blue sphere representing calm emotions (α waves), showing the respondent’s pleasure. Excessive tension triggers an auditory alarm. Ten triggering questions are divided into five positive and five negative emotional questions (see Fig. 4).

Fig. 4.
figure 4

The system’s interface, including brainwave visualization, the other person’s facial data, and questions to stimulate emotions.

After a 20-min experience, a post-experience survey with five questions based on the Social Presence Scale theory assesses emotional fluctuations, perception of others’ emotions, the impact of others’ behaviors, and emotional interaction [17]. Participants rate their experience on a Likert scale from “strongly disagree” to “strongly agree”.

5 Analysis of Experimental Results

During the experience process, when the questioner selects the “death” option, the respondent hears unsettling sounds and sees frightening scenes on the screen. The combination of scenes and sounds can somewhat affect the viewer, causing short-term emotional fluctuations. These can be identified by reading specific EEG data bands. For example, in the frontal lobe areas represented by the A73 and AF8 electrodes, an increase in beta waves, indicating a significant amount of scare-related information processing accompanied by fear and anxiety, is noticeable (see Fig. 5). Additionally, larger peaks of alpha waves, associated with relaxation, suggest a reduced level of relaxation after being scared (see Fig. 6).

Fig. 5.
figure 5

In the frontal lobe areas represented by the A73 and AF8 electrodes, an increase in beta waves, indicating a significant amount of scare-related information processing accompanied by fear and anxiety, is noticeable.

Fig. 6.
figure 6

Larger peaks of alpha waves, associated with relaxation, suggest a reduced level of relaxation after being scared.

Emotional stimuli result from cognitive mechanisms processing sensory input. When an event occurs, this mechanism processes the stimulus and generates emotional stimuli for each affected emotion [18] (see Fig. 7). The Valence-Arousal model explains this [19], where valence represents the positive or negative nature of emotions, and arousal describes the level of emotional stimulation (see Fig. 8). Assessing emotional experience requires analyzing emotional valence (positive and negative) and arousal levels (intensity) related to stimuli (images, sounds, etc.). This study measures emotional levels through alpha (relaxation) and beta (anxiety) waves, and arousal levels by their intensities. During the experience, a questioner’s click causes short-term emotional fluctuations in the respondent, influencing the questioner’s next interaction and inducing emotional fluctuations in them.

Fig. 7.
figure 7

Emotional stimuli result from cognitive mechanisms processing sensory input. When an event occurs, this mechanism processes the stimulus and generates emotional stimuli for each affected emotion.

Fig. 8.
figure 8

The Valence-Arousal model explains this, where valence represents the positive or negative nature of emotions, and arousal describes the level of emotional stimulation.

At a specific moment, it was observed that a few seconds after the questioner clicked, the respondent’s emotional data showed significant changes. Subsequently, after some time, the questioner also exhibited similar emotional fluctuations (see Fig. 9). This sequence of emotional responses suggests complex information processing and emotional interaction are taking place. Specifically, it indicates that in an interactive environment, one participant’s actions may initially trigger an emotional response in another, leading to their own emotional change (see Fig. 10). This pattern of alternating fluctuations reflects the dynamics and interdependence of emotional interaction from a macro perspective.

Fig. 9.
figure 9

After the interaction point, it was observed that a few seconds after the questioner clicked, the respondent’s emotional data showed significant changes. Subsequently, after some time, the questioner also exhibited similar emotional fluctuations.

Fig. 10.
figure 10

One participant’s actions may initially trigger an emotional response in another, leading to their own emotional change.

6 Conclusion

This study explores the application of explicit biofeedback in enhancing emotional cycles and empathy through designing and implementing a multi-user interactive system. Experiments with 10 participants using Muse EEG devices collected and analyzed brainwave data and emotional responses, showing improved emotional communication and understanding, highlighting the potential of explicit biofeedback in fostering emotional resonance.

Despite insightful results, limitations include a small sample size, signal stability issues of Muse devices under certain conditions, and the complexity of interpreting EEG data. Future research requires larger samples, improved equipment and algorithms, and more detailed emotion and biofeedback analysis methods to enhance system accuracy and user experience.