Keywords

1 Introduction

Data plays a fundamental role in the communication and acquisition of knowledge. Users gain insights into various topics through their interaction with data. Initially, the field of Human-Data Interaction focused solely on analyzing the data itself. However, over time, it became apparent that understanding how users interpret the presented information was equally crucial [22], particularly given the ever-growing size and intricacy of data resulting from advancements in data collection and dissemination technologies [12].

Human-Data Interaction plays a critical role in all domains involving information transmission, as it serves as a conduit for knowledge acquisition, as outlined in the research conducted by Victorelli and Reis [20]. This study also highlights the contributions of methods employing tools to support the data life cycle [20]. In environmental studies, the ability to access data accurately and concisely is of utmost importance, particularly in accident prevention and monitoring.

Dams have received substantial attention in the Brazilian environment. The country has numerous water dams connected to hydroelectric plants and mining tailings dams. Many major accidents have occurred in some dams, leading to severe environmental and human impacts. In Brazil, the National Dam Safety Policy was established by Law 12,334/2010 [2] to safeguard lives and nature during dam accidents. The Brazilian National Water Agency [1] defines dam accidents as situations in which there is a “compromise of the structural integrity of a dam, leading to the uncontrolled release of the reservoir contents”.

When conducted appropriately, the interaction with data related to the state of preservation of a dam is of great significance in preventing accidents, and in more severe cases, it enables experts to leverage their knowledge to predict risks and evacuate the area before the event occurs. Additionally, data availability allows lay users to access information about nearby dams and their danger levels.

Human-Computer Interaction concepts can be employed to measure interaction aspects [14, 22], to assess whether users can comprehend the data as intended. Such evaluation may be conducted using methods like user tests or heuristics focused on data, as proposed by Victorelli and Reis [21].

Despite the increased attention to studies on Human-Data Interaction [14, 22], specific areas of knowledge have not been subjected to a detailed analysis that can derive implications for design. In the environmental field, there is insufficient understanding of the outcomes obtained through applying Human-Computer Interaction techniques aimed at data exploration platforms via inspections utilizing heuristic evaluations or user tests, for instance.

The objective of this research was to evaluate an environmental data exploration platform that aims to centralize information about Brazilian dams using a combination of two methods: 1) tests with users who know the environmental and technological areas and 2) a collaborative heuristic evaluation conducted by three specialists in Human-Computer Interaction. The usability heuristics defined by Nielsen and Molich [15] and the specialized usability heuristics for Human-Data Interaction defined by Victorelli and Reis [21] were used in the evaluation. In addition to identifying issues through these heuristics, the study also developed categories to group the most significant problems to facilitate improvement.

The paper is structured as follows: Sect. 2 provides the theoretical background, defining essential terms related to dam safety and human-data interaction. Section 3 discusses related studies. Section 4 outlines the methods employed in this study. Section 5 presents the results obtained from the applied methods. Finally, Sect. 6 summarizes the contributions of this work and outlines avenues for future research.

2 Theoretical Background

This section introduces definitions relevant to dam safety in the Brazilian context and human-data interaction. This section lays the foundation for the research presented in this paper by providing these definitions. Additionally, this section discusses related works and their findings.

2.1 Context of Dam Safety in Brazil

Law 12,334 [2] was enacted in Brazil in 2010, establishing the National Policy on Dam Safety. Its main objective was to preserve life and nature by anticipating accidents involving dams before their occurrence [17]. According to the Brazilian National Water Agency - [1], a “dam” is defined as “a structure for the retention or accumulation of liquid substances or mixtures of liquids and solids”.

Dam safety is closely tied to monitoring existing dams to maintain their integrity and preserve life and the environment around them. Additionally, according to the Brazilian National Water Agency [1], an accident involving a dam can be defined as “the structural integrity compromise with the uncontrollable release of reservoir contents”.

In 2015 and 2019, the Brazilian population witnessed accidents involving dams. The first accident in Mariana, in the state of Minas Gerais, resulted in the release of about 40 million cubic meters of tailings, causing the loss of life and a significant environmental imbalance [13]. The second accident occurred in the municipality of Brumadinho, also in Minas Gerais, in 2019. It involved the release of about 12 million cubic meters of tailings, resulting in a higher number of fatalities than the 2015 incident and a smaller environmental impact [5].

Thus, in the Brazilian context, dam safety requires conducting suitable inspections and verifications of erected dams to anticipate and forestall accidents.

2.2 Human-Data Interaction

Knaflic [7] states that in human-data interaction, it is crucial to determine the intended audience to whom the presenter will communicate. This approach establishes the context for the data presentation. Explaining what will be presented is necessary while avoiding excessive information to prevent user confusion. This information is particularly relevant in the context of the evaluated data since users often access it independently without assistance or supervision.

In their article, Mortier et al. [14] have identified three main aspects of human-data interaction. The first aspect is readability, which focuses on presenting data more transparently and understandably for readers. The second aspect is action, which relates to users’ actions based on the information absorbed from the data. The third and final aspect is negotiability, which concerns visualizing changes in individuals and society resulting from the interpretation of data over time.

In their study, Victorelli and Reis [21] proposed a set of heuristics related to the design of elements that utilize human-data interaction, including:

  1. 1.

    Human-data interaction design guidelines for visualization systems

    1. 1.1.

      Self-evidence in coordinated views

    2. 1.2.

      Consistency between coordinated visualizations

    3. 1.3.

      Reversible operations in visualizations

  2. 1.

    Use smooth animated transitions between visualizations states when they can help the user to notice the difference between the data

  3. 2.

    Immediately provide visual feedback on the interaction

  4. 3.

    Maximize direct manipulation with data

  5. 4.

    Minimize information overload

    1. 5.1.

      Show information context

    2. 5.2.

      Avoid requiring data memorization

  6. 1.

    Semantically enrich the interaction

    1. 6.1.

      Semantically enrich search interaction

    2. 6.2.

      Enriched feedback from humans incorporated into the system

    3. 6.3.

      Refine and train models through user feedback

3 Related Work

This section presents related work concerning improving dam monitoring systems, mapping questions asked by users, adapting architecture to perform dam simulations, and analysis focused on human-data interaction.

The study performed by Law, Lai-Chong, and Ebba Thora Hvannberg [9] investigated the complementary nature and convergence of heuristic evaluation and usability testing in evaluating the usability of a universal brokerage platform. The case study explores how these two evaluation methods can be integrated to provide a more comprehensive assessment of the platform’s usability. The results from both methods were compared and synthesized to reveal the complementarity and convergence between heuristic evaluation and usability testing. The findings demonstrated that heuristic evaluation identified high-level usability issues and provided valuable design suggestions, while usability testing uncovered specific and contextual usability problems encountered by users during task execution. Integrating heuristic evaluation and usability testing provided a more holistic evaluation of the universal brokerage platform’s usability, with each method offering unique perspectives and insights.

Ekşioğlu, Mahmut, et al. [4] investigated the efficacy of combining heuristic evaluation and user testing to assess user interface usability. The presented case study involved evaluating a real-world web application. Initially, heuristic evaluations were conducted, wherein usability experts analyzed the interface based on predefined heuristics. This phase identified several potential usability issues within the interface. Subsequently, user testing was performed, involving participants completing specific tasks while their behaviours and perceptions were observed. User testing helped validate and further explore the findings from the heuristic evaluation, uncovering additional problems not identified through expert analysis alone. The results suggest combining heuristic evaluation and user testing can be a powerful approach to evaluating interface usability, enabling more effective identification and resolution of problems. This highlights the significance of utilizing a multi-method approach in usability evaluation, leveraging the strengths of different approaches to achieve comprehensive and reliable insights into the quality of the user experience.

Komarkova, Jitka, et al. [8] presents a study that focuses on the usability evaluation of the Prague Geoportal, a web-based geographic information system (GIS). The study aims to assess the usability of the Geoportal and identify potential improvements to enhance the user experience. The evaluation process involved a combination of heuristic evaluation and usability testing. Usability experts performed heuristic evaluations, applying established usability heuristics to analyze the interface design and functionality. Additionally, usability testing was conducted with real users who performed specific tasks on the Geoportal while their interactions were observed. The study found several usability issues and provided recommendations for improving the interface’s navigation, information presentation, and overall user interaction. The findings highlight the importance of considering usability principles in GIS design and development to ensure efficient access to spatial data and enhance user satisfaction. By integrating heuristic evaluation and usability testing, the study contributes to a better understanding of usability challenges and offers valuable insights for optimizing GIS interfaces.

The study by Jeon et al. [6] starts from the principle of seeking to increase the safety of dams in South Korea because of the risks involved in accidents. Therefore, a dam safety monitoring system was created to seek data from water systems, dams, instrumentation, hydrological information, inspection, and dam information. Having a more robust system with more information than the system currently used in Korea made it possible to conclude that decisions and actions can be taken faster with more detailed and easily visualized data.

Rodrigues et al. [16] conducted a study to understand which questions users ask during their interaction with the data. Twenty-two users participated in the study, totalling 1058 questions, divided into two groups for analysis: the first group was related to straightforward questions, and the second group to questions with some problem. As a result, the authors categorize the questions into five categories, described according to the author’s definition:

  1. 1.

    ERR - questions containing conceptual errors (88 occurrences);

  2. 2.

    AMB - questions that contain some ambiguity (41);

  3. 3.

    DTA - questions that are technically answerable, but are difficult to answer with the visualization, i.e., questions for which the visualization was not appropriate (43);

  4. 4.

    DNA - questions the visualization does not answer (28);

  5. 5.

    INS - failures to follow the instructions when filling out the questionnaire (79).

The study can be used in learning about data visualization since it can map questions and errors raised by users during the analysis, also helping in a better view of the questions that can be asked by users and how to cover them in the presentation of data better.

The study carried out by Liu et al. [11] consisted of adapting an architecture so that dam information could collaborate more with a simulation focused on verifying possibilities of dam failure and its impacts. The data flows used in this research focused on experienced users reading this type of data and inexperienced users to ensure that several users could access the remodelling of the tool already used.

Leskens et al. [10] bring a system for analyzing flooding scenarios in their study. Despite the complexity involved in the data, the developed tool aims to be accessible to professionals and people who have no contact with the area. This objective is facilitated by the 3D tool used by the system. It helps better estimate the scale and impact of a flood.

According to Calvetti et al. [3], they were conducting a study using use cases that often have a large amount of information and generating a detailed view of the data. In this way, the existing processes were improved by specialists. Thus, they concluded that monitoring human activities deserves to be highlighted, indicating as accurately as possible what data will be collected and the expected results.

The study carried out by Trajkova et al. [18] sought to understand which aspects related to interaction would be necessary to ensure that museum visitors understood how to use the system and how to attract people to interact with the screen, keeping their attention.

The studies mentioned in this section bring views on mapping impressions about interaction with data and which methodologies worked, especially in the context of dam safety. In this sense, this study aims to complement the information already in the literature to bring an approach containing the junction of human-data interaction with user tests for a dam safety context.

4 Methods

This study aimed to investigate aspects of the interaction between humans and data in exploring environmental issues. The investigation employed user testing techniques and heuristic evaluation to assess an application that explores a dataset on the safety of Brazilian dams.

The overseeing agency responsible for monitoring Brazilian dams maintains a comprehensive database containing all relevant dam-related information. This dataset is utilized to develop a dashboard that enhances data visualization and comprehension, ultimately contributing to establishing a national repository of records concerning Brazilian dams.

Concerning the user tests, the research involved 18 participants assigned specific tasks to interact with the data. Their perceptions and opinions were solicited, and upon completion of the tasks, they were asked to complete a questionnaire and participate in a brief post-test interview.

The heuristic evaluation was conducted by three experts specializing in human-computer interaction, employing the collaborative heuristic evaluation method. This approach involved the evaluators performing the same tasks as the users and collaboratively identifying any issues or problems encountered during the evaluation process.

4.1 Task Performed in Evaluations

The objective of the study was to examine the usability of a webpage that featured a dashboard presenting data related to dam safety, accompanied by search filters.

During the execution of the assigned task, participants were instructed to interact with both the dashboard and the search filters, expressing their understanding of the provided information and articulating any inquiries that arose.

4.2 Procedures for Heuristic Evaluation

The evaluation aimed to appraise the performance of the data presented within the dashboard, along with its associated filters. The issues identified in the application were assessed and compared against general-purpose usability heuristics proposed by Molich and Nielsen [15] and specific heuristics for human-data interaction put forth by Victorelli and Reis [21], as described in Sect. 2.2. The Nielsen and Molich’s heuristics [15] were: 1) Visibility of system status, 2) Match between system and the real world, 3) User control and freedom, 4) Consistency and standards, 5) Error prevention, 6) Recognition rather than recall, 7) Flexibility and efficiency of use, 8) Aesthetic and minimalist design, 9) Help users recognize, diagnose, and recover from errors and 10) Help and documentation.

The inspection was conducted by three Human-Computer Interaction specialists through a collaborative heuristic evaluation conducted remotely. This approach sought to facilitate evaluators’ joint identification of issues and their assignment of corresponding heuristics.

4.3 Procedures for User Tests

The testing phase involved 18 participants aged between 22 and 45 years, possessing prior knowledge in dam safety and technology, primarily from disciplines such as Environmental and Sanitary Engineering and Computer Science, with experience in technology and environmental management. User recruitment was conducted through invitation-based selection, where interested individuals were included as participants upon acceptance. In the event of non-acceptance, researchers proceeded to the following potential user. Each user was assigned the same task to evaluate their comprehension of the presented data. Due to the Covid-19 pandemic, the tests were conducted remotely via a videoconferencing platform. The testing protocol and post-test user interviews were approved by the Research Ethics Committee, with the code CAAE 55663422.8.0000.5148.

During the test phase, upon initial contact with the user, the researcher introduced themselves and provided an overview of the research objectives. The user was then given an explanation of the nature and procedures of the usability tests, emphasising the confidentiality of their personal information. We clarified that the purpose of the test was to assess the platform’s performance rather than the user’s ability to comprehend the dashboard. Additionally, users were informed of their right to discontinue the test at any point. Following this initial briefing, the user was presented with the assigned task and instructed to employ the Think-Aloud protocol [19] to verbalize their impressions and experiences while using the platform. Subsequently, the test session commenced. After completing the test, participants were asked to provide demographic information through a questionnaire, including their age, computer experience, and familiarity with dam safety information. Additionally, a usability questionnaire was administered, comprising a series of statements with response options ranging from “Totally disagree” to “Completely agree”, presented as follows:

  1. 1.

    Overall, I was able to understand the information presented in the data;

  2. 2.

    In general, the reading of the data was easy to carry out;

  3. 3.

    I would use the data presented to carry out studies;

  4. 4.

    I would recommend the page containing the data to friends;

  5. 5.

    I believe the presentation of the data was easy to understand;

  6. 6.

    I believe the information is useful in my life.

After the completion of the questionnaire, users were invited to participate in a concise interview aimed at eliciting their perceptions regarding their interaction with the platform. The interview questions were as follows:

  1. 1.

    Were you able to understand the information presented in the data?;

  2. 2.

    Did you have any questions while interpreting the data?;

  3. 3.

    In your opinion, what is the best way to perform a data presentation?;

  4. 4.

    If you want to mention any other point you deem necessary, feel free to expose it.

4.4 The Evaluated System

The system under evaluation represents an enhanced version of the existing Brazilian dam information system, with a heightened emphasis on data presentation and consolidation. The objective is to facilitate even easier access to information for users compared to the currently utilized system.

It is important to note that the release of the updated product has not occurred as of yet. User testing is being conducted at an intermediate stage of the development process to leverage the obtained results as a tool for implementing enhancements and refining the already developed features and functionalities. An example of a screen used in the evaluation is shown in Fig. 1.

4.5 Analysis

The problems encountered during user testing were categorized as unique issues to avoid repeating an error identified by multiple users. There was no repetition of the same problem during each test; therefore, the analysis considered only the number of users who identified a specific problem.

Based on the previous outcome, the next step involved cross-referencing the results identified through heuristic evaluation and user testing to determine which issues were exclusively identified by one of the methods and which were identified by both. This analysis allowed for a comprehensive understanding of the aspects of interaction that were addressed by each approach.

5 Results and Discussion

This section describes the findings derived from the analysis of information obtained through user testing and the outcomes yielded from the heuristic evaluation.

5.1 Heuristic Evaluation Results

The heuristic evaluation successfully identified 41 issues by simulating the same task assigned in the user tests. Out of the 41 problems identified, 28 did not yield a similar outcome in the user testing. Among the critical problems, one significant issue involved comprehending the data due to its scattered presentation across the dashboard, leading to a disconnection between the titles and the corresponding data. Additionally, the interaction was hindered by the utilization of closely related colour scales for the presented data, frequently impeding the comparison of information when seeking specific details. The problems identified solely through heuristic evaluation are displayed in Table 1.

Table 1. Issues found only by the heuristic evaluation

Specialists also encountered a different category of problems related to technical terminology. Given that the system caters to diverse audiences, an expectation was to provide a “translation” of the technical terms into a more accessible language.

Furthermore, the presence of a Spanish translation option on the page did not result in the dashboard being updated when selected.

5.2 User Tests Results

In total, 27 problems were found by users. An identical problem encountered by multiple users was consolidated to prevent the duplication of problem analysis. Nonetheless, the number of users who reported the issue was included in the results and examined.

The problems identified during user tests were systematically categorized into individual issues, with their frequency of occurrence meticulously recorded, and subsequently linked to the heuristics proposed by Victorelli and Reis [21], and Molich and Nielsen [15] (Table 2).

Table 2. Issues found by user tests

In this particular context, most of the issues experienced by users were directly associated with the use of search filters, which demonstrated considerable complexity in terms of comprehensibility, primarily owing to the extensive array of information available for selection and the disparities, as mentioned earlier with the dashboard.

5.3 The Outcomes Derived from the Consolidation of User Tests and Heuristic Evaluation

Upon defining the task at hand, various usability issues were identified, which were found to be associated with the heuristics outlined by Victorelli and Reis [21], specifically centred around human-data interaction, as well as the heuristics proposed by Molich and Nielsen [15].

Following the execution of the application utilizing both methods for the identical task within the system, a comparative analysis was conducted to determine the prevalence of specific heuristics in each method and the insights they yielded.

The most prevalent issue encountered during interaction with the dashboard was the perceived lack of interactivity in the map of Brazil (reported by 14 users), where the ability to select and isolate data about a specific state by clicking on it was absent (Table 3).

Table 3. Issues found by both methods
Fig. 1.
figure 1

Example of an issue with the information contained in the dashboard, where “Em branco” means “Blank space”

As evidenced by related works, the combined approach of user testing and heuristic evaluation provides complementary insights into the issues that require attention. Furthermore, based on the obtained results, it was possible to confirm that heuristic evaluation also encompasses aspects of design suggestions, as Law, Lai-Chong, and Ebba Thora Hvannberg [9]. The integration of these two aspects yielded a comprehensive overview of improvements and problem categories that can be encountered in applications like the one under evaluation.

Among the heuristics proposed by Victorelli and Reis [21], the ones most closely associated with the issues encountered by users and identified during the heuristic evaluation were as follows: 3. Immediately provide visual feedback on the interaction due to lack of feedback and confusion caused to users at certain moments of the interaction, and 6. Semantically enrich the interaction, focusing on its sub-item 6.1 Semantically enrich search interaction, since search filters were applied and often did not have adequate feedback.

Regarding the heuristics of Molich and Nielsen [15], the two heuristics most closely linked to user problems were: 7. Flexibility and efficiency of use, and 8. Aesthetic and minimalist design, since the arrangement of elements and the pattern of colours used (mainly in heat maps), end up causing user confusion and hindering interaction. Regarding the problems found in the heuristic evaluation, the heuristics related the most were: 6. Recognition rather than recall, 7. Flexibility and efficiency of use, and 8. Aesthetic and minimalist design, once the interaction became harder because it did not have a pattern of the data presented, confusing the evaluators.

Regarding the heuristic 1 - Human-data interaction design guidelines for visualization systems from Victorelli and Reis [21], the subcategory applicable to the problem found was 1.3 - Reversible operations in visualizations. One of the issues identified during the evaluation was users having difficulty locating the button to clear their previous selections after applying filters and attempting to initiate a new search. This lack of immediate visibility led users to believe there was no option to remove the previously selected data.

During the task execution in the conducted tests, we observed that the presented information lacked fundamental contextualization on multiple occasions. For instance, within the system, updates were displayed without an accompanying explanation, thereby impeding a clear understanding of whether the updates were derived from real-time data or sourced directly from the underlying database along with the existing information. This issue was considered in the heuristic of Victorelli and Reis [21] 5. Minimize information overload, more precisely in subitem 5.1 - Show information context.

Another frequently encountered problem in the interaction involved the excessive spacing or scattered arrangement of substantial amounts of information on the screen. This particular issue hindered users from discerning the context and purpose of each case, thereby impeding the search process within the application. Particularly in cases where insufficient explanations were provided regarding the presented information, this problem further exacerbated the challenges faced by users, where heuristic 6. The “Semantically enrich the interaction” heuristic of Victorelli and Reis [21] was applied in subitem 6.1 - Semantically enrich search interaction.

Considering the results obtained from applying each method, it becomes evident that combining diverse methods with different heuristics yields complementary insights when evaluating security-focused systems. Consequently, the presence of multiple converging fronts, presenting similar outcomes empowers evaluators to prioritize resolving identified issues and subsequently address the specific concerns highlighted by each method.

Utilizing heuristics within the context of data presentation usability proved pivotal in effectively capturing and translating the encountered problems, thereby offering solutions and avenues for improved interpretation of the issues, subsequent rectification, and standardization.

The application of the heuristics proposed by Victorelli and Reis [21] proved beneficial within this project’s scope. Specifically, the broader set of six upper-level heuristics demonstrated significant relevance in addressing many encountered issues. However, when utilizing these heuristics in a context different from the case studies analyzed in the initial study, it became evident that there exist gaps that require more precise guidance tailored to specific domains and data contexts, thereby enhancing the efficacy of the broader heuristics.

5.4 Categories Proposed to Represent the Issues Related to Interaction with Data

The identified problems were categorized to represent the various issues encountered during the interactions. The resultant categories derived from this analysis offer valuable insights into the nature of the identified problems and their implications on the design of data exploration systems within dam safety.

Visibility About How to Interact with Data. When data requires a specific path to be accessed, it is imperative for the steps involved in obtaining the information to be transparent. The category “Visibility about how to interact with data” encompasses issues about quickly locating and comprehending the pathway to access data and its corresponding details. This category is proposed to facilitate user interaction with the data application, enabling efficient data exploration.

An instance illustrating an issue within this category was observed in both inspections, wherein the heatmap displayed on the dashboard failed to convey its interactive nature. Despite the intended functionality of optimizing searches beyond Brazilian states, it did not function as anticipated. Furthermore, upon users’ realization that the map could be interacted with and an area was selected, the remaining parts of the map disappeared, lacking a clear option to reset the previously made selection.

Position of Key Elements to Interact with Data. In order to interact with data, certain elements serve as keys to unlock specific information. Hence, these keys must be easily discernible and prominently visible to users. For instance, when working with data visualization accompanied by filters that necessitate parameter adjustments, the key for initiating this transition should be within the user’s visual field.

An illustrative example, identified during user testing and heuristic evaluation within this category, pertains to the button’s location to clear search filters (“Limpar filtro” in Portuguese). This button is situated in an inconspicuous position, erasing the selected filters and initiating a new search more challenging. Furthermore, depending on the user’s computer screen size, the button may not be immediately visible until the user scrolls down the page.

Data Presentation Pattern. Another facet of data interaction pertains to the presentation of information. Is there a discernible pattern in data presentation, or is it seemingly random? This category aims to capture this aspect, encompassing information clarity and its organisational structure’s comprehensibility. Additionally, the pattern must be coherent, enhancing understanding and ease of interaction with the data narrative.

The evaluated application allows users to employ filters to locate the desired dam. However, in each filter category, the application conveys that multiple pieces of information can be selected. Nonetheless, in practice, only one item from each filter category can be selected, leading to user frustration when choosing two items within the same parameter.

Operating Error to Achieve the Expectation of Interaction with the Data. Several malfunctioning issues were identified during the evaluations in the context of a recently released system. This category is introduced to address functional errors that can detrimentally impact the quality of data interaction, as the expected behaviour may not be fulfilled within the system, thus hindering the completion of the data interaction process.

It is important to acknowledge that systems generally are not exempt from malfunctions in specific functionalities. Within this application, an issue related to this category was observed concerning the zoom-in or zoom-out functionality of the dashboard. Despite the user’s attempts to increase the font size, no visible changes occurred.

Lack of Clarity in Terms that Explain the Data Presented. Certain domains necessitate the use of technical terminology to describe the presented data. Dam safety, for instance, employs specific terms to denote safety levels and risks associated with dam failure. These terms are commonly familiar to professionals engaged in daily dam safety management activities. However, considering that the system’s target audience encompasses the entire population, it is crucial to acknowledge that they may not be familiar with these terminologies.

In Brazil, dam safety policies incorporate specific terms to indicate the risk level of a potential dam failure, the associated risks in case of a breach, and the completeness of information about a particular dam. While these terms were devised to enhance safety inspection management, they may be unfamiliar to users. Hence, it is imperative to “translate” this information for users, ensuring their comprehension and alleviating concerns arising from potential misinterpretation.

5.5 Relation Between Issues Found in the Approaches and the Categories Proposed

The category with the fewest number of related issues was “Operating error to achieve the expectation of data interaction” (4 issues), which is expected in a system that has already been released, where malfunctions should ideally be minimized. The second-lowest category was “Position of key elements to interact with data” (6 issues), indicating that users encountered minimal difficulties locating the necessary elements to interact with the data.

The comparison between the category with the highest frequency and the category with the second-lowest frequency indicates that specialists perceive users to have a greater ability to navigate and interact with the system than comprehend the data itself. This observation raises questions regarding potential improvements in data presentation to convey the intended information.

The category exhibiting the highest number of related issues was also “Visibility about how to interact with data,” with nine associated problems. These nine issues were observed by users in at least 30 instances during the tests, indicating that they were consistently perplexing and caught users’ attention in the majority of the evaluations. The second most prevalent category linked to the issues encountered by users was “Data presentation pattern” (8 issues with 15 instances), underscoring the difficulties users faced in visualizing how to interact with the data application and the employed presentation patterns. Consequently, it is crucial to contemplate how data can be presented effectively, enabling users to better understand the narrative conveyed by the data.

6 Final Considerations

This study aimed to contribute to comprehending data exploration platform interaction within environmental systems focused on dam safety. The evaluation encompassed the usability assessment of a platform featuring information regarding the safety of Brazilian dams, a matter of paramount importance for accident prevention. The study identified categories of usability problems concerning human-data interaction in this context, elucidating the primary issues encountered during the evaluations.

The evaluation involved 18 participants aged between 22 and 45 years who interacted with the dashboard and utilised the filters to access the presented data. Throughout this process, a total of 27 problems were identified. Additionally, a collaborative heuristic evaluation was conducted to compare the results derived from user feedback with those provided by experts in the field of human-computer interaction, leading to the identification of 41 problems.

The outcomes achieved through each method can be associated with the heuristics and subsequently subjected to comparison. For the heuristics of Victorelli and Reis [21], both methods indicated the use of the heuristics 3 - Immediately provide visual feedback on the interaction due to lack of feedback and confusion caused to users at certain moments of the interaction, and 6 - Semantically enrich the interaction, focusing on its sub-item 6.1 - Semantically enrich search interaction. As for the heuristics of Nielsen [15], the user tests provided a view of mainly two heuristics: 7 - Flexibility and efficiency of use, and 8 - Aesthetic and minimalist design, while the heuristic evaluation highlighted, in addition to those mentioned, also the heuristic 6. Recognition rather than recall.

The categories established to encapsulate the issues encountered in both methods, namely heuristic evaluation and user tests, revealed “Visibility about how to interact with data” and “Data presentation pattern” as the most prominent. These categories shed light on the challenges users face when attempting to comprehend the interaction process for accessing or comprehending data, indicating that current designs lack appropriate patterns. In terms of data presentation, participants expressed in interviews that an optimal approach involves amalgamating various tools, such as dashboards, tables, and texts. The problems identified during the heuristic evaluation exhibited similarities to the issues reported by users, with emphasis on the interaction with dashboard information, which failed to clearly indicate the parameter employed for each presentation.

Thus, it becomes feasible not only to assess the usability of the application itself but also to comprehend that, within this particular context, user tests and heuristic evaluation mutually complement each other in inspecting the quality of a product with an environmental focus that necessitates thorough examination for safety reasons. Moreover, it is also possible to establish a correlation between the encountered problems and the existing heuristics in the literature.

For future studies, the attained results will be utilized by the responsible agency in charge of the system’s development to ensure the continuous enhancement and evolution of the application employed in this study. Subsequent tests can be conducted during the developmental process. Our objective is to deepen the understanding of the interaction issues identified in the evaluations and expand the knowledge in this domain, thereby elucidating the implications for design by scrutinizing other systems pertinent to human-data interaction in the environmental context.