Abstract
Despite the proposed advantages of systems accident analysis (SAA) methods for understanding incident-causation, these approaches have not been widely adopted by practitioners. This represents a significant gap between research and practice in accident analysis. The Understanding and Preventing Led Outdoor Accidents Data System (UPLOADS) provides a series of tools to address this gap. The aim of this study was to evaluate the validity of UPLOADS by comparing analyses generated by risk managers and researchers experienced in SAA. Twenty-three risk managers used UPLOADS to collect and analyse incident data from their organization over a three month period. The reports were then analyzed by two researchers experienced in SAA, and compared to those generated by participants. Participants identified half the number of factors identified by researchers, and tended to focus on only one or two factors as the causes of each incident. The potential consequences for practitioners’ understanding of incident-causation and countermeasure development are discussed, as well as ways of improving the system.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
There is now significant evidence that systems accident analysis (SAA) methods are required to understand the incidents that occur during “led” (i.e. facilitated or instructed) outdoor activities [1–3]. These methods are underpinned by the idea that safety in sociotechnical systems is impacted by decisions and actions made at all levels of that system, not just by human operators working at the so-called ‘sharp-end’. Therefore, accidents are caused by multiple factors that go beyond the immediate context of the incident itself. Accidents and safety are described as emergent properties arising from the interactions of components within a system [4, 5]. Studies which have applied SAA methods (i.e. STAMP and Accimap) to the analysis of both fatal [1, 2] and relatively minor injury-causing incidents [3] have demonstrated that these principles also apply to the led outdoor activity domain. Moreover, SAA methods provide a deeper understanding of how interactions across the system contribute to hazardous conditions and unsafe behaviour during led outdoor activities, compared to other “root cause” analysis methods that have been developed specifically for the domain [1].
Despite the proposed advantages of SAA for understanding incident-causation in this domain, this approach has not yet become common practice. This reflects a wider research practice gap whereby researchers are applying more advanced methodologies than practitioners [6]. The gap is a significant issue in many safety critical domains [6]. Researchers have demonstrated the applicability of SAA, and its advantages over non-systemic methods, in a wide range of safety critical domains, including space exploration [7], aviation [8], rail [9], public health [10], disaster management [11] and road freight transport [12, 13]. In practice, most investigations are still underpinned by linear chain-of-event models [5, 14]. According to Leveson [5], these models oversimplify accident processes, and cannot represent situations where accidents are caused by interactions between components, rather than individual component failures. Underwood and Waterson [6] identified a number of barriers preventing the adoption and usage of SAA methods by practitioners, including a lack of awareness, lack of training opportunities, accessibility and lack communication of information, usability, resource constraints, and concerns over the reliability and validity of SAA methods.
The Understanding and Preventing Led Outdoor Accidents Data System (UPLOADS) was designed to address this research-practice gap, by providing a series of tools that risk managers can use to collect and analyse led outdoor activity incident data. UPLOADS consists of incident reporting, storage, coding and analysis methods that formalise the application of Rasmussen’s [4] risk management framework (RRMF) and its associated Accimap technique in this domain. A software tool and incident reporting form support the collection of detailed information on the events leading up to, and during incidents, from the perspective of those involved in the incident (i.e. activity leaders) and those involved in activity planning and organisational management (i.e. managers). Managers then enter these reports into a database within the software tool. After entering each report, managers are prompted to identify and code the causal factors, and relationships between them, present within each report using the UPLOADS accident analysis method (described below). The software tool then produces diagrammatic representations of the system of factors involved in individual, and aggregate, incidents (presented in the form of Accimaps). Tables with descriptions of the contributing factors and relationships identified support the diagrams. Video and paper-based training material has also been developed to explain the underpinning theory and support each stage of this process. The system evaluated in this paper was a refined version developed after an initial six month trial and evaluation of the prototype system [15].
The UPLOADS accident analysis method consists of a framework for representing the system of factors involved in incidents and a taxonomy for populating the framework. In a series of previous studies [2, 3], the RRMF was adapted to describe the “led outdoor activity system” as a hierarchy across multiple levels: equipment, environment and meteorological conditions; decisions and actions of leaders, participants and other actors at the scene of the incident; supervisory and management decisions and actions; activity centre management, planning and budgeting, local area government, schools and parents, regulatory bodies and associations; and government department decisions and actions. The taxonomy consists of two levels of categories. The first level describes the actors (e.g. activity participants, activity leaders, field managers, schools, parents etc.), artefacts (e.g. equipment) and activity context. The second level describes specific contributing factors relating to each of these components. The taxonomy was developed and refined in a series of previous studies [3, 16], and has initially shown reasonable levels of inter-rater reliability [17]. The taxonomy is intended to guide both the collection and analysis of incident data, allow for analysis across multiple incidents, and help ensure the reliability of the method [2, 6].
As noted, concerns around the reliability and validity of SAA methods are a key factor in their lack of uptake by practitioners. Validity testing is a critical but often overlooked part of human factors methods development and implementation [18]. One aspect of testing the validity of a method involves evaluating validity by testing whether end users are able to generate analyses that are accurate compared to a criterion, such as an expert panel’s analysis [19]. Such testing ensures that, when implemented in practice, the method is able to be used as intended.
The aim of this study is to evaluate the validity of UPLOADS by comparing analyses generated by risk managers to those generated by researchers experienced in SAA. This study specifically focuses on the types of contributing factors that are identified from incident reports by risk managers as opposed to researchers.
2 Method
2.1 Design
The study was a prospective trial. It involved participants using UPLOADS to collect and analyze incident data within their organization over a three month period (June to August 2014). The study was approved by the University of the Sunshine Coast Human Ethics Committee.
2.2 Recruitment
Organizations were invited to participate in the trial via outdoor education and recreation peak body and professional membership association newsletters. Interested organizations were asked to invite a senior staff member in a safety-related role to participate in the study. That person was responsible for entering all incident reports; analyzing and managing the data; and providing training to other staff on reporting incidents. Forty three organizations volunteered to be involved in the study.
2.3 Materials
UPLOADS includes: paper-based incident report forms; a Software Tool for collecting, coding and analyzing data; an incident severity scale; and video and paper-based training.
Paper-Based Incident Report Forms. The fields in Table 1 were presented in the format of a paper-based incident report form, so that activity leaders (i.e. reporters) could easily report incidents. The paper-based version included an incident severity scale and a taxonomy of potential contributing factors.
Software Tool. The software was developed in FileMaker Pro 12 and Java. The software consisted of: five linked databases for collecting data (incidents; staff; clients; and participation); a tool for classifying contributing factors and relationships between them as identified in incident reports; a tool for summarizing the contributing factor and relationships data collected; and a tool for exporting deidentified data (e.g. names removed) to send to the research team.
Data Collection Tools. The Incident Database captured the information shown in Table 1. It was structured to record both near misses and incidents associated with adverse outcomes. A “near miss” was defined as a serious error or mishap that had the potential to cause an adverse event but failed to do so because of chance or because it is intercepted. An “adverse outcome” was defined as any injury or illness. Participants were instructed to record any near misses rated as 2 or above on the severity scale (i.e. an incident where the potential outcome to risks could cause moderate injuries or illnesses) and any adverse outcome rated as 1 or above (i.e. required localized care (non- evacuation) with short term effects).
Procedure for Analyzing Incident Reports. After incident reports had been entered into the software, the program prompted participants to identify and classify the contributing factors, and relationships, from the incident report. When identifying contributing factors, participants were instructed not to speculate beyond the information provided in the incident report. Classifying the contributing factors involved manually entering a description of each contributing factor identified from the report into a list. For each description, participants were then prompted to select a Level 1 code from a drop down list that best described that factor, and if possible, a corresponding Level 2 factor. Classifying the relationships also involved entering a description of each relationship identified in the report into a list. For each description, participants were then prompted to choose a pair of codes (identified when classifying the contributing factors), that best described that relationship. There was no limit on the number of contributing factors or relationships that could be entered.
Once this information had been entered, summary analyses of individual and aggregate incidents could be produced. This involved performing a search to identify a set of incidents based on any field or combination of fields within the database (e.g. all incidents, or all incidents associated with injuries and kayaking). The software then generated a diagrammatic representation of the factors and relationships identified in these reports and summary tables listing each code, the frequency of that code and the descriptions associated with it.
Training Material. The training material consisted of: a manual explaining the accident causation model underpinning the system and how to collect data about incidents; a manual describing how to use the UPLOADS software; videos demonstrating how to use each component of the software; and a PowerPoint presentation for staff explaining the details required for the incident report forms.
2.4 Procedure
On contacting the research team, organizations and participants were asked to provide written consent to participate in the study. Participants were then sent a link to a demographics questionnaire presented on Survey Monkey. Once completed, participants were sent an email with instructions describing: how to download all the study materials from DropBox; the type of incidents to collect; and dates for submitting data to the research team. In addition, the email invited participants to contact the research team via phone or email if they had any questions or required any help.
Data Analysis. All data collected was merged into a central database. The set of incident reports analyzed by participants was then identified and extracted. These reports contained a list of contributing factors identified by participants, and each of the factors was associated with a Level 1 and Level 2 code from the taxonomy.
To compare the factors identified by participants and researchers, two researchers identified the contributing factors from each report (see Table 1). One researcher was highly experienced in SAA, and one was highly experienced in using UPLOADS and its taxonomy. Once the contributing factors had been identified, the researchers then discussed any discrepancies between the analyses and reached a consensus. Each researcher then classified the factors identified, and again discussed the coding until consensus was reached on all codes selected.
The researchers then classified the contributing factors identified by participants. Again the coding was discussed until consensus was reached on all codes selected. An aggregate Accimap was then constructed using the UPLOADS accident analysis framework [17], comparing the factors identified by participants and researchers agreed correct answers.
3 Results
3.1 Sample
In total, 23 participants used UPLOADS and sent the data to the research team. Thirteen had analyzed and coded reports. This represents a 53 % response rate for using the incident reporting component of the software tool, and 30 % response rate for using the SAA tools.
Of the participants who analyzed and coded reports, 7 were male and 6 female. Two participants were 25 to 34 years; eight were 35 to 44 years; two were 45 to 54 years; and one was 55 to 64 years. All held a management role within the organization, and 9 led outdoor activities as part of their role.
3.2 Overview of Data Collected
Participants analyzed and coded 104 incident reports out of a total 226 reports. Of the participants who analyzed reports, on average they analyzed and coded 92 % of the reports that they themselves had collected (range 36 % to 100 %).
One hundred and twenty six reports were associated with adverse outcomes, and 14 with near misses. The median rated actual severity of the reports was 1 (range 0 to 5), indicating that the majority of incidents resulted in only minor injuries or illnesses. The median rated potential severity of the reports was 2 (range 1 to 6), indicating that the majority of incidents had a potential for moderate injuries or illnesses.
3.3 Comparison of Contributing Factors Identified from Incident Reports
The median number of factors identified by participants per report was 2 (range 1 to 4). The median number of factors identified by researchers per report was 3 (range 1 to 11). Researchers identified all of the factors that were identified by participants. However, participants identified only half (53 %) of the contributing factors identified by researchers.
Figure 1 shows a comparison between the contributing factors identified from the reports by researchers and participants, classified according to the UPLOADS accident analysis framework. In total, researchers identified 51 types of contributing factors; participants identified 40. The contributing factors that participants had most difficulty identifying were related to the Activity Group and Higher-level management. Factors relating to Higher-level management were present in very few of the reports provided by participants.
The level of agreement between researchers and participants varied considerably according to the type of contributing factor. Agreement was highest for factors relating to “other people in the activity group” (Mean level of agreement = 83 %), “activity participant” (Mean level of agreement = 77 %) and “activity environment” (Mean level of agreement = 65 %). Agreement was lowest for factors relating to “higher-level management” (Mean level of agreement = 21 %), “parents/carers” (Mean level of agreement = 25 %) and the “activity group” (Mean level of agreement = 30 %).
4 Discussion
The aim of this study was to evaluate the concurrent validity of UPLOADS by comparing analyses generated by practitioners to those generated by researchers experienced in SAA, The findings showed that participants identified only half the factors identified by researchers from incident reports. Participants tended to focus on only one or two factors as the primary causes of each incident, indicating a lack of systems thinking and general awareness regarding accident causation. Overall, the findings indicate that UPLOADS provides only a partial bridge over the research-practice gap, as the analyses generated by practitioners may only provide only a partial picture of the factors contributing to accident causation in led outdoor activities.
The question is whether this level of validity is sufficient to justify the use of UPLOADS within organizations? A “valid” tool achieves the purpose for which it was designed (Stanton & Young, 2003). The purpose of UPLOADS is to support organizations in developing a deeper understanding of accident causation during led outdoor activities and therefore identify more appropriate and effective countermeasures.
The study provides some evidence that UPLOADS, as it is currently used, supports a deeper understanding of accident causation. Prior to the development of UPLOADS, a survey of Australian outdoor activity providers found that only half had incident databases [20]. Thus, at the very least, organizations that use UPLOADS will benefit by being able to track incident rates and identify trends over time, even if the full causal picture may not be forthcoming. Secondly, other incident reporting systems for this domain either do not support the identification of contributing factors [20], or include taxonomies that are limited to factors relating to activity leaders, participant, equipment and the environment [3]. The findings show that participants identified higher-level factors in at least some reports. Therefore, while they may not have a complete picture of accident causation in their organization, they potentially have an enhanced understanding compared to not using an incident database at all, or using any of the other incident reporting systems that have been developed for the domain. Moreover, it is likely that continued use of UPLOADS and exposure to its SAA method and taxonomy will enable practitioners to develop a deeper understanding of the role of systemic factors in accident causation, which in turn should lead to improvements in the data.
One concern is that the current outputs may not be sufficient to support the development of more appropriate and effective countermeasures. From a systems perspective, appropriate and effective countermeasures need to target the higher-level factors that contribute to hazardous conditions and unsafe behavior [4]. The findings generally indicate that based on their own analyses, participants are likely to focus on only one or two factors, such as hazardous conditions and unsafe behaviours by participants and activity leaders. Thus, risk managers are unlikely to be able to identify the systemic network of factors that contribute to such factors. This suggests that the way that managers currently use UPLOADS is insufficient to support the identification of more appropriate and effective countermeasures. One approach that has proven useful in other areas is to use expert panels to analyze the data provided by practitioners and also to subsequently identify countermeasures. However, in this case the amount of data is prohibitive to this approach. Rather, it may be that practitioners need more training in SAA to help them identify multiple factors from reports. This could potentially involve the development of interactive training videos which explain the approach, as at the moment the theoretically underpinnings are only explained in written documentation.
However, it should also be noted that in the current study, the period of exposure to UPLOADS and SAA concepts was very short (3 months). It has taken practitioners in domains such as aviation many years to become proficient systems thinkers. This suggests that different results may be obtained once UPLOADS becomes widely adopted across the sector for a number of years.
Finally, the limitations of the study and directions for future research should be considered. Firstly, a key aspect of SAA is the identification of relationships between factors; this aspect of the UPLOADS was not considered in this study. The evaluation of this aspect may shed more light on the countermeasures that practitioners using the tool are likely to produce. Secondly, participants with few incidents to report had less opportunity to interact with UPLOADS than those who had more incidents. While this was a consequence of the naturalistic study design, it would be useful to evaluate the validity of UPLOADS under conditions where all participants had to analyze the same incident reports. Potentially, this would allow insights into how technical aspects of the tool could be improved. Third and finally, training was self-directed (i.e. participants could choose which of the training materials they viewed). This may partially explain why so few organizations used the coding and analysis tools. While face-to-face training would undoubtedly have produced better results, the training approach was guided by cost considerations, which need to be kept as low as possible to support Australia-wide implementation.
In conclusion, it appears that implementation of UPLOADS system has slightly narrowed the gap between accident analysis research and practice in the outdoor activity domain. Practitioners now have the opportunity to report and analyse accidents in line with systems thinking, applying state of the art SAA methods. However, the findings show that practitioner’s understanding of accident causation in this domain remains limited. More work is required that targets the usability of the tool and practitioner’s understanding of SAA. This work needs to be undertaken in tandem to ensure that the use of the tool ultimately results the identification of more appropriate and effective countermeasures to enhance accident prevention in this domain.
References
Salmon, P., Williamson, A., Lenné, M., Mitsopoulos-Rubens, E., Rudin-Brown, C.M.: Systems-based accident analysis in the led outdoor activity domain: application and evaluation of a risk management framework. Ergonomics 53(8), 927–939 (2010)
Salmon, P.M., Cornelissen, M., Trotter, M.J.: Systems-based accident analysis methods: a comparison of Accimap, HFACS, and STAMP. Saf. Sci. 50(4), 1158–1170 (2012)
Salmon, P.M., Goode, N., Lenné, M.G., Finch, C.F., Cassell, E.: Injury causation in the great outdoors: a systems analysis of led outdoor activity injury incidents. Accid. Anal. Prev. 63, 111–120 (2014)
Rasmussen, J.: Risk management in a dynamic society: a modelling problem. Saf. Sci. 27(2/3), 183–213 (1997)
Leveson, N.G.: Applying systems thinking to analyze and learn from events. Saf. Sci. 49(1), 55–64 (2011)
Underwood, P., Waterson, P.: Systemic accident analysis: examining the gap between research and practice. Accid. Anal. Prev. 55, 154–164 (2013)
Johnson, C.W., de Almeida, I.M.: Extending the borders of accident investigation: applying novel analysis techniques to the loss of the Brazilian space launch vehicle VLS-1 V03. Saf. Sci. 46(1), 38–53 (2008)
Branford, K.: Seeing the Big Picture of Mishaps: Applying the AcciMap Approach to Analyze System Accidents. Aviat. Psychol. Appl. Hum. Factors 1(1), 31–37 (2011)
Underwood, P., Waterson, P.: Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models. Accid. Anal. Prev. 68, 75–94 (2014)
Cassano-Piche, A.L., Vicente, K.J., Jamieson, G.A.: A test of Rasmussen’s risk management framework in the food safety domain: BSE in the UK. Theor. Issues Ergonomics Sci. 10(4), 283–304 (2009)
Salmon, P.M., Goode, N., Archer, F., Spencer, C., McArdle, D., McClure, R.J.: A systems approach to examining disaster response: using Accimap to describe the factors influencing bushfire response. Saf. Sci. 70, 114–122 (2014)
Newnam, S., Goode, N.: Do not blame the driver: a systems analysis of the causes of road freight crashes. Accid. Anal. Prev. 76, 141–151 (2015)
Salmon, P., Read, G., Stanton, N.A., Lenné, M.: The Crash at Kerang: investigating systemic and psychological factors leading to unintentional non-compliance at rail level crossings. Accid. Anal. Prev. 50, 1278–1288 (2013)
Lundberg, J., Rollenhagen, C., Hollnagel, E.: What-You-Look-For-Is-What-You-Find – the consequences of underlying accident models in eight accident investigation manuals. Saf. Sci. 47(10), 1297–1311 (2009)
Grant, E., Goode, N., Salmon, P., Lenne, M., Finch, C., Scott-Parker, B.: How do I save it? Usability evaluation of a systems theory-based incident reporting software prototype by novice end users. In: HCI International 2015, Los Angeles, USA (2015)
Goode, N., Salmon, P.M., Lenne, M., Finch, C.F.: A test of a systems theory-based incident coding taxonomy for risk managers. In: 5th International Conference on Applied Human Factors and Ergonomics. The Printing House Inc, Krakow, Poland (2014)
Taylor, N.Z., Goode, N., Salmon, P.M., Lenne, M., Finch, C.: Which code is it? Inter-rater reliability of systems theory-based causal factor taxonomy for the outdoor sector. In: 19th Triennial Congress of the International Ergonomics Association, Melbourne, Australia (2015)
Stanton, N.A., Young, M.S.: What price ergonomics? Nature 399(6733), 197–198 (1999)
Stanton, N.A., Young, M.S.: Giving ergonomics away? The application of ergonomics methods by novices. Appl. Ergon. 34(5), 479–490 (2003)
Goode, N., Finch, C., Cassell, E., Lenne, M.G., Salmon, P.M.: What would you like? Identifying the required characteristics of an industry-wide incident reporting and learning system for the led outdoor activity sector. Aust. J. Outdoor Educ. 17(2), 2–15 (2014)
Acknowledgements
This projected was supported by funding from the Australia Research Council (ARC) in partnership with the Australian Camps Association, Outdoor Educators’ Association of South Australia, United Church Camping, Outdoors Victoria, Outdoor Council of Australia, Recreation South Australia, Outdoor Recreation Industry Council, Outdoors WA, YMCA Victoria, The Outdoor Education Group, Girl Guides Australia, Queensland Outdoor Recreation Federation, Christian Venues Association, Parks Victoria, Victoria Department of Planning and Community Development, Outdoor Education Australia and the Department of National Parks, Recreation, Sport and Racing Australia. Caroline Finch was supported by a NHMRC Principal Research Fellowship (ID: 1058737). The Australian Centre for Research into Injury in Sport and its Prevention (ACRISP) is one of the International Research Centres for Prevention of Injury and Protection of Athlete Health supported by the International Olympic Committee (IOC). Paul Salmon’s contribution to this article was funded through his current Australian Research Council Future Fellowship (FT140100681).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Goode, N., Salmon, P.M., Taylor, N.Z., Lenné, M.G., Finch, C.F. (2015). Bridging the Research-Practice Gap: Validity of a Software Tool Designed to Support Systemic Accident Analysis by Risk Managers. In: Harris, D. (eds) Engineering Psychology and Cognitive Ergonomics. EPCE 2015. Lecture Notes in Computer Science(), vol 9174. Springer, Cham. https://doi.org/10.1007/978-3-319-20373-7_20
Download citation
DOI: https://doi.org/10.1007/978-3-319-20373-7_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20372-0
Online ISBN: 978-3-319-20373-7
eBook Packages: Computer ScienceComputer Science (R0)