Abstract
This paper proposes a new approach to universal access based on the premise that humans have the universal capacity to engage emotionally with a story, whatever their ability. Our approach is to present the “story” of museum resources and knowledge as a journey, and then represent this journey physically as a smart map. The key research question is to assess the extent to which our “story” to journey to smart map’ (SJSM) approach provides emotional engagement as part of the museum experience. This approach is applied through the creation of a smart map for blind and partially sighted (BPS) visitors. Made in partnership with Titanic Belfast, a world-leading tourist attraction, the interactive map tells the story of Titanic’s maiden voyage. The smart map uses low-cost technologies such as laser-cut map features and software-controlled multi-function buttons for the audio description (AD). The AD is enhanced with background effects, dramatized personal stories and the ship’s last messages. The results of a reception study show that the approach enabled BPS participants to experience significant emotional engagement with museum resources. The smart model also gave BPS users a level of control over the AD which gave them a greater sense of empowerment and independence, which is particularly important for BPS visitors with varying sight conditions. We conclude that our SJSM approach has considerable potential as an approach to universal access, and to increase emotional engagement with museum collections. We also propose several developments which could further extend the approach and its implementation.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
In recent decades, museums have gone through a major transformation. Their role has shifted from collection-centred to user-centred institutions [1]. As Sylaiou and Dafiotis point out: “museums, since the advent of New Museology in the second half of the twentieth century, seek to adapt to a wider call for audiences’ empowerment, involvement and acknowledgment of visitors’ individuality” [2]. In other words, visitor experience is at the heart of the museum vision. Museums are seeking to create an interactive, engaging and participatory experience for their visitors, generating memories that will endure long after the visit.
Equal access to cultural venues such as museums, heritage centres and art galleries for people with disabilities is a human right, declared and addressed at international and national levels. For example, the UN Convention on the Rights of Persons with Disabilities (CRPD) formally recognises that accessibility is a human right and “a vital precondition for persons with disabilities to participate fully and equally in society and enjoy effectively all their human rights and fundamental freedoms” [3]. In line with human rights, museums are endeavouring to ensure equal access for visitors with varying abilities. Yet, because museums have historically been heavily invested in providing largely visual experiences, blind and partially sighted (BPS) visitors are still a significant, marginalized group in terms of the quality of their visitor experience, specifically in our experience in the failure to generate emotional engagement [4]. Moreover, according to a website audit by VocalEyes [5], audio described guides were mentioned by only 3% of museums and handling/tactile objects were referenced by only 11% of museums across the UK. This paper focuses on accessibility for BPS visitors in museums, but we also believe our approach could be beneficially applied to a broader group of visitors.
The research approach of design, implementation and evaluation to media accessibility (MA) has changed significantly in the past few years. Greco puts forward three major such changes: “a shift from particularist accounts to a universalist account of accessibility, from maker-centred to user-centred approaches, and from reactive to proactive models” [6]. Greco further explains that “according to the universalist account, MA concerns access to media and non-media objects, services and environments through media solutions, for any person who cannot or would not be able to, either partially or completely, access them in their original form” [6]. In the context of Greco’s three shifts, our research contributes to each of these shifts as follows: (1) we propose a new approach to universal access based on the premise that humans have the universal capacity to engage emotionally with a story, whatever their ability; (2) a user-centred approach was employed focusing on users at each stage of the research; and (3) the interactive and user-driven way of controlling the AD positions the user as an active participant rather than a passive recipient.
In museums, access facilities for BPS visitors, such as AD guides, are usually provided as an add-on approach. In addition, AD is traditionally about information transmission. AD is defined as a verbal description of visual components, which traditionally advocates objectivity. With museums increasingly switching from being collection-centred to visitor-centred institutions, visitors look for delightful, provoking [7] and memorable experiences [8]. Wang et al. [4] and Hutchinson and Eardley [9] thus argue that museum AD should enable visitors to access an independent, inclusive and emotionally engaging experience, providing inclusiveness and equal access in a broad sense.
Therefore, in this paper, we put forward a new approach, what we call the “story to journey to smart map” (SJSM) approach, to universal access in museum contexts. The approach first repurposes museum knowledge into a story (or stories), then uses the spatial and temporal dimensions of a journey to curate those stories through the medium of an interactive smart map. In our case study, we present a smart map of Titanic’s maiden voyage as a vehicle for telling the story of life on board the ship on its fateful journey. The map not only conveys to visitors information about the voyage, but enables them to embark on their own narrative journey as they explore the stories in an accessible and emotionally engaging way.
This research is part of a larger project, in which we are collaborating with Titanic Belfast and the Royal National Institute of Blind People (RNIB) to develop the use of new technologies to improve accessibility and visitor experience, and to evaluate their effectiveness. Titanic Belfast is recognized internationally as a global tourist attraction; it was awarded World’s Leading Tourist Attraction in 2016, and is already committed to accessibility standards, such as free-of-charge AD, handrail extensions and contrasting floor textures.
Titanic Belfast is an ideal venue for testing our proposed approach. Although Titanic Belfast has very strong museological qualities in terms of exhibition, it is not just about information: it offers a human and emotional experience, rooted in one of the great traumas of twentieth-century history. The venue and the experience it provides are especially poignant in the context of Belfast, where the ship was designed and built. Titanic Belfast, located on the very site where the Titanic was built and launched, opens up the potential for deep emotional engagement [4].
This paper is structured as follows: the next short section presents our hypothesis and the theoretical basis of our approach to universal access. Section 3 presents the challenges of converting 2D images into smart replicas and the benefits of using a multisensory approach in museums. Section 4 gives an overview of the current display in Titanic Belfast regarding the Titanic’s maiden voyage. Then, in Sect. 5 we present our smart map of the maiden voyage and, in Sect. 6, we reflect on some key design principles which we developed in producing the smart map. The initial reception study is described in Sect. 7, and we end by drawing some broader conclusions, particularly in relation to how our approach can be used in terms of universal access.
2 Storytelling and journeys as the basis of universal access
In Article 2 of the UNCRPD, universal design is defined as “the design of products, environments, programmes and services to be usable by all people, to the greatest extent possible, without the need for adaptation or specialized design. “Universal design” shall not exclude assistive devices for particular groups of persons with disabilities where this is needed” [10]. However, we realise the challenge in providing one solution that is adequate for all. Given the technology revolution, Jenkins argues that such new technologies have “enabled the same content to flow through many different channels and assume many different forms at the point of reception” [11]. In other words, instead of one size fits all, storytelling can be addressed in various forms, and technologies can be customized easily to provide a personalized experience; but in a broad sense, everyone can access an experience on offer without compromising others’ experience. This is specifically relevant to the proposed universal access approach and places user-centred design at the heart of the universal access approach. As Neves argues, “this will mean foreseeing all possible profiles of potential visitors and creating ideal conditions to make them feel welcome, safe, comfortable and, above all, to make them feel that there is something there to enjoy, learn and do that has been created ‘especially for me’” [12].
Stories have always been recognized as an effective way of enhancing and engendering emotional engagement. However, following a story in a visitor attraction presents particular challenges for BPS visitors. The result can be that BPS visitors miss out on the emotional experience. For example, in an earlier study we carried out into accessibility in Titanic Belfast [4], we noted that one of the BPS participants observed other visitors responding to exhibits with emotions which she herself did not experience. After her first visit to Titanic Belfast, referring to the part of the exhibition that presents the sinking of the ship, she said: “Some of my family members were crying, but I was wondering why they cried”. This example illustrates the depth of emotional engagement on the part of at least some sighted visitors, the quality of experience from which BPS visitors might all too easily be excluded. Our hypothesis is that, since the ability to emotionally engage with narrative seems to be universal, the proposed “story to journey to smart map” approach has the potential to enhance accessibility and visitor experience for visitors of all abilities.
2.1 A “story to journey to smart map” approach to universal access
Storytelling is an approach for enhancing learning, generating emotional engagement, and inspiring imagination; it is also a way to connect to one’s personal and collective experiences. Stories are powerful tools for social inclusion and enhanced visitor experience, particularly in museum settings.
Storytelling allows the democratization of knowledge and promotes […] social integration, since the visitors can learn about their own heritage and reinforce the sense of belonging to a community and at the same time, learn about and understand cultural diversity and become acquainted to other ways of thinking [2].
The storytelling approach has been adopted widely in the context of museums, such as the Anne Frank House in Amsterdam, the BAC Moving Museum in London, and the Museum of Broken Relationships in Zagreb. It is key to the layout of Titanic Belfast, where visitors are guided through the narrative of Titanic, from its design and construction through its traumatic loss and, decades later, its rediscovery. Moreover, storytelling is such a universal and ubiquitous practice that it has, arguably, the potential to pave a universal way for museums to attract and accommodate a wide range of visitors. Storytelling “takes place within the brain all the time simply because the brain is constructed to think in terms of narratives and to relate to experiences and conversations by constructing stories” [13,14,15]. Because the capacity to follow and engage with stories is a universal cognitive ability shared by everyone, whatever their degree of ability, we adopt storytelling as the first stage of our approach to universal access. That is, our first step is to reconfigure the museum’s resources into a story.
Next, we bring the concept of journeys to storytelling. The journey could be a physical journey through space, or it may be that the more abstract flow of a story can be represented and visualized more tangibly and creatively as a journey through time, in the development of relationships between people, for example, or between communities, or between people and places. It is interesting that sea journeys in particular have a long history of appealing to people of all cultures down through the centuries (e.g. Homer’s Odyssey) or for communicating a particular view of history and heritage (e.g. Virgil’s Aeneid). Part of their appeal is that people can often relate the journey to their own stories of life’s experience and challenges. As Mieke Bal puts it, “a traveller in narrative is in a sense always an allegory of the travel that narrative is” [16]. Journeys can also exploit the fact that there can be multiple routes between the same starting point and end point, which can reflect different societal interpretations of the same facts and events depending on the teller’s/traveller’s perspective.
Once we have organized the story as a journey, we then propose the creation of a smart map to represent the journey. This gives scope for engaging the tactile and auditory senses of BPS visitors. A corresponding visual display would make the smart map more universal. The smart map also provides the basis for controlling and interacting with the AD experience. Our proposed SJSM approach to universal access can be visualized as shown in Fig. 1.
Creating our smart map required an interdisciplinary approach that draws on translation studies, disability studies, museum studies, electronics, and computer science. Translation is conceived here as a mode centrally concerned with inclusivity which, in its responsiveness to the legislative framework for enhanced accessibility, e.g., the UN Convention on the Rights of Persons with Disabilities (CRPD) 2014, European Disability Strategy 2010–2020, European Accessibility Act 2015, Equality Act 2010 (Great Britain) and Discrimination Act 1995 (Northern Ireland), is increasingly turning to sophisticated technological solutions. In that regard, “ICT permeate cultural life, not only by introducing new forms of creative expression and meanings for art, but also by enriching, transforming and enhancing the museum experience” [2]. In this paper, Sect. 5 describes how ICT enables us to convert the visual, two-dimensional (2D) map into a tactile, auditory, three-dimensional (3D) smart map.
We apply the proposed SJSM approach to the maiden voyage of Titanic. In many ways, this is an ideal context, because of the nature of the story, the significance of that fateful journey, and the privileged local access we have to such a sophisticated and popular visitor attraction, whose historical relevance and emotional charge are both enhanced, as we have noted, by the fact of its location on the very site where the ship was built and launched.
3 Smart replicas and multisensory museum accessibility
A smart replica is an interactive replica of an original artefact. In museums and art galleries, “replica” usually refers to a duplicated object from the original, and “smart replica” is a term used to refer to “replicas of objects that are either part of a collection or exhibit or related to it [that] are built to embed digital components to allow for their use as part of an interactive experience” [17]. A smart map is a type of smart replica, inviting BPS visitors to participate and explore meaning in museum galleries and exhibitions both physically and cognitively. By actively participating in the exploration of meaning, and “in full control of the object and its content”, an emotional and personalized experience can be enjoyed by the visitors [18]. To explore meaning via a smart replica is like enjoying the materiality of a book, recalling how “books were, and are, held, carried, opened, thumbed, fingered, and stroked” [19]. The exploration of meaning is not only carried out through the traditional five senses, but also constructed by the motion and the movement of the body through space, where the encounter with strangers and the surrounding environment, together with the sense of balance, time and direction as “cognition is not solely a process of the mind, but rather, of the interplay between our minds, bodies, and the environment” [20].
3.1 Smart maps and the challenges of design
Tactile graphs, raised-line graphs, and braille maps are commonly used among BPS people to aid teaching and wayfinding [21]. Ungar et al. point out that “tactile maps have long been recognized by professionals involved in the education of visually impaired children and in the rehabilitation of recently blinded adults, as a useful tool in mobility training and, perhaps to a lesser extent, as a day to day wayfinding aid” [22]. In addition to improving learning experiences, the use of interactive tactile maps has been proven to enhance BPS people’ ability to orientate and navigate themselves through space [21, 23, 24].
More recently, multisensory interactive storytelling maps have been designed using online multimedia to provide access to specific spatial features and stories for sighted people. This type of map is usually database-driven and usually only available in electronic versions. Such a multisensory storytelling map can serve as a creative tool for information communication, and can be employed in various contexts, such as museums, art galleries, heritage centres, outdoor sites and educational settings. For example, a web-based story map was created for people to explore the Methana Peninsula in Greece, renowned for its “specific volcanic geoforms, unique flora and rich history” [25]. The aim of this map is to enable various users to learn about the distinguishing characters and particular aspects of the Methana Peninsula through interacting with multimodal materials, maps, texts and images. [25]. Another example is a web-based storytelling map of the Upper Mara Valley (Maramureș, Romania), which is famous for its volcanic relief and folk tales. The map was designed for young people to enhance their outdoor activities [26].
These examples illustrate the feasibility and possibility of using maps as an interactive and communicative tool for storytelling and as a way of offering users a multisensory experience. The tool offers opportunities to translate information into knowledge and into powerful and moving stories in a multimedial way, by combining the maps with photographs, audio, and videos [25]. Although web-based story maps mostly rely on visual access, the concept of the story map also informs us that maps can be used in a way that employs various sensory modalities to transfer information. Therefore, tactile maps combined with audio such as AD, music and background sounds have the potential to be effective communicative tools for BPS visitors. When tactile maps are updated with interactive features, the map starts to serve more purposes and possibilities. The map can be used to enable BPS users to not only learn historical and physical facts, but also have emotional experiences given the fact that BPS users can access much more information through audio than through the limited tactile information conveyed through the map itself and/or braille (which is not accessible to all BPS visitors). Compared to 3D objects, 2D objects, such as maps, paintings and photographs, still present challenges in terms of providing access for BPS visitors [27]. Sighted visitors obtain information from 2D objects by looking at them and, reading them; they see words, images, shapes, colours, textures, fonts, combinations of these and other properties. However, compared to sighted visitors, little to none of this visual information is available to BPS visitors. Therefore, there are challenges as to how to make the visual information of 2D objects accessible to people without, or with limited, vision. For example, how do we bring a multisensory approach (e.g. sound and touch) to static 2D objects that are normally silent and not to be touched? Another challenge is how the visitor experience (e.g. the interactive and participatory experience) can be enhanced in this interpretive process [28].
A multisensory approach has been put forward to overcome the challenges of turning the 2D (visual) image into a 3D (tactile) object and/or turning the 2D (visual) image into audio information by using AD in various ways. A US commercial company, for example, 3D PhotoWorks, makes 2D paintings and photos accessible to BPS people by converting the images into 3D replicas, many of which have touch sensors embedded to activate audio commentaries. In another example, Neves proposes “soundpainting” as a form of artistic transcreation that aims “to convey explicit and implicit visual messages through non-visual forms” [29]. Soundpainting is an approach that explores the use of multiple sounds, as Neves explains: “carefully chosen words and a careful direction of the voice talent to guarantee adequate tone of voice, rhythm and speech modulation can all work together with specific sound effects and music to provide “stories” and emotions that a particular art may offer” [29]. Soundpainting can inspire creative AD design to offer BPS users quality experiences. Producing a smart replica of a 2D object actually integrates both of these solutions, where touch can be one experience and soundpainting offers another experience for BPS visitors. With the integration of these two solutions, a smart replica of a 2D map, or a smart map, can communicate to BPS visitors the stories and emotions on offer.
Previous studies have highlighted both the advantages of tactile technologies for providing interactivity and their difficulties. Interactivity relies on technologies such as tactile buttons, conductive painting or capacitive technologies [30, 31] that provide feedback in response to a user’s input [32]. Giraud’s study [30], which compared raised-line maps (RLMs) and interactive small-scale models (SSMs), found that BPS people enjoyed the interactive feature of the map, whereby the audio was triggered when they touched the maps’ conductive markers. The BPS participants in this study pointed out that “interactivity was appreciated because the back and forth movements between the map and the braille legend were not necessary” [30]. However, based on their research, Giraud et al. also discovered that the touch sensors were too sensitive. Researchers have pointed out that avoiding short impact-related gestures (e.g. tap) is a significant factor in deciding the feasibility of the map [33]. Because of the natural two-hand exploration, if a single tap is used to create interactivity, it is more likely that BPS users will trigger some simultaneous audio commentaries unintentionally, and they are unlikely to know which finger caused the sound outputs [34]. Thus, double-tap [35] and multiple-tap interactions [36, 37] are proved to be more feasible.
3.2 Benefits of using a multisensory approach in museums
There are various benefits in using multisensory modalities. Multisensory techniques involving more than two sensory modalities have been found to be better solutions for students to remember information [38]. The success of employing multisensory techniques in learning mathematics, languages and reading are pointed out by education researchers and practitioners alike [38,39,40,41,42]. Researchers also argue that such multisensory methods are particularly effective for learners with disabilities [43,44,45,46].
According to the research cited previously, by encouraging visitors to use their range of sensory modalities to explore exhibitions, visitors’ understanding and memory are enhanced. Wilson points out that this process can “facilitate the generation of memorable experiences” [47]. Other experts further support this argument and point out that “multisensory experience, based on imagery or perceptual experience, combined with semantic or factual information, would enhance memorability” [48].
Lacey and Sathian highlight the interaction between visual imagery and haptic perception and further explain the benefits to visitors’ understanding and memory of touching exhibits from a neuroscience perspective [46]. Based on their findings, they suggest that “spatial metric information is preserved in representations derived from both vision and touch, and that both modalities rely on similar, if not identical, imagery processes” [46]. By further outlining a preliminary model of visual imagery in haptic shape perception and representation, Lacey and Sathian recommend that visitors should be allowed to handle objects and conclude that “doing so may lead to more elaborative processing, thus enabling better understanding and improved recall of the museum experience and its intellectual content” [46]. Furthermore, through physical engagement, aesthetic pleasure and a deeper appreciation can be stimulated by touching a beautiful object [48,49,50,51,52].
The evidence shows how sensory perception works to compensate for the lack of sight. Sensory compensation is defined as “the functional reorganization of systems and subsystems recruited to perform a disrupted function” [53]. The cerebral reorganization of BPS people demonstrates that “a heightened sensitivity to sound, together with non-visual senses, [is] thought to be the main method by which they compensate for lack of sight” [54]. In addition, a large number of studies show that people who are blind acquire superior tactile abilities (e.g. tactile discrimination) [55] and superior auditory skills [56]. Therefore, it can be argued that an approach using multisensory modalities can bring many benefits for BPS visitors to access visual arts in museums.
4 Design context: the Maiden Voyage Gallery in Titanic Belfast
For the purposes of this paper, we chose the Maiden Voyage Gallery as a specific gallery context for the story, journey and corresponding smart map design. In the Maiden Voyage Gallery, complex narratives, including historical facts, personal stories and passengers’ experiences on board Titanic, are curated and displayed. Almost all the artefacts exhibited in this gallery are either behind glass or in various 2D forms, including photographs, letters, a first-class menu, etc. Figure 2 shows part of the picture wall in the Maiden Voyage Gallery. It is a typical example of the use of 2D images in Titanic Belfast. The black and white photographs can be seen which were taken at Southampton during Titanic’s docking in the port as well as the explanatory texts of what was happening on and around Titanic at that time. Most of these photographs were taken by a young passenger, Frank Browne, whose story is told in the smart map.
A large number of digital technology-based exhibits are also displayed in the Maiden Voyage Gallery. Digitized 2D moving visuals include: a large screen onto which sea waves are projected to give the illusion of standing on the first-class promenade deck; a video image of a chef projected behind a table set with fine china, simulating a dining experience in Titanic’s Veranda Café; and an animation clip that presents visitors with information regarding the different passengers who boarded and disembarked from Titanic at each stop and the amount and type of goods and supplies the ship carried. Almost all of the information and artefacts displayed in the Maiden Voyage Gallery can only be accessed by visual perception.
4.1 Titanic’s Maiden Voyage map
The focus of the smart map design is on employing haptic and auditory modalities to communicate multi-layered information for BPS visitors while providing access to a similar or equivalent experience to that enjoyed by sighted visitors in the Maiden Voyage Gallery. Our interactive map of Titanic’s maiden voyage aims to involve visitors in a historical and emotional journey. The map is designed to tell the story of the maiden voyage and to take visitors on their own journey.
A huge 2D map of Titanic’s maiden voyage is displayed at the entrance of the Maiden Voyage Gallery in Titanic Belfast. Actually, this map serves as a specific narrative context in several galleries, and it can be found in various 2D art forms across the whole venue. For example, in the Maiden Voyage Gallery, an animated version of the map in an interactive screen stand enables visitors to follow the route to each stop and learn assorted factual information, such as how many people got on and off the ship at each port. The map can also be found in the interactive screen stands in the Aftermath Gallery, where it is used to show the databases of passengers at each stop, and in the Titanic Beneath Gallery, where it is used to show the oceanographical information of Titanic’s route. It can also be found outside the venue, engraved into the external Plaza where it can be seen from the Launch Gallery window.
This map serves as a link between the stories curated and displayed in these different galleries and locations. The text provided visually on the original map reads:
After successfully completing her sea trial in Belfast Lough, Titanic set sail for Southampton shortly after 8 pm on 2 April 1912. On her delivery voyage through the Irish Sea and English Channel to Southampton Titanic recorded a speed of 23 ¼ knots (approx. 26 ¾ miles per hour), the highest speed she would ever attain.
After the delivery voyage, Titanic was in Southampton for 7 days, where the first passengers embarked. Titanic then called into the French port of Cherbourg on 10 April and departed for Queenstown (now Cobh), Ireland, on the same day. From Queenstown, the ship set out for New York, although, as we know, it never arrived. Figure 3 shows the 2D map Titanic Belfast uses to depict this first and last voyage.
The dotted lines indicate Titanic’s route, and the map highlights and labels the different ports (Belfast, Southampton, Cherbourg and Queenstown) using contrasting colours. These stopping points provide the basis for information and exhibits, including photographs and databases of passenger embarkation details. Without being able to see the map and access the information, most of the exhibit is lost on BPS visitors. The discolouration around each port also suggests that, in fact, maps are the perfect kind of 2D objects to make tactile, because even sighted people sometimes like to touch them and explore them in a tactile way (if they are allowed).
The map shows visitors the factual information about the voyage, but also inspires us to imagine the opportunities for passengers to board and disembark from this ship; some passengers were lucky to get off and some were unlucky. Therefore, the design aim of our interactive version of this map is to link, across time and space, historical and physical facts with personal stories of Titanic. This interactive, tactile smart map aims to establish an emotional connection between the visitors of today and the passengers of the past.
4.2 AD creation
The content of the AD was collected from information and exhibits in the Maiden Voyage Gallery, including the information panels, the video of Titanic’s maiden voyage route, the existing AD guide, the multimedia guide for sighted visitors, and artefacts such as the first-class menu, letters written by passengers, and a set of black and white photographs of life on board Titanic displayed on the wall of the gallery (see examples in Sect. 4.1). When creating the AD, we followed several guidelines, such as Audio Description: Lifelong Access for the Blind (ADLAB) [57]. As the AD we created was a descriptive guide, the guidelines and hints for designing descriptive guides provided by ADLAB are useful to follow. The guidelines highlight the features of a description guide which are distinguished from other types of AD. That is, as Neves states, the information given in a descriptive guide is the narrative itself with an emphasis on “how” and “what” to say “about what” [57]. In other words, selection and interpretation are important in creating the audio descriptive guide.
The AD design process was creative and not simply a matter of transcribing existing material. The information selection must take into account narrative construction and the desired emotional impact of each stage of the journey. The AD was produced in a multisensory way and was combined with various sound effects, such as background music, different accents and voices. The background sound of seagulls, wind and sea was recorded on site at the Titanic’s original slipway and embedded into the AD clip at the point where the ship began its journey from Belfast. We used traditional French music and Irish music when the ship voyaged towards Cherbourg and Queenstown. The Navy Hymn—‘For Those In Peril On The Sea’—and the sound effect of Morse code were used when the ship struck the iceberg. For the personal stories, various accents and voices were used: an Irish man with a local accent read a letter written by Dr. John Edward Simpson who was Titanic’s Assistance Surgeon. Dr. Simpson was born in Belfast, and graduated in Medicine from Queen’s University Belfast. His letter to his mother Elizabeth from on board Titanic was posted in Queenstown. He did not survive the sinking.
In total, there are eighteen (18) AD files, two (2) introductory AD files about the smart map, how to use the tactile buttons and tactile voyage routes, and a short background introduction of Titanic’s voyage route. The other sixteen (16) AD files give users brief factual information about each stop and personal passenger stories (such as Dr. Simpson’s letter), which are arranged chronologically and geographically.
5 A smart map for the story of Titanic’s Maiden Voyage
The smart map portraying the story of Titanic’s journey on its maiden voyage is based on the 2D map in Titanic Belfast. It consists of a raised map of (most of) the British Isles, part of northern France with Cherbourg, and the start of the Atlantic Ocean. So that the buttons in the sea can be fixed into place, the map has three layers: the land layer (made from wood), the sea layer (made from thin, dark blue acrylic plastic sheeting), and a sturdy unseen base layer (400 mm × 600 mm) for anchoring the buttons (plywood) and supporting the map (see Fig. 4).
We explored three ways to represent the voyage routes on the sea layer: the use of physical wires, tactile stickers, and engraved lines. Given the consideration that physical wires and tactile stickers are more prone to suffer damage, we chose to use engraved lines. The engraving can be performed easily as part of the laser cutting process. Different types of line were tested (see Fig. 5). Solid lines were chosen for the route from Belfast to Southampton, but a dashed line (the second option in Fig. 5) was used from Southampton onwards. This was so that users can distinguish between the different routes and directions, especially when routes are close to each other (e.g. around Southampton and Cherbourg).
The smart maps’ interactive AD is delivered through a set of 10 buttons, embedded at key points on the journey. Because each button is multi-functional, it is possible to provide a series of AD files providing increasing levels of detail, should the user request this. Normally we provide one level of additional information, which can be selected by long pressing the button (see Sect. 6.4). Figure 6 gives an overview of the 10 buttons and shows the number of AD files available at each location (indicated by the dotted loops): normally there is one “additional information” option per button, though three buttons have only one AD file, while Queenstown has three AD files.
There are two main approaches to recording AD: using a real person or using a voice synthesizer, such as Google Cloud Text-to-Speech. In our audio recording, we used real people, three female narrators and one male, using professional recording facilities in a soundproof studio at the Sonic Arts Research Centre at Queen’s University Belfast. Background music and sound effects were then added and mixed to produce the final version of the AD.
The final stage of producing the model required simple circuit design, soldering the buttons, and connecting the wires to a Raspberry Pi, a fully customizable and programmable small computer board [58], which runs our purpose-designed Python program, the QUB Smart Map Controller, as explained in Sect. 6.4. Each button is connected to one of the GPIO (general-purpose input/output) pins of the Raspberry Pi. A bluetooth speaker and/or headphones is used for audio output (see Fig. 7).
6 Design principles for smart maps
Our experience in designing and evaluating prototypes of the smart map has led us to propose four main principles of designing interactive maps for BPS visitors: (1) employing journey-based structured storytelling, (2) purpose-oriented design, (3) use of low-cost technologies and (4) use of multi-function buttons.
6.1 Employing journey-based structured storytelling
In the existing AD guide at Titanic Belfast, users request the required AD through a keypad device. The whole layout of Titanic Belfast is structured around a chronological story of Titanic and visitors have to follow that narrative path through the venue. This suggests that the AD, in some way, is structured according to the story of the venue and the journey through the venue. However, emotional involvement is substantially reduced for BPS visitors, as the many layers of information and stories (presented visually for sighted visitors) are largely omitted from the existing AD guide.
In contrast, our smart map AD is activated by buttons placed at key geographical locations in the journey, which become key points in the story. The locations and timeline of Titanic’s maiden voyage are used to narrate to visitors a story structured along the historical journey, but also a story that becomes a personal and emotional journey. The AD is designed to create a personalized experience by giving users some control over how they interact with and explore stories independently and at a level of detail they choose. By exploring the map and following a preferred route laid down in the map, visitors are on a guided but not fixed journey as they can choose whether to follow the sequence of the route or choose a different sequence. They can select the amount of information—both historical facts and personal stories—they would like to hear. The personal stories are intended to form a relationship between visitors and the passengers on board Titanic; as the visitor’s hand moves across the map, their mind follows passengers’ real stories. BPS visitors can choose to follow some passengers’ stories over time and also pick up different stories of different passengers. Across time and space, the visitors can develop a personal connection with these passengers and try to imagine what life on board was like. They can follow the story of a passenger who had to get on and off the ship at Queenstown, with the result that the visitor’s foreknowledge of the doomed vessel’s imminent sinking begins to function as an emotional response.
6.2 Purpose-oriented design
In designing the smart map, design decisions must always be made in full awareness of the map’s purpose. Two examples are given in what follows.
Firstly, we applied a novel approach, namely Localized Scale Magnification (LSM), to the construction of some map features. The minimum size of map features which can be tactilely distinguished is determined by the width of a fingertip. For a map covering a large area, local points of interest can be lost if they are made to scale. For example, the ports of Belfast and Southampton are small in comparison with a map of the UK drawn to scale, but they are significant for Titanic’s journey and stories of that journey. Although they can be seen visually, they are too small for a BPS visitor to feel with a finger. Therefore, we exaggerated the scale locally, rather like placing a magnifying glass over the local point of interest, so that the significant features are large enough to be felt. The goal is to give the impression of higher resolution in the locality of significant points of interest. Also, LSM was applied to points in the journey where two close routes could easily be confused (for example, the route into, and out of, Southampton). The physical width of the channels and the separation of the routes need to be large enough for the fingertip to feel the sea and to explore how the ship navigates its way.
One result of LSM is that the resulting map is not strictly accurate and may look a little strange to a sighted person, who may initially expect to see a map drawn to scale. In theory, this poses a challenge to the principle of universal access (where the map should be suitable for either sighted or BPS visitors). In practice, however, we have found, informally, that using LSM does not interfere with the experience of sighted visitors. Sighted visitors happily use tourist maps which often use similar scale exaggeration around points of interest.
A second application of purpose-oriented design led us to eliminate certain geographical and topographical features. We originally included some mountain ranges and inland waterways, using 3D printing; however, when we presented these to a focus group, these features confused BPS participants and distracted their attention from Titanic’s route. We also simplified and smoothed the coastlines of the countries involved, because these details are not relevant to the purpose of the map. We are not providing a lesson in geography. This is also in keeping with current research which suggests that “the blind can only distinguish a limited number of tactile cartographic symbols in one representation” [59].
Another reason to eliminate unnecessary tactile detail is to reduce the cognitive load. To a greater extent than with visual perception, BPS people’s manual exploration of tactile maps is sequential and their mental picture is built up part by part. Tactile perception can also use different body parts, such as a finger, a whole hand, or both hands associated with movement of the arms; hence, the cognitive load needed for mental integration and synthesis to build a unified representation of the object is increased [60]. Also, because of the lower capacity of tactile memory than visual memory [61], the information that BPS people can assimilate through touch is limited [62]. Therefore, every effort should be made to eliminate or reduce details which are not relevant to the purpose of the map.
Purpose-oriented design also influenced the choice of colours for the sea and the land in our map. After presenting our first choice of green and light blue to our focus group, the partially sighted members requested much greater contrast between land and sea. This is another example where the resulting map might not look so realistic to sighted visitors. It also demonstrates the need to take the needs and capabilities of partially sighted visitors into account, and not treat everyone in exactly the same way.
Finally, braille is not used on the map. Only a relatively small proportion of BPS people have experience in reading braille [63]. Braille on the map reduces the legibility and immediacy of the map and increases the complexity and difficulties in map interpretation [64, 65].
6.3 Low-cost technologies
Cost can usually be a decisive factor for museums in deciding how to improve the accessibility of their buildings, collections and facilities. Traditionally, tactile maps are made by braille embossers and microcapsules; however, “the greatest flaw of standard tactile map production is that it is both a time-consuming and costly process and that the production of a larger number of copies is difficult (poor reproducibility)” [59]. Therefore, making tactile maps is normally costly and time-consuming and they require facilities which are rarely available outside classrooms [32].
For museums, tactile objects are inevitably likely to suffer damage over time and therefore may need to be replaced easily. Convenient, low-cost reproducibility is important. Therefore, we would want as much of the model as possible to be generated from a computer description.
There are several low-cost solutions for designing replicas, such as rapid prototyping, “an ‘additive’ process, combining layers of material or powder to create a solid object”, and CNC (computer numerical control) milling machine, which drives a machine tool to selectively remove material from a solid block [66]. With the introduction of laser cutting machines, 3D printers, and the varieties of available materials, the process of making tactile maps has become low-cost, time-efficient and simpler. Moreover, there are often open access computer models available, which can be a good starting point for a smart map [59]. While the equipment for laser cutting can be expensive, the service is often available at reasonable economic cost.
3D printing has become much more common in recent years, and the decreasing purchase price of 3D printers opens more possibilities in applying 3D printing to museums. However, given the fact that a 3D printer’s build volume is relatively small (e.g., 330 × 240 × 300 mm), we did not use 3D printing for the geographical regions on the map. The regions would have had to be printed as a series of smaller regions and then assembled. Laser cutting was faster, cheaper and more easily repeatable.
6.4 Multi-function buttons
We investigated three types of widely used buttons: smart buttons, capacitive touch sensors and tactile buttons. Smart buttons are usually wifi- or bluetooth-based. Examples include the Echo button, the Logitech Pop and the Flic button, which have become quite popular nowadays, especially in the smart home. However, smart buttons are comparatively very expensive and too large for our purposes. In capacitive touch sensors, a proximity touch triggers the event. However, they are prone to unintentional triggers [30]. Tactile buttons are simple on/off electronic switches. They are cheap, come in various sizes and heights and only respond to users when pressed.
Tactile buttons proved to be the most appropriate for our smart map. Indeed, whatever the journey to be represented, tactile buttons can be used, with the supporting software which we developed. Tactile buttons can be turned into multi-functional buttons by using software which runs on the Raspberry Pi. This enables the duration of a button press to represent different functions. Users can select the amount of information they wish for a location, by using the button to request information (by a short press), or to request more information (by a longer press), or to request less information, by stopping the AD fragment. Unintended short taps, or pressing two buttons simultaneously, can be ignored by the software.
To support the multi-function buttons and the playing of the selected AD files, we developed a Python program, the QUB Smart Map Controller, which can actually service any button-based AD system. The designer supplies a text file which gives, for each button, a pin connection and a location name. This location name, plus the type of button press (S, L, L2, etc.), is used to make up the file name of the corresponding AD file. This is then played by the Raspberry Pi through headphones or via a Bluetooth speaker. There are some 18 AD files. More AD files could be created and added to the system without having to modify the program. The program also produces a log file which shows which buttons the user pressed, and when; from this it calculates statistics about how much of each AD file was listened to.
7 Evaluation
In collaboration with RNIB Northern Ireland, a group of BPS participants took part in the reception study. The participants were registered members of RNIB. So far, six (6) BPS participants have taken part in the study. The group represented different levels of sight loss, with three (3) blind participants and three (3) partially sighted participants; two (2) participants were born with sight loss and four (4) participants had developed sight loss. There were different ages, from 36 to 75, and an equitable gender split (3 male and 3 female). Figure 8 shows one of the BPS participants evaluating the smart map.
In each evaluation session, BPS participants were asked to explore the smart map by themselves. Three methods were employed in the evaluation process: (1) BPS participants used a tactile Self Assessment Manikin (SAM) to self-report their emotional responses at each location during their exploration of the map; (2) participants completed a questionnaire afterwards; and (3) the log file generated by the QUB Smart Map Controller programme was used to record and analyse how BPS participants explored the map.
7.1 Results based on SAM
SAM, a self-managed method of recording the emotional state of a person, is a picture-oriented questionnaire to measure emotional responses [67, 68]. The SAM concept is widely used to assess emotional states. Based on an existing model [69] and inspired by Gallardo and Ulrich’s tactile version of SAM [70], we developed our own SAM to evaluate BPS participants’ emotional responses, while they explore the interactive map in the reception study. We simplified and eliminated the unnecessary components of the graphic features on SAM.
We designed and produced our own tactile SAM, using a laser cutting machine to engrave the emoticons onto a plastic board (see Fig. 9). The first row of the tactile SAM depicts the shape of a person’s mouth, ranging from a “sad” to a “happy” mouth expression. This top row represents the valence of emotion, ranging from unpleasant, negative emotion (e.g. sad), to pleasant, positive emotion (e.g. happy). Note that in our journey story sad emotions are every bit as meaningful as happy emotions. The second row of the tactile SAM represents intensity, that is, the arousal degree (or strength) of emotion. Participants will find the pattern of symbols becomes larger and larger as their emotional feeling increases from “not much feeling” to a “very strong feeling”.
The BPS participants were given time to become familiar with the SAM before the evaluation began. We did a short initial test before the start, asking them to use the SAM to describe their current emotional state at that time. They quickly and clearly described their emotional state. Everyone confirmed they were comfortable with using the SAM and found it clear and easy to use.
Although the emotional patterns varied among BPS participants, the trend of the SAM data shows that most BPS participants started the journey from a relatively low arousal and low pleasure emotional state (calm), developed into a high pleasure and high arousal emotional state (happy) at some point during the map exploration, and usually ended with a low pleasure and high arousal emotional state (sad). The spread of the numbers in each cell in Fig. 10 also indicates that there are differences between each BPS individual’s emotional responses, strengthening the argument that BPS visitors should not be treated as a homogenous group.
More significantly, the data shows that BPS participants experienced some kind of emotional engagement. Figure 10 shows how many times each emotional state (in terms of both valence and arousal) was self-reported by each BPS participant while exploring each stage of the smart map (a total of 60 reports). For valence, we can see that 52 (87%) self-reports are medium (3) or above (calm to happy). Since there is only one button that triggers AD content regarding the sinking, there are only 8 (13%) reports of sad emotion (columns 1 and 2). For arousal, 56 (93%) are at 3 or above. Encouragingly, 28 (47%) reports were of very high intensity of emotion (arousal 5). If to this are added the 14 reports of high intensity (arousal 4), altogether, 42 (70%) reports were of high arousal. These results strongly suggest that participants were experiencing significant emotional engagement while exploring the map.
7.2 Results based on questionnaire
There are 15 questions in the questionnaire, 12 five-point Likert scale questions and three open questions. The aim of the questionnaire was to gather participant feedback focusing on three main categories: the interactive map, the AD, and the overall experience. Thus, for example, we were interested in the extent to which BPS participants are able to feel the ports and visualize their location, the extent to which the background sounds and music used in the AD help BPS participants follow the journey, and the extent to which the overall experience was a positive one, for instance, if BPS participants were emotionally touched. The questionnaire was conducted as an interview. Each BPS participant was verbally asked the questions by a volunteer in a separate room. Answers were voice recorded, with the consent of all participants, and later transcribed. The responses to the 5-point Likert scale questions are presented in Fig. 11.
In summary, BPS participants were highly satisfied with the interactive map, and enjoyed their cognitive, emotional and engaged experience in using this map. The results show that BPS visitors were able to experience a real degree of emotion while they explored the interactive map. They felt in control, as they are able to explore the information they liked, and they appreciated this personalized experience. Sample quotes from BPS participants demonstrate not only their appreciation of the map but their enthusiasm for its potential:
I enjoyed doing it. Using different voice actors and background music was really positive for evoking emotions. In fact, using real life stories evoked more emotional reactions too.
A very positive move. It would be a move forward to improve accessibility for visually-impaired people but also for sighted people as it can help visualize events on one route. I have been to Titanic Belfast and I would like to go back if you put it in. Just interested in people’s reaction to it because I really like this kind of thing. I think this is good.
I think it is important there are more things like that in museums, because more often the things are behind glass. You cannot touch them. There was a model of Titanic behind us in the glass, but you don’t know if it is big or small, so it is important to have a hands-on experience and you are able to touch things.
It would be an excellent idea because it certainly helps to make you feel more inclusive. Through installing this map, it made me feel very inclusive.
7.3 Results based on the log file
In order to mitigate social bias in completing the questionnaire [71], we also kept a record of participants’ interactions with the map, such as calculating the length of time each audio file was played and the length of waiting time between button presses. This information is gathered and stored in a log file on the Raspberry Pi while BPS participants explore the map. This log file enables the gathering of quantitative data on how participants explored the AD.
Analysis of the log file data, in the form of a spreadsheet, resulted in two main observations:
-
(1)
Almost all AD was listened to by everyone. There was one main exception (discussed in the following point) and one case where the AD was stopped during the final background music of the Atlantic (9th) button (see Fig. 6).
-
(2)
The data revealed that some users missed the Cherbourg button completely. It turned out that, in spite of our attempts to provide tactilely distinct routes, this was because the voyage routes from Southampton to Cherbourg and from Cherbourg to Queenstown are too close to each other in the Southampton region. Because BPS participants could not see that there was a button at Cherbourg, they did not know they were missing this part of the journey. This reveals an interesting pitfall in designing a smart map for BPS users: sighted users can readily see all the buttons at a glance, but BPS users do not have this overview. One attractive solution would be to update the software so that it detects that the user has skipped a part of the journey (where there is a danger of doing this by accident rather than by choice); it could then prompt the user with this information. This is one of the advantages of a software-controlled smart map.
Overall, we observed a pleasing consistency between all three evaluation approaches, although each method picked up different features.
8 Conclusions
This paper has proposed a “story to journey to smart map” (SJSM) approach to universal access. We applied this approach to create a smart map of the Titanic’s maiden voyage. When evaluated with a group of BPS participants in a reception study, the approach was given approval for visitors’ experience, especially regarding visitors’ emotional engagement and independence.
Although we did not carry out a formal reception study with sighted visitors, some sighted visitors found that the smart map was very attractive to use and commented that it helped their own exploration and visualization of the journey. It proved natural for sighted users to adapt to the BPS-specific modifications almost without noticing. Although this does not demonstrate universal access experimentally, it does suggest that the SJSM approach may well have potential for engaging with a larger number of participants with a range of sensory, cognitive and physical abilities and disabilities. This is a promising area for ongoing research.
One advantage of our smart map technology is that no new software needs to be written for other button-based smart maps. The QUB Smart Map Controller software (which can be made available upon request) obtains all its information from a text file, which a non-expert can easily create from a template. Another advantage is the relatively low cost of producing our type of smart map, or replacing a broken one. Provided there is access to a laser cutting service, the actual materials need not be expensive, and the cost of the Raspberry Pi and associated electronics is eminently affordable.
While the current smart map needs further reception studies and refinement, there are several possible future developments of our approach, as briefly described below.
-
(1)
Extending to visitors with hearing loss. To extend the universality of the SJSM approach beyond the BPS, our current smart map technology could be supplemented by delivering a parallel video-based experience based on the physical map, rather than having a separate touch-screen version of the smart map. This would share the same AD text. The software would automatically generate the visual text. There is scope to use avatar technology for different characters in the story; this would open up lip-reading as a further modality.
-
(2)
Rapid and multi-lingual AD generation using Translation and Text-to-Speech technology. If desirable, to avoid the cost and delay of getting multiple human readers to create the AD commentary files, and in different languages, there is considerable potential to exploit available automatic text-to-speech generators with multiple voices. As a way of obtaining a limited multi-lingual version, some museums might find it worthwhile to explore the use of automatic translation software. Another advantage of this automatic approach is that the audio-based and video-based interfaces would operate from the same core AD text.
-
(3)
Integrated self-assessment. Additional buttons or tactile sensors, integrated into the smart map, could replace the proposed SAM, so that users can give regular feedback on their emotional experience in a natural and convenient way. The logging software will automatically record the time (and hence the point in the journey) of such feedback.
-
(4)
Representing more abstract “stories”. With creative application of our SJSM approach, there is potential for museums to create new smart maps which help users explore more abstract “stories”. Historical and political timelines are one type of “story” which can be explored using a journey via a smart map. As a more complex example, we plan to investigate how a large collection of personal stories and other resources related to conflict situations can be moulded into a multi-person, multi-path “journey”, representing the journey which the divided communities undertake as part of a reconciliation process. A smart map can portray two (or more) different interpretations of the same events, corresponding to alternative routes on a smart map. Careful design can ensure that visitors are led to explore both, or several, “stories” (routes), rather than imposing a particular storyline which might try to create a reductive interpretation.
References
Giannini, T., Bowen, J.P.: Museums and Digital Culture: New Perspectives and Research. Springer Series on Cultural Computing. Springer, Cham (2019). https://doi.org/10.1007/978-3-319-97457-6
Sylaiou, S., Dafiotis, P.: Storytelling in virtual museums: engaging a multitude of voices. In: Liarokapis, F., Voulodimos, A., Doulamis, N., Doulamis, A. (eds.) Visual Computing for Cultural Heritage, pp. 369–388. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-37191-3_19
General comment No. 2. Article 9: Accessibility. Convention on the Rights of Persons with Disabilities (CRPD) (2014). CRPD/C/GC/2
Wang, X., Crookes, D., Harding, S., Johnston, D.: Evaluating audio description and BPS visitor experience in Titanic Belfast. J. Audiovis. Transl. 3(2), 246–263 (2020). https://doi.org/10.47476/jat.v3i2.2020.124
Cock, M., Bretton, M., Fineman, A., France, R., Madge, C., Sharpe, M.: State of Museum Access 2018 Does your museum website welcome and inform disabled visitors? VocalEyes, Stage TEXT, and Autism in Museums (2018). https://vocaleyes.co.uk/state-of-museum-access-2018/
Greco, G.M., Jankowska, A.: Framing media accessibility quality. J. Audiovis. Transl. 2(2), 1–10 (2019)
Prentice, R.: Experiential cultural tourism: museums & the marketing of the new romanticism of evoked authenticity. Museum Manag. Curatorship 19(1), 5–26 (2001)
Kim, J.H., Ritchie, J.R.B.: Cross-cultural validation of a memorable tourism experience scale (MTES). J. Travel Res. 53(3), 323–335 (2014). https://doi.org/10.1177/0047287513496468
Hutchinson, R.S., Eardley, A.F.: Museum audio description: the problem of textual fidelity. Perspect. Stud. Transl. Theory Pract. 27(1), 42–57 (2019). https://doi.org/10.1080/0907676X.2018.1473451
Article 2. Definitions. Convention on the Rights of Persons with Disabilities (CRPD) (2014)
Jenkins, H.: Convergence Culture. New York University Press, New York (2006)
Neves, J.: Cultures of accessibility: translation making cultural heritage in museums accessible to people of all abilities. In: Harding, S., Cortés, O.C. (eds.) The Routledge Handbook of Translation and Culture, pp. 415–430. Routledge, Abingdon (2018)
Nielsen, J.K.: Museum communication and storytelling: articulating understandings within the museum structure. Museum Manag. Curatorship 32(5), 440–455 (2017). https://doi.org/10.1080/09647775.2017.1284019
Dodd, J.: Museums and the health of the community. In: Sandell, R. (ed.) Museums, Society, Inequality, pp. 182–189. Routledge, London (2002)
Gottschall, J.: The Storytelling Animal: How Stories Make us Human. Houghton Mifflin Harcourt, New York (2012)
Bal, M., Van Boheemen, C.: Narratology: Introduction to the Theory of Narrative, 3rd edn. University of Toronto Press, Toronto (2009)
Marshall, M.T., Dulake, N., Ciolfi, L., Duranti, D., Kockelkorn, H., Petrelli, D.: Using tangible smart replicas as controls for an interactive museum exhibition. In: TEI'16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, 2016. vol February, pp. 159–167. https://doi.org/10.1145/2839462.2839493
Capurro, C., Nollet, D., Pletinckx, D.: Tangible interfaces for digital museum applications. In: Digital Heritage, Granada, Spain, 2015 2015. IEEE, pp. 271–276. https://doi.org/10.1109/digitalheritage.2015.7413881
Smith, M.M.: Sensing the Past: Seeing, Hearing, Smelling, Tasting, and Touching in History. Univ of California Press, Berkeley (2007)
McGinnis, R.: Islands of stimulation: perspectives on the museum experience, present and future. In: Levent, N., Leone, A.P. (eds.) The Multisensory Museum: Cross-Disciplinary Perspectives on Touch, Sound, Smell, Memory, and Space, pp. 319–329. Rowman & Littlefield, Lanham (2014)
Ghodke, U., Yusim, L., Somanath, S., Coppin, P.: The cross-sensory globe: participatory design of a 3D audio-tactile globe prototype for blind and low-vision users to learn geography. In: The 2019 on Designing Interactive Systems Conference, pp. 399–412 (2019)
Ungar, S., Blades, M., Spencer, C.: The role of tactile maps in mobility training. Br. J. Vis. Impair. 11(2), 59–61 (1993)
Götzelmann, T.: LucentMaps: 3D printed audiovisual tactile maps for blind and visually impaired people. In: The 18th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 81–90 (2016). https://doi.org/10.1145/2982142.2982163
Taylor, B., Dey, A., Siewiorek, D., Smailagic, A.: Customizable 3D printed tactile maps as interactive overlays. Paper presented at the ASSETS '16: The 18th International ACM SIGACCESS Conference on Computers and Accessibility (2016)
Antoniou, V., Ragia, L., Nomikou, P., Bardouli, P., Lampridou, D., Ioannou, T., Kalisperakis, I., Stentoumis, C.: Creating a story map using geographic information systems to explore geomorphology and history of Methana peninsula. ISPRS Int. J. Geo-Inf. 7(12), 484 (2018). https://doi.org/10.3390/ijgi7120484
Ilies, G., Ilies, M.: A storytelling map of the upper Mara Valley. Cartogr. Geoinf. 17(30), 16–27 (2018). https://doi.org/10.32909/kg.17.30.2
Emsley, I., Graven, T., Bird, N., Griffiths, S., Suess, J., Shaw, L.: Please touch the art: experiences in developing for the visually impaired. J. Open Res. Softw. (2019). https://doi.org/10.5334/jors.231
Mesquita, S., Carneiro, M.J.: Accessibility of European museums to visitors with visual impairments. Disabil. Soc. 31(3), 373–388 (2016). https://doi.org/10.1080/09687599.2016.1167671
Neves, J.: Multi-sensory approaches to (audio) describing the visual arts. MonTI Monografías de Traducción e Interpretación 4, 277–293 (2012). https://doi.org/10.6035/monti.2012.4.12
Giraud, S., Brock, A.M., Macé, M.J.M., Jouffrais, C.: Map learning with a 3D printed interactive small-scale model: improvement of space and text memorization in visually impaired students. Front. Psychol. (2017). https://doi.org/10.3389/fpsyg.2017.00930
Sato, M., Poupyrev, I., Harrison, C.: Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects. In: the SIGCHI Conference on Human Factors in Computing Systems, pp. 483–492 (2012)
Ducasse, J., Brock, A.M., Jouffrais, C.: Accessible interactive maps for visually impaired users. In: Pissaloux, E., Velazquez, R. (eds.) Mobility of Visually Impaired People, pp. 537–584. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-54446-5_17
McGookin, D., Brewster, S., Jiang, W.: Investigating touchscreen accessibility for people with visual impairments. In: NordiCHI '08: The 5th Nordic Conference on Human–Computer Interaction: Building Bridges, pp. 298–307 (2013). https://doi.org/10.1145/1463160.1463193
Brock, A.M., Truillet, P., Oriola, B., Picard, D., Jouffrais, C.: Interactivity improves usability of geographic maps for visually impaired people. Hum.-Comput. Interact. 30(2), 156–194 (2015). https://doi.org/10.1080/07370024.2014.924412
Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: the SIGCHI Conference on Human Factors in Computing Systems, pp. 413–422 (2011). https://doi.org/10.1145/1978942.1979001
Miele, J.A., Landau, S., Gilden, D.: Talking TMAP: automated generation of audio-tactile maps using Smith–Kettlewell’s TMAP software. Br. J. Vis. Impair. 24(2), 93–100 (2006). https://doi.org/10.1177/0264619606064436
Senette, C., Buzzi, M.C., Buzzi, M., Leporini, B.: Enriching graphic maps to enable multimodal interaction by blind people. In: International Conference on Universal Access in Human–Computer Interaction, pp. 576–583. Springer, Berlin (2013)
Shams, L., Seitz, A.R.: Benefits of multisensory learning. Trends Cogn. Sci. 12(11), 411–417 (2008)
Birsh, J.R.: Multisensory Teaching of Basic Language Skills. Brookes Publishing Company, Baltimore (2011)
Campbell, M.L., Helf, S., Cooke, N.L.: Effects of adding multisensory components to a supplemental reading program on the decoding skills of treatment resisters. Educ. Treat. Child. 31(3), 267–295 (2008)
Jordan, K.E., Baker, J.: Multisensory information boosts numerical matching abilities in young children. Dev. Sci. 14(2), 205–213 (2011)
Scheffel, D.L., Shaw, J.C., Shaw, R.: The efficacy of a supplementary multisensory reading program for first-grade students. J. Read. Improv. 45(3), 139–152 (2008)
Al-Hroub, A.: Programming for mathematically gifted children with learning difficulties. Roeper Rev. 32(4), 259–271 (2010)
Axel, E.S., Levent, N.S.: Art Beyond Sight: A Resource Guide to Art, Creativity, And Visual Impairment. Art Education for the Blind, Inc & AFB Press of the American Foundation for the Blind, New York (2003)
Joshi, R.M., Dahlgren, M., Boulware-Gooden, R.: Teaching reading in an inner city school through a multisensory teaching approach. Ann. Dyslexia 52(1), 229–242 (2002)
Lacey, S., Sathian, K.: Please DO touch the exhibits! Interactions between visual imagery and haptic perception. In: Levent, N., Leone, A.P. (eds.) The Multisensory Museum: Crossdisciplinary Perspectives on Touch, Sound, Smell, Memory and Space, pp. 3–16. Rowman & Littlefield, Plymouth (2014)
Wilson, P.F., Stott, J., Warnett, J.M., Attridge, A., Smith, M.P., Williams, M.A.: Museum visitor preference for the physical properties of 3D printed replicas. J. Cult. Herit. 32, 176–185 (2018). https://doi.org/10.1016/j.culher.2018.02.002
Eardley, A.F., Mineiro, C., Neves, J., Ride, P.: Redefining access: embracing multimodality, memorability and shared experience in Museums. Curator Museum J. 59(3), 263–286 (2016)
Candlin, F.: Don’t touch! Hands off! Art, blindness and the conservation of expertise. Body Soc. 10(1), 71–90 (2004). https://doi.org/10.1177/1357034X04041761
Dudley, S.: Museum Materialities: Objects, Engagements, Interpretations. Routledge, New York (2013)
Gallace, A., Spence, C.: The science of interpersonal touch: an overview. Neurosci. Biobehav. Rev. 34(2), 246–259 (2010). https://doi.org/10.1016/j.neubiorev.2008.10.004
Reeve, J.: Prioritizing audience groups. In: Lang, C., Reeve, J., Woodllard, V. (eds.) The Responsive Museum: Working with Audiences in the Twenty-First Century, pp. 43–60. Ashgate Publishing Limited, Hampshire (2006)
Lambert, S., Sampaio, E., Mauss, Y., Scheiber, C.: Blindness and brain plasticity: contribution of mental imagery? An fMRI study. Cogn. Brain Res. 20(1), 1–11 (2004). https://doi.org/10.1016/j.cogbrainres.2003.12.012
Romero-Fresco, P., Fryer, L.: Could audio-described films benefit from audio introductions? An audience response study. J. Vis. Impair. Blind. 107(4), 287–295 (2013). https://doi.org/10.1177/0145482X1310700405
Goldreich, D., Kanics, I.M.: Tactile acuity is enhanced in blindness. J. Neurosci. 23(8), 3439–3445 (2003). https://doi.org/10.1523/JNEUROSCI.23-08-03439.2003
Teng, S., Puri, A., Whitney, D.: Ultrafine spatial acuity of blind expert human echolocators. Exp. Brain Res. 216(4), 483–488 (2012)
Remael, A., Reviers, N., Vercauteren, G.: Pictures painted in words: ADLAB audio description guidelines. EUT Edizioni Università di Trieste (2015)
Raspberry Pi. https://www.raspberrypi.org/. Accessed 15 Mar 2021
Rener, R.: The 3D printing of tactile maps for persons with visual impairment. In: International Conference on Universal Access in Human–Computer Interaction. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-58703-5
Hatwell, Y., Streri, A., Gentaz, E.: Touching for Knowing: Cognitive Psychology of Haptic Manual Perception. John Benjamins Publishing, Amsterdam (2003)
Yoshida, T., Yamaguchi, A., Tsutsui, H., Wake, T.: Tactile search for change has less memory than visual search for change. Atten. Percept. Psychophys. 77(4), 1200–1211 (2015). https://doi.org/10.3758/s13414-014-0829-6
Magee, L.E., Kennedy, J.M.: Exploring pictures tactually. Nature 283(5744), 287–287 (1980)
Brock, A.M.: Interactive maps for visually impaired people: design, usability and spatial cognition (2013)
Hinton, R.A.L.: Tactile and audio-tactile images as vehicles for learning. In: Non-Visual Human–Computer-Interactions—Prospects for the Visually Handicapped, pp. 169–180 (1993)
Tatham, A.F.: The design of tactile maps: theoretical and practical considerations. In: The International Cartographic Association: Mapping the Nations, pp. 157–166 (1991)
Sportun, S.: The future landscape of 3D in museums. In: Levent, N., Leone, A.P. (eds.) The Multisensory Museum: Cross-Disciplinary Perspectives on Touch, Sound, Smell, Memory, and Space, pp. 331–341. Roman & Littlefield, Plymouth (2014)
Bynion, T.-M., Feldner, M.T.: Self-Assessment Manikin. Encyclopedia of Personality and Individual Differences. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-28099-8
Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)
Lang, P.J., Greenwald, M.K., Bradley, M.M., Hamm, A.O.: Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology 30(3), 261–273 (1993)
Iturregui-Gallardo, G., Méndez-Ulrich, J.L.: Towards the creation of a tactile version of the Self-Assessment Manikin (T-SAM) for the emotional assessment of visually impaired people. Int. J. Disabil. Dev. Educ. 67, 1–18 (2019). https://doi.org/10.1080/1034912X.2019.1626007
Van de Mortel, F.T.: Faking it: social desirability response bias in self-report research. Aust. J. Adv. Nurs. 25(4), 40–48 (2008)
Acknowledgements
We would like to acknowledge the support and input from Laura Smithson, Siobhan McCartney, Catriona Armour and Samantha Thompson at Titanic Belfast, and from Olive Rodgers and the volunteer participants at the Royal National Institute of Blind People (RNIB). We also want to thank Craig Jackson from the Sonic Arts Research Centre at Queen’s University Belfast, the volunteer AD narrators, Sarah Devlin, Jack Steers, Chrissy Marie Charleman, Sonia Katic, and Rui Sun, and Gerry Rafferty from the School of Electronics, Electrical, Engineering and Computer Science at Queen’s University Belfast for their support and assistance.
Funding
This research project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant Agreement No 754507.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Code availability (software application or custom code)
The QUB Smart Map Controller software can be made available upon request.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Wang, X., Crookes, D., Harding, SA. et al. Stories, journeys and smart maps: an approach to universal access. Univ Access Inf Soc 21, 419–435 (2022). https://doi.org/10.1007/s10209-021-00832-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-021-00832-0