Abstract
Autonomous driving will provide higher traffic safety, meet climate-related issues due to energy-saving mobility, and offer more comfort for drivers. To ensure reliable and safe autonomous traffic, and to provide efficient and time-critical mobility services, data exchange between road users and systems is essential. In public perception, however, sharing data and information may pose a challenge due to perceived privacy restrictions. In this paper, we address user perceptions and their acceptance towards data and information distribution in autonomous driving. In a multi-step empirical procedure, qualitative (focus groups, guided interviews) and quantitative approaches (questionnaire-study) were combined. The findings reveal that autonomous driving is commonly seen as a highly useful and appreciated technology. Though individual risk perceptions and potential drawbacks are manifold, mainly described in terms of data security and privacy-related issues. The findings contribute to research in human-automation interaction, technical development, and public communication strategies.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Autonomous driving has already been technologically implemented in parts in many countries and will enhance daily mobility in cities in the near future (Grush and Niles 2018). Applications range from commonly used advanced driver assistance systems, such as adaptive cruise control, over prototypes of autonomous buses in public transport, up to tracks on which the possibilities of the innovative technology are tested in real life environments (Haas et al. 2020; Portouli et al. 2017; Reid 2019). The feasibility of such applications includes intelligent transportation systems to coordinate traffic (pedestrians, vehicles, road infrastructure, etc.) (Alam et al. 2016). Based on information and communication technologies, mesh networks provide the opportunity for multiple connected entities to exchange data and interact (Arena and Pau 2019). To classify levels of automation, different systems have been established, such as the standard of the National Highway Transport Safety Administration (NHTSA), the SAE standard, or the standard of the German Federal Highway Research Institute (BASt). Chances for urban mobility are to avoid traffic jams and reduce accidents, thus improving road safety, travel efficiency, and environmental impacts (Jiménez et al. 2016). Therefore, the scientific attention is great and integrates different perspectives: besides technical vehicle development (Alam et al. 2016), also computer science (Dartmann et al. 2019) and social science (Brell et al. 2019d; 2019c).
The real benefit autonomous vehicle technology can bring against potential negative consequences is difficult to assess for the public. The evaluation of the positive and negative consequences of autonomous mobility for people might depend on various conditions: such might be the prevailing knowledge about autonomous driving, the utility of autonomous driving in different usage contexts, the alternatives to using autonomous vehicles, the availability of autonomous vehicles, and the public understanding of personal, technical, and data-law aspects associated with the novel mobility. For the overall success of autonomous driving and its seamless implementation in societies, public perceptions are a major cornerstone that should be incorporated quite early in the implementation process to foster a participatory procedure in line with public opinions (Hess 2018; Biegelbauer and Hansen 2011).
In order to integrate the user perspective not only into the technological development but also in public communication and information strategies at an early stage, it is central to understand which advantages and disadvantages citizens consider with regard to autonomous driving. This is particularly important to assess the extent to which these fears can be attributed more to the novelty of the technology (due to lacking user experience), to a perceived loss of control by artificial intelligence, or to data and privacy-related issues.
In this paper, we provide deeper insights into the user perspective on autonomous driving. Special regard is given to data distribution as a key to effective vehicular communication and reliable autonomous mobility. Previous studies have shown that the handling of personal information and privacy-related issues is relevant to acceptance in this context (Brell et al. 2019a; Garidis et al. 2020; Walter and Abendroth 1011). Building on these findings, our study contributes to further conclusions, e.g., regarding the storage and sharing of sensitive user data, perceived data risks and chances, to derive reasoned recommendations for practice.
2 Acceptance of autonomous driving
Acceptance and the willingness of people to use new technologies, to accept them willingly, and to deal with the consequences of innovation are central to societies and their well-being, but of course it is also a question of policy and governance in innovation management. In the following, we detail the theoretical base of technology acceptance and its importance to integrate acceptance perspectives in technology development at an early stage (see Section 2.1). Also, the perception of privacy (see Section 2.2) and trust (see Section 2.3) is specifically addressed as both concepts seem to play a cardinal role in the public perception of autonomous driving and the willingness to use autonomous mobility.
2.1 Technology acceptance and innovation management
The question of whether, and if so under which conditions, people accept technological innovations has received attention in research and development since the 1980s. The historically most influential model—the Technology Acceptance Model (TAM) and its successors—focused on information and communication technologies in the office context. The intended use of such technologies is predominately influenced by two key factors, the ease of use and the perceived usefulness of the technology. The TAM was subsequently extended in more differentiated versions (Venkatesh and Davis 2000; Venkatesh and Bala 2008; Venkatesh et al. 2003; Venkatesh et al. 2012): for example, demographic variables and other user characteristics as well as conditions of use (e.g., the voluntary nature of use) were added as predictive factors for technology acceptance. Another approach to theoretically model persons’ willingness to accept and use technical devices are technology diffusion theories (Rogers 1995). Accordingly, users show diverse adoption reactions to innovations, from “early adopters”, thus persons who are much more willing to adopt an innovation, to “laggards”, thus users who refuse the adoption of the innovation as long as possible (Rogers 1995).
In this context, risk perceptions have been identified to impact the societal acceptance of large-scale technologies (Burger 2012; Huijts et al. 2012; Gupta et al. 2012). Public concerns or even protests against a novel technology occur as the public responds to unknown but imputed risks of a novel technology even though technology might also deliver benefits to society (Gunter and Harris 1998; Horst 2005). Thus, perceptions of risk refer to persons’ subjective evaluations of the probability of harm through technology and possible consequences of negative events (Sjöberg et al. 2004). Risk perceptions are impacted by different cultural and social values and also by individual knowledge and personal attitudes (Zaunbrecher et al. 2018; Arning et al. 2018; Linzenich et al. 2019). Recent research indicates that people weigh up the perceived risks and benefits for the decision to adopt a technology (Linzenich et al. 2016).
Recently, a large empirical study with more than 1700 adults in the USA (Ward et al. 2017) examined the risk and benefit perceptions in the context of automated vehicles. The findings corroborated that trust, risk, and benefit perceptions are related to acceptance of automated vehicles. Demographic factors, such as generation, age, and gender, also influenced the knowledge and the reported trust towards acceptance of and willingness to use automated vehicles (Ward et al. 2017; Hulse et al. 2018; Hohenberger et al. 2016).
In addition, risk assessments in autonomous driving technology were found to be influenced by the prior experience with technology in general and the experience with driver assistance systemsFootnote 1 (Brell et al. 2019d; 2019c).
With increasing knowledge of and experience with automated driving systems, risk perceptions towards autonomous driving decreased and acceptance of the technology increased (Brell et al. 2019d; 2019c; Ward et al. 2017). Apparently, the familiarity with the handling of advanced speed regulation systems increased the trust perceptions in the reliability and safety of the system, and the discomfort towards the autonomous (uncontrollable) nature of the system decreased. Independently of experience, however, the most critical factors for the broad acceptance are users’ attitudes towards invasions of privacy and distrust in a transparent data handling (Brell et al. 2019d; 2019c).
2.2 Perceptions of data distribution, data handling, and privacy
The enormous advantages of intelligent vehicle technology—with respect to safety by controlling of traffic jams, congestion by conserving fossil fuels, and reducing noise levels in cities—can only be exploited because data from the vehicles and its routes is connected to infrastructure and other road users. This allows the planning and management of networked, individual, adaptive, and overall efficient traffic routes within and across cities. On the one hand, the utilization of data is associated with these enormous social and societal benefits; on the other hand, it also has significant disadvantages in the context of data protection and privacy (Gantz and Reinsel 2012; Dritsas et al. 2006).
The protection of privacy and the careful handling of data represent the most sensible part in the roll-out process and the critical point with respect to public perception and acceptance of autonomous mobility. The development of an appropriate privacy policy for citizens, their willingness to tolerate a broad data collection, and the tolerance of (technical) surveillance (Tene and Polonetsky 2012; Karabey 2012) is vital in this context. Characteristically, trade-offs need to be negotiated on different levels and situations: for example, the trade-off between keeping personal privacy on the one hand, the provision of open infrastructures, and open data on the other hand is of importance. Also, questions with regard to responsibility in the data handling and use are of vital impact as well as the critical issue of data ownership that needs to be carefully determined in order to provide adaptive and individually tailored services (Ziefle et al. 2016; Ziefle et al. 2019).
In line with the increasing digitization of societies and the area-wide use of electronic devices, “information privacy” is defined as the disclosure of personal information to a third party (Finn et al. 2013; Smith et al. 2011). It is important to note that the perception of privacy risks is not identical with the factual technical risks. Rather, users follow an affective or experience-based understanding of data or information sensitivity (Schomakers et al. 2019a; Ziefle et al. 2016). This potential mismatch between perceived sensitivity of data and the technical information sensitivity provides a rich base for misconceptions. As a consequence, careless user behaviors on the one hand and exaggerated concerns on the other may arise. When it comes to the question if consumers want to share their data, the temporary benefits of the novel services might be higher than the concerns what could happen with the data. This is referred to as privacy calculus (Dinev and Hart 2006).
Not only minor technical knowledge levels but also over-trust in having control of data might be responsible for observed privacy behaviors (Schomakers et al. 2019b; 2020). Recent research showed that the majority of users are quite sensitive in the context of data exchange and privacy issues, especially when the data is used by third parties without public transparency (Lidynia et al. 2017; Valdez and Ziefle 2019). Another critical issue for users concerns the question for how long data may be stored and which authority is responsible for the storage. The longer the data storage and the more data is stored on servers beyond the control of the users (e.g., central servers of companies or the traffic management), the lower is the willingness to share data, independently of the type of data (Schmidt et al. 2015a). Concerns are also higher the more personal the information is and the higher the probability of being identifiable (Valdez and Ziefle 2019; Ziefle et al. 2016). However, there is also empirical evidence that people seem to be differently vulnerable for those concerns (Schmidt et al. 2015a; Schomakers et al. 2018; Schomakers et al. 2019b). All in all, however, there is a widely prevailing public distrust which seems to have two different sides: one is an unspecific distrust in authorities with regard to a careful, protective, and diligent handling of data; the other is an archival concern towards invasions of privacy.
2.3 Trust—the hidden player
Whenever humans get in touch with automated systems, trust is a key to successful interaction, but it is also sensitive to uncertainties that may lead to users’ distrust and the rejection of technology (Hoff and Bashir 2015; Parasuraman and Riley 1997). However, the impact of human (dis)trust on acceptance decisions may not always be immediately apparent, but possibly hidden behind other narratives and experiences, carried, and influenced by other parameters (Siegrist 2019). For instance, trust has been identified as driving the perceived reliability and reliance on automation which are decisive for the evaluation and use behavior (Dzindolet et al. 2003; Lee and See 2004). Trust may also be expressed in terms of individual expectations or concerns determining the trade-off between perceived risks and opportunities, e.g., with regard to data exchange in autonomous driving (Schmidt et al. 2015a). To understand the dynamics of trust and acceptance in human-automation interaction, relevant predictors need to be accurately identified and carefully considered both in isolation and interaction.
According to Janssen et al. (2019), using automated systems increasingly involves time-sensitive or safety-critical settings, embodied and situated systems (i.e., subsets of automated systems), and non-professional users, which all applies to self-driving cars. Autonomous driving offers not only great usage potentials but also perceived risks from the user perspective (Kaur and Rampersad 2018; Schmidt et al. 2015b). Therefore, a research focus on trust in automation is required. Kaur and Rampersad (2018) investigated key factors influencing the adoption of driverless cars and identified performance expectations and perceived reliability as relevant determinants, pointing out the relevance of empirical studies and the inclusion of user needs in the technical development with special emphasis on trust issues. Further research has shown the influence of trust on the acceptance of autonomous mobility (Choi and Ji 2015), revealing trust as predicting the interest in using and the willingness to purchase a self-driving car (Ward et al. 2017). Consequently, studies on supporting trust in automation in this usage context are numerous (e.g., Häuslschmid et al. 2017; Koo et al. 2015; Waytz et al.2014).
Common to many studies is that trust is directly addressed and measured through self-reporting (using, e.g., Likert scales), e.g., with regard to immediate responses in experimental settings (Sheng et al. 2019) or scenario-based evaluations (Brell et al. 2019b). As there are indications that trust in the performance of a particular automated system is not only influenced by explicit but also implicit attitudes (e.g., towards automation in general) which users are often not aware of Merritt et al. (2013), it is of great interest to what extent trust as a hidden player has a role in the evaluation of autonomous mobility.
Therefore, in this survey, we explored ways in which trust is (indirectly) expressed and perceived in relation to other acceptance-relevant factors, such as data security and privacy.
3 Empirical research design
The research aim was to better understand the perspective of future users on autonomous driving in terms of acceptance and the probability of use rejection. We set a specific focus to the perception of data security and privacy when using autonomous vehicles.
Yet, as autonomous driving is only partly entering real life experiences, capturing users’ mental models about autonomous driving, their perceptions and understanding of using this technology at this point in time offer valuable input for scientific evaluation in human-automation interaction, technical development, and public communication strategies.
3.1 Research scenario and questions addressed
The survey was conducted in Germany in 2019 using German language. For the classification of automation level, we referred to the German Federal Highway Research Institute standard (BASt). The standard defines driving tasks of the driver according to automation level, from “driver only” (no automation, level 0) to “fully automated” (level 4) (BASt 2018). Following level 4 of the BASt standard, our research scenario referred to driving features that are capable to drive the vehicle themselves (i.e., performing driving tasks autonomously), allowing the human driver to pursue other activities while driving. An introduction to the topic, including the scenario and aim of research, was presented to the participants in advance.Footnote 2
We addressed the following research questions with focus on data distribution and privacy:
-
RQ1: What types of expectations, fears, and risks do potential users face in autonomous driving?
-
RQ2: What barriers and benefits do they consider?
-
RQ3: What influences the user’s perception and evaluation of data use?
-
RQ4: How are these factors related to the intention to use autonomous vehicles?
3.2 Mixed methods approach
As users’ perceptions and technology acceptance may vary depending on the sample, context, and approach (i.e., there is an interaction between method and research object (Wilkowska et al. 2015)), mixed methods represent a reliable response, also with regard to complex research questions (Lund 2012). Hence, we followed a multi-tiered process to develop and validate relevant assumptions using qualitative and quantitative methods. This way, methodological advantages were combined to compensate for potential shortcomings and to achieve in-depth insights.
Figure 1 shows our empirical research design. The overall approach was exploratory and structure-discovering. First, we conducted focus groups and guided interviews for a deep understanding of the multi-faceted and diverse user perspective on autonomous driving with special regard to the handling of personal data and travel information (see Section 4). Key findings (i.e., novel, often, or commonly mentioned attitudes, needs, and demands) were then operationalized, transferred into survey items, quantitatively assessed, and related to each other in two consecutive online questionnaires focusing on risk perceptions, benefits and barriers of use, and user requirements towards data exchange to gain valid conclusions in this context, also regarding the intention to use autonomous driving (see Section 5).
For the social science perspective, the iterative, consecutive implementation of qualitative and quantitative methods has already proven its value and effectiveness, but has often been limited to the application of a two-step approach (e.g., Brell et al. 2019c). The novel combination of already validated methods in an empirical four-step approach, in which the applied qualitative and quantitative surveys carefully build on and complement each other in the best possible way, is therefore the key to our research approach.
3.3 Data acquisition
In order to capture an unbiased view on the topic, ad hoc participants were addressed. We contacted volunteer participants for the interviews and focus group discussions in our personal environment under consideration of diverse social settings and characteristics. Online links to the questionnaires were distributed through social networks (e.g., on Facebook in personal feeds and groups), instant messaging, and email. Participants covered a broad age range and came from all parts of Germany. Participants volunteered to take part in the studies, were not gratified for their efforts, but took part for the sake of interest in the topic. Before the participants started the survey, the interviews, and the focus groups, they were informed that it is central for us to understand their free opinions and perspectives on autonomous driving, and the opinions that are prevailing in the public. We stressed that there are no “wrong” answers but encouraged them to spontaneously and honestly report their personal views. Participants were also informed that their participation is completely voluntarily. In line with ethical research science standards, we confirmed a high privacy protection in handling the participants’ data and assured that none of their answers can be referred to them as persons.Footnote 3
In order to assure an overall understanding of the material provided, three independent random pre-testers (28–36 years of age, no technical experts) checked the materials used for the empirical studies. We asked them to carefully control the information texts as well as the questionnaire items regarding (a) understandability (complicated or ambiguous wording, grammar, and orthographic issues), (b) length and perceived burden when filling in the questionnaires, and (c) bias and objectiveness of introducing the topic to participants (presenting the topic in a neutral manner).
Information and content presented below to illustrate the methods and results were translated from German.
4 Understanding the users’ narratives on autonomous mobility
The use of qualitative methods in empirical research is a key to addressing individual perspectives of particular stakeholders on a specific topic and thus providing insights into the many facets of social reality. Group discussions and interviews haven been proven reliable for exploring yet unknown aspects of subjective perception, knowledge, experience, and attitude. To develop a broad and deep understanding of the users’ perspective on autonomous driving, we used an integrative qualitative research approach including focus groups (N = 14) and guided interviews (N = 7) (see Fig. 1).
The following sections describe our qualitative research approach (development and implementation) (see Section 4.1), the participants (see Section 4.2), the obtained results (see Section 4.3), and lessons learned for follow-up research (see Section 4.4).
4.1 Qualitative research approach
In focus groups, the participants exchanged personal ideas and attitudes regarding autonomous driving in a joint and lively discussion under guidance. Main topics addressed user expectations (e.g., What would be different when driving in an autonomous car?), perceived risks (e.g., Who should take the responsibility for driving?), and data privacy (e.g., Are you willing to share passenger information?) (see Section 4.3.1).
Face-to-face interviewsFootnote 4 provided deeper insights into individual perceptions of sensitive issues related to data and information distribution in autonomous driving. Here, the focus was on collecting and sharing data (e.g., Which data may (not) be stored? Who should (not) have access to your data?) as well as data security (e.g., What steps should be taken to ensure data protection in autonomous driving?) (see Section 4.3.2).
All participants provided socio-demographics (age, gender, education) and data on their mobility behavior (driver’s license, experience with driving assistance systems).
The average survey time was between 60 and 90 min. The dialogues were audio-recorded, transcribed verbatim, and analyzed by qualitative content analysis (Mayring 2015), which is particularly useful for processing large quantities of material (Mayring and Fenzl 2019). First, analysis units (coding, context, and evaluation units) are determined to systematically reduce the text material (i.e., the transcripts) to essential meanings, which are then categorized; the aim is to develop a category system that includes all relevant aspects of analysis (i.e., categories) (Mayring 2015). In the present survey, the definition of categories was primarily inductive, i.e., based on the text material, but was deductively supplemented by theoretical considerations (based on the survey guidelines).
4.2 Participants
In total, 21 participants took part in the qualitative survey, thereof n = 11 men (52.4%) and n = 10 women (47.6%). Age ranged between 16 and 67 years (M = 40.6, SD = 16.3). Education was comparatively high with n = 9 (42.9%) university graduates, n = 7 (33.3%) high school graduates, and n = 5 (23.8%) participants holding a secondary school certificate (cf. Statistisches Bundesamt (Destatis) 2020).
The majority of participants had a driver’s license (n = 20, 95.2%). Besides, the participants indicated a regular (daily to weekly) car use, whereas previous experiences with driving assistance systems (e.g., automatic parking, lane keeping assistant, and adaptive cruise control) varied.
4.3 Results
First, user expectations and risk perceptions are described with special regard to data privacy (see Section 4.3.1). Then, data-related factors relevant to mobility acceptance are outlined (see Section 4.3.2).
4.3.1 Expectations and risk perceptions
In general, the participants showed a high interest and openness towards autonomous driving. Individual perceptions and evaluations were influenced by trade-offs: The participants considered diverse expectations in detail to carefully balance between perceived disadvantages and advantages of use.
Expected advantages
In particular, not only enhanced comfort by assigning driving tasks to the autonomous vehicle but also the possibility of pursuing other activities while driving and time saving were perceived positively. Besides, increased safety, e.g., with regard to faster reaction times in critical traffic situations was appreciated.
Feared disadvantages
The participants also expressed concerns about losing their driving experience when using autonomous vehicles, often associated with a negative feeling of technology dependency. Liability risks were frequently discussed revealing uncertainties concerning who will be legally responsible for driving, particularly in the event of damage (e.g., the human on-board or the manufacturer).
Data privacy
To clarify liability and investigate accidents (including the question of guilt), the participants showed a high, dedicated willingness to provide and share relevant data by using a black box for journey recording, for example. Apart from that, the distribution of personal and travel information was considered critically due to perceived privacy restrictions. Fears regarding data robbery and misuse (e.g., through hacker attacks) became apparent.
4.3.2 Data use in autonomous driving
In the following, we report the user-centered evaluation of data and information distribution in autonomous driving, in which data collection, data sharing, and data security were considered as key criteria to acceptance.
Data collection
The participants expressed considerable information needs concerning the purpose and duration of storing personal data and showed high control requirements, particularly as regards the amount of data. Besides, they strongly demanded to decide which data is collected. Concerning the type of data, some of the participants would only provide information about their destination and the route, while others could also imagine having vital signs measured in-vehicular for health prevention (e.g., to help communicating with rescue services in an emergency).
Data sharing
Details about the data addressee were required a condition for information exchange. Whereas sharing data with the vehicle and road infrastructure (e.g., traffic lights) was considered necessary and therefore accepted, distributing information to the manufacturer or public authorities was rather rejected out of concerns for data misuse. There were tendencies that the acceptance of data distribution varied with the mobility service and was greater for Car-Sharing (e.g., for user identification) than private vehicles.
Data security
Concerns about data protection were repeatedly reported as acceptance barrier. To ensure data privacy and increase the willingness to use autonomous vehicles, the participants suggested regular, external security checks, also with regard to necessary software updates, for example.
4.4 Lessons learned for follow-up research
Autonomous driving is seen as a highly useful and appreciated technology envisioned for the future. Individual expectations and concerns are expressed in terms of usage benefits and barriers with special emphasis on perceived challenges in data exchange, especially in the context of potential data misuse and hacking. The following lessons learned served as a basis for follow-up studies to quantify and validate the obtained research findings (see Section 5):
-
Expectations positively relate to improved user experience and road safety.
-
Individual risk perceptions and potential drawbacks are manifold, mainly described in terms of data security and privacy-related issues.
-
The willingness to share data strongly depends on the individually perceived usefulness and necessity (e.g., smooth and safe travel).
-
Perceived data challenges relate to the handling of personal information: Transparency and the possibility to decide on the distribution of personal data seem to be a key to acceptance.
5 Measuring user attitudes and data requirements
The use of quantitative methods in empirical research is to measure knowledge, opinions, and attitudes towards selected indicators in large samples. To validate previously obtained research findings (see Section 4), we conducted a consecutive quantitative survey including two online questionnaires (see Fig. 1). The aim was to increase our understanding of underlying concepts and relationships as regards data-related acceptance factors in autonomous driving.
The following sections describe our quantitative research approach (development and implementation) (see Section 5.1), the participants (see Section 5.2), and the obtained results (see Section 5.3).
5.1 Quantitative research approach
We requested personal information on the participants’ socio-demography (age, gender, education, income) and mobility behavior (driver’s license, experience with driving assistance systems) to identify sample characteristics. Instructions relevant for answering all questions were presented in easy to understand text descriptions.
In order to validate the items of the questionnaireFootnote 5, we calculated Cronbach’s Alpha (α) revealing scale consistency with α > .7 which can be interpreted as good reliability (Field 2009). Answers to the scales were given voluntarily.
The first questionnaire (N = 183) addressed attitudes towards using autonomous vehicles (see Section 5.3.1). We measured perceived risks (4 items, α = .864), usage benefits (7 items, α = .882), and barriers (7 items, α = .723) identified as central before (see Section 4.3.1) on 6-point Likert scales (min = 1 full disagreement, max = 6 full agreement). Special focus was on comfort and safety as well as cyber-security. The participants were also asked whether they could imagine using an autonomous vehicle (yes/no/undecided) (Davis et al. 1989). Table 4 lists the items used in this study.
The second questionnaire (N = 100) deepened insights into users’ data requirements in autonomous driving (see Section 5.3.2). Based on pre-study results (see Section 4.3.2), we addressed preferences for data storage location (5 items, multiple choice), attitudes towards the use of health data (5 items, α = .895), and data privacy and security (5 items, α = .740). Special focus was on the in-vehicular collection of vital signs and data protection strategies. For comparative values, we also asked about general attitudes towards data use, such as sharing personal information in everyday life (5 items, α = .781).
Also, the intention to use autonomous vehicles was evaluated (3 items, α = .837) (Davis et al. 1989).
Likert items were assessed on 6-point scales (min = 1 full disagreement, max = 6 full agreement). Table 5 lists the items used in this study.
5.2 Participants
In total, 283 people participated in the quantitative survey, thereof N = 183 in study I and N = 100 in study II. Sample characteristics are compared in Table 1.
On average, the participants were older in study I (age range 20–90) than in study II (age range 19–68). Gender and education distribution was similar: Overall, more men than women took part and educational levels were comparatively high according to predominant proportions of university graduates (cf. Statistisches Bundesamt (Destatis) 2020). The monthly net household income was higher in study I, which may be explained by the sample’s higher average age and related life situation.
The overall proportion of driving license holders was high. Regarding the use of driver assistance systems in cars (e.g., lane keeping assistant, automatic parking, cruise control), the majority was experienced, especially in study II.
5.3 Results
First, attitudes towards using autonomous driving are described (see Section 5.3.1, N = 183). Then, insights in users’ data requirements are provided (see Section 5.3.2, N = 100).
For data analysis, we used descriptive and inferential statistics. The level of significance (α) was set at 5%.
5.3.1 Attitudes towards using autonomous driving
In general, attitudes towards using autonomous driving were rather positive. Nearly half of the participants (47%, n = 86) could imagine to use a fully automated car, followed by 39.9% (n = 73) undecided ones, and 13.1% (n = 24) refusers. Figure 2 shows evaluations of perceived usage benefits and barriers (min = 1, max = 6).
Considering perceived benefits, less traffic jams and improved traffic flows (M = 4.9;SD = 1.4), more comfort by letting the vehicle take over driving tasks (M = 4.8;SD = 1.5), low accident risks (M = 4.6;SD = 1.5), and time savings (M = 4.5;SD = 1.5) were considered as usage advantages. These also included less fuel consumption (M = 4.4;SD = 1.6) and the expectation of improved insurance conditions (lower risk category) (M = 3.9;SD = 1.7). Privileges, such as free parking or using the bus lane, were not expected to be a benefit of use (M = 3.2;SD = 1.8).
Considering perceived barriers, legal issues as liability in the event of damage were seen particularly challenging (M = 5.1;SD = 1.3), followed by technical risks (e.g., errors) (M = 4.7;SD = 1.5), and insufficient data security (M = 4.7;SD = 1.5). Besides, not only ethical issues, e.g., responsibility for decisions in accident situations (M = 4.4;SD = 1.7) but also economic challenges, such as payments for infrastructure and road transport investments (M = 4.2;SD = 1.5), were identified as potential barriers to use. In contrast, incapacitation (i.e., limited self-determination) (M = 3.6;SD = 1.7) and health risks (e.g., through electrosmog) (M = 2.7;SD = 1.7) were considered less critical.
The evaluation of perceived risks (min = 1, max = 6) revealed feelings of distrust of the innovative technology and indicated high information and education needs of the public. The participants considered that there were still many issues to be technically and legally clarified on autonomous driving in the public (M = 4.8;SD = 1.3) and expressed concerns as regards the technical reliability (M = 4.0;SD = 1.7). In addition, they worried about cyber-criminals who could gain control of the vehicle (M = 4.0;SD = 1.7). To this, hacker attacks were perceived as deterrent to use (M = 3.8;SD = 1.7).
Correlation analyses showed that the intention to use autonomous vehicles was related to perceived risks (r = .456, p < .001) and usage barriers (r = .406, p < .001): The stronger agreements with perceived risks and usage barriers, the more likely participants were to decide against the use of autonomous vehicles. In detail, relations with technical unreliability (r = .471, p < .001) and health risks (r = .423, p < .001) were found (see Table 2). Usage benefits correlated weakly with the use intention (r = −.238, p < .01).
Considering user factors, gender correlated with use intention (r = .264, p < .001), perceived risks (r = .288, p < .001), and usage barriers (r = .318, p < .001), indicating men to be more willing to drive in an autonomous vehicle and also to have lower concerns than women, particularly as regards economic (r = .200, p < .01), legal (r = .200, p < .01), and health (r = .322, p < .01) issues. Age showed no significant correlations.
To better understand these relationships, we considered usage requirements with focus on data security and privacy in more detail in order to provide validated indications concerning future users’ willingness to drive in an autonomous vehicle (see Section 5.3.2).
5.3.2 Data use(r): distribution, safety, and privacy needs
The willingness to use autonomous vehicles was high (min = 1, max = 6): The participants indicated that they would like to experience autonomous vehicles (M = 4.7; SD = 1.4) and could imagine using them regularly in the future (M = 4.1; SD = 1.4). Less agreement was reached on the idea that there should be only autonomous vehicle transport in the future (M = 3.0; SD = 1.5).
To better understand user requirements for data distribution in mobility contexts, we took a general look at personal opinions on using and sharing data, such as in daily life (see Fig. 3): Most participants generally cared about what happens with their data (M = 4.8; SD = 1.2) and which data is being stored (M = 4.7; SD = 1.2) indicating high control needs. It was therefore not surprising that sharing personal information was rather critically seen (M = 2.8; SD = 1.3), especially data distribution to third parties (M = 2.2; SD = 1.3). Concerns about user profiles were indicated (M = 3.8; SD = 1.2).
Correlation analysis revealed that attitudes towards data use in general were related to attitudes towards data privacy and security in autonomous driving (r = .492, p < .001): The stronger general agreements on data privacy and control, the greater these were also with regard to autonomous driving. Here (see Fig. 4), the participants considered regular security checks by an independent company important as regards both the vehicle software (M = 5.3; SD = 0.9) and the service provider (M = 5.4; SD = 0.9). The participants also indicated to less likely assume well-developed data protection concepts from manufacturers and service providers (M = 2.8; SD = 1.3) indicating distrust towards individual stakeholders. Besides, privacy concerns about data access by third parties (M = 4.9; SD = 1.1) and data hacking (M = 4.6; SD = 1.2) were expressed.
Particularly concerning the collection and distribution of sensitive information (i.e., health data) opinions varied (see Fig. 5). The participants tended to reject the recording of vital signs (e.g., blink or pulse) for safety reasons (M = 3.4; SD = 1.6) as well as driving adjustments according to vital signs (M = 3.1; SD = 1.6). However, the recording of health data for emergency situations was approved (M = 3.7; SD = 1.7) as well as distributing vital signs to rescue services (M = 4.5; SD = 1.5). The participants also agreed that autonomous vehicles should be aware of disabilities (e.g., blindness) to adapt user interfaces to individual needs (M = 4.1; SD = 1.7).
Considering preferences about data storage location of personal information, the majority 60% (n = 60) chose the country of residence, followed by 21% (n = 21) who agreed for their data to be stored in the country of traveling. A few participants (6%, n = 6) selected any country in the EU as data storage location. 13% (n = 13) stated that the location did not matter.
Correlation analyses revealed that the intention to use autonomous driving was greater the lower perceived risks on data privacy and security were (r = −.311, p < .01) and the more open participants were in sharing sensitive health data (r = .370, p < .001). Perceived risks on data privacy and security in terms of data access by unknowns (r = −.326, p < .01) and hacker attacks (r = −.320, p < .01) were negatively related to the use intention, whereas particularly the willingness to provide health data for emergencies (r = .350, p < .001) and to share them with rescue services (r = .375, p < .001) showed positive correlations in this context (see Table 3).
Considering user factors, age was related to the use intention of autonomous vehicles (r = −.316, p < .001) which was all the greater the younger the participants were. Besides, age correlated with attitudes towards health data use (r = −.299, p < .01), indicating that younger participants tended to be more open about recording and using health data. Age was also related to perceived risks on data privacy and security (r = .214, p < .05) with older participants being more concerned about safety lacks in autonomous driving, particularly as regards unauthorized data access (r = .240, p < .05). In addition, older participants were more skeptical that vehicle manufacturers or service providers would do enough to protect the vehicles from external attacks (r = −.239, p < .05). Gender showed no significant correlations.
6 Discussion of results
Regarding the mixed methods approach of this survey, the iterative use of qualitative and quantitative methods allowed an intense exploration of user perspectives, perceived expectations, and challenges of data distribution in autonomous driving. As user perceptions, feelings, and requirements towards innovative technology are very individual, research requires high sensitivity, especially with regard to privacy and trust. This was realized by focus groups and interviews in which relevant factors were identified, individually addressed, and consolidated for appropriate measurement. Subsequent quantification provided validated results on user evaluations and showed significant correlations, particular between perceived (data protection) risks, usage barriers, and the willingness to use autonomous mobility. Also, significant correlations for age and gender were found, which, however, varied depending on the study: This may not only be due to diverse sample sizes and characteristics but also the items used, and thus needs to be re-considered in future work.
Results allow innovation management to properly address user requirements and compensate for potential usage barriers early in the technical development. Key findings may be used to develop transparent information and communication strategies for communes and cities. The communication concepts may serve two different goals: One is to increase public knowledge and the awareness for future urban mobility in order to empower citizens to derive informed decisions whether, to which extent, and under which (data) usage conditions citizens would support and use autonomous vehicles. The other is to inform technical designers and communication professionals about the public’s viewpoint and to develop an understanding that those public concerns need to be taken seriously and to be met with care.
In the following, we discuss specific aspects which are essential to understand the broad acceptance of autonomous vehicles and the handling and perspectives on data collection and distribution in future mobility.
6.1 Perceived risks and (dis)advantages
Risk perceptions and expectations were discussed in terms of perceived disadvantages and advantages of use, with data privacy as a strongly considered challenge in information distribution. Measurements of usage benefits and barriers confirmed pre-study results and previous research (Ward et al. 2017; Schmidt et al. 2015b): Not only increases in comfort, road safety, and travel efficiency regarding time saving by the ability to do other things while driving but also less traffic load and environmental pollution presented salient advantages of use. Next to legal and technical risks, data security was perceived a barrier to use. Fears of cyber-criminality in terms of data hacking and misuse were frequently mentioned.
The evaluation of risk perceptions indicated trust issues. The participants expressed doubts not only about the vehicle technology but also towards individual stakeholders, such as manufactures and service providers responsible for data protection. Since interpersonal trust has been identified as relevant for trust in automation (Hoff and Bashir 2015), we suggest communication concepts to establish contacts between future users, responsible companies, organizations, and policies for greater exchange, mutual understanding, and trusting relationships. It is also advisable to promote first hand user experience of the innovative technology in demonstrations or trials, as experience may positively affect trust perceptions (Gold et al. 2015) and the evaluation of technology (Brell et al. 2019c). The same applies to users’ understanding of technology (Koo et al. 2015): As the participants agreed that there are still many unresolved issues on autonomous mobility, our results demonstrate the urgency of early, user-centered information and education initiatives to increase the visibility of technical progress and improve technology know-how, in particular of inexperienced users, to foster trust.
6.2 Data use evaluation
Issues related to data collection, data sharing, and data security were relevant to the evaluation of data use in autonomous driving. The willingness to provide data for specific purposes was high if deemed necessary, such as accident investigation and emergency prevention. In contrast to Schmidt et al. (2015a) and Valdez and Ziefle (2019), health data use met with positive reaction. It seemed as monitoring was perceived reasonable to compensate for perceived barriers to use (e.g., liability and health risks). Follow-up research should focus on usage situations and conditions in which the distribution of data is preferred and accepted, also with regard to diverse user groups (older people, children, etc.).
Data control needs and information requests became apparent. Especially in situations involving unknown people, the participants perceived privacy restrictions. Again, the fear of unauthorized intruders gaining access to passenger and vehicle data was predominant. To reduce data concerns, we suggest third-party inspections to ensure not only data protection but also vehicle safety and stakeholder reliability as related uncertainties were repeatedly reported a barrier to use. Certifications to visualize security standards and clarify regulations may also improve feelings of privacy and trust. It is up to subsequent works to explore how this may attract the interest of service providers and users, which information needs to be addressed, and how it could be visually designed. Until then, comprehensible information guidelines on data handling are just as necessary as the involvement of users in deciding which data are collected and shared.
6.3 Usage intention
Despite perceived risks on data distribution, the reported willingness to use a fully automated vehicle was high, confirming previous findings (e.g., Panagiotopoulos and Dimitrakopoulos 2018; König and Neumayr 2017). Presumably, expected advantages may increase users’ interest and curiosity to experience the new technology. However, mainly concerns about usage (especially hacker attacks) were negatively related to the intention to use.
As a preliminary conclusion, the removal of risks and barriers (e.g., in terms of reliable data protection strategies) may be more decisive than incentives as regards the decision to adopt or reject autonomous mobility. However, this assumption needs to be addressed in follow-up studies in which participants have to decide which potential barrier or benefit weighs stronger for them in which usage situation (e.g., conjoint analysis). Such decision simulations would allow us to understand the trade-offs between the pro-using and the contra-using motivation and to identify so-called no-go-situations in which the public would not be willing to use autonomous vehicles, under no circumstances.
In this context, it is to be noted that the methodology used includes evaluations that do not base on real experience with automated vehicles. Rather, participants envision whether to be willing to use automated mobility and if so, under which circumstances. Of course, one could critically argue that the reliability of laypeople’s evaluation is low—due to the missing experience with automated driving. However, from a social science point of view, even evaluations of laypersons without hands-on experience might be an especially valuable source of information for all institutions and persons involved in the development and implementation process of automated driving: technical planners, persons responsible in communal policy, communication professionals, the teaching and education sector, and industry. Public perceptions represent the current status of technical knowledge (which can be increased by appropriate information designs) and the prevailing affect heuristics (Slovic et al. 2005; Keller et al. 2006) in terms of trust and the emotional evaluation of persons towards technology innovations in general and automated vehicles in particular. Public perceptions can be used early in the evaluation process to steer technical decisions, to develop information and communication strategies, and to inform and consult policy and governance (Offermann-van Heek et al. 2020).
7 Limitations and future works
As the study aim was to explore data risk perceptions and expectations of future users on a broad basis, we have not considered and compared the needs of diverse user groups so far. Since other studies on the perception and acceptance of autonomous mobility indicated the importance of user diversity in this context (Brell et al. 2019a; Brell et al. 2019c), the consideration of individual user perspectives in relation to this survey’s key findings has to catch up in subsequent studies. Effects of user factors to be addressed could regard not only, for example, gender, age and technology generation, health status, education, and technology know-how—especially as the participants in this survey were comparatively highly educated and often experienced in using advanced driver assistance systems—but also preferred user roles when driving (e.g., driver vs. passenger) and resulting requirements.
Another limitation regards our empirical methodology. We combined qualitative and quantitative procedures to catch both argumentation narratives as well as the quantification of user perspectives and their expectations towards benefits and challenges of autonomous driving. Still, we need to consider that our research methods provide only “anecdotal” evidence of acceptance, as the limited sample sizes do not allow a deeper applied insight into the interaction of human users with autonomous vehicles. Future studies could meet this limitation in two ways: one is to cross-validate the acceptance of users which do have some experience with automated vehicles already. Thereby, it could be determined whether the envisioned expectations towards benefits and challenges of autonomous driving are modulated by the increasing experience in the handling of autonomous vehicles. The second way of cross-validating is to replicate the studies country-wide in order to understand the acceptance patterns in a representative sample.
Besides, country-comparative studies should be conducted, as perceptions of urban mobility and the implementation of future mobility concepts may vary depending on culture, social shifts, and trends (Fraedrich and Lenz 2014; Theoto and Kaminski 2019).
Finally, in this study, we predominately focused on security-related aspects, thus expectations and risk perceptions with special regard to data privacy and data-related factors relevant to mobility acceptance. We did not include other positive effects of autonomous driving, as, e.g., the environmental benefit aspect. Future studies should also address environmental benefits of autonomous driving (Liu et al. 2019a; Nègre and Delhomme 2017) in order to receive a comprehensive picture of public perception of autonomous mobility.
Two more very essential aspects which need to be addressed in future work are the role of experience and individuals’ knowledge on acceptance as well as the role of information which is given to the public in the roll out process. What we know so far is that users’ experience with automated vehicle functions as well as drivers’ knowledge about vehicle automation is influencing public acceptance of automated cars: experienced persons (relying on both theoretical and/or practical hands-on knowledge) tend to be more open to vehicle innovations in general and automated driving in particular (Brell et al. 2019c; Ward et al. 2017). The explanation why experienced persons have higher acceptance levels, however, might be due to different reasons: on the one hand, users might know factually more (about benefits and risks), which allows them to evaluate autonomous driving realistically; on the other, users might feel to be better informed about potential factual and perceived risks which, as a consequence, increases the trust towards automated vehicle technology (Zaunbrecher et al. 2018; Petersen et al. 2018; Distler et al. 2018). Thus, both cognitive and affective factors influence public acceptance (Zaunbrecher et al. 2018; Liu et al. 2019b; Graf and Sonnberger 2020). However, at this point, it is the question how a transparent and diligent information policy could help to allow future users to realistically evaluate not only the enormous potential of autonomous vehicle technology but also the risks and uncertainties which come with it. Thus, future research should examine different information formats, media, and contents that increase the ability of future users to deal adequately with vehicle innovations in a transparent way and allow them to draw informed and diligent decisions by this forming a solid and sustainable public understanding and acceptance.
Notes
Experience with driver assistance systems was operationalized by the familiarity with using advanced speed regulation systems with SAE automation level 2 (SAE 2016)
See the Appendix: Introduction to the topic
See the Appendix: Privacy policy
Note that the interviews have been carried out in 2019, before Corona entered Germany; therefore, we had the chance to meet participants face to face
References
Alam M, Ferreira J, Fonseca J (2016) Introduction to intelligent transportation systems. In: Alam M, Ferreira J, Fonseca J (eds) Intelligent transportation systems. Studies in systems, decision and control. 52nd edn. Springer, Cham, pp 1–17
Arena F, Pau G (2019) An overview of vehicular communications. Future Internet 11(2):27. https://doi.org/10.3390/fi11020027
Arning K, van Heek J, Ziefle M (2018) Acceptance profiles for a carbon-derived foam mattress. exploring and segmenting consumer perceptions of a carbon capture and utilization product. J Clean Prod 188:171–184. https://doi.org/10.1016/j.jclepro.2018.03.256
BASt (2018) BASt Fahraufgaben der Fahrer nach Automatisierungsgrad [Driving tasks of the drivers according to degree of automation]
Biegelbauer P, Hansen J (2011) Democratic theory and citizen participation: democracy models in the evaluation of public participation in science and technology. Sci Public Policy 38(8):589–597. https://doi.org/10.3152/030234211X13092649606404
Brell T, Biermann H, Philipsen R, Ziefle M (2019a) Conditional privacy: users’ perception of data privacy in autonomous driving. In: Proceedings of the 5th international conference on vehicle technology and intelligent transport systems (VEHITS 2019). SCITEPRESS – Science and Technology Publications, Lda., pp 352–359
Brell T, Biermann H, Philipsen R, Ziefle M (2019b) Trust in autonomous technologies. A contextual comparison of influencing user Factors. In: Moallem A (ed) LNCS 11594. https://doi.org/10.1007/978-3-030-22351-9, vol 2019. Springer Nature, Switzerland, pp 371–384
Brell T, Philipsen R, Ziefle M (2019c) sCARy! risk perceptions in autonomous driving: the influence of experience on perceived benefits and barriers. Risk Analysis 39(2):342–357. https://doi.org/10.1111/risa.13190
Brell T, Philipsen R, Ziefle M (2019d) Suspicious minds ? – users ’ perceptions of autonomous and connected driving and connected driving. Theoretical Issues in Ergonomics Science 20(3):301–331. https://doi.org/10.1080/1463922X.2018.1485985
Burger J (2012) Rating of worry about energy sources with respect to public health, environmental health, and workers. Journal of Risk Research 15(9):1159–1169. https://doi.org/10.1080/13669877.2012.705316
Choi JK, Ji YG (2015) Investigating the importance of trust on adopting an autonomous vehicle. International Journal of Human-Computer Interaction 31(10):692–702. https://doi.org/10.1080/10447318.2015.1070549
Dartmann G, Song H, Schmeink A (2019) Big data analytics for cyber-physical systems: machine learning for the internet of things. Elsevier
Davis FD, Bagozzi RP, Warshaw PR (1989) User acceptance of computer technology: a comparison of two theoretical models. Manag Sci 35(8):982–1003. https://doi.org/10.1287/mnsc.35.8.982
Dinev T, Hart P (2006) An extended privacy calculus model for e-commerce transactions. Information Systems Research 17(1):61–80. https://doi.org/10.1287/isre.1060.0080
Distler V, Lallemand C, Bellet T (2018) Acceptability and acceptance of autonomous mobility on demand: the impact of an immersive experience. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–10
Dritsas S, Gritzalis D, Lambrinoudakis C (2006) Protecting privacy and anonymity in pervasive computing: trends and perspectives. Telematics and Informatics 23(3):196–210. https://doi.org/10.1016/j.tele.2005.07.005
Dzindolet MT, Peterson SA, Pomranky RA, Pierce LG, Beck HP (2003) The role of trust in automation reliance. International Journal of Human-Computer Studies 58(6):697–718
Field A (2009) Discovering statistics using SPSS, 3rd edn. Sage Publications Ltd, London
Finn RL, Wright D, Friedewald M (2013) Seven types of privacy. In: European data protection: coming of age. Springer, Dordrecht, pp 3–32
Fraedrich E, Lenz B (2014) Automated driving: individual and societal aspects. Transp Res Rec 2416(1):64–72. https://doi.org/10.3141/2416-08
Gantz J, Reinsel D (2012) The digital universe in 2020: big data, bigger digital shadows, and biggest growth in the far east. IDC iView: IDC Analyze the future 2007(2012):1–16
Garidis K, Ulbricht L, Rossmann A, Schmäh M (2020) Toward a user acceptance model of autonomous driving. In: Proceedings of the 53rd Hawaii international conference on system sciences, vol 3, pp 1381–1390
Gold C, Körber M, Hohenberger C, Lechner D, Bengler K (2015) Trust in automation – before and after the experience of take-over scenarios in a highly automated vehicle. Procedia Manufacturing 3:3025–3032. https://doi.org/10.1016/j.promfg.2015.07.847
Graf A, Sonnberger M (2020) Responsibility, rationality, and acceptance: how future users of autonomous driving are constructed in stakeholders’ sociotechnical imaginaries. Public Underst Sci 29(1):61–75
Grush B, Niles J (2018) The end of driving: transportation systems and public policy planning for autonomous vehicles. Elsevier
Gunter VJ, Harris CK (1998) Noisy winter: the ddt controversy in the years before silent spring. Rural Sociol 63(2):179–198. https://doi.org/10.1111/j.1549-0831.1998.tb00670.x
Gupta N, Fischer AR, Frewer LJ (2012) Socio-psychological determinants of public acceptance of technologies: a review. Public Underst Sci 21(7):782–795. https://doi.org/10.1177/0963662510392485
Haas R, Bhattacharjee S, Möller D (2020) Advanced driver assistance systems. In: Akhilesh K, Möller D (eds) Smart technologies. Springer, Singapore, pp 345–371
Hess DJ (2018) Social movements and energy democracy: types and processes of mobilization. Frontiers in Energy Research 6:135. https://doi.org/10.3389/fenrg.2018.00135
Hoff KA, Bashir M (2015) Trust in automation: integrating empirical evidence on factors that influence trust. Hum Factors 57(3):407–434. https://doi.org/10.1177/0018720814547570
Hohenberger C, Spörrle M, Welpe IM (2016) How and why do men and women differ in their willingness to use automated cars? the influence of emotions across different age groups. Transportation Research Part A: Policy and Practice 94:374–385. https://doi.org/10.1016/j.tra.2016.09.022
Horst M (2005) Cloning sensations: mass mediated articulation of social responses to controversial biotechnology. Public Underst Sci 14(2):185–200. https://doi.org/10.1177/0963662505050994
Huijts NM, Molin EJ, Steg L (2012) Psychological factors influencing sustainable energy technology acceptance: a review-based comprehensive framework. Renew Sust Energ Rev 16(1):525–531. https://doi.org/10.1016/j.rser.2011.08.018
Hulse LM, Xie H, Galea ER (2018) Perceptions of autonomous vehicles: relationships with road users, risk, gender and age. Saf Sci 102:1–13. https://doi.org/10.1016/j.ssci.2017.10.001
Häuslschmid R, Buelow MV, Pfleging B, Butz A (2017) Supporting trust in autonomous driving. In: Proceedings of the 22nd international conference on intelligent user interfaces, pp 319– 329
Janssen CP, Donker SF, Brumby DP, Kun AL (2019) International journal of human-computer studies history and future of human-automation interaction. Journal of Human Computer Studies 131 (May):99–107. https://doi.org/10.1016/j.ijhcs.2019.05.006
Jiménez F, Naranjo JE, Anaya JJ, García F, Ponz A, Armingol JM (2016) Advanced driver assistance system for road environments to improve safety and efficiency. Transportation Research Procedia 14:2245–2254. https://doi.org/10.1016/j.trpro.2016.05.240
Karabey B (2012) Big data and privacy issues. In: International symposium on information management in a changing world. Springer, pp 3–3
Kaur K, Rampersad G (2018) Trust in driverless cars: investigating key factors influencing the adoption of driverless cars. J Eng Technol Manag 48(April):87–96. https://doi.org/10.1016/j.jengtecman.2018.04.006
Keller C, Siegrist M, Gutscher H (2006) The role of the affect and availability heuristics in risk communication. Risk Analysis 26(3):631–639. https://doi.org/10.1111/j.1539-6924.2006.00773.x
König M, Neumayr L (2017) Users’ resistance towards radical innovations: the case of the self-driving car. Transp Res F 44:42–52. https://doi.org/10.1016/j.trf.2016.10.013
Koo J, Kwac J, Ju W, Steinert M, Leifer L, Nass C (2015) Why did my car just do that? Explaining semi-autonomous driving actions to improve driver understanding, trust, and performance. Int J Interact Des Manuf 9(4):269–275. https://doi.org/10.1007/s12008-014-0227-2
Lee J, See K (2004) Trust in automation: designing for appropriate reliance. Human Factors 46(1):50–80. https://doi.org/10.1518/hfes.46.1.50_30392
Lidynia C, Philipsen R, Ziefle M (2017) Droning on about drones—acceptance of and perceived barriers to drones in civil usage contexts. In: Savage-Knepshield P, Chen J (eds) Advances in human factors in robots and unmanned systems. Springer, Berlin, pp 317–329
Linzenich A, Zaunbrecher BS, Ziefle M (2016) Examining risk profiles for wind turbines, electricity pylons and mobile phone masts. An empirical study. In: Proceedings of the 11th conference on sustainable development of energy, water and environment systems–SDEWES
Linzenich A, Arning K, Offermann-van Heek J, Ziefle M (2019) Uncovering attitudes towards carbon capture storage and utilization technologies in germany: insights into affective-cognitive evaluations of benefits and risks. Energy Research & Social Science 48:205–218. https://doi.org/10.1016/j.erss.2018.09.017
Liu P, Ma Y, Zuo Y (2019a) Self-driving vehicles: are people willing to trade risks for environmental benefits? Transportation Research Part A: Policy and Practice 125:139–149
Liu P, Xu Z, Zhao X (2019b) Road tests of self-driving vehicles: affective and cognitive pathways in acceptance formation. Transportation research part A: Policy and Practice 124:354–369
Lund T (2012) Combining qualitative and quantitative approaches: some arguments for mixed methods research. Scand J Educ Res 56(2):155–165. https://doi.org/10.1080/00313831.2011.568674
Mayring P (2015) Qualitative Inhaltsanalyse. Grundlagen und Techniken [Qualitative Content Analysis Basics and Techniques], 12th edn. Beltz, Weinheim
Mayring P, Fenzl T (2019) Qualitative Inhaltsanalyse [Qualitative content analysis]. In: Handbuch methoden der empirischen sozialforschung. Springer VS, Wiesbaden, pp 633–648
Merritt SM, Heimbaugh H, LaChapell J, Lee D (2013) I trust it, but i don’t know why: effects of implicit attitudes toward automation on trust in an automated system. Hum Factors 55(3):520–534. https://doi.org/10.1177/0018720812465081
Nègre J, Delhomme P (2017) Drivers’ self-perceptions about being an eco-driver according to their concern for the environment, beliefs on eco-driving, and driving behavior. Transportation Research Part A: Policy and Practice 105:95–105
Offermann-van Heek J, Arning K, Sternberg A, Bardow A, Ziefle M (2020) Assessing public acceptance of the life cycle of co2-based fuels: does information make the difference? Energy Policy 143:111586. https://doi.org/10.1016/j.enpol.2020.111586
Panagiotopoulos I, Dimitrakopoulos G (2018) An empirical investigation on consumers’ intentions towards autonomous driving. Transportation Research Part C: Emerging Technologies 95:773–784. https://doi.org/10.1016/j.trc.2018.08.013
Parasuraman R, Riley V (1997) Humans and automation: use, misuse, disuse, abuse. Hum Factors 39(2):230–253. https://doi.org/10.1518/001872097778543886
Petersen L, Zhao H, Tilbury D, Yang XJ, Robert LP Jr (2018) The influence of risk on driver’s trust in semi-autonomous driving. In: NDIA ground vehicle systems engineering and technology symposium modeling & simulation, testing and validation (MSTV), pp 1–10
Portouli E, Karaseitanidis G, Lytrivis P, Amditis A, Raptis O, Karaberi C (2017) Public attitudes towards autonomous mini buses operating in real conditions in a Hellenic City. In: IEEE intelligent vehicles symposium, vol IV, pp 571–576
Reid R (2019) Teaching cars to drive. Civil Engineering Magazine Archive 89(3):52–74. https://doi.org/10.1061/ciegag.0001366
Rogers E (1995) Diffusion of innovations: modifications of a model for telecommunications. In: Stoetzer M, Mahler A (eds) Die Diffusion von Innovationen in der Telekommunikation [The diffusion of innovations in telecommunications]. Springer, Berlin, pp 25–38
SAE (2016) Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles j3016. Tech. rep., SAE International
Schmidt T, Philipsen R, Ziefle M (2015a) From V2X to control2trust: why trust and control are major attributes in Vehicle2X Technologies. In: Tryfonas T, Askoxylakis I (eds) HAS 2015, LNCS 9190, Springer, pp 570–581
Schmidt T, Philipsen R, Ziefle M (2015b) Safety first? V2X – Percived benefits, barriers and trade-offs of automated driving. In: Proceedings of the 1st international conference on vehicle technology and intelligent transport systems (VEHITS-2015), SCITEPRESS, pp 39–46
Schomakers EM, Lidynia C, Vervier L, Ziefle M (2018) Of guardians, cynics, and pragmatists-a typology of privacy concerns and behavior. In: IoTBDS, pp 153–163
Schomakers EM, Lidynia C, Müllmann D, Ziefle M (2019a) Internet users’ perceptions of information sensitivity–insights from germany. Int J Inf Manag 46:142–150. https://doi.org/10.1016/j.ijinfomgt.2018.11.018
Schomakers EM, Lidynia C, Ziefle M (2019b) A typology of online privacy personalities. Journal of Grid Computing 17(4):727–747. https://doi.org/10.1007/s10723-019-09500-3
Schomakers EM, Lidynia C, Ziefle M (2020) All of me? Users’ preferences for privacy-preserving data markets and the importance of anonymity. Electron Mark, pp 1–17. https://doi.org/10.1007/s12525-020-00404-9
Sheng S, Pakdamanian E, Han K, Kim BG, Tiwari P, Kim I, Feng L (2019) A case study of trust on autonomous driving. In: 2019 IEEE intelligent transportation systems conference, ITSC, pp 4368–4373
Siegrist M (2019) Trust and risk perception: a critical review of the literature. Risk Analysis. https://doi.org/10.1111/risa.13325
Sjöberg L, Moen B, Rundmo T (2004) Explaining risk perception. an evaluation of the psychometric paradigm in risk perception research. Rotunde Publikasjoner 10(2):665–612
Slovic P, Peters E, Finucane ML, MacGregor DG (2005) Affect, risk, and decision making. Health Psychology 24(4, Suppl):S35–S40. https://doi.org/10.1037/0278-6133.24.4.S35
Smith HJ, Dinev T, Xu H (2011) Information privacy research: an interdisciplinary review. MIS Quarterly 35(4):989–1016
Statistisches Bundesamt (Destatis) (2020) Bildungsstand der bevölkerung - ergebnisse des mikrozensus 2018 [educational level of the population - results of the microcensus 2018]. https://www.destatis.de/DE/Themen/Gesellschaft-Umwelt/Bildung-Forschung-Kultur/Bildungsstand/Publikationen/Downloads-Bildungsstand/bildungsstand-bevoelkerung-5210002187004.pdf?_blob=publicationFilehttps://www.destatis.de/DE/Themen/Gesellschaft-Umwelt/Bildung-Forschung-Kultur/Bildungsstand/Publikationen/Downloads-Bildungsstand/bildungsstand-bevoelkerung-5210002187004.pdf?_blob=publicationFilehttps://www.destatis.de/DE/Themen/Gesellschaft-Umwelt/Bildung-Forschung-Kultur/Bildungsstand/Publikationen/Downloads-Bildungsstand/bildungsstand-bevoelkerung-5210002187004.pdf?_blob=publicationFilehttps://www.destatis.de/DE/Themen/Gesellschaft-Umwelt/Bildung-Forschung-Kultur/Bildungsstand/Publikationen/Downloads-Bildungsstand/bildungsstand-bevoelkerung-5210002187004.pdf?_blob=publicationFile
Tene O, Polonetsky J (2012) Big data for all: privacy and user control in the age of analytics. Nw J Tech & Intell Prop 11(5):xxvii
Theoto TN, Kaminski PC (2019) A country-specific evaluation on the feasibility of autonomous vehicles. Product Management & Development 17(2):123–133. https://doi.org/10.4322/pmd.2019.013
Valdez AC, Ziefle M (2019) The users’ perspective on the privacy-utility trade-offs in health recommender systems. International Journal of Human-Computer Studies 121:108–121. https://doi.org/10.1016/j.ijhcs.2018.04.003
Venkatesh V, Bala H (2008) Technology acceptance model 3 and a research agenda on interventions. Decis Sci 39(2):273–315. https://doi.org/10.1111/j.1540-5915.2008.00192.x
Venkatesh V, Davis FD (2000) A theoretical extension of the technology acceptance model: four longitudinal field studies. Management Science 46(2):186–204 . https://doi.org/10.1287/mnsc.46.2.186.11926 arXiv:1011.1669v3
Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. MIS Q 27(3):425–478. https://doi.org/10.2307/30036540
Venkatesh V, Thong J, Xu X (2012) Consumer acceptance and use of information technology: extending the unified theory of acceptance and use of technology. MIS Q 36(1):157–178. https://doi.org/10.2307/41410412
Walter J, Abendroth B (1011) On the role of informational privacy in connected vehicles: a privacy-aware acceptance modelling approach for connected vehicular services. Telematics Inform 49:361. https://doi.org/10.1016/j.tele.2020.101361
Ward C, Raue M, Lee C, D’Ambrosio L, Coughlin JF (2017) Acceptance of automated driving across generations: the role of risk and benefit perception, knowledge, and trust. In: Kurosu M (ed) Human-Computer Interaction. User Interface Design, Development and Multimodality. HCI 2017. Lecture Notes in Computer Science, vol 10271. Springer, Cham, pp 254–266
Waytz A, Heafner J, Epley N (2014) The mind in the machine: anthropomorphism increases trust in an autonomous vehicle. J Exp Soc Psychol 52:113–117. https://doi.org/10.1016/j.jesp.2014.01.005
Wilkowska W, Ziefle M, Himmel S (2015) Perceptions of personal privacy in smart home technologies: do user assessments vary depending on the research method?. In: Tryfonas T, Askoxylakis I (eds) HAS 2015, LNCS 9190. Springer International Publishing, Switzerland, pp 592–603
Zaunbrecher BS, Kluge J, Ziefle M (2018) Exploring mental models of geothermal energy among laypeople in Germany as hidden drivers for acceptance. Journal of Sustainable Development of Energy, Water and Environment Systems 6(3):446–463. https://doi.org/10.13044/j.sdewes.d5.0192
Ziefle M, Halbey J, Kowalewski S (2016) Users’ willingness to share data on the internet: perceived benefits and caveats. In: Proceedings of the international conference on internet of things and big data (IoTBD 2016), SCITEPRESS, pp 255–265
Ziefle M, Brell T, Philipsen R, Offermann-van Heek J, Arning K (2019) Privacy issues in smart cities: insights into citizens’ perspectives toward safe mobility in urban environments. In: Big data analytics for cyber-physical systems, Elsevier, pp 275– 292
Acknowledgments
The authors would like to thank all participants for their patience and openness to share opinions on innovative mobility concepts in terms of autonomous driving solutions. Special thanks are given to Florian Groh and Adam Robert Michalik for research assistance and to Dr. Johanna Kluge, Dr. Simon Himme, and Julian Hildebrandt for valuable research advice.
Funding
Open Access funding enabled and organized by Projekt DEAL. This work has been funded by the Federal Ministry of Transport and Digital Infrastructure (BMVI) within the funding guideline “Automated and Networked Driving” under the project APEROL with the funding code 16AVF2134C.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Ethics approval
We did not seek ethical approval from the ethics committee, as our study falls in the category where no such approval is necessary in Germany. This category spans all non-invasive, non-clinical research on human subjects, where subjects are transparently informed about the purpose, aim, and risks of the studies and when these risks are reasonably low. Prior to participating in the study, they were informed that it is of high importance to understand free opinions and attitudes on mobility behavior from the drivers’ perspective and that we were very happy if they would share their opinions with us. Still, however, we stressed that their participation was completely voluntary. The participants were not reimbursed for taking part in the study. Furthermore, we ensured a high standard privacy protection and let the participants know that none of their answers can be referred back to them as persons. Demographic data were also submitted voluntarily and all participants were informed that, on request, their personal data would be deleted from our encrypted hard drives. After these careful explanations, participants reported to feel well informed about the purpose and the aim of the study and their freedom to quit participation at any time. Regarding the privacy policy explanations, the participants reported to understand that high standards were applied and deliberately accepted participation. Participant privacy is a key value that our university has committed itself to uphold. From the comments in the open question fields at the end of the survey, we learnt that those participants were interested in the topic and were keen to look at the results, which we assured them to receive.
Consent to participate
Informed consent was obtained from all individual participants included in the study.
Consent for participation
Informed consent was obtained from all individual participants included in the study.
Consent for publication
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix. Introductions and items used in questionnaire studies
Appendix. Introductions and items used in questionnaire studies
Privacy policy
“Information obtained in this survey will be completely anonymized and treated confidentially by the research team (protection of data privacy). Conclusions about your person are not possible. Anonymized data will be used exclusively for scientific purposes in publications and presentations. With your permission, we will record the discussion. The recordings will be used for the exact documentation of the information we want to gather and will only be used within scientific environments.”
Introduction to the topic
“In this survey, we want to gain insights on the topic of autonomous driving. Please imagine that on-road situations (e.g., turning, parking, overtaking) are performed autonomously, i.e., by fully automated vehicle features, allowing the human driver to pursue other activities while driving (e.g., reading, relaxing).
Our aim is to contribute to the understanding of expectations and factors related to data distribution, data privacy, and security in autonomous driving that are of particular interest to future users. Your contribution helps to shape future mobility according to individual needs and ideas.
No prior knowledge is needed. If there are response options that do not exactly match your situation or opinion, please provide an answer that comes closest to it. We are interested in your personal impression, perception, and evaluation. Thank you for your participation and for taking your time.”
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Biermann, H., Philipsen, R., Brell, T. et al. Rolling in the deep. Hum.-Intell. Syst. Integr. 1, 53–70 (2019). https://doi.org/10.1007/s42454-020-00015-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42454-020-00015-x