Keywords

1 Introduction

The advancing digitalization leads to the ever-increasing pervasion of the internet into the daily lives of individuals. In this context, individuals increasingly share sensitive data and use software to facilitate their everyday life. This has implications both with regard to privacy and security in the realm of information technologyFootnote 1. The Deutsche Telekom (Europe’s largest telecommunications company) hereby reported 46 million attacks on their honeypots in 2019 [1], an increase of 12 million attacks compared to 2018. In addition, the Federal Criminal Police Office (Bundeskriminalamt) reported around 87,000 incidents of cybercrime with a particularly growing focus on mobile malware and an associated financial loss of around 60 million euros [2].

Apart from such illegal activities that reveal the need for enhanced security, the advancing digitalization also fuels an increased interest of private companies and state institutions to increasingly collect private data about individuals. Companies are mainly interested in better understanding their customers in order to offer individualized products and enable personalized advertising. State actors, on the other hand, are expanding their surveillance activities in cyberspace to prevent or solve crimes, in the context of which the interests of individuals who value their privacy are potentially affected. Negative consequences of increased collection of private data could be observed in the case of Cambridge Analytica, where data was analyzed and misused for political purposes and thus used in a completely different context than originally intended by the user [3].

In this digital environment, individuals should therefore have an interest in maintaining their privacy and security through appropriate protective behavior. In line with this, a representative study by the German Federal Office for Information Security (Bundesamt für Sicherheit in der Informationstechnik, BSI) in 2017 showed that 97% of German internet users consider security to be very important [4]. However, only a third of those surveyed specifically inform themselves about security. Further studies have confirmed that there is a growing security awareness among private individuals, especially with regard to the widespread use of smartphones [5, 6]. Similarly, users usually highly value their privacy but often do not act accordingly, a phenomenon also known as the privacy paradox [7, 8]. Thus, there is a general concern to support users in both their privacy and security needs. Privacy and security behavior have a common basis, as they both deal with threats in a digital world. By avoiding public WIFI-spots, for example, one can avoid both security risks and unwanted access to private data. However, the one does not necessarily go hand in hand with the other. Performing regular updates of the operating system of one's computer might be an effective security behavior, but does not prevent the provider from collecting private data. Therefore, the exact relationship between privacy and security remains of high importance. If both privacy and security behavior is to be effectively enhanced, it must be understood how both are related, whether they are conceptually similar or different and whether different factors similarly influence privacy and security behavior. Only on the basis of a better understanding of this can it be ensured that appropriate interventions and software are developed which support users in their need for both privacy and security.

To address this issue, an online study representative for the German population (with regard to age, gender, state, income and education) with 1,219 participants was conducted. In the following sections related work is presented (Sect. 2), followed by the hypotheses (Sect. 3) and the methods applied (Sect. 4). After the illustration of the results (Sect. 5) the findings are discussed in a broader context (Sect. 6) and conclusions are drawn (Sect. 7).

2 Related Work

Theoretical Conceptualizations.

The causes of existing insufficiencies and possibilities for improving both privacy and security behavior are being studied intensively. IT security in general refers to the protection of computer systems from theft and damage of hardware, software and information as well as the disruption of services they are supposed to provide [9]. A good conceptualization of this protection is provided by the so called CIA triad: secure IT systems should therefore maintain confidentiality, integrity and availability [10]. Confidentiality hereby refers to the prevention of unauthorized viewing, integrity to the unauthorized modification and availability to the preservation of access [10]. Based on these definitions, security does not necessarily cover the privacy domain, but may incorporate it to some extent. There is a particular overlap in the factor confidentiality, since unauthorized viewing is associated with both unauthorized access as a security breach and with the possible exposure of sensitive information about individuals as a privacy breach. Integrity and availability, on the other hand, tend to describe factors that can be distinguished from privacy more easily.

Privacy in general refers to the prevention of exposure of sensitive information about (groups of) individuals. This includes, among other things, the nondisclosure of behavior, communications and descriptive personal data [11]. The general understanding of the term “privacy” today is still quite close to Westin’s widely known definition in 1967, which described privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others” [12]. However, preserving privacy in the rapidly changing digital environment is much more difficult today, which may be one reason why there is still no general agreement on the exact scope of the term privacy. Since the focus in this study is on the exposure of sensitive information in the realm of information technology, we refer to privacy in the IT context throughout this manuscript.

Based on these conceptualizations, privacy and security can both be seen as essential protections which are related to a certain degree - especially in the factor confidentiality, which describes the unauthorized viewing of data and is relevant for both privacy and security. Nevertheless, they can also vary widely in which specific elements they protect. While security refers to protection in a more general way, privacy refers specifically to the protection of personal, informational data.

Previously, the technology threat avoidance theory (TTAT) has been introduced as a possible framework to better understand personal motivations when facing IT threats [13, 14]. The TTAT hereby tries to conceptualize the cognitive processes taking place when individuals appraise threat and seek solutions with the goal to avoid technology-related threats. Although the TTAT does not explicitly distinguish between privacy and security, both represent essential areas in which IT-threats can be avoided. TTAT posits that, when confronted with IT threats, the two processes threat appraisal and coping appraisal take place and determine the answer to the threat [14]. While both privacy and security have their common ground in representing IT-related threats, they could also differ in those processes. For example, security threats such as ransomware often have immediate negative effects for users while privacy threats often have negative consequences only at a later stage and also on a societal rather than individual level (as in the Cambridge Analytica case). Thus, depending on whether the threat is a security or privacy threat, the threat appraisal could differ and result in different behavior. Taken together, the TTAT provides a framework on the basis of which it can be expected that privacy and security behavior are related to a certain extent, but nevertheless differ in specific aspects of the corresponding behavior.

Empirical Conceptualizations.

In order to find relevant literature, we used several databases (IEEE Xplore, Web of Science, ACM Digital Library), looking for the combination of the search terms privacy, security and relationship. After initially including several studies containing both privacy and security even without a specific conceptualization of their relationship in order to illustrate the problem, we proceeded to only include studies making some kind of statement about the presumed relationship. This approach revealed that despite the reported differences on a theoretical level, privacy and security (and corresponding behavior) are often used together without a finer distinction. They are hereby treated as quite identical with the (mostly implicit) assumption, that they describe a common construct. In this context one study, for example, argues for the importance of usable privacy and security and how social processes play a major role in a number of privacy and security related behaviors [15]. However, instead of explicitly conceptualizing the relationship between security and privacy, both terms are mainly used in combination. Similar cases with a lack of disentangling privacy from security behavior can be seen throughout the literature [16,17,18]. Only few studies explicitly justify the approach to treat both privacy and security as closely related. For example, in one instance it is explicitly argued that they are indeed closely related and might be best conceptualized as two dimensions of a single construct [19].

Apart from studies that cover privacy and security as closely related or without an explicit conceptualization, some voices argue for a finer distinction between privacy and security, and define these concepts more explicitly in distinction to each other [20,21,22,23]. Bansal, for example, distinguishes privacy and security via developing a scale with dimensions which are unique to security concerns and show no overlap to privacy concerns, such as data integrity, authentication and improper access during transmission [24]. Pavlou also explicitly distinguishes information privacy concerns and information security concerns as distinct antecedents of purchasing behavior in an online environment representing uncertainty factors [25]. Finally, Oetzel and Krumay distinguish privacy and security conceptually and based on a content analysis of company websites, even though they acknowledge that some relationship exists between the concepts to a certain degree [26]. One group of studies explicitly examines the relationship between privacy and security attitudes and find that they are not equally influenced by individual characteristics, with the correlation between privacy and security attitudes being only weak [27, 28].

Finally, some studies rather use a hierarchical approach to conceptualize privacy and security, although sometimes only implicitly. In one study, influencing factors on privacy and security behavior are discussed without a clear distinction between the two concepts [17]. Implicitly, however, privacy is treated as a subcategory of security concerned with the protection of access to personal data. The subsumption of privacy into the security domain is confirmed by further studies which define information privacy as a part of the broader construct web security [29] and generally as being part of a security framework [30]. The other direction of a hierarchical relationship has also been suggested, e.g. in the sense that the problem area of improper access to data as a security concern can also be considered as a part of the superordinate category privacy [31]. An overview of the most commonly proposed relationships is provided in Fig. 1.

Influencing Factors on Privacy and Security Behavior.

In order to better conceptualize the relationship between privacy and security behavior, it is also promising to examine it from different points of view and analyze how factors such as age, gender, education and political ideology influence the corresponding behavior. Age and gender, for example, have previously been associated with differences in security behavior. Here, it has mainly been shown that women show less security knowledge, experience and behavior than men [32, 33]. With regard to age, especially younger people below 25 years have been associated with weaker security behavior [34, 35]. As for education, it has previously been shown that those with higher levels of education tend to be more concerned about privacy [36] and show more security awareness [37]. Political ideology has so far mainly been reported as relevant to privacy attitudes and behavior [7, 38, 39]. There is a consensus in this respect that people who see themselves as rather left-wing are more critical of the (predominantly state-organized) data collection on individuals. If the concepts of privacy and security were indeed as closely related as they are often discussed, the moderating factors described should be applicable to each other's behavior. Thus, political ideology should have an influence on security behavior, age and gender should have an influence on privacy behavior and education levels should have similar effects on both privacy and security behavior.

Fig. 1.
figure 1

Conceptualizations of the relationship between privacy and security proposed in the literature (inspired by [20])

Importantly, privacy and security attitudes and behavior can potentially differ between cultures [40, 41]. Thus, we focus on a sample from Germany providing an opportunity for cross-country comparability in future studies. Private users are thereby of special interest since everyday behavior in the digital realm can have negative consequence for everyone, such as the already described security incidents (e.g. mobile malware) and privacy breaches (e.g. Cambridge Analytica).

Research Gap.

Generally, there is no consensus with regard to the relationship between privacy and security and plenty of studies using both terms actually do not conceptualize their relationship at all, but use both in parallel and assume some kind of implicit, close relationship. Importantly, the vast majority of studies trying to conceptualize some kind of relationship focus either on theoretical considerations or on corresponding privacy/security attitudes as opposed to behavior. Thus, there exists a gap with regard to illuminating the relationship between privacy and security behavior. If the ultimate goal is to increase privacy and security behavior, which is a desirable objective as previously outlined, further empirical data on the relationship between them is needed. Especially the question, to what extent privacy behavior goes hand in hand with security behavior, and thus whether both could eventually be improved by similar interventions and technical implementations has been neglected so far. Accordingly, this study aims to answer the following research question: “Are privacy and security behavior closely related and similarly influenced by demographic factors and political ideology?”.

3 Hypotheses

In order to fill the described research gap, this study investigates the relationship between privacy and security behavior of private users in Germany, taking into account demographic factors such as gender, age, education and political ideology. Private users are hereby defined as individuals who use information and communication technology, such as computers and the internet for their personal use. Based on the literature review, we do not expect privacy and security behavior to be not related at all and that they would constitute completely separable domains. Instead, we aim at illuminating the relationship between the two by assessing the correlation and influencing factors on corresponding behavior at different levels. Due to the literature describing privacy and security (sometimes implicitly) predominantly as closely related, a correlation and a similar influence of demographic factors and political ideology on both privacy and security behavior is expected. However, no assumptions are made about the expected strength of the correlation, as there is preliminary evidence to suggest that privacy and security may be conceptually more different than often treated in the literature. As demographic factors have previously been shown to influence especially security behavior, these factors should influence privacy behavior in a similar way, if both were conceptually closely related. Similarly, political ideology, which has previously been shown to influence privacy behavior, should also influence security behavior if both were closely related. If there were no similar influence of these factors on the corresponding behavior, this would indicate that the two concepts of privacy and security behavior need to be distinguished more thoroughly. Based on the previously reviewed literature, the following hypotheses are therefore postulated:

  • H1: Privacy behavior and security behavior correlate positively

  • H2: Demographic factors such as gender, age, and education have a similar influence on both privacy and security behavior

  • H3: Political ideology has a similar influence on both privacy and security behaviour.

4 Method

4.1 Study Design and Participants

To assess privacy and security behavior and their relationship, a representative online survey with German citizens was conducted in May 2019, using LimeSurvey and the panel provider GapFish (Berlin)Footnote 2. GapFish is certified according to the ISO norm 26362 ensuring the quality of access-panels in market, opinion and social research [42]. The sample (N = 1,219) was matched to the distribution of age, gender, income, region and education according to the general German population [43, 44] during the data collection by the panel provider using corresponding quotas. The sample covers an age-range from 14 to 87 years, of which 52% are women and 48% are men.

The survey included four questions related to security behavior and eight questions related to privacy behavior. The overall survey further included questions on security and privacy knowledge, media use in crisis situations and data misuse which, however, are not part of this study. As the privacy and security behavior questions were posed prior to the other questions, a possible bias through the other questions can be ruled out. Answers to the items were given on a 5-point rating scale by Rohrmann, ranging from 1 – I disagree to 5I strongly agree [45]. To get more reliable answers, the option no answer was provided for all questions and all questions were posed in German.

The items were developed based on the recommendations of the German Federal Office for Information Security (BSI) on how to secure one’s computer, smartphone and online generated data [46, 47]. Some survey instruments already exist with regard to privacy and security. However, we found none to be suitable for our specific case, in which we wanted to analyze German private users with regard to their everyday behavior. The Human Aspects of Information Security Questionnaire (HAISQ), for example, aims at evaluating information security threats caused by employees within organizations rather than assessing private users in their everyday life [48] and the Internet Users Information Privacy Concerns (IUIPC) scale focuses on attitudes rather than actual behavior [49]. For the item development we therefore focused on (1) behavioral actions rather than intentions, (2) private users in their everyday-life as opposed to specific (e.g. work-related) contexts and (3) suitable contexts for German private users. The latter was the main motivation to use recommendations of a German institution such as the BSI. The recommendations do not explicitly distinguish between privacy and security behavior but rather touch on both topics. For the purpose of this study, however, the resulting items have been treated as items separately for privacy and security, based on face validity. Since the recommendations do not explicitly distinguish between privacy and security behavior and we wanted to include all recommendations to cover enough topics, an uneven number of items for assessing privacy and security behavior resulted. With regard to their security behavior, the participants had to answer questions such as whether they would install software updates immediately, or use antivirus software. With regard to their privacy behavior, the participants had to answer questions such as whether they inform themselves about the privacy policy of apps before installing them, or avoid online services that require a name or e-mail address (an overview of all items used in the analysis can be found in Figs. 2 and 3).

Because the aim of this study was to evaluate the relationship between privacy and security behavior of the German population with regard to demographics like age and gender, education but also political ideology, corresponding questions were also included in the survey. For the latter, two items were included which asked for the personal opinion regarding the responsibility for data protection on the internet (state vs. company) since different political ideologies can be expected to yield different answers here (e.g. more left-wing socialist types might expect greater state interference than more right-wing liberal types [50]). In these, participants were asked, whether they think that the state is responsible for data protection on the internet (item 1) and whether they think, that the companies collecting the data are responsible for data protection on the internet (item 2). The items were developed based on theoretical considerations and answers were given on the same 5-point Rohrmann-scale as the other items. Another item asked directly about the political orientation on a left to right spectrum (left-wing, fairly left-wing, center, fairly right-wing, right-wing).

4.2 Analysis

The analysis was conducted using the software tools Microsoft Excel and RStudio Version 4.0.2. Answers with the rating no response were excluded as missing values from the subsequent analysis. An initial descriptive analysis for the items for both the privacy behavior scale and security behavior scale was conducted. The reliability of the corresponding scales was investigated based on the internal consistency (Cronbach’s Alpha). In order to find group differences, participants were grouped into roughly equal age categories (15–29, 30–44, 45–59, >60). Education levels were grouped into three categories: low (no degree and German Hauptschul-degree), medium (German Realschul-degree) and high (Highschool & University degree). The individual level of privacy and security behavior was determined by calculating the mean across all items of the corresponding scale. The factor attribution of responsibility for data protection on the internet was calculated based on the two items with regard to state- or company-responsibility. If a participant reported a higher responsibility for the state than the company, he was grouped in the factor level state and vice versa.

Differences in privacy and security behavior depending on the group factors gender, age and education were analyzed using a multivariate analysis of variance (MANOVA). A separate MANOVA was carried out for the factors political orientation and attribution of responsibility for data protection on the internet (together representing political ideology) as they can be assigned to a different theoretical framework than the former factors. Since the assumption of multivariate normality and homogeneity of covariance matrices could not be confirmed for the available data, a parametric bootstrap resampling approach with 10,000 iterations was used to calculate the test statistics. This method was implemented using the MANOVA function from the R package “MANOVA.RM” [51].

Subsequent univariate analyses were conducted using factorial analyses of variance (ANOVA) when corresponding assumptions such as normal distribution and homogeneity of variances were fulfilled and robust factorial ANOVAs with trimmed means (trimming level = 0.2) if they were violated [52, 53]. The robust approach also uses bootstrapping to obtain an empirically derived critical p-value. In this context, no degrees of freedom are reported since an adjusted critical value instead of a critical value based on a known sampling distribution is used. The reported test statistic Q refers to the robust ANOVA test statistic for trimmed means. Subsequent robust post-hoc tests (test statistic: ψ) for disentangling observed main effects are also based on percentile bootstraps for the p values [53]. Because all tests were performed with the same sample, the 5% - alpha level was corrected with the Bonferroni-Holm method [54].

5 Results

5.1 Descriptive Analysis

To evaluate the reliability of the constructed privacy behavior and security behavior scales, the internal consistency of Cronbach’s Alpha was analyzed. After two items for the privacy behavior scale and one item for the security behavior scale were rejected and not used for further analyses (due to a low correlation of the item with the overall scale) they showed moderate values of αprivacy-behavior = .72 and αsecurity-behavior = .65. The internal consistency is usually considered acceptable from around α =.70 [55]. A possible underestimation of α due to few and heterogenous items is a known phenomenon, which can be neglected to a certain degree, since the analysis does not focus on individual scores but on aggregated group scores, which are not strongly affected by measurement errors due to a lower reliability [55].

Fig. 2.
figure 2

Percentage frequencies for the questions of the privacy behavior category, N = 1,219.

A descriptive analysis of the responses gave a nuanced picture of the participants' self-reported privacy behavior. As shown in Fig. 2 (“Rejection” combines “I strongly disagree” and “I hardly agree” answers while “Agreement” combines “I fairly agree” and “I strongly agree” answers), agreement and rejection with regard to privacy-related items are mostly balanced with the exception of one item (“I avoid Online-Services, that require a name/email-address”) where only 12% of participants agree and 58% disagree. Moreover, a fairly high percentage of participants were undecided about their privacy behavior, with response rates ranging from 17% to 30%.

A descriptive analysis of the responses with regard to security revealed that the majority of participants indicated a rather high level of self-reported security behavior. Figure 3 shows the percentage frequencies for the corresponding security behavior items. It becomes apparent that agreement to all security-related items exceeds rejection (70% vs. 18%, 51% vs. 25%, 58% vs. 18%) and also undecided or no response answers (which are all below 20%).

Fig. 3.
figure 3

Percentage frequencies for the questions of the security behavior category, N = 1.219.

5.2 Hypothesis Testing

H1: Privacy Behavior and Security Behavior Correlate.

To test H1, a Spearman’s rank correlation was calculated with the mean values of privacy and security behavior across the corresponding items. The correlation was weakly positive, r = .18, p < .001. The overall privacy behavior (M = 2.81, SD = 0.86) thereby was considerably lower than the overall security behavior (M = 3.76, SD = 1.07) across all participants.

H2: Demographic Factors such as Gender, Age, and Education have a Similar Influence on Both Privacy and Security Behavior.

One main goal of the analysis was to assess, whether privacy and security behavior can be considered as conceptually closely related. If that were the case, gender, age and education should have a similar influence on both privacy and security behavior. The corresponding robust MANOVA revealed that while gender did not influence privacy and security behavior at all (Wald-type statistic: W(df=2) = 0.85, p = 0.99) both age (W(6) = 32.11, p < .001) and education (W(4) = 21.61, p = .003) influenced privacy and security behavior. To disentangle these effects, univariate ANOVAs were conducted separately for privacy behavior and security behavior. This revealed that age (F(1,995) = 0.07, p = .79) and education (F(1,995) = 2.01, p = .78) did not influence privacy behavior. In contrast, security behavior was significantly influenced by both age (Q = 44.94, p = .01) and education (Q = 12.88, p = .02). Subsequent robust post-hoc comparisons showed that young people below the age of 30 reported significantly less security behavior than those in the age groups 30–44 (ψ = −0.69, p < .001) and 45–59 (ψ = −0.81, p = .001). Furthermore, older people in the age group over 60 reported significantly higher security behavior than those in the age group 30-44 (ψ = −1.08, p < .001), but significantly lower security behavior that those in the age group 45–59 (ψ = −1.20, p < .001) (see Table 1).

Table 1. Table for trimmed mean values (trimming level = 0.2) and standard deviations of the security behavior score per age category

With regard to the level of education, robust post-hoc comparisons revealed that those with a low education reported significantly lower security behavior both compared to those with medium education (ψ = −0.94, p = < .001) and high education (ψ = 1.30, p = < .001) (see Table 2).

Taken together, the results show that contrary to expectations, privacy and security behavior are not similarly influenced by the categories of gender, age and education. While different groups with regard to age and education report significantly diverging security behavior, no such differences are seen for privacy behavior. Thus, the hypothesis could not be confirmed based on the current data.

Table 2. Table for trimmed mean values (trimming level = 0.2) and standard deviations of the security behavior score per education category

H3: Political Ideology has a Similar Influence on both Privacy and Security Behavior.

Besides the described demographic factors such as gender, age and education, it was hypothesized that political ideology might have an influence on privacy and security behavior. Again, if privacy and security can be considered as conceptually closely related, political ideology should have a similar influence on privacy and security behavior. The corresponding MANOVA included the factors “attribution of responsibility for data protection on the internet” (state vs. company) and political orientation (left, rather left, center, rather right, right). The results showed that neither the data protection attribution (W(8) = 17.51, p = .18) nor political orientation (W(4) = 5.56, p = .94) were significantly associated with privacy and/or security behavior.

6 Discussion

Summary of Results.

The main goal of this study was to quantify the relationship between privacy and security behavior and assess, whether they can be regarded as closely related. We tried to illuminate this relationship from different points of view by examining whether the corresponding behaviors correlate and whether they are similarly influenced by factors such as demographics and political ideology. Only then it would be valid not to disentangle them and explicitly explain their relationship when researching these concepts, as is often the case. However, the present results show that privacy and security behavior are actually only weakly correlated. Furthermore, influencing factors on privacy and security behavior are not consistent. While young people (<30) and those with low education (no degree and German Hauptschul-degree) reported significantly less security behavior than older and more educated people, no such differences could be found for privacy behavior. Political ideology had no influence, neither on privacy nor on security behavior.

Relationship Between Privacy and Security Behavior.

Based on these results, the notion, that privacy and security are closely linked and those who behave securely necessarily also behave privately must be questioned. This finding stands in contrast to some research, which does not explicitly distinguish between privacy but uses both in parallel [16,17,18]. Thus, the danger exists that findings which implicitly rather target security improvements might be falsely attributed to privacy improvements when they are only suitable to improve security – and vice versa. This could be relevant, for example, for both the education of children and adults with regard to improving privacy and security behavior and for software developers who need to be aware in which relation they view privacy and security and to what extend one and the other shall be protected.

Especially with regard to the examination of privacy and security behavior as opposed to corresponding attitudes, the findings of this study add to the existing literature. They are hereby in line with findings, that attitudes towards privacy and security also are not similarly influenced by personality characteristics and the correlation between privacy and security attitudes is only weak [27, 28]. Existing evidence, according to which individuals differ in their privacy needs based on their political ideology [38, 39] could not be shown in corresponding behavior. One reason for this could be the fact, that we assessed political orientation on a 5-point scale. Even though it can be argued that too many points can also confuse respondents, there is evidence that a 10-point and 11-point left-to-right scale can lead to a higher validity [56]. Thus, we might have been able to detect corresponding effects if we had used a more fine-grained scale.

In general, up until today there is no consensus on the exact relationship between privacy and security. Sometimes implicitly, sometimes explicitly hierarchical relationships are proposed (privacy as part of security [29, 30, 57], security as part of privacy [31]) both are described as rather separable constructs [25, 26] or as related dimensions of one underlying construct [15, 19]. Since we found at least some correlation between both privacy and security but couldn’t identify the common drivers to be demographic factors or political ideology the question then arises where the common ground between privacy and security could lie. As previously outlined, the TTAT might provide a suitable framework for conceptualizing both the similarities and differences between privacy and security. The TTAT makes assumptions about cognitive processes such as threat appraisal and coping appraisal, which determine subsequent behavior in the face of IT-related threats [14]. Threat appraisal hereby includes the perceived susceptibility and perceived severity, i.e. gravity of consequences associated with an IT threat. While TTAT does not explicitly distinguish between privacy and security related IT threats, the dimension of perceived severity of the corresponding IT threat (security or privacy related) could be one, where privacy and security behavior are differentially influenced. Specifically, only if an individual considers the unregulated collection of personal data as having grave consequences, would he engage in behavior that prevents this, and thus show high privacy behavior. However, since the consequences of a security threat such as a computer virus are usually more immediate, an individual could show high security behavior and at the same time underestimate the consequences of not protecting his privacy, and thus show low privacy behavior. Consequently, there might be a common factor such as avoidance of technology related threats in general, as posited by the TTAT, which explains that privacy and security behavior are correlated, albeit weakly. However, in certain aspects of this common factor, such as the exact threat appraisal via assessing the perceived severity of the IT threat, depending on the core beliefs of an individual, differences in privacy and security behavior might arise. This would explain that factors such as age and education have a differential influence on privacy and security behavior. Given the weak correlation and inconsistent role of demographic factors and political ideology for privacy and security behavior, it is not obvious whether our results rather suggest that privacy and security overlap as distinct concepts, or whether they can rather be seen as two dimensions of a common construct. However, combined with the considerations presented in the light of the TTAT, we suggest that corresponding privacy and security behavior might be best conceptualized as two dimensions of a common construct which, based on TTAT, possibly represents some form of technical threat avoidance.

Limitations.

Some limitations of this study need to be considered, before drawing too broad conclusions. First, (1) the results are based on the participants’ self-reported privacy and security behavior which is not necessarily identical with their actual behavior. The discrepancy between intentions and actual behavior has been reported before [7, 58] and represents a general limitation of the survey methodology. Furthermore, (2) the used items can only be seen as an approximation to the surveyed constructs because no previously validated questionnaire was used. This caveat regarding the validity of the scales was confirmed by a rather low internal consistency, especially with regard to security behavior. The exact wording of items could be refined, e.g. disabling WIFI on one’s smartphone could be more relevant in public than at home. In addition, (3) relatively few items were used to assess complex privacy and security behavior with many potential influencing factors [59, 60], a problem exacerbated by the elimination of two items due to their low correlation with the corresponding behavior scale. Consequently, the items and should be reviewed and revised. However, since the items were based on recommendations of the German Federal Office for Information Security, they are still considered sufficiently suitable for an approximation to the described topic.

7 Conclusion

In view of ever-increasing threats to privacy and security, methods to improve both privacy and security behavior are being studied intensively. However, an explicit conceptualization of the relationship between privacy and security is often missing, although both terms are usually used in combination. In general, there is no consensus on how best to describe the relationship and the extent to which one goes hand in hand with the other. Based on the results of this study, we found that privacy and security behavior of German private users actually correlate only weakly and is differentially influenced by demographic factors such as age and education. Thus, even though privacy and security are often treated as closely related concepts, it is not necessarily possible to improve security behavior and rely on automatically improving privacy behavior (and vice versa). Instead, a fine-grained differentiation is necessary if privacy or security behavior in particular is to be improved. The results of this study shed light on the relationship in that there might exist a common driver which influences both privacy and security behavior to a certain degree, but which we could not show to be related to demographics and political ideology. Future studies should take a step back from the circumscribed concepts privacy and security and explicitly try to uncover common drivers of those behaviors. Also, the findings of this study should be validated, taking into account the described limitations. Only through such studies and a better understanding of the concepts and the relationship between privacy and security behavior can they be effectively improved and private users empowered to meet the challenges in the digital realm.