Keywords

1 Introduction

The relationship between privacy and security has often been understood as a zero-sum game, whereby any increase in security would inevitably mean a reduction in the privacy enjoyed by citizens. A typical incarnation of this thinking is the all-too-common argument: “If you have got nothing to hide you have got nothing to fear”. This trade-off model has, however, been criticised because it approaches privacy and security in abstract terms and because it reduces public opinion to one specific attitude, which considers surveillance technologies to be useful in terms of security but potentially harmful in terms of privacy [23, 25]. Whilst some people consider privacy and security as intrinsically intertwined conditions where the increase of one inevitably means the decrease of the other. There are also other views: There are those who are very sceptical about surveillance technologies and question whether their implementation can be considered beneficial in any way. Then there are people who do not consider monitoring technologies problematic at all and do not see their privacy threatened in any way by their proliferation. Finally there are those who doubt that surveillance technologies are effective enough in the prevention and detection of crime and terrorism to justify the infringement of privacy they cause [17].

Insight in the public understanding of security measures is important for decision makers in industry and politics who are often surprised about the negative public reactions showing that citizens are not willing to sacrifice their privacy for a bit more potential security. On the back of this the PRISMS project aimed to answer inter alia the question: When there is no simple trade-off between privacy and security perceptions, what then are the main factors that affect the perception and finally acceptance of specific security technologies, of specific security contexts and of specific security-related surveillance practices?

The PRISMS project has approached this question by conducting a large-scale survey of European citizens. In [12] we have shown that privacy and security attitudes of European citizens are largely independent from one another. Now we are exploring what factors are influencing citizens’ perception towards surveillance-based security practices. This is, however, not simply a matter of gathering data from a public opinion survey, as such questions have intricate conceptual, methodological and empirical dimensions. Citizens are influenced by a multitude of factors. For example, privacy and security may be experienced differently in different political and socio-cultural contexts. In this chapter, however, our focus will be on the survey results, not their interpretation from different disciplinary perspectives.

2 Theoretical Approach

Researchers investigating the relationship between privacy and security have to deal with the so-called privacy paradox [8]: It is well known that while European citizens are concerned about how the government and private sector collect data about citizens and consumers, these same citizens seem happy to freely give up personal and private information when they use the Internet. This “paradox” is not really paradoxical but represents a typical value-action gap, which has been observed in other fields as well [12].Footnote 1

2.1 Social Facts

Measuring privacy and security perceptions thus has to deal with problems similar to ecopsychology at the beginning of the environmental movement in the 1970s: What is the relationship between general values and concrete (environmental) concerns and how do they translate into individual behaviour? In PRISMS we have been inspired by the “theory of planned behaviour” (TPB) that suggests that if people evaluate the suggested behaviour as positive (attitude), and if they think their significant others want them to perform the behaviour (subjective norm), this results in a higher intention and they are more likely to behave in a certain way (Fig. 1).

Fig. 1.
figure 1

Model of “planned behaviour” [1, p. 194]

TBP is a positivist approach as it assumes that there are rules structuring the way people think and these “social facts”, as Emile Durkheim has been calling them, can be verified by scientific observation and experimentation [9]. We assume that privacy and security perceptions of human being are such social facts and that they can be explained by other attributes (variables) on an aggregated level. We are aware of the fact that this assumption has been criticised by other epistemological perspectives such as critical school, cultural studies and STS, which are highlighting that attitudes and values may be situationally determined rather than stable dispositions and that a number of context factors may limit individual choice [7]. On the other hand a high correlation of attitudes and subjective norms to behavioural intention, and subsequently to behaviour, has been confirmed in many studies [1].

2.2 Operationalisation of Central Concepts

As a consequence the PRISMS survey comprises of questions exploring respondents’ perceptions of privacy and security issues as well as values questions including political views, attitudes to rights and perceptions of technology. For the operationalisation of the central concepts we rely on the privacy typology by Finn et al. [10] and a security typology by Lagazio [18], each distinguishing seven different dimensions. These typologies could be used to design batteries of questions to address the wide spectrum of meanings of privacy and security.

To address this ambiguity and context dependence of the central concepts the PRISMS survey is working with so called vignettes that are used when survey respondents may understand survey questions in different ways, due to the abstractness of the presented concepts (privacy, security), their complexity (security technologies and practices) and because they come from different cultures. Vignettes translate theoretical definitions of complicated concepts in presenting hypothetical situations and asking respondents questions to reveal their perceptions and values [22]. We have developed eight different vignettes (very short narratives of 50 to 100 words) presenting different types of security situations and surveillance technologies.Footnote 2 They are also covering all dimensions of privacy and security. For each of the vignettes citizens were asked if they think that the respective security-oriented surveillance practice should be used (“acceptance”) and to what extend these practices threaten people’s rights and freedoms (“intrusiveness”).

2.3 Questionnaire and Variables

For our research question we have modified and extended the general TBP model (see Fig. 2) that includes demographic and structural factors and already suggests some interrelationships between the model elements, cf. [4, 20, 24, for similar attempts].

Fig. 2.
figure 2

Suggested relationships between variables explaining privacy and security perceptions and acceptance of security practices

The questionnaire used for the fieldwork thus did not only ask for an assessment of the central concepts privacy and security and of the acceptability and perceived intrusiveness of different security oriented surveillance practices but also those variables needed for the model:

  • Individual characteristics: Age, gender, education, political orientation, geographic area (country, region), employment status, trust in people, attitude towards the benefits and risks of science and technology, member of a minority (self assessment)

  • Experience, behaviour: Intensity of Internet use, experience with privacy invasions, experience with privacy preserving measures, perceived intrusiveness of security practice

  • Knowledge: Privacy and data protection knowledge

  • Interim target variables: Trust in institutions, security perceptions, privacy perceptions

  • Final target variable: Vignette acceptance

2.4 Fieldwork

Fieldwork took place between February and June 2014. The survey company Ipsos MORI conducted around 1,000 30-min phone interviews in all EU member states except Croatia (27,195 in total) amongst a representative sample (based on age, gender, work status and region) within each country. For economic reasons each interviewee was presented only four randomly selected vignettes, resulting in approx. 13,600 responses for each vignette (500 per country).Footnote 3

3 Empirical Results

3.1 Concept and Methodology

Structural equation modelling (SEM) is a method used to study the relationship among multiple outcomes involving latent variables. In this respect SEM is similar to the regression models that were used to test if linear correlations exist between the different variables. However, SEM allows to estimate and test direct and indirect effects in a more complex system of regression equations and verify (or falsify) theories about the absence of relationships among latent variables [15]. For instance, for the development of the SEM we tested the direct influence of demographics variables such as age on the constructs such as privacy and security perceptions and on the acceptance of the vignettes but also the indirect influence of the demographic variable on the acceptance via the constructs.

The main task in the development of a SEM is to reduce the large number of possible connections between the variables by deleting connections that do not show a statistically significant impact on the target variable. This is done iteratively until a number of benchmarks indicate a good model fit.Footnote 4

The model explores the relationship between the different variables to explain which variables influence the acceptance or rejection of surveillance based security practices as outlined in the scenarios. On the highest level the model does no longer distinguish between the vignettes, neither between virtual and physical forms of surveillance nor between public and private operators. Even with these generalisations or simplifications the resulting model is rather complex; it includes 17 variables with more than 40 significant correlations. However, the coefficient of determination \(\mathrm {R}^2\), that indicates that the fraction by which the variance of the errors is smaller than the variance of the dependent variable. In our case the target variable “acceptance of surveillance oriented security measure” shows \(\mathrm {R}^2=0.484\), which means that almost half of the variability can be explained though the other variables in the model. This is a good value comparable to similar studies such as [4] or [24].

Due to the complexity of the model it will be presented in four parts or sub-models to single out important influence factors. Three of the sub-models focus on the main constructs (security perceptions, privacy perceptions and trust in institutions) while the last one discusses the “acceptance of security practices” as the target variable. The data used for the model can be found in Table 1.

The nodes in the following diagrams are representing those (influencing) variables that have a significant influence on the other (target) variable (“acceptance of a concrete security practice”). Elliptic nodes represent general demographics variables such as age, gender or education. Rectangular boxes stand for variables that are closely related to the context of surveillance and security practices. These include knowledge about data protection rights, experiences with privacy invasions etc. Hexagonal nodes stand for the main constructs that are also important mediating variables. The trapezoidal nodes finally stand for the target variable(s).Footnote 5 The coefficients listed in the second column of Table 1 can also be found next to the edges.

3.2 Factors Influencing “Security Concerns”

Figure 3 shows the influences that constitute citizens’ personal security perception (in the context of surveillance oriented security practices). In contrast to the other constructs the security perception is strongly influenced by a number of factors.

Experience with prior privacy infringements has a strong positive effect on the security perception – this is in line with the notion that privacy and security are not perceived as competing values, but that privacy is rather seen as an element of security. On the other hand there are three factors that have a negative influence on the security perception. The higher the education the less worried citizens are about their security. The other negative influence factors are related to trust. The more people trust their fellow citizens and in particular institution the less their security concerns. Apart from these strong influence factors, age, gender and rural-urban classification have a weaker influence on the formation of security perceptions.

Security perceptions in turn have a strong influence on privacy perception (in concrete security contexts!) and finally on the target variables.

Fig. 3.
figure 3

Sub-model for security concerns. Dotted lines = negative influence, solid lines = positive influence, thickness of the line = strength of the influence

3.3 Factors Influencing “Privacy Concerns”

The influence factors on privacy perceptions as the second important construct is shown in Fig. 4. Privacy perception is constituted from a large number of influence factors without very dominant ones. The rather strong influence of the personal security perception was already mentioned before. Experience with privacy infringements and with privacy protecting measures (privacy activism) have a similarly strong influence on privacy perceptions. Minor influence factors include trust, political orientation and privacy knowledge. The educational level is having a relatively strong indirect influence moderated by trust and intensity of Internet use. In summary the formation of privacy perceptions depends on experience in the context where surveillance takes place and on general knowledge. These two elements help citizens to comprehend the complexity and rationale of surveillance measure and to assess the possibilities of safeguards.

Fig. 4.
figure 4

Sub-model for privacy concerns. Dotted lines = negative influence, solid lines = positive influence, thickness of the line = strength of the influence

Privacy perceptions are the most important influence factor for citizens’ acceptance or rejection of concrete surveillance oriented security measures either directly or indirectly via the assessment of the intrusiveness.

3.4 Factors Influencing “Trust in Institutions”

As already mentioned before trust in institutions is another important moderating factor in citizens’ assessment of security technologies and practices. Figure 5 shows how the trust construct is influenced by other factors. The most dominant influence is the other dimension of trust, the trust in persons which shows to be highly correlated with trust in institutions. Other more important factors include a person’s political orientation, where more conservative (right-winged) persons have a higher trust in institutions such as state agencies, companies and the press. On the other hand trust – in concrete surveillance/security situations – is also influenced by experiences that citizens have had. People who found their privacy invaded are less trusting towards institutions in general. The direct influence of education, gender and rural-urban classification is less important on the formation of trust in institutions. Blinkert has pointed out that this is related to the “relative structural effectiveness”, which he defines as a combination of the effectiveness of the state’s monopoly on legitimate use of force and the extent of social welfare and distributive justice, that varies greatly between countries and regions [4].

Trust in institutions has no immediate influence on the acceptance of a specific security measure but plays a strong role for people’s assessment if such a measure is intrusive, i.e. if it threatens or protects people’s fundamental rights. It also has minor effects on the perception of personal security and the perception of privacy, which in turn have a strong effect on acceptance.

Fig. 5.
figure 5

Sub-model for citizens’ trust in institutions. Dotted lines = negative influence, solid lines = positive influence, thickness of the line = strength of the influence

3.5 Factors Influencing Acceptance of Surveillance-Oriented Security Technologies

Figure 6 finally shows which variables and constructs influence European citizens’ acceptance or rejection of security practices. The most striking result is that the perceived impact of the practice on citizens’ rights (here called intrusiveness) is the most critical factor for their acceptance or rejection that itself is strongly influenced by trust in institutions. Privacy and security perceptions follow as the next important factors, however, with a much smaller coefficient. Apart from these three factors most of the other variables play a direct or indirect role, but with a rather small contribution. The only new demographic variable that has a significant (but still small) influence on acceptance is the general attitude towards science and technology where people with a more positive assessment of their benefit have a greater acceptance.

Fig. 6.
figure 6

Sub-model for acceptance of surveillance-oriented security technologies. Dotted lines = negative influence, solid lines = positive influence, thickness of the line = strength of the influence

Fig. 7.
figure 7

Model of factors influencing acceptance of SOSTs (simplified). Dotted lines = negative influence, solid lines = positive influence, thickness of the line = strength of the influence, size of nodes = overall influence of a factor on acceptance

3.6 The Full Picture

The combination of these sub-models does not only show the impacts described before but also the indirect and cumulative effects. Figure 7 is giving a comprehensive picture of the different factors influencing people’s perceptions of privacy and security in the context of concrete applications of surveillance based security technologies. In this picture each of the variables (boxes) also includes the share that it contributes to the manifestation of the target variable. The higher this contribution, the bigger the size of the respective node.

Apart from the importance of the perceived intrusiveness, trust in institutions and the general perception of privacy and personal security that have already been discussed play a significant role in the acceptance of security oriented surveillance practices. The picture also gives a better impression of the relevance of different personal characteristics.

Among the individual characteristics education plays the most important role: the higher the education level the lower the acceptance of security technology. The influence of education is moderated mainly over three channels: (1) More educated people have a higher level of trust with an influence on the perception of intrusiveness; (2) more educated people usually use the Internet more intensively and have thus more experiences with the possibilities of online surveillance and (3) more educated people have less worries about their personal security.

The other influential personal characteristic is political orientation: More conservative people have a higher level of trust in institutions, also those operating surveillance oriented security technology and thus tend to accept them to a higher degree than more left-winged persons.

Noteworthy is also that age is playing a significant role in the model; the influence, however, on acceptance of surveillance based security technologies is small.

4 Discussion of Results

Our analysis of the questions that aimed to measure European citizens’ attitudes towards specific examples of surveillance technologies and practices has the following main results:

Trust in the operating institution is the essential factor for the acceptability of a security practice. The important role of trust, in people, in institutions as well as in the whole societal environment, is regularly confirmed in surveys [4, 11, 14].

The SurPRISE project, for instance, confirmed clearly that “the more people trust scientific and political institutions ... the more acceptable a technology would be.” In their explanatory model institutional trust is the strongest positive influence factor for acceptability of surveillance oriented security technologies [24, p. 135f.].

The PACT project on the other side stresses the strong impact that distrust has on the likelihood that citizens reject a given security measure [21, p. v].

Table 1. Structural equation model data. Estimation Method = ADF; Number of obs = 12,196; Discrepancy = 0.1244

Finally also a recent Eurobarometer study on Europeans’ attitudes towards security found that institutions’ respect for fundamental rights and freedoms is a strongly impacting the perception of security [26, p. 15f.].

Transparency or openness has a positive effect on the willingness of citizens to accept security practices. This can be understood on different levels:

  • Citizens tend to accept security practices when they are convinced that a security measure is necessary, proportionate and effective.

  • People are more easily convinced when a security practice is embedded in a context that citizens are familiar with and where they understand who is surveying whom and how.

  • As a result the surveillance activity should not be covert but perceivable for the citizen and communicated in a responsible way by the operator.

  • Understanding and acceptance is also a question of proper knowledge and education - though not only in one way. While education contributes to understanding technicalities and complexities of a security practices it also drives critical reflections. The SurPRISE project also confirmed most of these observations [24, p. 154f.].

  • Current security practices, however, often do not seem to take this lesson seriously. In a Eurobarometer survey a majority of European citizens said they think that the security technologies and practices in the fight against terrorism and crime have restricted their rights and freedoms, which then is negatively impacting citizens’ trust [26, p. 45ff.].

All these factors also involve an inherent risk for manipulation, since a security practice can be designed to create false trust among citizens to be accepted [3].

On the downside our empirical results also showed that many citizens do not care about surveillance that does not negatively affect them personally but only others. The SuPRISE project similarly concludes that “the more participants perceive SOSTs to be targeted at others rather than themselves, the more likely they are to find a SOST more acceptable” [24, p. 138].

5 Conclusions

For the design and introduction of security measures it is useful to consider some of the main socio-demographic determinants for acceptance of these measures, since poorly-designed measures can consume significant resources without achieving either security or privacy while others can increase security at the expense of privacy. However, since there is no natural trade-off between privacy and security, carefully designed solutions can benefit both privacy and security.

Law enforcement and government officials often heavily weight security. On the other hand we have shown in our analysis of the vignettes that citizens’ opinions on security measures vary, and are influenced by some crucial factors. Apart from trust in the operating agency or company we could observe mainly four different types of reactions [6]:

  1. 1.

    Citizens may consider a measure as useless to enhance security, and at the same time invasive for their privacy. Such a situation has to be absolutely avoided.

  2. 2.

    Citizens may consider a measure useless to enhance security but with no risk for their privacy.

  3. 3.

    Citizens may consider a measure as useful in terms of security, but privacy invasive.

  4. 4.

    Finally, citizens may consider a measure both useful to increase security and with no risk for their privacy.

However, citizen perceptions do not (always) have to reflect the real effectiveness of a security measure and its real impact on privacy. Considering the importance of trust for the acceptability and acceptance the responsible parties should aim to reconcile the perceived and real impacts. Potential for conflicts can be mainly found at the border between reaction types 2 and 3 when citizens fear an invasion of their privacy or perceive a technology as ineffective. Citizens’ reactions are mostly based upon perceptions rather than rational fact-based assessments. As we have shown before these are influenced by a multitude of factors. Trust in institutions is one, the perceived self-interest is another, the measure being overt or covert a potential third. These three elements should be taken into account in the design of new security technologies and in specific security investments. For these cases PRISMS has developed a participatory and discursive technique that can help decision-makers in industry, public authorities and politics to implement security measures that raise fewer concerns in the population and are thus more acceptable along the lines stated in many policy documents [2, 19].