Keywords

1 Introduction

The unprecedented deployment of information and communication technologies has made possible the development of myriads of new services but it has also given rise to a variety of threats to individual rights that must be taken very seriously:

  • Data protection rights: the extensive use of data collection and knowledge inference techniquesFootnote 1 undermines the effectiveness of existing data protection regulations.

  • Privacy: the facilities offered by the internet to publish and have access to information lead to more and more privacy infringements.

  • Non-discrimination: automatic data processing techniques can be applied to huge amounts of available information to build (individual and group) profiles which can be used to treat people differently, which makes it easier to commit large scale, discreet discriminations.

Simultaneously, the evolution of the technology has also increased the interactions between these three types of rights. For example, there is no doubt that misuses of personal data can adversely affect privacy and self-development (resulting in the unwanted disclosure of personal data to third parties, in identity theft, harassment through email or phone calls, etc.), or lead to a loss of choices or opportunities (e.g. enabling a recruiter to obtain information over the internet about political opinions or religious beliefs of a candidate and to use this information against him). As a matter of fact, privacy breaches and discriminations based on data processing are probably the two most frequent and the most serious types of consequences of misuses of personal data.

In this chapter, we focus on one of these interactions, the relation between personal data protection and anti-discrimination regulations, and argue that an extended application of the latter can help strengthening the former. We first review the increasing need for data protection before identifying the weaknesses of existing regulations and their consequences (Sect. 15.2). We then turn to anti-discrimination regulations and make a comparison with personal data protection considering both the types of data concerned and their respective modus operandi (Sect. 15.3). From this comparison, we make proposals for a stronger synergy between data protection and anti-discrimination regulations (Sect. 15.4) and draw some conclusions (Sect. 15.5). As far as legal systems are concerned, we focus on European regulations in this chapter, with examples mostly drawn from the French legislation and jurisprudence.

2 Data Protection: Increasing Needs, Decreasing Effectiveness

As Simon Nora and Alain Minc emphasised already in 1978 in their report on the computerisation of society, “this growing overlap between computers and telecommunications, which we will call “telematics”, will not form an additional network, but another kind of network (…) It will transform our cultural model (…) it constitutes a common factor enabling and accelerating all other technologies. Especially insofar as it radically alters the processing and storage of information, it will change the nervous system of organisations and of society as a whole. [Telematics], unlike electricity, does not carry an inert current, but information, i.e. power” (Nora and Minc 1978). Associating information with power naturally leads to a major issue which is the potential means of control of this power and the establishment of adequate counter-powers to keep a balance between entities which can collect, produce and have access to information and individuals who do not have the same abilities or can be the targets of such collections or treatments of information.

Looking at it more closely, information actually confers two different, yet complementary, types of power: the power of knowledge and the power of action.Footnote 2 As a first approximation, the collection of information can be associated with the power of knowledge when the use of information seems more related to the power of action. Obviously, personal information is the first type of information which confers power on individuals. Personal data regulations therefore constitute a significant part of the necessary counter-powers. From the legal point of view, the European regulation on personal data protection is based on a priori procedures (typically notifications and authorisation requests): no matter whether any individual suffers any actual loss or harm, the failure to complete prior formalities, even without malicious intentions, is sufficient to constitute a criminal offence.Footnote 3 We can thus argue that, to some extent, personal data protection was originally intended to control the power of knowledge. In contrast, privacy protection and anti-discrimination regulations both relate more directly to the control over the power of action: an offence is established only if an individual has actually suffered from a privacy breachFootnote 4 or a detrimental decision is made unlawfully on the grounds of a discriminatory criterion.Footnote 5 These differences of approaches can be well justified by historical and theoretical reasons and could lead to complementary means to protect individual rights. We argue however that the a priori procedures which form the basis of data protection regulations are weakened by the recent technological and legal evolutions (Sect. 15.2.1) and this weakening has in turn an impact in terms of privacy and discrimination (Sect. 15.2.2).

2.1 A Priori Checks: A too High and too Low Barrier

Under European regulation, the completion of prior formalities by the data controller is one of the conditions for personal data processing to be lawful.Footnote 6 These formalities however do not necessarily lead to thorough checks by the national data protection authorities. For example, in France the notification procedureFootnote 7 does not impose any verification from the French data protection authority (CNIL), which has only to record it and issue a receipt. In contrast, the authorisation procedure under Art. 25 of the French data protection lawFootnote 8 does require more extensive checks as the CNIL has to provide a reasoned decision. In practice, the CNIL may prohibit the processing, authorise it, or issue an authorisation with reservations, which amounts to authorising the processing if specific modifications or additional guarantees are implemented.

However, the a priori control procedures have been weakened by the transposition of Directive 95/46/EC into French law, leading to a revision of the data protection law of 6 Jan. 1978 (6 August 2004). In fact, not only have notifications become the common, by default, procedure, but the appointment of a “personal data protection official”Footnote 9 releases organisations from any obligation of notification. This weakening of a priori controls has been offset by an increased emphasis on a posteriori checks, at least with respect to personal data processing in the private sector.Footnote 10 This evolution is justified by the unprecedented proliferation of data processing in the new information society and the practical impossibility to submit all these treatments to a priori checks.

It is already the case today with the internet, but the phenomenon will take new proportions with “ubiquitous computing”Footnote 11 or “ambient intelligence” (RFID chips, sensors, the “internet of things”, etc.): information and communication technologies will make it more and more easy to collect vast amounts of personal data automatically and without notice from the individuals. The impact of these technologies is even amplified by the increasingly sophisticated knowledge inference and data mining techniques which make it possible to produce new personal data and accurate profiles, or to de-anonymise data, using ever larger volumes of available information. It should be pointed out that the origins of this evolution are not exclusively technical but also social, since many internet users deliberately contribute to populating this gigantic database.Footnote 12 Another consequence of the development of knowledge inference techniques is that the frontier between anonymous data and identifying data tends to blur and to evolve: data which can be considered as anonymous at a given time in a given context can become identifying later on because new, seemingly unrelated data has been released, generated or forwarded to a third party, giving rise to the possibility of “re-identification” (see Ohm 2010; Narayanan and Shmatikov 2010). Several authors have already pointed out that, far from being a panacea, anonymisation should rather be viewed with extreme caution.Footnote 13 Actually, as stressed by Serge Gutwirth and Mireille Hildebrandt (Gutwirth and Hildebrandt 2010), the legal status of the profiles themselves is another striking illustration of the limitations of European data protection regulation: one could argue that group profiles built from anonymised data fall outside the scope of Directive 95/46/EC, and are instead ruled by intellectual property laws, thus offering protection to those building these profiles rather than to the individuals, even when these profiles may be used to run procedures or take decisions (unilaterally, possibly unfairly, and generally without any requirement to provide motivations) affecting them.

To summarise, we have argued in this section that a priori checks, even though they could represent in theory a very strong protection, are no longer effective enough, and become more and more both a too high barrier (considering the huge amount of data flows in the digital society) and a too low barrier (because certain types of data which can have an impact on our individual life, such as anonymous profiles, can still escape their application field). In the next section, we study the consequences of these limitations in terms of privacy and discrimination.

2.2 Impact in Terms of Privacy Breaches and Discrimination

As argued in the introduction, the increased inadequacy of the a priori controls which form the basis of data protection regulations can lead to misuses of personal data with strong impact in terms of privacy and discrimination. As an illustration, the teacher grading website “note2be.com” was prosecuted for two offences: failing to obtain prior consent for processing personal data and privacy breach. The French data protection authority and the judge took the view that consent was necessary, but they came to different conclusions with respect to the alleged privacy breach: the CNIL considered the disclosure of the workplace address as a privacy breach, while the judge held the opposite view.Footnote 14 Another recent case illustrates the potential risks in terms of discrimination: in a public report, the non-profit organisation “SOS Racisme” claimed that ethno-racial data records were a tool for discrimination (Thomas 2009) and criticised Air France for processing ethnic data records on his cabin personnel to meet customers’ requests.Footnote 15 More generally, the development of profiling techniques which are typically based on the analysis of personal data (even if the data may be subsequently “anonymised” in a more or less robust way) has the effect of increasing the differences of treatments between individuals, both in the private sector (e.g. services offered or prices set on the basis of profiles) and in the public sector (e.g. monitoring for security purpose). As a matter of fact, the first reason for elaborating profiles is often to be able to provide personalised services, which in many situations can be perfectly legitimateFootnote 16 but can also imperceptibly turn into various forms of discriminations. This widespread use of profiling and the associated risks, especially as regards discrimination, have already been studied and denounced by a number of lawyers (Gutwirth and Hildebrandt 2008; Gutwirth and Hildebrandt 2010; Hildebrandt 2009; Zarsky 2002) and sociologists (Lyon 2003).

3 Non-discrimination: Protecting Individuals

Even though they have different origins and objectives, and they are governed by different bodies of rules, the rights for personal data protection and non-discrimination interact in different ways and it can be helpful to review their differences and similarities before drawing some lessons and providing suggestions to improve their effectiveness. Considering the scope of this study, namely the new risks posed by data processing technologies, we distinguish two complementary aspects of data protection and non-discrimination rights: the types of data which are covered by the protections (Sect. 15.3.1) and the types of controls which are provided on these data (Sect. 15.3.2).

3.1 Similar Types of Data

Let us first consider the types of data covered by non-discrimination regulations. Article L. 225–1 of the French Penal Code prohibits the use of certain types of data in specific contexts. These categories of data and contexts are extensively enumerated in the law:

  • Art. 225–1 (excerpt): “Discrimination comprises any distinction applied between natural persons by reason of their origin, sex, family situation, state of health, handicap, sexual morals, political opinions, union activities, or for being a member or not a member (or supposed to), of a given ethnic group, nation, race or religion”.

  • Art. 225–2: “Discrimination defined by article 225–1, committed against a natural or legal person, is punished by two years’ imprisonment and a fine of € 30,000 when it consists:

    1. 1.

      of the refusal to supply goods or services;

    2. 2.

      of obstructing the normal exercise of any given economic activity;

    3. 3.

      of the refusal to hire, the sanction or the dismissal a person;

    4. 4.

      of subjecting the supply of goods or services to a condition based on one of the factors referred to under article 225–1;

    5. 5.

      of subjecting an offer of employment to a condition based on one of the factors referred to under article 225–1”.

As for the European Directive 2000/43/EC “implementing the principle of equal treatment between persons irrespective of racial or ethnic origin”, its scope includes “conditions for access to employment”, “social protection”, “social advantages”, “education” and “access to and supply of goods and services which are available to the public, including housing”. As far as the European Convention on Human Rights is concerned, its Art. 14 states that “The enjoyment of the rights and freedoms set forth in this Convention shall be secured without discrimination on any ground such as sex, race, colour, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth or other status.” Its scope is thus larger than that of the Directive, the expression “other status” leaving the door open to a non-limitative list of potential grounds.

Interestingly, the new Art. 8 § 1 of the European directive 95/46/EC sets out a list of sensitive data (“special categories of data”) which, by default, may not be collected or processed. These sensitive data include any data “revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life”. Beyond these types of data that are clearly common to these areas of law, information about pregnancy or disabilities, which are considered as discriminatory, are also related to health and can therefore be considered as sensitive data in the sense of the data protection directive. The same can be said about sexual preferences which are considered both as sensitive and discriminatory data.

As far as differences are concerned, one may notice that gender and age are considered as discriminatory factors but not as sensitive data in the sense of the data protection Directive. On the other hand, the data protection Directive states that “Processing of data relating to offences, criminal convictions or security measures may be carried out only under the control of official authority, or if suitable specific safeguards are provided under national law” and “Member States may provide that data relating to administrative sanctions or judgments in civil cases shall also be processed under the control of official authority.” Offences and criminal convictions are thus considered as sensitive data in the sense of the data protection Directive but not as discriminatory factors. This observation naturally leads to the following question: are these residual differences really justified by the respective goals of these two types of regulations or should they rather be seen as the fortuitous result of different histories. First, the fact that certain types of data can be seen as potential sources of discriminations without necessarily being considered as sensitive in the sense of data protection regulations seems quite reasonable: for example, it may be considered unfair to treat people differently based on information such as the gender but this information, even if undoubtedly personal (attached to a person), can hardly be considered as sensitive. The other type of difference (sensitive data not considered in anti-discrimination regulations) may be more questionable though: for example, wouldn’t it be unfair to use information about an offence concerning a person who has already been sanctioned in court and would thus have to suffer a “double punishment” (unless, of course, this information can have an objective value in the context where it is used, for example the fact that a person has been convicted for armed robbery in the context of the recruitment of a bank security agent). Indeed, it is the case that the specific status granted to sensitive data in European data protection regulation is justified by the risks that could result from the processing of such data, which should also lead to ban the use of such data as discriminatory (i.e. selection) criteria.

More generally, one can argue that the close proximity between the types of data covered by data protection and non-discrimination rights stems from their common objectives to ensure fairness and to re-establish some kind of balance, or at the very least to reduce the imbalance of powers, between the individuals and those who may threaten their rights. Paradoxically, this shared reference to sensitive data also introduces difficulties in the implementation of data files for the assessment of discrimination practices: although Directive 95/46/EC provides exceptions to the ban on collecting sensitive data, the legality of such processing and the conditions under which it is permitted are still under discussion in some European countries (Ringelheim 2010).

3.2 A Priori Versus A Posteriori Controls

As stated in Sect. 15.2.1, the first measures for data protection in Europe are the formalities to be followed by the data controllers before embarking on the collection or treatment of personal data. The application modes of anti-discrimination laws are rather different. These laws prohibit the use of certain discriminatory criteria for specific purposes, but it would be difficult, if not impossible, to require that all actions falling under these purposes (e.g. service provision or hiring) go through an administrative procedure to confirm that they are not prohibited by law. Indeed, one can hardly conceive a system, other than in the context of a totalitarian regime, in which all actions which could potentially fall under anti-discrimination laws should be declared beforehand in order to confirm that they are lawful. For this basic reason, anti-discrimination regulations put more emphasis on the compensations for damages than on a priori verifications. This practical reason is reinforced by the civil law origin of anti-discrimination regulations in countries like France (even though they have since found their way into criminal law as well).

In conclusion, one should notice that the differences identified here between data protection laws and anti-discrimination laws are diminishing over time: as suggested in Sect. 15.2.1, the implementation of data protection laws evolves towards stronger emphasis on a posteriori checks, this shift on emphasis being justified by the growing difficulty to control data collection, which makes it necessary to be extremely vigilant on the use made of the data.

4 Towards a Synergy Between Data Protection and Anti-discrimination Regulations

In order to address to the issues raised by the technological developments and the new threats to individual rights that they make possible, it can be helpful to distinguish two very different types of data collection:

  1. 1.

    The collection of data as part of formal procedures with clearly identified parties or in the course of clearly identified events, recognised as such by the subjects (e.g. when submitting a file, filling a questionnaire, using a smart card or providing one’s fingerprint to get access to a building).

  2. 2.

    The apparently insignificant and almost continuous collection of data that will become more and more common in the digital society (digital audit trails, audio and video recordings, etc.). This collection may be more or less perceived or suspected by the subject (which does not mean that he is necessarily aware of the potential risks concerning the subsequent use of the data or its divulgation to third parties), or remain completely invisible and unsuspected, the frontier between the two situations depending on the quality of the information provided by the controller and the level of awareness of the subject. Another worrying phenomenon—which could in fact be considered as a processing as well as a collection—is the automatic generation of new knowledge using data mining and knowledge inference techniques. In this kind of situation, the subject may ignore not only the process but also the generated knowledge itself, even if this knowledge concerns him (e.g. his preferences, the probability that he could accept a given offer or the risks that he could develop a given disease) and could be used to take actions affecting him (e.g. not offering him a job or an insurance contract or adjusting the price of a service up to the level he would be prepared to pay).

The regulations on personal data protection were originally designed to address the first type of situation. Efforts are made to adapt them to the complex issues raised by the second type of data collection but they tend to be increasingly ineffective in these situations. The main cause of this ineffectiveness is their underlying philosophy of a priori and procedural controls. The digital world is based on the circulation and processing of data, and data collection is no longer a one-off event but a commonplace process, that will even become a permanent phenomenon with the advent of ubiquitous computing. Furthermore, the boundaries between personal and non-personal data are more and more blurring,Footnote 17 as well as the frontiers between the private and public domains, and also the differences between data collection and data processing.Footnote 18 In view of these developments, a priori checks and procedures are too rigid or simply impossible to implement. As a result, requirements which may once have represented genuine protections are becoming purely formal obligations, leaving individuals more and more helpless to protect their personal data in the digital world. Just to take an example, on the internet the requirement for prior consent generally turns into the mindless acceptance of users eager to gain access to a website or a service and who hardly take the time to read the question, not to mention the privacy policy of the site.

In order to better address the second type of situation mentioned above, we believe that two types of evolutions are necessary:

  1. 1.

    The first one is to put greater emphasis on the protection of the subjects against data misuse, which would involve more stringent a posteriori checks and the integration within the scope of the law of all types of discriminatory processing, i.e. all processing resulting in significant differences of treatment between individuals whenever such differences are not justified by objective grounds that are not solely based on the interests of the data collector (e.g. cost effectiveness or revenue maximisationFootnote 19).

  2. 2.

    The second one is to assess the data processing by the yardstick of its potential harm to individuals, which suggests relying more on civil law than on criminal law and applying a thorough “risks versus benefits” analysis to evaluate the legitimacy of the data processing.

As regards potential harm to individuals, one may observe that most infringements to personal data protection regulations result either in privacy breachesFootnote 20 (excessive disclosure of information, intrusive actions such as phone calls or emails, etc.) or in various forms of discriminations (in the common sense of the term, even if those discriminations are not necessarily considered as such in the legal sense and sanctioned by existing anti-discrimination regulations) such as losses of chances to get access to certain offers, benefits or services (job, insurance, etc.) or to get such access under reasonable conditions.Footnote 21 This observation, combined with the convergence sketched in the previous sections, leads us to call for the establishment of stronger connections between personal data protection regulations and these two other categories of rights, in particular the right to be protected against unfair discriminations. Anti-discrimination laws also present significant advantages to cope with the continuous information flows which characterise the new digital society:

  • There are more flexible as they are not based on a priori procedures and administrative formalities.

  • Being rooted in civil law, they put emphasis on compensations for damages.

In addition, in certain countries like France, anti-discrimination laws explicitly provide for collective legal proceedingsFootnote 22 (akin to the American “class actions”) which may, to a certain extent, tend to restore the balance of powers between the organisations in position to collect and process personal data or to apply differentiating treatments and the individuals who may suffer from such treatments.

It must be recalled, however, that, under substantive law, the protection against discriminations is restricted to very specific types of information (sex, handicap, race, religion, etc.) and purposes (recruitment, supply of services, etc.) which are comprehensively enumerated in the law. The preciseness of this definition contributes to the effectiveness of the law because it makes it possible to apply clear criteria, but it is also a strong limitation, especially in the perspective suggested here to apply anti-discrimination laws to all kinds of unfair differences of treatments based on the processing of personal data. Indeed, this generalisation would make it necessary to lift the current restrictions on the legal definition of discrimination. But such expansion of the scope of anti-discrimination regulations should of course be handled with great care to maintain the effectiveness of the protection.

Another possible instrument to establish stronger connections between personal data protection and anti-discrimination regulations is Art. 15 of European directive 95/46/EC, which applies to decisions producing legal effects on individuals or significantly affecting them.Footnote 23 One might think that this article could be applied to cases of discriminations (such as the refusal to supply a service or its supply on very disadvantageous terms) based on personal data processing. To make this provision genuinely effective however, it would be useful to clarify and broaden its scope, in particular to ensure that a processing involving insignificant or purely formal human interventions would not systematically fall outside its scope (Bygrave 2001). Another prerequisite for its application is that the individuals concerned are actually in a position to exercise their rights, the first condition being to be aware or informed of the fact that a decision is based on automatic data processing, for instance the use of a profile (Hildebrandt 2009). As already argued by Lee Bygrave (Bygrave 2001), even if the principles underlying this article are sound and highly commendable, much has still to be done to make it truly effective. To make things worse, the scope of this article may be further reduced in the transposition of the Directive by member states. For example, in France, it is limited to decision with legal consequences: Art. 10 of the law 78–17 states that “No legal ruling involving the appraisal of an individual’s conduct can be taken on the grounds of any automated processing of personal data intended to assess certain aspects of his or her character. No other decision producing legal effects on an individual can be taken solely on the grounds of any automated processing of data intended to establish the profile of that individual or assess certain aspects of his or her character” without reference to decisions which “significantly affect him”, as stated in the Directive. One option to implement the approach suggested here would thus be to take the opportunity of the future revision of the European directive 95/46/EC to reinforce, to revivify Art. 15 and clarify its scope so that it could be applied to all cases of discriminations based on personal data processing.

Two complementary measures are needed to make this approach truly realistic. The first one is to strengthen the means of the national data protection authorities to implement a posteriori controls which are sufficient to dissuade data controllers from misusing data. These means are first in terms of funding and manpower, but they should also include enhanced legal support with respect to the accountability of the data controllersFootnote 24 and technical solutions enabling more secure and effective verifications (Le Métayer et al. 2009; Le Métayer and Monteleone 2009).

The second complementary measure concerns the possibilities for individuals to get real compensations in the event of unlawful use of personal data. Again, this possibility is a prerequisite to ensure that a posteriori controls can effectively have a deterrence effect towards data controllers.Footnote 25 One desirable development in this respect would be to make it possible to victims of misuses of personal data to resort to collective legal proceedings (“class actions”) as they already can do it for specific kinds of discriminations in certain European countries.

From an institutional viewpoint, we can notice that the former French high commission against discriminations and for equality (HALDE)Footnote 26 and the French data protection authority (CNIL) have executed a partnership agreement in March 2006 on the grounds that the “legal competencies of both authorities may prove complementary in many cases, as discriminatory practices are in fact likely to be based on processing personal data, whether or not computerised”.Footnote 27 This agreement provides for the exchange of information including the disclosure by one authority of information required for the other take actions.Footnote 28 It also includes provisions for the organisation of joint inspections, staff training and communications.

5 Conclusion

To sum up, starting from the observation that it is increasingly difficult to effectively control a priori all data collections or the production of new knowledge on individuals, we have argued that a possible option is to strengthen a posteriori controls on the use of personal data and to ensure that the victims of data misuses can get compensations which are significant enough to represent a deterrence for data controllers. We have also argued that the consequences of such misuses of personal data often take the form of unfair discriminations and this trend is likely to increase with the generalisation of the use of profiles. For this reason, we advocate the establishment of stronger connections between anti-discrimination and data protection laws, in particular to ensure that any data processing resulting in unfair differences of treatments between individuals is prohibited and is subject to effective compensations and sanctions.

Needless to say, the evolution suggested here is by no means a final statement or answer to the very complex issues addressed in this chapter. As discussed in Sect. 15.4 and by Schreurs et al. (2008), the scope and conditions of application of current anti-discrimination laws are too narrow for them to provide sufficient protections in the context of automated processing. One of the most challenging tasks for the future will be the extension of the definition of discrimination criteria to ensure that the scope of non-discrimination regulations covers all (or most) situations where computers could be used to introduce unfair differences of treatments between people. But where to place the red line between acceptable business practices and unfair discriminations is a very delicate (and political) matter.

It should be clear also that the evolution advocated here is meant to provide complementary means to protect individual rights and should not lead to weaken existing protections, including a priori controls when these controls are possible. The shift of emphasis from a priori to a posteriori checks should apply essentially to situations in the second category mentioned above (the unobtrusive and almost continuous collection of apparently insignificant data) and must not undermine notification obligations, authorisation requests or the requirement for consent for the first category (the collection of data as part of formal procedures with clearly identified parties or in the course of clearly identified events recognised as such by the individuals concerned) where they remain appropriate. It is also clear that one of the primary purposes and raison d’être of personal data regulations is to protect a model of democratic society (Rouvroy and Poullet 2009), and this objective must in no way be jeopardised by the evolutions suggested here. In particular, it is necessary to maintain the principle of an absolute barrier, a personal data sanctuary, ensuring that in certain situations or for certain types of data, because the common interest is at stake, the subject’s consent is not a sufficient condition to make personal data processing legitimate and that prior authorisation from the data protection authority is expressly required.