Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Online social networking sites (SNS) provide individuals with the opportunity to share information on a previously unimaginable scale, by creating profiles, disclosing facts, emotions, or pictures and to interact in a highly sophisticated manner. During the past 5 years, the popularity of these SNS has expanded spectacularly, attracting an extraordinary number of users (e.g. almost 1,25 billion users for Facebook).Footnote 1 Among children and young people, social networking has become one of the preferred online activities.Footnote 2 The EU Kids Online study found that 77 % of 13–16 year olds and 38 % of 9–12 year olds have a social networking profile,Footnote 3 even though most SNS put the minimum age limit to create a profile at 13.

It has been argued that the blurring between ‘public’ and ‘private’ in SNS, the invisibility of audiences and the fact that information in such networks is persistent, replicable, searchable and visible on a large-scaleFootnote 4 entail that risks in an SNS environment are significantly more complex than equivalent offline risks. Aside from providing greater access to certain (illegal or harmful) categories of content (e.g. hate speech), and the facilitation of certain behaviour such as sexting or grooming, an added complexity can be found in the fact that the increased interaction between minors may lead to more reciprocal harassment, blurring the lines between victims and offenders to a greater extent than in the offline world.Footnote 5 , Footnote 6

This changing role of minors within social networks raises questions with regard to the applicability of the current legislative framework and liability for certain acts. This chapter will assess this applicability, to what extent minors may be held liable according to existing criminal or civil law for certain risk behaviour (e.g. bullying, posting of harmful/illegal comments or pictures) and whether third parties, such as parents, may be held liable for the behaviour of minors on SNS. This will be illustrated by means of provisions from Belgian (civil and criminal) law. Given the specific nature of SNS, the use, and especially the implementation and enforcement of, traditional types of legislation are neither obvious nor desirable.Footnote 7 It is the aim of this chapter to explore a number of regulatory and other strategies that may be adopted instead to deal with risk behaviour of young SNS users without depriving them of the undeniable benefits and opportunities that SNS provide them with. This includes a discussion of the potential of self- and co-regulation and non-regulatory mechanisms, such as improving media literacy skills or providing efficient reporting tools, for the development of (regulatory) strategies, which reduce peer-to-peer risks in user-centric environments.

2 Children and SNS Risks: Victims, Participants, Offenders, …?

As in most environments in which children and young people are present (e.g. the street, school, playground, in front of the television, etc.), risks may occur in social networks. Recent research has found that children who use SNS encounter more risks online than those who do not,Footnote 8 but this substantially depends on how they use these services. Moreover, the exposure to risks does not automatically lead to harm. As Staksrud et al. put it: “Risk may, therefore, be safely encountered by many, and only in a proportion of cases (depending on the action of both protective and risk factors) does it result in harm.”Footnote 9

Examples of risks that may occur in SNS are bullying, sexting, posting hurtful comments, sharing of or being tagged in pictures without permission and targeted advertising.Footnote 10 Whereas children used to be predominantly regarded as victims of certain risks, in need of protection, increasingly, and this has become very visible in the SNS environment, they may adopt different roles depending on the activities they perform. According to the ‘three C’s classification’ of online risks developed by the EU Kids Online study children may be recipients (Content), participants (Contact) or actors (Conduct) with regard to various risks.Footnote 11

Although (inappropriate or harmful) content and contact (with adult strangers, so-called ‘stranger danger’Footnote 12) have traditionally been the major causes of concern for policy makers and parents, empirical research has shown that the risks that upsets minors the most often occur between peers.Footnote 13 Sexting, i.e. the communication of “sexually explicit content […] via text messages, smart phones or visual and Web 2.0. activities such as social networking sites”Footnote 14 and cyberbullying are prime examples of this type of risks. Scholars have found, for instance, that, similarly to bullying that occurs offline, around 60 % of children that bully online have been the target of bullies themselves.Footnote 15 , Footnote 16 Children thus engage in both functions (victim—bully),Footnote 17 and these multiple roles are considered to be “fluid over time and across different contexts”.Footnote 18 Other situations where peers may be considered ‘actors’ are when pictures or video clips are posted, shared or tagged without the consent of the child that is portrayed. This may, for instance, be the case with regard to types of secondary sexting,Footnote 19 where sexually suggestive pictures that are sent voluntarily by someone are forwarded by the receiver of the picture, or incidents of ‘happy slapping’, where bullying scenes or assaults in the offline world are filmed with a digital or often a phone camera and then shared online. Especially in SNS, which are increasingly accessed through smart phones,Footnote 20 such actions may have significant consequences that are often not anticipated by the ‘actor’. Once content is uploaded on an SNS it can be shared very quickly with a very large audience,Footnote 21 and it is often very difficult to completely remove it. Moreover, content may already have been copied or forwarded before it is erased by a SNS provider, something of which children and young people are often not aware.Footnote 22

3 Legal Implications

Translated to the legal context the various roles a child may adopt in the SNS environment, e.g. active creator, perpetrator or ‘data controller’, may—in theory—entail different legal consequences and the applicability of specific legislation. Of course, it is necessary to take into account that, because of their age, or potential lack of legal capacities,Footnote 23 they may not be considered liable for their actions. Consequently, this raises questions with regard to the liability of parents or other caretakers.

3.1 Applying Existing Legislative Provisions to Unwanted Behaviour

The fact that SNS constitute a global, vast communication platform with millions of users in countries across the world does not entail that this environment finds itself in a legal vacuum. If an offense is committed on a SNS, the existing legal framework may be applied if certain conditions laid down in the law in question are fulfilled. Depending on the national circumstances, legislation may be formulated in a technology-neutral manner, may be interpreted in an evolutionary manner or may even be drafted especially with new media environment in mind (e.g. provisions with regard to electronic stalking or harassment).Footnote 24

In Belgium, for instance, a number of articles of the Criminal Code may be relevant with regard to sexting. Article 383 criminalises the display, sale or distribution of writings or images that are indecent. If this is done in the presence of minors below the age of 16, more severe sentences are imposed according to Article 386. In addition, Article 384 stipulates that the production of indecent writings or images is also a criminal offence. Child pornography is addressed in Article 383bis of the Criminal Code. This article criminalises the display, sale, rental, distribution, transmission, delivery, possession or (knowing) obtainment of access of or to images that depict poses or sexual acts with a pornographic character which involve or depict minors. These articles are formulated in a technology-neutral and broad manner, so in theory they could be applied to cases of sexting in SNS. In addition to criminal provisions, legislation with regard to the processing of personal data or portrait rights may be violated in cases where images are shared or distributed without the consent of the person that is portrayed. Whether or not these existing legislative provisions are applicable in certain situations will be judged on a case-by-case basis.

However, the application of existing legislation may have unintended consequences. In case of primary sexting, for instance, it seems to be disproportionate to apply legislative provisions that aim to address child pornography and to punish adults who intend to sexually abuse children, to situations where minors send or post sexually suggestive pictures to each other.Footnote 25 Not only may such behaviour possibly “be part of the developmentally necessary exploration and experimentation that enables the emergence of sexual identity”,Footnote 26 but even when considered imprudent or unwise taking into account the spiralling loss of control of a picture once uploaded on a SNS, criminally prosecuting and punishing minors may also be counterproductive and over-reaching.Footnote 27

In addition, practical obstacles may arise when an attempt is made to enforce national legislative provisions. Often SNS providers are located in another jurisdiction than the victim and declare in their Terms and Conditions that disputes need to be brought before the courts of their country of establishment.Footnote 28 Moreover, in the SNS environment perpetrators may act anonymously (for instance by means of a fake profile), making it very difficult to find and punish them. Other complicating factors may be that victims have not succeeded in obtaining evidence of certain acts (because the offending content has been deleted by the perpetrator or by the victim itself), that law enforcement and magistrates are not sufficiently aware of the characteristics of the SNS environment or that children do simply not report or file a complaint when they have been the victim of harmful acts.

3.2 Responsibility and Liability of Minors, Parents and Teachers

  1. 1.

    Criminal liability

Even if certain acts may fall within the scope of application of existing criminal provisions, this will not automatically mean that children can be held responsible or liable. This will depend on the age of criminal responsibility, i.e. “the age at which children are deemed to have the capacity to be legally responsible for breaches of the criminal law”,Footnote 29 that is adopted in each national jurisdiction, as there is no commonly accepted age of criminal responsibility in international or European legislative or policy documents.Footnote 30 In order to determine this age it should be assessed “whether a child, by virtue of her or his individual discernment and understanding, can be held responsible for essentially anti-social behaviour”.Footnote 31

In Belgium, for instance, the Youth Protection Act of 1965 states that minors cannot be put on a par with adults with regard to the degree of liability and the consequences of their actions (Preamble, para 4). However, if a minor commits an ‘act that is described as a crime’ they should be made aware of the consequences of that offence. As a result, the Youth Protection Act imposes other measures, including supervision, education, disciplinary measures, guidance, advice or support instead of the punishments of the Criminal Code.Footnote 32 Measures can be imposed on parents or on the minors themselves. The age of the minor in question is taken into account; different measures will be imposed before and after the age of 12 years (Article 37). If possible, the judge may give preference to victim-offender mediation (Article 37bis).

  1. 2.

    Civil liability

In addition to potential criminal liability, depending on the system of law of a particular country, minors may be held civilly liable for ‘wrongful acts’ or acts that have caused damage. In order to assess whether this will be the case in a specific situation, a child’s age and maturity will be taken into account to determine whether he or she had the ability to discern the scope of his or her actions. In Belgium, for instance, judges have held that this may be as early as the age of seven.Footnote 33 On the basis of Article 1382 and 1383 of the Belgian Civil Code, to be held liable the victim must prove the offence and the causal link with the damage that this offence has caused. This entails that the offender has not acted as a normal, reasonable and careful person that he or she acted freely and consciously and that he or she must have been able to foresee that his or her behaviour would cause damage to the victim.Footnote 34 Judges will need to evaluate this element of foresee ability, taking into account the specific and concrete circumstances of each case. One may wonder, for instance in the case of sexting, whether minors can reasonably foresee the consequences of their actions. It is conceivable that it is hard for minors (or even adults) to grasp what it means to forward or post an intimate picture of someone else, as the loss of control over content that is made public in the digital sphere is so vast and irreversible.

  1. 3.

    Liability of parents and teachers

Moreover, in certain circumstances parents and teachers may be held liable for the acts of their children or pupils. In Belgium, for parents as well as teachers an assumption of liability has been included in Article 1384 of the Civil Code. This means that, in order not to be held liable, the parents and teachers in question must prove that they did not commit a mistake in raising or supervising the child.Footnote 35 Walrave et al. have argued that supervision with regard to a child’s activities online is very difficult and advocate evolving towards a liability system without fault that would require an obligatory insurance.Footnote 36

3.3 Reflections

First of all, it is important to emphasise that many cases that may be perceived as involving a peer-to-peer risk will not fall inside the scope of the legislative framework because they lack gravity (even if the victim in question may experience harm). Second, whereas in theory it is possible to apply existing legislation to peer-to-peer risks in SNS, in practice the enforcement may run into various obstacles, rendering the law ineffective, or the side-effects may be undesirable. In addition it may be argued that with regard to such risks the application of criminal law provisions and the use of court procedures should be considered an ultimum remedium,Footnote 37 and be limited to very serious cases, where malignant intent is undeniable and the (moral) damage significant. Other types of intervention, both ex ante and ex post, will in most cases be more appropriate and effective.

4 The Use of Self- and Co-regulation

Policies aimed at a safer Internet for children have over the past 15 years put significant emphasis of alternative regulatory instruments such as self- and co-regulation.Footnote 38 , Footnote 39 This was again confirmed in the Commission Communication on a European strategy for a better Internet for children of May 2012, which stated that “[l]egislation will not be discarded, but preference will be given to self-regulation, which remains the most flexible framework for achieving tangible results in this area”.Footnote 40 Furthermore, the Commission underlined that “[o]ngoing effective industry self-regulation for the protection and empowerment of young people, with the appropriate benchmarks and independent monitoring systems in place, is needed to build trust in a sustainable and accountable governance model that could bring more flexible, timely and market-appropriate solutions than any regulatory initiatives”.Footnote 41

Examples of ‘ongoing’ industry self-regulation are the Safer Social Networking Principles for Europe,Footnote 42 the CEO CoalitionFootnote 43 and the ICT Coalition,Footnote 44 three ‘coalitions’ that consist of different constellations of companies (some companies, such as Facebook, are a member of all three coalitions). They put forward largely similar principles, albeit with different emphasis, to make the Internet in general, or SNS in particular, safer for children, such as the promotion of privacy-friendly default settings, age-appropriate content, reporting mechanisms, content classification and parental controls.

The reference to the need for independent monitoring systems with regard to self-regulatory initiatives in the Commission Communication is crucial. Up until now in reality the results of the coalitions’ work leave significant room for improvement. Independent assessments of the implementation of the Safer Social Networking Principles for instance have shown that with regard to reporting mechanisms, in 2010, only 9 out of 22 sites responded to complaints submitted by minors asking for help,Footnote 45 and in 2011, only 17 out of 23 services responded to complaints or reports, sometimes taking up to 10 days to do so.Footnote 46 The results of the CEO Coalition were assessed in July 2012 and in 2013. The conclusion of these assessments was that progress can be observed, but that tangible results remain limited.Footnote 47

The results of these evaluations raise the question of the effectiveness of this type of regulatory initiative: although the commitment of the SNS providers to take steps to make their services safer is to be applauded, the concrete implementation of such safety measures is of course crucial in order to achieve actual protection. It is our view that the European Commission should play a role in observing and guiding the various existing initiatives in order to avoid fragmentation, discrepancies or contradictions, and should consider moving towards a stronger co-regulatoryFootnote 48 framework if independent evaluations keep demonstrating that self-regulation does not reach the policy objectives in this area.

In addition to monitoring and evaluating self- (and/or co-)regulatory systems in this domain, it is, very important, from a human rights perspective, to be aware that if such systems have an impact on fundamental rights, such as freedom of expression and the right to privacy,Footnote 49 certain safeguards or procedural guarantees, laid down for instance in the European Convention on Human Rights (ECHR), need to be respected.Footnote 50 It has been emphasised by the Council of Europe Committee of Ministers in its Recommendation on the protection of human rights with regard to social networking services that it is important that “procedural safeguards are respected by these mechanisms, in line with the right to be heard and to review or appeal against decisions, including in appropriate cases the right to a fair trial, within a reasonable time, and starting with the presumption of innocence”.Footnote 51

5 Empowerment Strategies

Influenced by increasingly available high-quality empirical social science research, such as the EU Kids Online project, the debate on the ‘regulation’ of the digital and social media environment to protect minors has shifted its focus away from legislation and regulation towards empowerment and the improvement of digital skills and media literacy over the past 5 years. As Livingstone et al. have emphasised: “[w]hile recognising that measures to reduce specific risks have their place, it is also important to develop strategies to build children’s resilience and to provide resources which help children to cope with or recover from the effects of harm”.Footnote 52 Moreover, “the more that children are equipped to work out solutions for themselves – through skills, greater resilience or access to online resources to support them – the less others will need to step into guide or restrict their online activities”.Footnote 53

Policy documents in this area at different levels highlight the importance of empowering children and young people. The Council of Europe already issued a Recommendation on empowering children in the new information and communications environment in 2006.Footnote 54 More recently, the OECD Council Recommendation on the protection of children online, for instance, stated that “policies to protect children online should empower children and parents to evaluate and minimise risks and engage online in a secure, safe and responsible manner”.Footnote 55 Furthermore, in its Strategy for a better Internet for children the European Commission very clearly emphasises that “[r]egulation remains an option, but, where appropriate, it should preferably be avoided, in favour of more adaptable self-regulatory tools, and of education and empowerment”.Footnote 56

A number of empowerment strategies could help reduce peer-to-peer risks in SNS.

  1. 1.

    Improving media literacy skills

Media literacyFootnote 57 and skills are of the utmost importance to children’s use of the Internet.Footnote 58 In the context of SNS, media literacy has been argued to be especially important “in order to make the users aware of their rights when using these tools, and also help them acquire or reinforce human rights values and develop the behaviour necessary to respect other people’s rights and freedoms”.Footnote 59 With regard to peer-to-peer risks such as bullying or sexting, this last element is of particular importance. This relates to a basic principle that children are taught in the offline world as well: ‘do not do to others what you would not want others do to you’. This should also be a golden rule with regard to SNS, but for children and young people it is much more difficult to estimate the consequences and potential grave impact of their actions in this environment. Hence, raising awareness of children from a very early age about the particular characteristics of SNS and the potential long-term impact of a seemingly trivial act is crucial. Furthermore, children are often completely unaware of a number of basic legal principles, such as portrait rights or the right to privacy. However, it is crucial that they have a clear understanding of the fact that certain acts in SNS may have legal implications, and this should be conveyed to them in an age-appropriate, clear and understandable manner.

  1. 2.

    Providing information

The idiom ‘knowledge is power’ is often used in relation to the information society. It is undeniable that if we want to empower children and young people to act appropriately in SNS, providing them with information is essential. Not only parents and teachers can play a role in this. SNS providers should provide understandable and accessible information about the types of behaviour that are not tolerated in their networks or that may infringe on legal provisions.Footnote 60 This is now often included in the Terms and Conditions section of their network. However, these Terms and Conditions remain notoriously unread and un-understood.

“Rais[ing] awareness of safety education messages and acceptable use policies to users, parents, teachers and carers in a prominent, clear and age-appropriate manner” is the first principle of the Safer Social Networking Principles (supra). An independent assessment of the implementation of this principle found that whereas safety information is often available (although only in half of the cases easy to find) on SNS, the Terms of Use, Community guidelines, Statement of Rights and Responsibilities and/or House rules are “either difficult to access and/or difficult to understand, especially for younger audiences”.Footnote 61 There are, however, SNS that provide child-friendly versions of the Terms of Use, sometimes even presented in audio–visual format. Given the importance of this information, not only with regard to peer-to-peer risks but for instance also with regard to the protection of personal data and privacy, SNS providers should be encouraged to adopt innovative strategies to make children read and above all understand their Terms of Use. Empirical research on such strategies is urgently needed.

  1. 3.

    Providing efficient reporting mechanisms

Whereas the provision of information usually takes place before certain acts are carried out (ex ante), reporting mechanisms allow users to complain about certain content or report about conduct or content ex post. With regard to social media, the use of reporting mechanisms is increasingly promoted. The Council of Europe, for example, has emphasised that in order to protect children and young people against harmful content and behaviour “while not being required to control, supervise and/or rate all content uploaded by its users,Footnote 62 social networking service providers may be required to adopt certain precautionary measures (for example, comparable to ‘adult content’ rules applicable in certain member States) or take diligent action in response to complaints (ex-post moderation)”.Footnote 63 To do this, the setting up of easily accessible reporting mechanisms is actively encouraged.Footnote 64 The CEO Coalition (supra) also put forward the development of simple and robust reporting tools for users as one of its action points, and the provision of such tools is one of the core Safer Social Networking Principles as well. In addition, the European Commission has advocated the establishment and deployment of reporting tools for users, and added that for children in particular, these mechanisms should be “visible, easy to find, recognisable, accessible to all and available at any stage of the online experience where a child may need it”.Footnote 65

With regard to peer-to-peer risks, such as sexting or cyberbullying the importance of mechanisms to report behaviour or acts that are experienced as being harmful or hurtful cannot be overestimated. At the moment, research shows that the use of these tools by children is still rather low. The EU Kids Online study found for instance that only 9 % of those upset by bullying messages have used the available reporting tools, leaving significant scope for awareness-raising concerning availability and use.Footnote 66 SNS providers, however, seem to be increasingly committed to providing users with reporting possibilities.Footnote 67 , Footnote 68 Of course, in addition to making such tools available it is also essential that if SNS providers receive complaints about problematic peer-to-peer behaviour they promptly act upon them,Footnote 69 provide support for the victims, warn the offenders that this type of behaviour is not tolerated and apply sanctions if necessary. Such sanctions (such as removing content, suspending or deleting accounts) are often included in the Terms and Conditions, another reason why it is very important that steps be taken to ensure that users are aware of these terms and that they understand them.

Moreover, the action that is taken by a SNS provider should be carefully considered. First, with regard to the removal of content there should there be transparent procedures that include the possibility to appeal certain decisions in order to keep private censorship at bay. Second, with regard to serious cases, where actual harm seems to occur,Footnote 70 SNS providers should cooperate with other actors such as law enforcement agencies (LEA). Whereas many SNS providers are already working together with LEA, it is important that the criteria that are used to assess content and to decide whether to escalate reports to LEA are made very clear.Footnote 71 Assessing to what extent certain behaviour has actual legal implications should not and cannot be left to private actors.

  1. 4.

    Peer-to-peer strategies

In order to address peer-to-peer risks, advantage could also be taken from peer-to-peer opportunities, such as peer mentoring schemes,Footnote 72 peer-based learningFootnote 73 or peer education.Footnote 74 In such schemes (usually older) children provide support and advice to other (younger) children, based on the idea that they may be better able to get a certain message across than parents or teachers.

Currently, these types of systems are being promoted with regard to the online environment, for instance by the European Commission in its European strategy for a better Internet for children.Footnote 75 In some member states initiatives have already been taken, such as the CybermentorsFootnote 76 project in the United Kingdom, which is an online forum where ‘cybermentors’ chat with their peers about negative online experiences such as cyberbullying and provide support for each other.Footnote 77

Peer-monitoring or peer support mechanisms have been used for some time to address traditional bullying, and have generally been proven to be effective in reducing bullying, empowering children and creating more positive peer relations, for instance in a school environment.Footnote 78 However, some scholars have warned that such schemes may also have unintended consequences, such as reinforcement of aggressive behaviour and thus an increase in bullying, when implemented in certain circumstances and that their implementation should thus be carefully considered.Footnote 79 Empirical research into the uptake and success of these types of systems to reduce peer-to-peer risks in the SNS environment should be encouraged.

6 Conclusion

In the context of the protection of children and young people in the digital environment, multi-stakeholder involvement has long been put forward as a key principle. It should also be considered as such with regard to peer-to-peer risks. In many cases that involve such risks legal, and certainly criminal, action is not desirable. Moreover, although certain acts may fall within the scope of the legislative framework, the enforcement thereof, where warranted, may run into practical obstacles that undermine its effectiveness. Notwithstanding these findings, policy makers at national level need to make clear decisions regarding the application of criminal law to situations where peer-to-peer risks lead to serious harm (such as secondary sexting), taking into account the interests of both minor victims and offenders. Ambiguous situations, which leave doubt as to the possible application of certain criminal law provisions (e.g. with regard to child pornography) to minor offenders should be avoided.

Many steps can be taken, however, to attempt to prevent peer-to-peer risks from occurring: starting from instilling in children from a very early age basic principles such as empathy and respect for one another,Footnote 80 to gradually increasing their media literacy skills, which should include an understanding of the characteristics of the digital environment in general, and SNS environments in particular, as well as a basic insight into and awareness of a number of potential legal implications of certain behaviour. Whereas parents and teachers would seem the most suitable actors to do this, research has found that their capacities are limited,Footnote 81 both in relation to skills as to time. Their efforts should thus be complemented by industry commitment to keep improving their services and providing users with clear information and tools that empower them. In addition, governmental actors must encourage educational institutions and civil society organisations to implement empowerment strategies and provide them with the resources to do so.

Given the importance of providing children with a positive online environment, in our view, policy makers should consider establishing strong co-regulatory mechanisms in this context. Such mechanisms can take various forms and can be organised at different levels, but should at least entail that all initiatives that are taken are evaluated independently, and that there are safety-net procedures if these evaluations show that the actors that are involved do not take up their responsibility. Although the European Commission hints at such an approach in its 2012 European strategy on a better Internet for children, by referring to considering regulatory or legislative measures if industry initiatives fail to deliver,Footnote 82 this remains too vague and open-ended. A transparent co-regulatory framework, with an unambiguous division of responsibilities, strong incentives to comply and clearly defined evaluation criteria, would lead to more accountability and certainty for all actors involved and would thus, in our view, be in everyone’s best interest.

Finally, it remains crucial that both quantitative and qualitative empirical research is undertaken in this domain. A comprehensive and efficient strategy to reduce risks and empower young users can only be developed on the basis of sound evidence on the occurrence of certain practices, the actual harm they cause, and the concrete impact of initiatives of different actors.