Abstract
This paper approaches regulatory strategies against disinformation with two main goals: (i) exploring the policies recently implemented in different legal contexts to provide insight into both the risks they pose to free speech and their potential to address the rationales that motivated them, and (ii) to do so by bridging policy debates and recent social and communications studies findings on disinformation. An interdisciplinary theoretical framework informs both the paper’s scope (anchored on understandings of regulatory strategies and disinformation) and the analysis of the legitimate motivations for states to establish statutory regulation aiming at disinformation. Departing from this analysis, I suggest an organisation of recently implemented and proposed policies into three groups according to their regulatory target: content, data, and structure. Combining the analysis of these three types of policies with the theoretical framework, I will argue that, in the realm of statutory regulation, state action is better off targeted at data or structure, as aiming at content represents disproportional risks to freedom of expression. Furthermore, content-targeted regulation shows little potential to address the structural transformations on the public sphere of communications that, among other factors, influence current production practices and disinformation spread.
This is an updated version of a journal article previously published as “Don’t Shoot the Message: Regulating Disinformation Beyond Content”, in Revista de Direito Público, Volume18, 99, pp. 486–515, 2021.
I would like to thank my colleagues at the Digital Disinformation Hub of the Leibniz Institute for Media Research, Stephan Dreyer and Amélie Heldt, for the conversations that inspired my ideas and supported the findings in this paper. Furthermore, I would like to thank Stephan Dreyer and Leonard Kamps for their revisions and suggestions, as well as Lena Hinrichs and Mara Barthelmes for their help with the empirical research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
Marwick and Lewis (2020).
- 3.
- 4.
- 5.
- 6.
Rauchfleisch and Kaiser (2020), p. e0241045.
- 7.
Faris et al. (2017).
- 8.
- 9.
Black (2001), pp. 103–146.
- 10.
Baldwin et al. (2012) p. 105.
- 11.
Marwick et al. (2021).
- 12.
Wardle and Derakhshan (2017).
- 13.
For exceptions, Dan et al. (2021), pp. 641–664.
- 14.
For all, see Wardle and Derakhshan (2017).
- 15.
Guess and Lyons (2020).
- 16.
- 17.
Habermas (2003).
- 18.
Jungherr and Schroeder (2021), p. 2.
- 19.
Karpf (2019).
- 20.
Barberá (2020), p. 345.
- 21.
Faris et al. (2017).
- 22.
For an overview of these disputes, see Barberá (2020).
- 23.
Jungherr and Schroeder (2021), p. 2.
- 24.
Ibid.
- 25.
Jungherr and Schroeder (2021), p. 4.
- 26.
Jungherr and Schroeder (2021), p. 4.
- 27.
Jungherr and Schroeder (2021), pp. 5–8.
- 28.
Hofmann (2019).
- 29.
Ibid.
- 30.
Benkler (2019).
- 31.
Valente (2019).
- 32.
Karpf (2019).
- 33.
- 34.
Ognyanova et al. (2020).
- 35.
Jungherr and Schroeder (2021), p. 3.
- 36.
Khan (2021).
- 37.
Ognyanova et al. (2020).
- 38.
Ognyanova et al. (2020).
- 39.
The role of states in disinformation counteraction is not restricted to formal legislation. It also includes other sorts of public policy beyond this paper’s scope, like police task forces, institutional support, encouraging fact-checking and media literacy initiatives (Marsden et al. 2020, p. 105373, p. 3.) and even enhancing cybersecurity. Further, as this paper looks exclusively at statutory legislation, it will not approach institutional solutions decentred from the state, such as the negotiation of voluntary measures, for example, the European Code of Conduct Durach et al. (2020) and the Australian Code of Practice on Disinformation and Misinformation (The Code was elaborated by digital platform providers represented by the Digital Industry Group Inc. (DIGI) upon a recommendation of the Australian Media and Communications Authority (ACMA). The Code is available at: https://digi.org.au/disinformation-code/. For more on the Australian framework, see Carson and Fallon (2021).
- 40.
- 41.
Valente (2019).
- 42.
The repositories that were initially consulted are the Poynter Institute’s, A guide to anti-misinformation actions around the world. Available at: https://www.poynter.org/ifcn/anti-misinformation-actions/; the Law Library of Congress Reports “Initiatives to Counter Fake News in Selected Countries”. Available at: https://digitalcommons.unl.edu/scholcom/179/; and “Government Responses to Disinformation on Social Media Platforms”. Available at: https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1180&context=scholcom; and Carson and Fallon (2021), available at: https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf. Besides these repositories, further academic literature and media reports were used to provide insights and context for regulatory experiences. All these other sources are cited throughout the paper.
- 43.
- 44.
Law 13.834/2019, art. 2o. Available at: http://www.planalto.gov.br/ccivil_03/_ato2019-2022/2019/lei/L13834.htm. Accessed on: 30 Sept. 2021.
- 45.
Proclamation 1185/2020, as per the translation available at: https://chilot.me/2020/04/05/proclamation-no-1185-2020-hate-speech-and-disinformation-prevention-and-suppression/.
- 46.
Schuldt (2021). Available at: https://verfassungsblog.de/malaysia-fake-news/.
- 47.
Cambodian Center for Human Rights, Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression. Available at: Cambodia-Centre-for-human-rights.pdf (ohchr.org).
- 48.
Sugow et al. (2020).
- 49.
Law Library of Congress Reports, “Initiatives to Counter Fake News in Selected Countries”. Available at: https://digitalcommons.unl.edu/scholcom/179/.
- 50.
- 51.
Schulz (2019), p. 17.
- 52.
Qi et al. (2018), pp. 1342–1354, p. 12.
- 53.
Smith (2019), p. 52. As the author highlights, the French Law adopts “a more holistic approach” (p. 58) based on three strands designed to curb foreign state disinformation: to prevent further online transmission of false information prior to elections (i.e., the case of the policy described here); to ensure greater transparency in the operation of online communication platforms; and to stimulate new educational initiatives. Some of these strategies do not encompass content regulation and will be discussed in other sections of this paper.
- 54.
Ibid., p. 60.
- 55.
Belarus, Freedom House. Available at: https://freedomhouse.org/country/belarus/freedom-net/2021.
- 56.
Belarus, Freedom House. Available at: https://freedomhouse.org/country/belarus/freedom-net/2021.
- 57.
Cambodian Center for Human Rights, Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression. Available at: https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf.
- 58.
Cambodian Center for Human Rights, Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression. Available at: https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf.
- 59.
Khan (2021), p. 3.
- 60.
Iglesias Keller (2020).
- 61.
Law Library of Congress Reports, “Initiatives to Counter Fake News in Selected Countries”. Available at: https://digitalcommons.unl.edu/scholcom/179/.
- 62.
Schulz (2019), p. 17
- 63.
Junior (2017), pp. 274–302, p. 2.
- 64.
Khan (2021), p. 11.
- 65.
Mendes (2014), p. 47.
- 66.
Walker et al. (2019).
- 67.
UK Information Commissioner’s Office, Microtargeting, ICO website. Available at: https://ico.org.uk/your-data-matters/be-data-aware/social-media-privacy-settings/microtargeting/.
- 68.
Zarouali et al. (2020).
- 69.
Jungherr and Schroeder (2021), p. 3.
- 70.
Nenadić (2019), p. 6.
- 71.
Ibid., p. 2.
- 72.
- 73.
- 74.
Cruz (2020), p. 297.
- 75.
Cruz (2020).
- 76.
Ibid., p. 377.
- 77.
- 78.
Cruz (2020), p. 377.
- 79.
Bennet and Oduro-Marfo (2019), p. 6.
- 80.
In the context of digital platform regulation, regulating structure can also refer to antitrust legislation (see Tackling the Information Crisis: A Policy Framework for Media System Resilience, The Report of the LSE Commission on Truth, Trust and Technology. Available at: https://www.lse.ac.uk/media-and-communications/truth-trust-and-technology-commission. Despite this application of the term, this article’s scope does not go so far as to encompass the analysis of antitrust legislation.
- 81.
Tambini (2021).
- 82.
Mansell and Steinmueller (2020), p. 101.
- 83.
Schulz (2019).
- 84.
Brazilian Federal House of Representatives Bill of Law 2.630/2020. Available at: https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735.
- 85.
Frosio (2017), pp. 1–33.
- 86.
Síthigh (2020), pp. 1–21.
- 87.
Kuczerawy (2019).
- 88.
Gasser and Schulz (2015).
- 89.
- 90.
Smith (2019), p. 62.
- 91.
Suzor et al. (2019), pp. 1526–1543.
- 92.
Rieder and Hofmann (2020).
- 93.
Gorwa and Ash (2020), p. 287.
- 94.
Rieder and Hofmann (2020).
- 95.
The 2017 Netzwerkdurchsetzungsgesetz – NetzDG Act is not a disinformation-targeted law. It requires social media platforms to implement procedures that allow users to report illegal content, notably, the 22 criminal conducts already provided in Germany’s Criminal Code. According to its terms, “‘manifestly unlawful”’ content needs to be removed within 24 h of notification (or possibly after 7 days or more, with terms to be agreed upon with law enforcement authority). Beside removals, Sect. 2 requires platforms to periodically publish transparency reports on the number of complaints received and how they were handled by the platform. Heldt (2019).
- 96.
Brazilian Federal House of Representatives Bill of Law 2.630/2020. Available at: https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735.
- 97.
European Commission, Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC. Available at: https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-european-parliament-and-council-single-market-digital-services-digital-services.
- 98.
- 99.
Smith (2019), p. 62.
- 100.
Nenadić (2019).
- 101.
An example would be the German NetzDG mentioned above.
- 102.
Schulz (2019).
- 103.
Sylvain (2010), p. 209.
- 104.
Helberger (2020), p. 846.
- 105.
Suzor et al. (2018), pp. 391–392.
- 106.
Hofmann (2019).
- 107.
Natali Helberger has argued that recent attempts to “infuse some public value standards into corporations” formalises “the role of platforms as governors of online speech” and reinforcing their political power. Helberger (2020), p. 848.
- 108.
Jungherr and Schroeder (2021).
- 109.
Helberger (2020), p. 847.
References
Baldwin R, Cave M, Lodge M (2012) Understanding regulation: theory, strategy, and practice, 2nd edn. Oxford University Press, New York
Barberá P (2020) Social media, echo chambers, and political polarization. In: Persily, N, Tucker, J (orgs) Social media and democracy: the state of the field, prospects for reform. Cambridge University Press, Cambridge
Benkler Y (2019) Cautionary Notes on Disinformation and the Origins of Distrust. MediaWell, Social Science Research Council. https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/
Bennet C, Lyon D (2019) Data-driven elections: implications and challenges for democratic societies. Inter Policy Rev 8:4. https://policyreview.info/node/1433
Bennet C, Oduro-Marfo S (2019) Privacy, voter surveillance and democratic engagement: challenges for data protection authorities. University of Victoria
Black J (2001) Decentring regulation: understanding the role of regulation and self-regulation in a “post-regulatory” world. Curr Leg Probl 54(1):103–146
Cadwalladr C (2017) The great British Brexit robbery: how our democracy was hijacked. The Guardian. https://www.theguardian.com/technology/2017/may/07/the-great-british-brexit-robbery-hijacked-democracy
Carson A, Fallon L (2021) Fighting fake news: a study of online misinformation regulation in the Asia Pacific. La Trobe University. https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf. Acesso em: 10 jun. 221DC
Coglianese, C (2010) Engaging business in the regulation of nanotechnology. In: Bosso CJ (org) Governing uncertainty: environmental regulation in the age of nanotechnology. RFF Press, Washington, DC
Cruz F (2020) Novo jogo, velhas regras: democracia e direito na era da nova propaganda politica e das fake news. Grupo Editorial Letramento, Casa do Direito, Belo Horizonte, MG
Dan V et al (2021) Visual mis- and disinformation, social media, and democracy. J Mass Commun Q 98(3):641–664
Das A, Schroeder R (2020) Online disinformation in the run-up to the Indian 2019 election. Information, Communication & Society, pp 1–17
Dobber T, Ó Fathaigh R, Zuiderveen F (2019) The regulation of online political micro-targeting in Europe. Inter Policy Rev 8:4. https://policyreview.info/node/1440
Durach F, Bârgâoanu A, Nastasiu C (2020) Tackling disinformation: EU regulation of the digital space. Roman J Eur Aff 20:1. http://rjea.ier.gov.ro/wp-content/uploads/2020/05/RJEA_vol.-20_no.1_June-2020_Full-issue.pdf#page=6
Egelhofer J, Lecheler S (2019) Fake news as a two-dimensional phenomenon: a framework and research agenda. Ann Int Commun Assoc 43(2):97–116
Evangelista R, Bruno F (2019) WhatsApp and political instability in Brazil: targeted messages and political radicalisation. Inter Policy Rev 8:4. https://policyreview.info/node/1434
Faris R, Roberts H, Etling B et al (2017) Partisanship, propaganda, and disinformation: online media and the 2016 U.S. presidential election. Berkman Klein Center for Internet & Society at Harvard University, Cambridge, U.S. http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251
Frosio G (2017) Why keep a dog and bark yourself? From intermediary liability to responsibility. Int J Law Inf Technol, pp 1–33
Gasser U, Schulz W (2015) Governance of online intermediaries: observations from a series of national case studies. SSRN Electronic Journal. http://www.ssrn.com/abstract=2566364
Gorwa R, Ash T (2020) Democratic transparency in the platform society. In: Persily N, Tucker J (orgs) Social media and democracy: the state of the field, prospects for reform. Cambridge University Press, Cambridge
Guess A, Lyons B (2020) Misinformation, disinformation and online Propaganda. In: Persily N, Tucker J (orgs) Social media and democracy: the state of the field, prospects for reform. Cambridge University Press, Cambridge
Habermas J (2003) O Estado Democrático de Democrático de Direito: uma amarração paradoxal de princípios contraditórios? Era das Transições. Editora Tempo Brasileiro, Rio de Janeiro
Helberger N (2020) The political power of platforms: how current attempts to regulate misinformation amplify opinion power. Dig J 8(6):842–854
Heldt A (2019) Reading between the lines and the numbers: an analysis of the first NetzDG reports. Inter Policy Rev 8:2. https://policyreview.info/node/1398
Hofmann J (2019) Mediated democracy – linking digital technology to political agency. Inter Policy Rev 8:2. https://policyreview.info/node/1416
Iglesias Keller C (2020) Policy by judicialisation: the institutional framework for intermediary liability in Brazil. Int Rev Law Comput Technol:1–19
Jungherr A, Schroeder R (2021) Disinformation and the structural transformations of the public arena: addressing the actual challenges to democracy. Soc Media Soc 7:1
Junior R (2017) Freedom of expression: what lessons should we learn from US experience? Revista Direito GV 13(1):274–302
Karpf D (2019) On digital disinformation and democratric myths. MediaWell, Social Science Research Council, 2019. Available at: https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/
Khan I (2021) Disinformation and freedom of opinion and expression. United Nations, General Assembly
Kuczerawy A (2019) General Monitoring Obligations: A New Cornerstone of Internet Regulation in the EU?. Available at: https://ssrn.com/abstract=3449170.
Mansell R, Steinmueller W (2020) Advanced introduction to platform economics. Edward Elgar Publishing, (Elgar Advanced Introductions), Cheltenham
Marsden C, Meyer T, Brown I (2020) Platform values and democratic elections: how can the law regulate digital disinformation? Comput Law Secur Rev 36:105–373
Marwick A et al (2021) Critical disinformation studies - a syllabus. Center for Information, Technology and Public Life. University of North Carolina at Chapel Hill. https://citap.unc.edu/research/critical-disinfo/
Marwick A, Lewis R (2020) Media manipulation and disinformation online. Data & Society Research Institute. https://datasociety.net/wp-content/uploads/2017/05/DataAndSociety_MediaManipulationAndDisinformationOnline-1.pdf
Mendes L (2014) Privacidade, proteção de dados e direito do consumidor: linhas gerais de um novo direito fundamental. Saraiva, São Paulo
Moses L (2013) How to think about law, regulation and technology: problems with ‘technology’ as a regulatory target. Law Innov Technol 5(1):1–20
Nenadić I (2019) Unpacking the “European approach” to tackling challenges of disinformation and political manipulation. Inter Policy Rev 8:4. https://policyreview.info/node/1436
Neo R (2021) The international discourses and governance of fake news. Global Policy 12(2):214–228
Ognyanova K et al (2020) Misinformation in action: fake news exposure is linked to lower trust in media, higher trust in government when your side is in power. Harvard Kennedy School Misinformation Review. https://misinforeview.hks.harvard.edu/?p=1689
Qi A, Shao G, Zheng W (2018) Assessing China’s cybersecurity law. Comput Law Secur Rev 34(6):1342–1354
Rauchfleisch A, Kaiser J (2020) The false positive problem of automatic bot detection in social science research. Plos One 15(10):e0241045
Rieder B, Hofmann J (2020) Towards platform observability. Inter Policy Rev 9:4. https://policyreview.info/articles/analysis/towards-platform-observability
Rossini P et al (2021) Dysfunctional information sharing on WhatsApp and Facebook: the role of political talk, cross-cutting exposure and social corrections. New Media Soc 23(8):2430–2451
Schuldt L (2021) The rebirth of Malaysia’s fake news law – and what the NetzDG has to do with it. https://verfassungsblog.de/malaysia-fake-news/
Síthigh D (2020) The road to responsibilities: new attitudes towards internet intermediaries. Inf Commun Technol Law 29(1):1–21
Smith R (2019) Fake news, French Law and democratic legitimacy: lessons for the United Kingdom? J Media Law 11(1):52–81
Schulz W (2019) Roles and responsibilities of information intermediaries: fighting misinformation as a test case for human-rights respecting governance of social media platforms. Hoover Institution, Stanford University (Aegis Series). https://www.hoover.org/sites/default/files/research/docs/schulz_webreadypdf.pdf
Sugow A, Mungai B, Wanyama J (2020) The regulation of fake news in Kenya under the coronavirus threat. Available at https://cipit.strathmore.edu/the-regulation-of-fake-news-in-kenya-under-the-coronavirus-threat/
Suzor N, Van Geelen T, West S (2018) Evaluating the legitimacy of platform governance: a review of research and a shared research agenda. Int Commun Gazette 80(4):385–400
Suzor N, West S, Wuodling A et al (2019) What do we mean when we talk about transparency? Toward meaningful transparency in commercial content moderation. Int J Commun 13:1526–1543
Sylvain O (2010) Internet governance and democratic legitimacy. Federal Commun Law J 62(2):205–274
Tambini D (2019) Rights and responsibilities of internet intermediaries in Europe: the need for policy coordination. CIGI - centre for international governance innovation (blog). https://www.cigionline.org/articles/rights-and-responsibilities-internet-intermediaries-europe-need-policy-coordination
Tambini D (2021) Rights and Responsibilities of Internet Intermediaries in Europe: The Need for Policy Coordination. https://www.cigionline.org/articles/rights-and-responsibilities-internet-intermediaries-europe-need-policy-coordination
Valente J (2019) Regulando desinformação e fake news: um panorama internacional das respostas ao problema. Comunicação pública 14:27. http://journals.openedition.org/cp/5262
Walker S, Mercea D, Bastos M (2019) The disinformation landscape and the lockdown of social platforms. Inf Commun Soc 22(11):1531–1543
Wardle C, Derakhshan H (2017) Information Disorder: toward an interdisciplinary framework for research and policy making. Council of Europe (Council of Europe Report)
Wiseman J (2020) Rush to pass ‘fake news’ laws during Covid-19 intensifying global media freedom challenges. International Press Institute. https://ipi.media/rush-to-pass-fake-news-laws-during-covid-19-intensifying-global-media-freedom-challenges/
Zarouali B et al (2020) Using a personality-profiling algorithm to investigate political microtargeting: assessing the persuasion effects of personality-tailored ads on social media. Communication Research, p. 009365022096196
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Iglesias Keller, C. (2022). Don’t Shoot the Message: Regulating Disinformation Beyond Content. In: Blanco de Morais, C., Ferreira Mendes, G., Vesting, T. (eds) The Rule of Law in Cyberspace. Law, Governance and Technology Series, vol 49. Springer, Cham. https://doi.org/10.1007/978-3-031-07377-9_16
Download citation
DOI: https://doi.org/10.1007/978-3-031-07377-9_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-07376-2
Online ISBN: 978-3-031-07377-9
eBook Packages: Law and CriminologyLaw and Criminology (R0)