Skip to main content

Don’t Shoot the Message: Regulating Disinformation Beyond Content

  • Chapter
  • First Online:
The Rule of Law in Cyberspace

Part of the book series: Law, Governance and Technology Series ((LGTS,volume 49))

  • 856 Accesses

Abstract

This paper approaches regulatory strategies against disinformation with two main goals: (i) exploring the policies recently implemented in different legal contexts to provide insight into both the risks they pose to free speech and their potential to address the rationales that motivated them, and (ii) to do so by bridging policy debates and recent social and communications studies findings on disinformation. An interdisciplinary theoretical framework informs both the paper’s scope (anchored on understandings of regulatory strategies and disinformation) and the analysis of the legitimate motivations for states to establish statutory regulation aiming at disinformation. Departing from this analysis, I suggest an organisation of recently implemented and proposed policies into three groups according to their regulatory target: content, data, and structure. Combining the analysis of these three types of policies with the theoretical framework, I will argue that, in the realm of statutory regulation, state action is better off targeted at data or structure, as aiming at content represents disproportional risks to freedom of expression. Furthermore, content-targeted regulation shows little potential to address the structural transformations on the public sphere of communications that, among other factors, influence current production practices and disinformation spread.

This is an updated version of a journal article previously published as “Don’t Shoot the Message: Regulating Disinformation Beyond Content”, in Revista de Direito Público, Volume18, 99, pp. 486–515, 2021.

I would like to thank my colleagues at the Digital Disinformation Hub of the Leibniz Institute for Media Research, Stephan Dreyer and Amélie Heldt, for the conversations that inspired my ideas and supported the findings in this paper. Furthermore, I would like to thank Stephan Dreyer and Leonard Kamps for their revisions and suggestions, as well as Lena Hinrichs and Mara Barthelmes for their help with the empirical research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Cadwalladr (2017). Evangelista and Bruno (2019) and Faris et al. (2017).

  2. 2.

    Marwick and Lewis (2020).

  3. 3.

    Rossini et al. (2021), pp. 2430–2451 and Dan et al. (2021), pp. 641–664.

  4. 4.

    Faris et al. (2017) and Karpf (2019).

  5. 5.

    Neo (2021), pp. 214–228; Schulz (2019).

  6. 6.

    Rauchfleisch and Kaiser (2020), p. e0241045.

  7. 7.

    Faris et al. (2017).

  8. 8.

    This is a research perspective on technological transformation called “mediated democracy”, which will be further explored in Sect. 3 of this paper. In general, see Hofmann (2019).

  9. 9.

    Black (2001), pp. 103–146.

  10. 10.

    Baldwin et al. (2012) p. 105.

  11. 11.

    Marwick et al. (2021).

  12. 12.

    Wardle and Derakhshan (2017).

  13. 13.

    For exceptions, Dan et al. (2021), pp. 641–664.

  14. 14.

    For all, see Wardle and Derakhshan (2017).

  15. 15.

    Guess and Lyons (2020).

  16. 16.

    Wardle and Derakhshan (2017) and Guess and Lyons (2020) and Faris et al. (2017) and Egelhofer and Lecheler (2019).

  17. 17.

    Habermas (2003).

  18. 18.

    Jungherr and Schroeder (2021), p. 2.

  19. 19.

    Karpf (2019).

  20. 20.

    Barberá (2020), p. 345.

  21. 21.

    Faris et al. (2017).

  22. 22.

    For an overview of these disputes, see Barberá (2020).

  23. 23.

    Jungherr and Schroeder (2021), p. 2.

  24. 24.

    Ibid.

  25. 25.

    Jungherr and Schroeder (2021), p. 4.

  26. 26.

    Jungherr and Schroeder (2021), p. 4.

  27. 27.

    Jungherr and Schroeder (2021), pp. 5–8.

  28. 28.

    Hofmann (2019).

  29. 29.

    Ibid.

  30. 30.

    Benkler (2019).

  31. 31.

    Valente (2019).

  32. 32.

    Karpf (2019).

  33. 33.

    Ibid. Available at: https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/.

  34. 34.

    Ognyanova et al. (2020).

  35. 35.

    Jungherr and Schroeder (2021), p. 3.

  36. 36.

    Khan (2021).

  37. 37.

    Ognyanova et al. (2020).

  38. 38.

    Ognyanova et al. (2020).

  39. 39.

    The role of states in disinformation counteraction is not restricted to formal legislation. It also includes other sorts of public policy beyond this paper’s scope, like police task forces, institutional support, encouraging fact-checking and media literacy initiatives (Marsden et al. 2020, p. 105373, p. 3.) and even enhancing cybersecurity. Further, as this paper looks exclusively at statutory legislation, it will not approach institutional solutions decentred from the state, such as the negotiation of voluntary measures, for example, the European Code of Conduct Durach et al. (2020) and the Australian Code of Practice on Disinformation and Misinformation (The Code was elaborated by digital platform providers represented by the Digital Industry Group Inc. (DIGI) upon a recommendation of the Australian Media and Communications Authority (ACMA). The Code is available at: https://digi.org.au/disinformation-code/. For more on the Australian framework, see Carson and Fallon (2021).

  40. 40.

    Wiseman (2020). According to Irene Khan, at least “17 states adopted legislation to address pandemic-related problematic disinformation”. Khan (2021), p. 11.

  41. 41.

    Valente (2019).

  42. 42.

    The repositories that were initially consulted are the Poynter Institute’s, A guide to anti-misinformation actions around the world. Available at: https://www.poynter.org/ifcn/anti-misinformation-actions/; the Law Library of Congress Reports “Initiatives to Counter Fake News in Selected Countries”. Available at: https://digitalcommons.unl.edu/scholcom/179/; and “Government Responses to Disinformation on Social Media Platforms”. Available at: https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1180&context=scholcom; and Carson and Fallon (2021), available at: https://www.latrobe.edu.au/__data/assets/pdf_file/0019/1203553/carson-fake-news.pdf. Besides these repositories, further academic literature and media reports were used to provide insights and context for regulatory experiences. All these other sources are cited throughout the paper.

  43. 43.

    This is similar to the explanation used by Moses (2013), pp. 1–20. The literature accounts for different ways of referencing regulatory targets, which can also be understood as the “the individual or organization to which a regulatory instrument applies”, Coglianese (2010).

  44. 44.

    Law 13.834/2019, art. 2o. Available at: http://www.planalto.gov.br/ccivil_03/_ato2019-2022/2019/lei/L13834.htm. Accessed on: 30 Sept. 2021.

  45. 45.

    Proclamation 1185/2020, as per the translation available at: https://chilot.me/2020/04/05/proclamation-no-1185-2020-hate-speech-and-disinformation-prevention-and-suppression/.

  46. 46.

    Schuldt (2021). Available at: https://verfassungsblog.de/malaysia-fake-news/.

  47. 47.

    Cambodian Center for Human Rights, Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression. Available at: Cambodia-Centre-for-human-rights.pdf (ohchr.org).

  48. 48.

    Sugow et al. (2020).

  49. 49.

    Law Library of Congress Reports, “Initiatives to Counter Fake News in Selected Countries”. Available at: https://digitalcommons.unl.edu/scholcom/179/.

  50. 50.

    https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1180&context=scholcom.

  51. 51.

    Schulz (2019), p. 17.

  52. 52.

    Qi et al. (2018), pp. 1342–1354, p. 12.

  53. 53.

    Smith (2019), p. 52. As the author highlights, the French Law adopts “a more holistic approach” (p. 58) based on three strands designed to curb foreign state disinformation: to prevent further online transmission of false information prior to elections (i.e., the case of the policy described here); to ensure greater transparency in the operation of online communication platforms; and to stimulate new educational initiatives. Some of these strategies do not encompass content regulation and will be discussed in other sections of this paper.

  54. 54.

    Ibid., p. 60.

  55. 55.

    Belarus, Freedom House. Available at: https://freedomhouse.org/country/belarus/freedom-net/2021.

  56. 56.

    Belarus, Freedom House. Available at: https://freedomhouse.org/country/belarus/freedom-net/2021.

  57. 57.

    Cambodian Center for Human Rights, Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression. Available at: https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf.

  58. 58.

    Cambodian Center for Human Rights, Submission to the Special Rapporteur on the promotion and protection of the rights to freedom of opinion and expression. Available at: https://www.ohchr.org/Documents/Issues/Expression/disinformation/2-Civil-society-organisations/Cambodia-Centre-for-human-rights.pdf.

  59. 59.

    Khan (2021), p. 3.

  60. 60.

    Iglesias Keller (2020).

  61. 61.

    Law Library of Congress Reports, “Initiatives to Counter Fake News in Selected Countries”. Available at: https://digitalcommons.unl.edu/scholcom/179/.

  62. 62.

    Schulz (2019), p. 17

  63. 63.

    Junior (2017), pp. 274–302, p. 2.

  64. 64.

    Khan (2021), p. 11.

  65. 65.

    Mendes (2014), p. 47.

  66. 66.

    Walker et al. (2019).

  67. 67.

    UK Information Commissioner’s Office, Microtargeting, ICO website. Available at: https://ico.org.uk/your-data-matters/be-data-aware/social-media-privacy-settings/microtargeting/.

  68. 68.

    Zarouali et al. (2020).

  69. 69.

    Jungherr and Schroeder (2021), p. 3.

  70. 70.

    Nenadić (2019), p. 6.

  71. 71.

    Ibid., p. 2.

  72. 72.

    Cadwalladr (2017); Evangelista and Bruno (2019).

  73. 73.

    However, guaranteeing the integrity of data-driven elections encompasses concerns that go beyond disinformation. See Bennet and Oduro-Marfo (2019) and Bennet and Lyon (2019).

  74. 74.

    Cruz (2020), p. 297.

  75. 75.

    Cruz (2020).

  76. 76.

    Ibid., p. 377.

  77. 77.

    Nenadić (2019), p. 13 and Cruz (2020), pp. 376–378.

  78. 78.

    Cruz (2020), p. 377.

  79. 79.

    Bennet and Oduro-Marfo (2019), p. 6.

  80. 80.

    In the context of digital platform regulation, regulating structure can also refer to antitrust legislation (see Tackling the Information Crisis: A Policy Framework for Media System Resilience, The Report of the LSE Commission on Truth, Trust and Technology. Available at: https://www.lse.ac.uk/media-and-communications/truth-trust-and-technology-commission. Despite this application of the term, this article’s scope does not go so far as to encompass the analysis of antitrust legislation.

  81. 81.

    Tambini (2021).

  82. 82.

    Mansell and Steinmueller (2020), p. 101.

  83. 83.

    Schulz (2019).

  84. 84.

    Brazilian Federal House of Representatives Bill of Law 2.630/2020. Available at: https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735.

  85. 85.

    Frosio (2017), pp. 1–33.

  86. 86.

    Síthigh (2020), pp. 1–21.

  87. 87.

    Kuczerawy (2019).

  88. 88.

    Gasser and Schulz (2015).

  89. 89.

    See, for instance, Tambini (2019) and Frosio (2017).

  90. 90.

    Smith (2019), p. 62.

  91. 91.

    Suzor et al. (2019), pp. 1526–1543.

  92. 92.

    Rieder and Hofmann (2020).

  93. 93.

    Gorwa and Ash (2020), p. 287.

  94. 94.

    Rieder and Hofmann (2020).

  95. 95.

    The 2017 Netzwerkdurchsetzungsgesetz – NetzDG Act is not a disinformation-targeted law. It requires social media platforms to implement procedures that allow users to report illegal content, notably, the 22 criminal conducts already provided in Germany’s Criminal Code. According to its terms, “‘manifestly unlawful”’ content needs to be removed within 24 h of notification (or possibly after 7 days or more, with terms to be agreed upon with law enforcement authority). Beside removals, Sect. 2 requires platforms to periodically publish transparency reports on the number of complaints received and how they were handled by the platform. Heldt (2019).

  96. 96.

    Brazilian Federal House of Representatives Bill of Law 2.630/2020. Available at: https://www.camara.leg.br/proposicoesWeb/fichadetramitacao?idProposicao=2256735.

  97. 97.

    European Commission, Proposal for a Regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC. Available at: https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-european-parliament-and-council-single-market-digital-services-digital-services.

  98. 98.

    https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12826-Transparency-of-political-advertising/public-consultation_en.

  99. 99.

    Smith (2019), p. 62.

  100. 100.

    Nenadić (2019).

  101. 101.

    An example would be the German NetzDG mentioned above.

  102. 102.

    Schulz (2019).

  103. 103.

    Sylvain (2010), p. 209.

  104. 104.

    Helberger (2020), p. 846.

  105. 105.

    Suzor et al. (2018), pp. 391–392.

  106. 106.

    Hofmann (2019).

  107. 107.

    Natali Helberger has argued that recent attempts to “infuse some public value standards into corporations” formalises “the role of platforms as governors of online speech” and reinforcing their political power. Helberger (2020), p. 848.

  108. 108.

    Jungherr and Schroeder (2021).

  109. 109.

    Helberger (2020), p. 847.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Clara Iglesias Keller .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Iglesias Keller, C. (2022). Don’t Shoot the Message: Regulating Disinformation Beyond Content. In: Blanco de Morais, C., Ferreira Mendes, G., Vesting, T. (eds) The Rule of Law in Cyberspace. Law, Governance and Technology Series, vol 49. Springer, Cham. https://doi.org/10.1007/978-3-031-07377-9_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-07377-9_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-07376-2

  • Online ISBN: 978-3-031-07377-9

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics