Skip to main content

Autonomous Systems and Tort Law

  • Conference paper
  • First Online:
Legal Aspects of Autonomous Systems (ICASL 2022)

Part of the book series: Data Science, Machine Intelligence, and Law ((DSMIL,volume 4))

Included in the following conference series:

  • 337 Accesses

Abstract

At the present stage of their development, as the European Law Institute pointed out, algorithms have five main characteristics: complexity, increasing autonomy, opacity, openness and vulnerability. Being able to learn through cumulative experience and to take independent decisions and standing as true ecosystems of connected elements, the autonomous systems pose a challenge for the classic tort law remedies. Thus, if damage or harm caused by an autonomous system occurs, who can be liable? In this paper, after showing the difficulties of the classic remedies, we will consider some possible solutions for this problem, examining three separate solutions: the establishment of a fund for the compensation of AI harm; the direct liability of the autonomous systems; and the establishment of a new hypothesis of strict liability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    ELI (2022), 21 s.

  2. 2.

    ELI (2022), 23.

  3. 3.

    Expert Group on Liability and New Technologies (2019), 17 s.

  4. 4.

    By contrast, see Richards, Neil; Smart William D. (2013); Hubbard’s (2016), 25–50.

  5. 5.

    Nevejans (2016), 6.

  6. 6.

    Karnow (2016), 51–77.

  7. 7.

    ELI (2022), 21.

  8. 8.

    Expert Group on Liability and New Technologies (2019), 17.

  9. 9.

    Richards, Neil; Smart William (2013) 1–25. In the same sense, see Hubbard’s (2016) 25–50. On the contrary, see Karnow (2016) 51. See also Nevejans (2016), 6: “civil liability law, for example, might be less easily applied to developments in autonomous robotics, particularly in a scenario where a machine might cause damage that cannot be easily traced back to human error. Whole chapters on civil liability law might, then, need rethinking, including basic civil liability law, accountability for damage, or its social relevance”; and considerandum AB of the Resolution of The European Parliament of 16-2-2107.

  10. 10.

    Karnow (2016), 63.

  11. 11.

    Karnow (2016), 64.

  12. 12.

    Expert Group on Liability and New Technologies (2019), 19.

  13. 13.

    See Expert Group on Liability and New Technologies (2019), 19.

  14. 14.

    See Expert Group on Liability and New Technologies (2019), 20.

  15. 15.

    See Expert Group on Liability and New Technologies (2019), 21.

  16. 16.

    See Expert Group on Liability and New Technologies (2019), 22.

  17. 17.

    ELI (2022), 23.

  18. 18.

    For more details, see Barbosa (2013). The model we propose has sufficient elasticity to accommodate these complex cases. Thus, however, does not mean that it is not necessary to considerer the specificities of the new technological reality.

  19. 19.

    Neves (1996), 38.

  20. 20.

    See Noorman (2012).

  21. 21.

    See Noorman (2008), 32 and Johnson, Deborah G.; Normann, Merel (2014), 143–158.

  22. 22.

    Noorman (2008), 46.

  23. 23.

    Sullins (2006), 23–29.

  24. 24.

    Shoemaker (2011), 603–632. For a critical of Shoemaker, see Smith (2021), 575–589.

  25. 25.

    See Savigny (1840), 310–313.

  26. 26.

    See Cordeiro (2007), 469-475 and Cordeiro (2011), 545, 676.

  27. 27.

    See Cordeiro (2007), 494; Cordeiro (2011), 573–574; Duguit (1901), 1–3; Wolf (1973), 100.

  28. 28.

    Gierke (2010), 470–475.

  29. 29.

    Cordeiro (2007), 517; Cordeiro (2011), 594.

  30. 30.

    Andrade (1997), 50.

  31. 31.

    See Nevejans (2016), 16.

  32. 32.

    See Nevejans (2016), 16.

  33. 33.

    Teubner (2018), 43 s.

  34. 34.

    Teubner (2018), 36.

  35. 35.

    The type, dimension and extension of damages that arise from a specific activity; the difficulty to prove fault; the importance of the violated values are some of the reasons that we can demand in order to justify it. Damage arising from the use of AI mechanisms fulfil these justifications. The use of AI entities may cause damage, for which a human activity does not contribute, as the new AI entities can act autonomously. And if fault can be discovered in reference to some cases, like when the owner of the robot does not actualize the software or does not accomplish the safety duties in order to guarantee that hackers do not invade the system, in many others situations damage arises without any fault.

  36. 36.

    Expert Group on Liability and New Technologies (2019), 36.

  37. 37.

    Expert Group on Liability and New Technologies (2019), 44–45 and 48.

  38. 38.

    In this sense, cf. ELI (2022), 5.

  39. 39.

    Silva (1999), 612 s.

  40. 40.

    Some authors questioned if software could be qualified as a tangible thing, but this opinion has been rejected. About this, see Voit, Wolfgang; Geweke, Götz (2001), 362; Bydlinski (1998), 305.

  41. 41.

    Silva (1999), 613; Bertolini (2013), 214–247; Koch (2019), 105.

  42. 42.

    ELI (2022), 9 s.

  43. 43.

    ELI (2022), 10–11.

  44. 44.

    ELI (2022), 12.

  45. 45.

    ELI (2022), 12.

  46. 46.

    ELI (2022), 14.

  47. 47.

    ELI (2022), 14.

  48. 48.

    Expert Group on Liability and New Technologies (2019), 43.

  49. 49.

    Expert Group on Liability and New Technologies (2019), 43.

  50. 50.

    ELI (2022), 16–17.

  51. 51.

    Expert Group on Liability and New Technologies (2019), 41.

  52. 52.

    Expert Group on Liability and New Technologies (2019), 41.

  53. 53.

    Koch (2005), 105–106.

  54. 54.

    Expert Group on Liability and New Technologies (2019), 39.

  55. 55.

    Expert Group on Liability and New Technologies (2019), 45.

  56. 56.

    Pagallo (2013), 23, 40, 103–105.

  57. 57.

    See Justo (2017), 31–33; Justo (1997), 27.

  58. 58.

    Pagallo (2013), 103–104.

References

  • Barbosa MM (2013) Do nexo de causalidade ao nexo de imputação. Contributo para a compreensão da natureza binária e personalística do requisito causal ao nível da responsabilidade civil extracontratual. Princípia, Cascais

    Google Scholar 

  • Bertolini A (2013) Robots as products: the case for a realistic analysis of robotic applications and liability rules. Law Innovation Technol 5(2):214–247. https://ssrn.com/abstract=2410754

  • Bydlinski P (1998) Der Sachbegriff im elektronischen Zeitalter: zeitlos oder anoassungsbedürftig? Archiv Für Die Civilistische Praxis 198:287–328

    Google Scholar 

  • Cordeiro AM (2007) Tratado de Direito Civil Português, I, Parte Geral, III, Pessoas. Almedina, Coimbra

    Google Scholar 

  • Cordeiro AM (2011) Tratado de Direito Civil, IV. Almedina, Coimbra

    Google Scholar 

  • da Silva JC (1999) Responsabilidade civil do produtor. Almedina, Coimbra

    Google Scholar 

  • de Andrade M (1997) Teoria Geral da Relação Jurídica, I. Almedina, Coimbra

    Google Scholar 

  • Duguit (1901) L’Etat, le Droit objectif et la loi positive. Albert Fontemoing Editeur, Paris

    Google Scholar 

  • ELI (2022) Response to public consultation on civil liability. Available at https://www.europeanlawinstitute.eu. Accessed 1 Nov 2022

  • Expert Group on Liability and New Technologies (2019) Liability for artificial intelligence and other emerging digital technologies. European Union. Available at https://op.europa.eu. Accessed 1 Nov 2022

  • Hubbard FP (2016) Allocating the risk of physical injury from sophisticated robots: efficiency, fairness and innovation. In: Calo R, Froomkin AM, Kerr I (eds) Robot law. Edward Elgar Publishing, Cheltenham, pp 25–50. https://doi.org/10.4337/9781783476732

  • Johnson DG, Merel N (2014) Artefactual agency and artefactual moral agency. In: Kroes P, Verbeek PP (eds) The moral status of artefacts. Springer, Heidelberg/London/New York, pp 143–158

    Chapter  Google Scholar 

  • Justo AS (1997) A Escravatura em Roma. Boletim Da Faculdade De Direito 73:19–33

    Google Scholar 

  • Justo S (2017) Direito Privado Romano: II–Direito das Obrigações. Coimbra Editora, Coimbra

    Google Scholar 

  • Karnow CEA (2016) The application of traditional tort theory to embodied machine intelligence. In: Calo R, Froomkin AM, Kerr I (eds) Robot law. Edward Elgar Publishing, Cheltenham, pp 51–77

    Google Scholar 

  • Koch B (2005) Strict liability. In: Principles of European Tort Law, text and commentary. Springer, Wien, pp 105–106

    Google Scholar 

  • Koch BA (2019) Product liability 2.0—Mere update or new version? In: Lohsse S, Schulze R, Staudenmayer D (eds) Liability for artificial intelligence and the Internet of Things—Münster Colloquia on EU Law and the Digital Economy, vol IV. Nomos, Baden-Baden, pp 99–115

    Google Scholar 

  • Nevejans N (2016) European civil law rules in robotics, European Union: directorate-general for internal policies. Available at https://www.europarl.europa.eu. Accessed 1 Nov 2022

  • Neves AC (1996) Pessoa, Direito e Responsabilidade. Revista Portuguesa de Ciência Criminal. 6:3–66

    Google Scholar 

  • Noorman M (2008) Mind the gap: a critique of human/Technology analogies in artificial agents Discourse. Maastricht Universitaire Press, Maastricht

    Google Scholar 

  • Noorman M (2012) Computing and moral responsibility. In: Standford encyclopedia of philosophy. 2012 (reviewed in 16th February 2018). Available at https://plato.stanford.edu. Accessed 1 Nov 2022

  • Pagallo U (2013) The law of robots. Springer, Heidelberg, London, New York

    Book  Google Scholar 

  • Richards N, Smart WD (2013) How should the law think about robots? SSRN Electron J 1–25. https://doi.org/10.2139/ssrn.2263363

  • Shoemaker D (2011) Attributability, answerability, and accountability: toward a wider theory of moral responsibility. Ethics 121(3):603–632

    Article  Google Scholar 

  • Smith A (2021) Attributability, Answerability, and Accountability: In Defense of a Unified Account. Ethics. 122/3: 575–589

    Google Scholar 

  • Sullins JP (2006) When is a robot a moral agent? Int Rev Inf Ethics 6(12):23–29

    Google Scholar 

  • Teubner G (2018) Digitale rechtssubjekte? Zum privatrechtlichen status autonomer softwareagenten/digital personhood? The status of autonomous software agents in private law. Ancilla Iuris 32–54

    Google Scholar 

  • Voit W, Geweke G (2001) Der praktische Fall—Bürgerliches Recht: Der tükische Computervirus. Juristische Schulung 43:358–374

    Google Scholar 

  • von Savigny F (1840) System des heutigen römischen Rechts, II. Vert, Berlin

    Google Scholar 

  • von Gierke O (2010) Deutsches Privatrecht, I, Allgemeiner Teil und Personenrecht. Duncker & Humblot, Berlin

    Google Scholar 

  • Wolf E (1973) Grundlagen des Gemeinschaftsrechts. Archiv Für Die Civilistische Praxis 173:78–105

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mafalda Miranda Barbosa .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Barbosa, M.M. (2024). Autonomous Systems and Tort Law. In: Moura Vicente, D., Soares Pereira, R., Alves Leal, A. (eds) Legal Aspects of Autonomous Systems. ICASL 2022. Data Science, Machine Intelligence, and Law, vol 4. Springer, Cham. https://doi.org/10.1007/978-3-031-47946-5_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-47946-5_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-47945-8

  • Online ISBN: 978-3-031-47946-5

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics