Abstract
At the present stage of their development, as the European Law Institute pointed out, algorithms have five main characteristics: complexity, increasing autonomy, opacity, openness and vulnerability. Being able to learn through cumulative experience and to take independent decisions and standing as true ecosystems of connected elements, the autonomous systems pose a challenge for the classic tort law remedies. Thus, if damage or harm caused by an autonomous system occurs, who can be liable? In this paper, after showing the difficulties of the classic remedies, we will consider some possible solutions for this problem, examining three separate solutions: the establishment of a fund for the compensation of AI harm; the direct liability of the autonomous systems; and the establishment of a new hypothesis of strict liability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
ELI (2022), 21 s.
- 2.
ELI (2022), 23.
- 3.
Expert Group on Liability and New Technologies (2019), 17 s.
- 4.
- 5.
Nevejans (2016), 6.
- 6.
Karnow (2016), 51–77.
- 7.
ELI (2022), 21.
- 8.
Expert Group on Liability and New Technologies (2019), 17.
- 9.
Richards, Neil; Smart William (2013) 1–25. In the same sense, see Hubbard’s (2016) 25–50. On the contrary, see Karnow (2016) 51. See also Nevejans (2016), 6: “civil liability law, for example, might be less easily applied to developments in autonomous robotics, particularly in a scenario where a machine might cause damage that cannot be easily traced back to human error. Whole chapters on civil liability law might, then, need rethinking, including basic civil liability law, accountability for damage, or its social relevance”; and considerandum AB of the Resolution of The European Parliament of 16-2-2107.
- 10.
Karnow (2016), 63.
- 11.
Karnow (2016), 64.
- 12.
Expert Group on Liability and New Technologies (2019), 19.
- 13.
See Expert Group on Liability and New Technologies (2019), 19.
- 14.
See Expert Group on Liability and New Technologies (2019), 20.
- 15.
See Expert Group on Liability and New Technologies (2019), 21.
- 16.
See Expert Group on Liability and New Technologies (2019), 22.
- 17.
ELI (2022), 23.
- 18.
For more details, see Barbosa (2013). The model we propose has sufficient elasticity to accommodate these complex cases. Thus, however, does not mean that it is not necessary to considerer the specificities of the new technological reality.
- 19.
Neves (1996), 38.
- 20.
See Noorman (2012).
- 21.
- 22.
Noorman (2008), 46.
- 23.
Sullins (2006), 23–29.
- 24.
- 25.
See Savigny (1840), 310–313.
- 26.
- 27.
- 28.
Gierke (2010), 470–475.
- 29.
- 30.
Andrade (1997), 50.
- 31.
See Nevejans (2016), 16.
- 32.
See Nevejans (2016), 16.
- 33.
Teubner (2018), 43 s.
- 34.
Teubner (2018), 36.
- 35.
The type, dimension and extension of damages that arise from a specific activity; the difficulty to prove fault; the importance of the violated values are some of the reasons that we can demand in order to justify it. Damage arising from the use of AI mechanisms fulfil these justifications. The use of AI entities may cause damage, for which a human activity does not contribute, as the new AI entities can act autonomously. And if fault can be discovered in reference to some cases, like when the owner of the robot does not actualize the software or does not accomplish the safety duties in order to guarantee that hackers do not invade the system, in many others situations damage arises without any fault.
- 36.
Expert Group on Liability and New Technologies (2019), 36.
- 37.
Expert Group on Liability and New Technologies (2019), 44–45 and 48.
- 38.
In this sense, cf. ELI (2022), 5.
- 39.
Silva (1999), 612 s.
- 40.
- 41.
- 42.
ELI (2022), 9 s.
- 43.
ELI (2022), 10–11.
- 44.
ELI (2022), 12.
- 45.
ELI (2022), 12.
- 46.
ELI (2022), 14.
- 47.
ELI (2022), 14.
- 48.
Expert Group on Liability and New Technologies (2019), 43.
- 49.
Expert Group on Liability and New Technologies (2019), 43.
- 50.
ELI (2022), 16–17.
- 51.
Expert Group on Liability and New Technologies (2019), 41.
- 52.
Expert Group on Liability and New Technologies (2019), 41.
- 53.
Koch (2005), 105–106.
- 54.
Expert Group on Liability and New Technologies (2019), 39.
- 55.
Expert Group on Liability and New Technologies (2019), 45.
- 56.
Pagallo (2013), 23, 40, 103–105.
- 57.
- 58.
Pagallo (2013), 103–104.
References
Barbosa MM (2013) Do nexo de causalidade ao nexo de imputação. Contributo para a compreensão da natureza binária e personalística do requisito causal ao nível da responsabilidade civil extracontratual. Princípia, Cascais
Bertolini A (2013) Robots as products: the case for a realistic analysis of robotic applications and liability rules. Law Innovation Technol 5(2):214–247. https://ssrn.com/abstract=2410754
Bydlinski P (1998) Der Sachbegriff im elektronischen Zeitalter: zeitlos oder anoassungsbedürftig? Archiv Für Die Civilistische Praxis 198:287–328
Cordeiro AM (2007) Tratado de Direito Civil Português, I, Parte Geral, III, Pessoas. Almedina, Coimbra
Cordeiro AM (2011) Tratado de Direito Civil, IV. Almedina, Coimbra
da Silva JC (1999) Responsabilidade civil do produtor. Almedina, Coimbra
de Andrade M (1997) Teoria Geral da Relação Jurídica, I. Almedina, Coimbra
Duguit (1901) L’Etat, le Droit objectif et la loi positive. Albert Fontemoing Editeur, Paris
ELI (2022) Response to public consultation on civil liability. Available at https://www.europeanlawinstitute.eu. Accessed 1 Nov 2022
Expert Group on Liability and New Technologies (2019) Liability for artificial intelligence and other emerging digital technologies. European Union. Available at https://op.europa.eu. Accessed 1 Nov 2022
Hubbard FP (2016) Allocating the risk of physical injury from sophisticated robots: efficiency, fairness and innovation. In: Calo R, Froomkin AM, Kerr I (eds) Robot law. Edward Elgar Publishing, Cheltenham, pp 25–50. https://doi.org/10.4337/9781783476732
Johnson DG, Merel N (2014) Artefactual agency and artefactual moral agency. In: Kroes P, Verbeek PP (eds) The moral status of artefacts. Springer, Heidelberg/London/New York, pp 143–158
Justo AS (1997) A Escravatura em Roma. Boletim Da Faculdade De Direito 73:19–33
Justo S (2017) Direito Privado Romano: II–Direito das Obrigações. Coimbra Editora, Coimbra
Karnow CEA (2016) The application of traditional tort theory to embodied machine intelligence. In: Calo R, Froomkin AM, Kerr I (eds) Robot law. Edward Elgar Publishing, Cheltenham, pp 51–77
Koch B (2005) Strict liability. In: Principles of European Tort Law, text and commentary. Springer, Wien, pp 105–106
Koch BA (2019) Product liability 2.0—Mere update or new version? In: Lohsse S, Schulze R, Staudenmayer D (eds) Liability for artificial intelligence and the Internet of Things—Münster Colloquia on EU Law and the Digital Economy, vol IV. Nomos, Baden-Baden, pp 99–115
Nevejans N (2016) European civil law rules in robotics, European Union: directorate-general for internal policies. Available at https://www.europarl.europa.eu. Accessed 1 Nov 2022
Neves AC (1996) Pessoa, Direito e Responsabilidade. Revista Portuguesa de Ciência Criminal. 6:3–66
Noorman M (2008) Mind the gap: a critique of human/Technology analogies in artificial agents Discourse. Maastricht Universitaire Press, Maastricht
Noorman M (2012) Computing and moral responsibility. In: Standford encyclopedia of philosophy. 2012 (reviewed in 16th February 2018). Available at https://plato.stanford.edu. Accessed 1 Nov 2022
Pagallo U (2013) The law of robots. Springer, Heidelberg, London, New York
Richards N, Smart WD (2013) How should the law think about robots? SSRN Electron J 1–25. https://doi.org/10.2139/ssrn.2263363
Shoemaker D (2011) Attributability, answerability, and accountability: toward a wider theory of moral responsibility. Ethics 121(3):603–632
Smith A (2021) Attributability, Answerability, and Accountability: In Defense of a Unified Account. Ethics. 122/3: 575–589
Sullins JP (2006) When is a robot a moral agent? Int Rev Inf Ethics 6(12):23–29
Teubner G (2018) Digitale rechtssubjekte? Zum privatrechtlichen status autonomer softwareagenten/digital personhood? The status of autonomous software agents in private law. Ancilla Iuris 32–54
Voit W, Geweke G (2001) Der praktische Fall—Bürgerliches Recht: Der tükische Computervirus. Juristische Schulung 43:358–374
von Savigny F (1840) System des heutigen römischen Rechts, II. Vert, Berlin
von Gierke O (2010) Deutsches Privatrecht, I, Allgemeiner Teil und Personenrecht. Duncker & Humblot, Berlin
Wolf E (1973) Grundlagen des Gemeinschaftsrechts. Archiv Für Die Civilistische Praxis 173:78–105
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Barbosa, M.M. (2024). Autonomous Systems and Tort Law. In: Moura Vicente, D., Soares Pereira, R., Alves Leal, A. (eds) Legal Aspects of Autonomous Systems. ICASL 2022. Data Science, Machine Intelligence, and Law, vol 4. Springer, Cham. https://doi.org/10.1007/978-3-031-47946-5_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-47946-5_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-47945-8
Online ISBN: 978-3-031-47946-5
eBook Packages: Law and CriminologyLaw and Criminology (R0)