1.
Introduction ^
Artificial intelligence («AI»)1 is no doubt a popular topic for legal academia as this technology pushes boundaries in many aspects of law. AI is often considered as a disruptive technology and – even though much research is yet to be done – is already (claimed to be) used for creating music, works of art, product information, and more.2 While multiple economic, legal and philosophical questions are linked to the use of AI, lawyers have considered how this technology fits within the copyright framework. A number of contributions have already touched upon the issue, primarily with a focus on economic rights; however, less research has been done to identify challenges in the realm of moral rights. This paper thus intends to outline special issues in connection with moral rights and AI, and as specific rules on AI are (still) the exception both on the national and the international level, it also hopes to contribute thoughts on a possible regime de lege ferenda.
2.1.
Moral copyrights under international and European copyright law ^
Rules on moral rights on the international and European level have traditionally been sparse.3 However, the Berne Convention4 provides that «the author shall have the right to claim authorship of the work and to object to any distortion, mutilation or other modification of, or other derogatory action in relation to, the said work, which would be prejudicial to his honor or reputation» (Art 6bis).5 It is commonly understood that this language refers, on the one hand, to the right to attribution and, on the other hand, the right to integrity of the work.6 Apart from this basic principle, the boundaries of the concept of moral rights are not uniform; other rights that national legislation may grant in addition to the Berne Convention’s minimum standard include the right to disclosure and withdrawal, or to prohibit the destruction of the work.7 European copyright law – which, for the most parts, consists of directives that must be transposed into national law – does not, in principle, harmonize moral rights. Rather, several European directives confirm that moral rights are not within their respective scope.8 Accordingly, these directives do not oblige member states to provide for moral rights and reform projects to harmonize the European framework have been postponed.9 However, rules that might be called an attribution right can be found in connection with exceptions and limitations to exclusive rights.10
2.2.
Protection of AI-generated content ^
At the outset, one should distinguish between the protection of the AI itself and the content produced by the AI,11 the latter being of interest for the present purposes. There is no general definition of protectable subject-matter in EU copyright law; however, the applicable directives state that databases, computer programs and photographs qualify for copyright protection if they are «the author’s own intellectual creation».12 The ECJ relies on this concept also in more general contexts13 and emphasizes that protectable subject-matter reflects the author’s «personality», therefore stamping «the work created with his ‹personal touch›».14 Against this background, there is a strong argument that copyrightable works necessarily involve the interaction of a human being,15 thus excluding content produced exclusively by an AI. While Member States are hence not obliged to award protection to AI-produced content, there is nothing that prevents them from doing so. Yet, protection in this field is not widespread, the UK being a notable exception.
2.3.
The protection of computer-generated works in the UK ^
The UK has two regimes insofar where moral rights of authors are concerned. Traditionally, authors only had rights derived from the common law to assert their moral rights on works, such as the tort of defamation or the tort of passing off. With the enactment of the Copyright, Designs and Patents Act 1988 («CDPA»),16 authors – which is defined in s.9 of the said act – now have moral rights afforded to them by virtue of sections 77-89. The CDPA provides four key rights – (i) the right to be identified as author or director; (ii) the right to object to derogatory treatment of work; (iii) the right to prevent false attribution of work; and (iv) the right to privacy of certain photographs and films. The CDPA provides for cases where the work is «computer-generated»; «computer-generated» is defined in s.178 as «in relation to a work, means that the work is generated by computer in circumstances such that there is no human author of the work». Where a work is «computer-generated», s.9 provides that «the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken», although this only applies to literary, dramatic, musical or artistic works. As for moral rights, the CDPA has exceptions. Computer-generated works are not afforded the right to be identified as author or director, nor the right to object to derogatory treatment of work as per s.79 and s.81. For false attribution of work, the CDPA is silent on whether computer-generated work is an exception.
3.
Moral rights in AI-created content ^
Whether the law should grant rights in AI-generated content has been discussed with respect to economic rights and has been studied on the basis of different dogmatic justifications; this also involves the question who (or which entity) should be the holder of those rights.17 Usually, the AI itself, the creators of the AI and the users of the AI are mentioned as possible right holders.18 The same question can essentially be raised in connection with moral rights. However, while economic rights are commonly understood to provide incentives for the production of creative content, it should not be overlooked that moral rights have a special objective, aiming at the protection the right holder’s personality and her non-economic interests.19
3.1.
AI as the owner of moral rights ^
The core issue in connection with ownership of moral rights by AI is whether the law should acknowledge a particular AI as a «person»20 that enjoys, through the grant of a special set of rights, protection relating to «personality» or «personhood». In fact, the European Parliament’s Committee on Legal Affairs called for a consideration of a specific legal status of «electronic persons».21 However, most of the discussion in this context centres on liability issues which are, in our view, quite different from the non-economic interests of an author in her creations.22 In our opinion, machines do not strive, e.g., for social recognition that could be gained from a right to attribution and no personal connection to the produced content can be harmed that could be protected by the right to integrity, as presupposed by the governing moral rights rationale. In fact, in can be doubted whether the interest- and incentive-based concept of copyright law suits machines in the first place.23 Thus, from today's perspective, we believe that the concept of moral rights (or related concepts of «personhood») are not suited for AI.
3.2.
Creators or users of AI as the owner of moral rights ^
The next consideration is whether humans – i.e., the creators or the users of the AI – should have moral rights in the content produced by the AI. Where AI is a tool used by humans, this does not pose specific issues. However, the situation is different where the AI produces the content without human intervention and hence does the actual «creative» work. The question here is whether the creator or the user has a special non-economic connection to the content generated by the AI that copyright law should protect by the grant of moral rights. We do not believe that this is the case: moral rights are granted because the work represents «an extension of the author’s personhood»;24 the author’s personal traits are imprinted on the work by the intellectual process of its creation. Thus, in absence of this creative process, there is a lack of the intimate connection between the creator or the user of the AI and the produced content that moral rights have traditionally sought to protect.25 This also seems to be the reason why moral rights are expressly excluded from the protection of CGW in the UK.26 Yet, not only is there no special relation to the content that needs to be protected; on the flipside, it seems to us that the recognition and appreciation of authorship would be unfair if no creative input is contributed by the creators or users of the AI. Against this background, we believe that, in principle, – if AI-produced content should be awarded some form of protection – no moral rights should be granted in this content.27 Thus, similar to the solution found in the UK, a regulatory model could28 grant some types of economic rights to the creators or users of the AI without moral rights; this makes the protection of AI-generated content akin to the field of related rights.29 This might add flexibilities, as ownership by legal persons or the (full) transfer of potential rights in AI-generated content would be more easily compatible with the droit d’auteur tradition.
4.
Conclusion ^
Currently, no moral rights exist in AI-generated content under EU law and many national copyright acts. Whether this should change is connected to the question whether the law should recognize that AI have some form of «personality» or non-economic interests; we believe that this is neither necessary nor desirable. Likewise, we do not believe that the creators or users of the AI should be afforded moral rights in AI-generated content due to the lack of a personal relation to the content that moral rights typically protect.
- 1 In this contribution, AI is meant as the ability of computer software to produce content autonomously.
- 2 See for some examples, e.g., Niebla Zatarain, The role of automated technology in the creation of copyright works: the challenges of artificial intelligence, International Review of Law, Computers and Technology 2017, p. 91 (p. 93-95); Schönberger, Deep Copyright: Up- and Downstream Questions Related to Artificial Intelligence (AI) and Machine Learning (ML), Intellectual Property Journal 2018, p. 35 (p. 40-41). See also Yu, Alibaba e-commerce merchants turn to AI for content creation, https://www.zdnet.com/article/alibaba-e-commerce-merchants-turn-to-ai-for-content-creation/ (accessed on 4 January 2019).
- 3 Rigamonti, Deconstructing Moral Rights, Harvard International Law Journal 2006, p. 353 (p. 357-358).
- 4 Berne Convention for the Protection of Literary and Artistic Works of September 9, 1886, completed at Paris on May 4, 1896, revised at Berlin on November 13, 1908, completed at Berne on March 20, 1914, revised at Rome on June 2, 1928, revised at Brussels on June 26, 1948, and revised at Stockholm on July 14, 1967.
- 5 A modified version of this guideline is contained in the WIPO Performances and Phonograms Treaty (WPPT) art. 5(1).
- 6 Rigamonti, supra note 3, p. 358.
- 7 Ibid. at p. 362-367. Arguably, the right of disclosure can be derived from other provisions of the Berne Convention, see Ricketson/Ginsburg, International Copyright and Neighbouring Rights I2, Oxford University Press, Oxford 2006, p. 614.
- 8 Council Directive 93/83/EEC of 27 September 1993 on the coordination of certain rules concerning copyright and rights related to copyright applicable to satellite broadcasting and cable retransmission, O.J. L 1993/248, 15-21, recital 28; Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases, O.J. L 1996/77, 20-28, recital 28; Directive 2001/29/EC of the European Parliament and of the Council of 22 May 2001 on the harmonisation of certain aspects of copyright and related rights in the information society, O.J. L 2001/167, 10-19, recital 19; Directive 2006/116/EC of the European Parliament and of the Council of 12 December 2006 on the term of protection of copyright and certain related rights, O.J. L 2006/372, 12-18, recital 20, art. 9.
- 9 Von Lewinski/Walter, Rights of authors. In: Von Lewinski/Walter (eds.), European Copyright Law, Oxford University Press, Oxford 2010, p. 1472 (p. 1473).
- 10 Ibid. at p. 1472; see, e.g., Directive 2001/29/EC art. 5(3)(a), (c), (d), (f); Directive 2012/28/EU of the European Parliament and of the Council of 25 October 2012 on certain permitted uses of orphan works, O.J. L 2012/299, 5-12, art. 6(3). See also art. 10, 10bis of the Berne Convention.
- 11 Hetmank/Lauber-Rönsberg, Künstliche Intelligenz – Herausforderungen für das Immaterialgüterrecht, GRUR 2018, p. 574 (p. 575) (discussing the protection of AI as a computer program and under patent law doctrines).
- 12 Directive 96/9/EC art. 3(1); Directive 2006/116/EC art. 6; Directive 2009/24/EC of the European Parliament and of the Council of 23 April 2009 on the legal protection of computer programs, O.J. L 2009/111, 16-22, art. 1(3).
- 13 ECJ 16 July 2009, C-5/08, at 37 (on the term «work» in the Directive 2001/29/EC); see also ECJ 4 October 2011, C-403/08, at 97.
- 14 ECJ 1 December 2011, C-145/10, at 88, 92; see also ECJ 1 March 2012, C-604/10, at 38.
- 15 See, e.g., Ihalainen, Computer creativity: artificial intelligence and copyright, Journal of Intellectual Property Law & Practice 2018, p. 724 (p. 727); Niebla Zatarain, supra note 2, p. 97; see also Handig, The Copyright Term «Work» – European Harmonisation at an Unknown Level, International Review of Intellectual Property and Competition Law 2009, p. 665 (p. 672).
- 16 Copyright, Designs and Patents Act 1988, c. 48.
- 17 See for an insightful discussion Galajdová, Deadlock in Protection of Software Developed by AI. In: Schweighofer/Kummer/Saarenpää/Schafer (eds.), Data Protection/Legal Tech, Editions Weblaw, Bern 2018, p. 601.
- 18 See, e.g., Hristov, Artificial Intelligence and the Copyright Dilemma, IDEA 2017, p. 431 (p. 443) (mentioning programmers, owners and end users as possible human right holders).
- 19 It should be noted, however, that in some European countries economic and moral rights are considered as a unity («monistic theory», prevailing in, e.g., Austria and Germany – as opposed to the «dualistic theory» in, e.g., Belgium, France), see, e.g., Rigamonti, supra note 3, p. 360 («unity of moral and economic rights»); Walter, Österreichisches Urheberrecht I, Medien und Recht, Vienna 2008, p. 263.
- 20 See on this issue, e.g, Solaiman, Legal personality of robots, corporations, idols and chimpanzees: a quest for legitimacy, Artificial Intelligence and Law 2017, p. 155.
- 21 Committee on Legal Affairs, Report with recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL)) Point 59(f). As far as can be seen, the Commission does not elaborate on this issue in its 2018 Communication on AI, see European Commission, Communication from the Commission to the European Parliament, The European Council, The Council, The European Economic and Social Committee and the Committee of the Regions, Artificial Intelligence for Europe, COM(2018) 237 final.
- 22 See, however, Committee on Legal Affairs, supra note 21, p. 28 («elaborate criteria for an «own intellectual creation» for copyrightable works produced by computers or robots»).
- 23 See, e.g., Samuelson, Allocating Ownership Rights in Computer-Generated Works, University of Pittsburgh Law Review 1986, p. 1185 (p. 1199); furthermore, it can be argued that allocating rights to machines would ultimately diminish the incentive for humans to invest in intellectual undertakings, see Schönberger, supra note 2, p. 46-47.
- 24 Rigamonti, supra note 3, p. 355-356.
- 25 See also Bridy, Coding Creativity: Copyright and the Artificially Intelligent Author, Stanford Technology Law Review 2012, p. 1 (p. 25) («radically mediated relationship to human authorship and creativity»).
- 26 HL Deb vol. 493 col 1305 25 February 1988 («Moral rights are closely concerned with the personal nature of creative effort, and the person by whom the arrangements necessary for the creation of a computer-generated work are undertaken will not himself have made any personal, creative effort.»).
- 27 Unjustified claims to authorship can probably be better dealt with by employing instruments offered by other legal fields, e.g., competition law.
- 28 Please note that we do not, for the present purposes, consider the question whether or to what extent economic rights should be granted in AI-generated content. We would also like to emphasize that the discussion around attribution in the moral rights context is independent of the question whether users or creators of AI should by liable for the content generated or the actions carried out by the AI.
- 29 However, moral rights can also be found in this field, see, e.g., WPPT Art 5(1).