Artificial Intelligence is revolutionising the media and entertainment industry, offering unprecedented capabilities in content creation, curation, and personalisation. Among these advancements, Sora stands out as a prime example of AI's developing transformative power in new media. With such developments, it is crucial to understand the limitations and legal implications that come with AI.
What is Sora?
Sora is an AI model, currently unreleased to the public, developed by Open AI which creates videos up to 1 minute long from text instructions. Sora is capable of generating content, recommend personalised media experiences, and even creating deepfake videos. Sora’s capabilities extend beyond simple automation, employing sophisticated algorithms to produce engaging and highly tailored content for users.
AI applications in media are diverse and growing and these trends underscore the need for a comprehensive understanding of the legal frameworks that govern AI in media.
Intellectual Property Rights of AI
One of the most pressing questions in the realm of AI is the issue of authorship and ownership. When Sora creates a piece of content, who owns it? Current legal frameworks often struggle to address this question, as traditional IP laws are designed with human creators in mind. The debate continues whether the developer, the user, or another entity holds the rights to AI-generated works.
Despite the ambiguities, there are strategies to protect AI-generated content. Registering works under the developer’s or user’s name, and clearly defining ownership in contracts, can provide a level of protection. However, enforcing these rights can be challenging, especially across different jurisdictions.
In the UK, the IPO announced in June 2023 that they were developing a code of practice for copyright and AI. Although there was no plan for legislation directed towards AI in the King’s Speech in November 2023, by April 2024 the House of Commons Culture, Media, and Sport Committee published a report recommending that content creators should enforce their consent and receive fair compensation for use of their work by AI developers.
Under EU laws, copyright is eligible for any work if it is original in the sense of being the author’s own intellectual creation which can be proven. The UK case law originally required in addition to this a certain amount of time, skill, and labour. In the UK, the Copyright, Designs and Patents Act 1988 provides for literary, dramatic, musical, and artistic works being computer-generated’ and provides that ‘the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken’.
Prompt-based generative AI differs in proposition to the computer-generated work in such a case and no case law has considered how computer-generated work would satisfy the requirement under EU case law that a work will be the author’s own intellectual creation’. Furthermore, the relevant platform’s terms of use must be considered. At the time of writing, the terms of service for ChatGPT, for example, purport to assign any copyright in the output to the user, but these terms may vary from platform to platform and may change over time as well.
Recent Case Studies
Several companies have successfully integrated AI like Sora into their media strategies. These companies often employ legal strategies to protect their AI initiatives and mitigate risks.
In June 2024, Toys “R” Us released the first ad created with Sora, which proved highly controversial to critics. The ad highlights the capacities of AI-generated videos through subtle errors where there seems to be inconsistencies throughout, including the model of the main character, who appears slightly different in each scene and seems to move in a rather unnatural sense. Furthermore, there are arguments that the rise in use of AI for commercial use shows the decline in human creativity.
Amazon also recently announced a new Winnie-the-Pooh series which was created using AI for efficiency. However, there seems to similarly have errors in the images that were used to promote this release. The use of generative AI to write and illustrate children’s books have become increasingly common, which is followed with a rise in backlash at the use of AI directed towards children and a young audience.
Examining legal disputes involving AI can provide valuable insights. For example, cases involving deepfake technology highlight the need for clear legal standards and robust enforcement mechanisms. Businesses can learn from these disputes to better navigate the legal landscape.
The integration of AI in new media, exemplified by systems like Sora, brings significant legal implications where understanding and addressing these challenges is crucial. As AI continues to evolve, so too must our legal frameworks and strategies. Staying informed and proactive in navigating these issues will be key to leveraging AI's full potential in the media industry.
Angelina Hong 2024