Actors strike over pay and AI

Ius Laboris
Contact

Ius Laboris

Actors and screenwriters in the US are on strike, seeking better pay and protections around the use of AI in their industry.
 

Actors in the United States began strike action on 14 July in a dispute with studios over pay, residual payments for re-runs and the use of artificial intelligence in the creative industries, joining film and TV writers who have been on strike since 2 May.

In resisting actors’ and writers’ demands, studios cite massive changes in consumer media spending habits, driven by the advent of both streaming services and social media, which have seen significant reductions in the profitability of production companies. With intense competition between the many streaming services currently on the US market, these have been losing money for their parent companies and seem set to continue to lose for the foreseeable future. The effect of these losses on companies’ bottom lines has been reflected in stock prices, which have lagged significantly behind major US market indices.

While the unions involved in these strikes—Sag-Aftra for actors and the WGA for writers—are based in the US, the effects of the walkouts are being felt globally, given the dominant position of the US film and television industries. In addition, Sag-Aftra’s Global Rule One prohibits members from performing services for any employer in any jurisdiction where the union has a collective bargaining agreement in place. When the strike was announced, the stars of Christopher Nolan’s ‘Oppenheimer’, including Cillian Murphy, Florence Pugh, Matt Damon, and Emily Blunt left the film’s London premiere.

Equity, the UK actors’ and performing artists’ union, has expressed its full solidarity with the strikes. Its members are, however, prevented from taking action in sympathy with their US colleagues under the UK’s highly restrictive trade union laws. These limit lawful industrial action to circumstances connected to a dispute between workers and their own employers, and, controversially, impose stringent balloting requirements. Equity members did, however, hold a rally in Central London this past Saturday, echoing the concerns raised in the US.

The threat of AI to the acting profession

While the actors’ and writers’ demands on pay and conditions have much in common with many industrial disputes, their concerns over the use of artificial intelligence are novel. This reflects the recent development and rise to prominence of generative AI based on large language models (LLMs), which are capable of replicating not just someone’s writing style, but also their voice and likeness. Actors and writers fear not just a loss of work across their industries due to AI, but also that their computer-generated likenesses could be used by studios or their assignees, perhaps forever more—potentially long after they are dead, without further payment, and without their further consent. There are also concerns that these likenesses could be used not just in AI-generated entertainment, but also in ways the actor might never have approved of, for advertising, fraud or in spreading political or other disinformation.

The particular risks to voice actors

In this article, we consider a subgroup of the acting profession as an example of the risk AI may pose: voice actors, and we do so within a European context. Voice actors are not on strike in the US, but are worried in many countries around the world about the impact AI could have on their future careers. They are currently used in commercial advertising, in dubbing films, voicing the narrative of documentaries, and all manner of service announcements, for example for public transport.

Voice actors are no strangers to the threats to their profession that have come from technology and automation, but this has, up till now, been less of a concern in the context of entertainment and cultural production. Many announcements of the kind made at airports, railway stations and on public transportation are made up of a combination of whole recorded phrases cut together with sections that are themselves spliced together from individually recorded words or even phonemes, the basic sounds that make up words. But to create smooth announcements from fragments in this way can be a labour-intensive task. Simone Hérault, the voice of the French national railways for more than 40 years, told an interviewer in 2021 that she continued to make new recordings for station announcements on a monthly basis.

While voice actors have long had the possibility of recording an extensive list of words, or a whole set of phonemes, and to sign away the rights to these recordings for all purposes and all time, in the past, they were likely at least to know what was being done. The original recordings for the American English voice of Apple’s Siri, for example, were made by a voice actor named Susan Bennett, speaking for four hours per day for the whole month of July 2005. Today, actors are known to have been asked to record extensive lists of words and phonemes supposedly for ‘research purposes’, only later to discover that they have been sold on to AI companies for use in machine learning and the generation of synthetic voices. AI-powered voice cloning systems have no need of extensive and deliberate recordings, and can be trained on a relatively short recordings made for another purpose.

Is an actor’s voice protected in Europe by the GDPR?

Associations representing voice actors in Europe, the Americas and in Turkey have recently combined to form United Voice Artists (UVA), which has published a series of demands that aim, principally, to influence the final stage of the legislative process that is expected to lead to the adoption of the EU’s proposed AI Act. UVA say that they are concerned about the ‘inherent risks, both legal and ethical, in the conception, training and marketing of AI generated content [and t]he need to adjust the protection of artists’ rights and GDPR rules, with the development of AI technologies in Europe.’

The idea of capturing the signature of someone’s unique voice, and then using that information to generate further content in a context detached from that for which initial recordings were made raises difficult and untested legal questions. A person’s voice print, which is a mathematical representation of an individual’s distinctive voice pattern generated by analysing their speech, is only ambiguously protected under European data protection law. While voice information falls within the GDPR definition of biometric personal data, that regulation was drawn up to address issues arising in the context of voice identification of the kind used by telecoms, banks or government departments offering services over the phone. The legal issues involved in the use of someone’s voice for generative AI go well beyond those envisaged when the GDPR was adopted.

The voice print that would be created by training a generative AI system on a particular actor’s voice would not be made for the purpose of identifying the actor, but rather in order to create further content in that known actor’s voice. This means that the higher level of protection afforded to ‘special category’ data under the GDPR would not apply, and that data could be processed under ordinary rules based on contract or consent. Similarly, proposed AI Act prohibitions and restrictions on the use of AI systems in the analysis of biometric data to recognise individuals, or to assess their health, emotional or intentional states would not restrict the creation of content using AI trained on new material that an actor recorded for a studio under a contract, or on archival material that a studio already owned. As things stand, the use of recordings will be governed by ordinary contract and intellectual property law, which have not yet had to grapple with the questions at hand.

UVA has stated a number of demands, which they hope will protect their occupation and go some way to safeguarding the quality and value of their contribution to culture and cultural production. They argue for civil liability for the creators of AI systems and those who provide ‘downstream applications’, broadly reflecting the approach recently approved by the European Parliament in its consideration of the draft AI Act. They call on the EU to require more stringent labelling requirements, that would clearly and audibly identify materials produced by AI.

Their most radical proposal, however, is for a ‘moratorium on the use of voice synthesisation and cloning techniques with generative AI […] until there is clear regulation protecting the rights of all voice professionals, securing the continuity of their cultural role’. At this late stage in the EU legislative process, however, UVA is unlikely to be able to influence the final AI Act text without the strong support of the government of at least one EU member state.

Why employers need to be aware of what’s happening

Actors and screenwriters, just like doctors and university lecturers who have recently taken strike action in the UK, may not fit the stereotypes associated with striking workers and labour disputes. For one thing, they are normally self-employed and rarely take action in the kind of unison that is currently being seen in the US. These disputes provide, however, a first set of high-profile examples of what may lie ahead across a range of industries, especially those beset with major changes driven by new technology, and are therefore things employers need to be aware of. Media companies are yet to find a stable and profitable business model given the disruption caused by streaming services. Major changes in this and other sectors are inevitable.

But none of this means that the workers whose creativity and labour are the source of company profits should disproportionately pay the price. Some kind of new settlement will, therefore, need to be found. No doubt efforts to find the right accommodation both for businesses and the individuals whose personal characteristics are up for grabs will be the subject of some intense negotiations for some time to come.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Ius Laboris | Attorney Advertising

Written by:

Ius Laboris
Contact
more
less

Ius Laboris on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide