How AI protections in new SAG-AFTRA agreement affect Canadian actors

By Emma Chapple ·

Law360 Canada (May 9, 2024, 10:02 AM EDT) --
Emma Chapple
Emma Chapple
“What’s the difference between theft and research?” went the old joke back in university. Answer: volume!

Steal one idea, it’s called theft. Steal a lot, it’s called research.

So far, artificial intelligence (AI) has wrestled with these two issues. In entertainment law, AI can generate new content based on precedent and earlier content, aggregated. What the AI platforms call “in the style of ...”

In the shadow of last year’s Screen Actors Guild and the American Federation of Television and Radio Artists (SAG-AFTRA) strike, where the use of AI-generated likenesses to replace live actors was a sticking point in negotiations with the Association of Motion Picture and Television Producers, AI promises to be a hot-button issue as the Alliance of Canadian Cinema, Television and Radio Artists (ACTRA) begins negotiations this year.

Add required Alt Text here for accessibility purposes

Olekcii Mach: ISTOCKPHOTO.COM

The long-term effect of AI on the entertainment industry has yet to be played out, but the red flags are flapping briskly in the wind. It’s a seminal time to be an entertainment lawyer in Toronto, where many of Canada’s creatives produce their content.

When it comes to actors, more high-profile productions are using more sophisticated replicas of performers. The new SAG-AFTRA agreement can give us an idea of how the future of AI might play out for actors, and how the upcoming ACTRA negotiations are affected by this new technology.

There is a lot at stake: How far can production companies push AI to, um, “borrow” someone’s likeness, behaviours, mannerisms, voice and so on? What about artists who are deceased? Do artists own their intellectual property forever? Who must be notified if a production company wants to use AI-generated images of any particular actor? And how will all this translate into fees and residuals? What about inappropriate uses of someone’s likeness or a deepfake? And that’s just a partial list.

Pre-strike use of AI in film

Across the border, actors fought hard for protections governing the use of artificial intelligence in last year’s SAG-AFTRA strike. Prior to the new agreement, the potential future of AI played out in two 2023 Hollywood productions.

Disney’s straight-to-television film Prom Pact made little impact beyond its target audience of tweens and teens. That is until a savvy social media user pointed out that the extras in a crowd scene were clearly not human. Producers employed generative artificial intelligence to scan human background actors and use their images throughout the film. The actors reportedly were not consulted nor paid for the continued use of their image. Aggregated human wallpaper, for free.

Warner Bros.’ The Flash, based on the DC Comics character, featured digital recreations (or “deepfakes”) of several actors from past DC properties, including deceased actors George Reeves, Adam West and Christopher Reeve. Christopher Reeve’s children would later state that they were not consulted about the use of their father’s image in the film.

This use of AI by Disney and Warner Bros. was perhaps cavalier but permissible. AI was a legal grey zone; it wasn’t until SAG-AFTRA ratified its new agreement with the AMPTP that performers had clarity on the use of AI.

New SAG-AFTRA agreement: What now?

The new SAG-AFTRA agreement differentiates between types of “digital replicas.” “Employment-based digital replicas” — that is, deepfakes created in the course of a production with the physical participation of the performer — are generally entitled to compensation and residuals.

An “independently created digital replica” is more akin to Christopher Reeve’s appearance in The Flash: a deepfake created using existing footage to place a performer in a new property. Compensation and residuals for an independently produced digital replica are “freely bargained.”

Productions seeking to use “background actor digital replicas” — like the Prom Pact extras — must generally give notice to performers at least 48 hours in advance that a digital replica will be used, with some exceptions. Guidelines state that background actor replicas cannot be used to meet daily background actor counts or to avoid engagement of background actors. Background actors will be paid day rates for time spent to create the replica, and a principal’s rate if the digital replica is used as a principal character.

In all cases, mandatory consent is required from the performers for use of the digital replicas. In the case of a deceased performer, consent must be granted from a representative of the deceased’s estate or, if no representative can be located, from a representative of SAG-AFTRA.

What does this mean for ACTRA?

As ACTRA and the Canadian Media Producers’ Association heads to the bargaining table, we can expect that the union will be seeking similar protections as were won by SAG-AFTRA. ACTRA leadership has stated that the issues facing actors stateside are the same issues faced by actors in Canada.

Of particular note are the rules regarding the use of deepfake background actors. When Hollywood productions head to Canada for on-location filming, it means opportunities for ACTRA background performers. Without clear guidelines governing the use of digital extras in Canada, a large swath of ACTRA performers are left in limbo.

As technology grows ever more sophisticated, we can expect more film and television producers to employ AI and deepfakes. Even with SAG-AFTRA, the discussion over the use of AI continues. Studios have agreed to meet with union representatives every six months to discuss new developments in technology.

Across the border and here at home, it will take co-operation and negotiation between producers and performers to ensure these tools are used fairly and with adequate protections for performers. It’s a great time to be an entertainment lawyer.

Emma Chapple practises at the confluence of Entertainment Law, Business Law and Civil/Commercial Litigation at Massey LLP.

 The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, LexisNexis Canada, Law360 Canada or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.

Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Richard Skinulis at Richard.Skinulis@lexisnexis.ca or call 437-828-6772.