Home » today » Business » Video game actors and acrobats want the “essence” of their work to be protected from AI

Video game actors and acrobats want the “essence” of their work to be protected from AI

For hours, motion-capture sensors attached to Noshir Dalal’s body tracked his movements as he unleashed aerial attacks, overhead slams and one-handed attacks that would later appear in a video game. He swung the sledgehammer he held in his hand so many times that he snapped a tendon in his forearm. By the end of the day, he couldn’t open his car door handle.

The physical effort involved in this type of movement work, and the hours spent on it, are part of the reason he believes all video game performers should be equally protected from the use of unregulated artificial intelligence.

Video game artists say they fear AI could reduce or eliminate job opportunities because the technology could be used to replicate a performance in a number of other movements without their consent. That’s a concern that led the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) to go on strike in late July.

“If motion capture actors, video game actors in general, are just making that day’s money … that can be a really slippery slope,” said Dalal, who played Bode Akuna in “Star Wars Jedi: Survivor.” “Instead of saying, ‘Hey, we’re going to bring you back,’ … They’re just not going to bring me back at all and they’re not going to tell me at all that they’re doing this. That’s why transparency and compensation are so important to us in AI protection.”

Hollywood video game artists announced a work stoppage, the second in a decade, after more than 18 months of negotiations on a new interactive media deal with video game industry giants broke down over artificial intelligence protections. Union members have said they are not against AI. Artists are concerned, however, that the technology could provide studios with a means to displace them.

Dalal said she took it personally when she heard that video game companies negotiating with SAG-AFTRA over a new contract wanted to consider some motion work as “data” rather than performance.

If players were to tally up the cutscenes they see in a game and compare them to the hours they spend controlling characters and interacting with non-player characters, they would find that they interact with the work of motion actors and stunt artists “a lot more than you interact with my work,” Dalal said.

“They’re the ones that sell the world that these games live in, when you’re doing combos and doing crazy, super-cool moves using Force powers, or you’re playing Master Chief, or you’re Spider-Man swinging through the city,” he said.

Some actors argue that AI could rob less experienced actors of the opportunity to land smaller roles, such as non-player characters, where they typically build their experience before landing bigger jobs. Unchecked use of AI, artists say, could also lead to ethical issues if their voices or likenesses are used to create content they don’t morally agree with. That kind of ethical dilemma has recently cropped up with games, where fans alter and create new content. Last year, voice actors spoke out against this approach in the role-playing game “Skyrim,” which used AI to generate actors’ performances and cloned their voices for pornographic content.

In video game motion capture, actors wear special lycra or neoprene suits with markers. In addition to more complex interactions, actors perform basic movements like walking, running, or holding an object. Animators take those motion capture recordings and string them together to respond to what someone using the game is doing.

“What AI is allowing game developers to do, or game studios to do, is generate a lot of those animations automatically from past footage,” said Brian Smith, an assistant professor in the Department of Computer Science at Columbia University. “Studios no longer need to collect new footage for every game and every type of animation they’d like to create. They can also draw on their archive of past animations.”

If a studio has motion capture from a previous game and wants to create a new character, he said, animators could use those stored recordings as training data.

“With generative AI, you can generate new data based on that pattern of previous data,” he said.

A spokeswoman for the video game makers, Audrey Cooling, said the studios offered “significant” protections for AI, but the SAG-AFTRA negotiating committee said the studios’ definition of who constitutes a “performer” is key to understanding the question of who would be protected.

“We have worked hard to put forward proposals with reasonable terms that protect the rights of performers while ensuring we can continue to use the most advanced technology to create a great gaming experience for fans,” Cooling said. “We have proposed terms that provide consent and fair compensation for anyone employed under the (contract) if an AI playback or digital replica of their performance is used in the games.”

The game companies offered pay increases, he said, with an initial 7% increase in scale rates and a further 7.64% increase starting in November. This amounts to a 14.5% increase over the life of the contract. The studios also agreed to increases in travel expenses, pay for overnight transfers and an increase in overtime rates and bonuses, he added.

“Our goal is to reach an agreement with the union that will end this strike,” Cooling said.

A 2023 report on the global gaming market from industry specialist Newzoo predicted that video games would begin to include more AI-generated voices, similar to the voice acting in Squanch Games’ “High on Life.” Game developers, the Amsterdam-based firm said, will use AI to produce unique voices, bypassing the need to seek out voice actors.

“Voice actors may see fewer opportunities in the future, especially as game developers use AI to reduce costs and development time,” the report said, noting that “big AAA prestige games like ‘The Last of Us’ and ‘God of War’ use motion capture and voice acting in a similar way to Hollywood.”

Other games, like “Cyberpunk 2077,” cast celebrities.

Actor Ben Prendergast said the data points collected for motion capture don’t capture the “essence” of someone’s performance as an actor. The same is true, he said, of AI-generated voices that can’t offer the nuanced choices that go into big scenes, or smaller, more strenuous endeavors like screaming for 20 seconds to portray a character’s death by fire.

“The big problem is that someone, somewhere, has this massive data, and now I have no control over it,” said Prendergast, who voices Fuse in the “Apex Legends” game. “Nefarious or not, someone can collect that data now and say, we need a character who is nine feet tall, sounds like Ben Prendergast and can fight in this battle scene. And I have no idea that that’s going to happen until the game comes out.”

Studios could “get away with it,” he said, unless SAG-AFTRA can secure the AI ​​protections they are fighting for.

“It reminds me a lot of sampling in the 80s, 90s and 2000s, where there were a lot of people who were into sampling classic songs,” he said. “This is an art. If you don’t protect the rights to their image, or their voice or their body and the way they walk now, then you can’t really protect human beings from other endeavors.”

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.