The UK arts and entertainment union Equity has announced a set of “best practice guidelines” for video game developers who hire voice actors. These include some proposed minimum wages designed to address the issue of “systemic underpayment” for artists. Other measures are designed to improve working conditions for voice actors and prevent companies from using their voices and likenesses as fuel for generative AI tools without their permission. It’s both a commendable effort and an interesting breakdown of the voice actor trade.
“The video games industry is a dynamic place to work, but unethical practices undermine the profession,” Equity explains on its official website. “Despite games being a multi-billion dollar industry with tax breaks of around £200m, UK artist pay has stagnated. Artists do not have the protections they need in the unregulated world of AI, abuse of confidentiality agreements is rife and health and safety is often lacking.”
Best practice guides include requirements on fair compensation, the need to obtain voice actor consent when creating digital replicas or “training” AI software, provisions for safer workplaces (including guidance on privacy coordinators and vocal cord emphasis) and “clear, legal, enforceable, fair, proportionate and targeted” confidentiality agreements.
On the subject of NDAs, Equity argues: “Many of the NDA agreements we see now reflect poorly on the industry, often intimidating and isolating Artists from those who support them. Artists are forced to sign documents with no hope of understanding or remedying them.
“As a result, Artists often assume they will be sued if they say anything to anyone about the production, even if they have been the victim of or witnessed a crime,” the page continues. “This is all the more shocking five years after the Harvey Weinstein scandal highlighted the egregious abuse of confidentiality agreements, rocked the film industry and ignited the #MeToo movement. We would now expect Engagers to follow best practices and not repeat the mistakes of the past.”
Meanwhile, the recommended minimum salaries offer some nuggets of information about how certain types of voiceover and motion capture are considered in video game projects. For example, they distinguish between voice lines that are part of the main plot and “atmospheric” or “world-building” sounds — “more than 300 written/recorded lines that do not advance the story.”
There are also “Walla sounds”, defined as “unscripted sounds that are not assigned to a specific character and do not advance the story”. Examples of walla sounds include crowd chatter and creature noises – the term apparently dates back to the early days of radio broadcasting, when some shows found that groups repeating “walla walla” created an effective impression of organic background noise.
I love hearing about these kinds of production subtleties. Since learning the term, I've found myself surprisingly charged with memories of walla voiceovers. There's a backstory to the anomalous battlefield hum you hear during loading breaks in Total War: Warhammer 3, for example. I've had this stuck in my head for a while, partly because my review PC at the time was a sclerotic antique and would sometimes take five minutes to load a battle.
As Eurogamer’s Ed Nightingale reminded us, Equity’s best practice guidelines came amid strike action by US actors’ union SAG-AFTRA. In July, SAG-AFTRA spokespeople noted that “AI protections remain an issue” in the strike, following disagreements between the union and its members over agreements with individual companies that use the technology. “Eighteen months of negotiations have shown us that our employers are clearly interested in exploitation, not fair, reasonable AI protections,” Sarah Elmaleh, chair of the Interactive Media Deal Negotiation Committee, commented at the time.
As Equity notes in its new guidelines, “atmospheric” and “walla” vocalizations are “one of the parts of our industry most threatened by AI,” likely because the strangeness of AI-generated speech is harder to spot when buried in a crowd. While not participating in the SAG-AFTRA strike for legal reasons, Equity has plenty of advice and requests for developers considering using generative AI.
Among other things, they recommend that companies confirm upfront that “data recorded within the specified performance sessions will only be used for the specified project and will not be reused in future titles” and that “an upfront purchase/integration fee will be required if a developer, studio or publisher wishes to retain performance data in their own 'library' for potential reuse in future projects.”
If you find all this interesting, I suggest you read the full text on the Equity site — it’s just scratching the surface. You might also be interested in Ken Levine’s thoughts on “turn-based dialogue.” If you’d like to learn more about the far-reaching implications of generative AI technology, Mike Cook did a series of essays on the subject earlier this year.