Valve recently announced that Steam will allow games made with assistance from generative AI tools so long as it’s disclosed. If game developers tell the truth, we ought to be seeing a lot of disclosures in the near future.
In a survey of over 3,000 game developers from the organizers of the Game Developers Conference, 31% of respondents said that they personally use generative AI in their work, and 18% said that they don’t personally use it, but colleagues in their workplace do. That means that 49% of game studios are using generative AI in some fashion—although mostly not in ways we’ll directly perceive, judging by the other survey questions.
Generative AI tools are most controversial when used to generate artwork, writing, and voices that players directly experience. For example, upcoming Square Enix game Foamstars has been heavily criticized for using Midjourney to generate what the developer characterized as “0.01%” of its assets. Free-to-play shooter The Finals also recently angered voice actors and sympathetic gamers by using AI to generate voice lines.
Not all generative AI use is visible or audible, though. AI can be used to generate code snippets with tools like ChatGPT and GitHub Copilot, for instance. “The bulk of respondents were interested in coding assistance and speeding up the content creation process,” according to the survey organizers. “Developers were also intrigued by the idea of using AI to automate repetitive tasks.”
Large language models like ChatGPT are additionally used (at risk of getting bad or uncredited information) as general research and writing assistants that can, as a few examples, generate marketing copy, summarize a transcription of a meeting, or solve math problems. And as it turns out, it’s actually finance, marketing, PR, production, and management where the most games industry generative AI use is occurring, with narrative, art, audio, and QA departments being the least likely to use it, according to the survey.
(Image credit: GDC)
(Image credit: GDC)
The use of generative AI is not always approved by the company it’s being used for. Anecdotally, I’ve heard from one Silicon Valley tech worker that they use ChatGPT to help them code without their supervisor’s knowledge. We’ve also seen instances of companies being surprised to learn that AI systems like the “Generative Fill” tool in Photoshop were used to create artwork they published. Wizards of the Coast recently had to admit that, after initially saying it was human-made, a Magic: The Gathering marketing image did in fact use AI-generated elements.
Only 12% of respondents said their company had a policy forbidding generative AI use. 7% said that some tools are allowed where they work, but others aren’t. 2% said they’re required to use AI, and the majority either said that their workplace has no policy, or that use of generative AI tools is optional.
Despite the apparent widespread use of generative AI in the games business, developers did say that they’re concerned about the ethical problems it presents: 42% of respondents said they are “very concerned” about the ethics of generative AI, another 42% said they are “somewhat concerned,” and 12% said they aren’t concerned at all.
“I think completely replacing someone’s job is a genuine concern,” said one respondent. “It should be used to enhance capabilities, not reduce the workforce.”
“It’s theft, plain and simple,” said another, “and since it’s ‘fancy and high tech’ no one seems to care about copyright or ethics. Public shaming hasn’t seemed to work, so we need actual regulation.”
Meanwhile, interest in another controversial technology is on the decline: In this year’s GDC survey, 27% fewer developers than last year said they were “somewhat” or “very” interested in NFTs and cryptocurrency.