Big Tech openly wants to manipulate us with AI. That seems bad to me

My complaints about the generative AI bubble, and the tech industry in general, can often be reduced to the question ‘What are we even doing here?’

Case in point, one of the things Google thinks will save the games industry is the ability to predict when someone is about to stop playing a game so that they can be manipulated, in ways invisible to them, into playing more. To which I have to wonder: What are we even doing here?

The concept was mentioned—not quite in those terms—by Google Cloud gaming exec Jack Buser back at the Game Developers Conference in March. His talk broadly promoted generative AI adoption, hopping across everyday game dev problems AI is being used for: tagging assets, debugging, detecting cheaters. I don’t have an especially visceral reaction to those uses for AI, but my stomach churned at the mention of “hyperpersonalization.”

AI is great! Here’s how we’re going to use it against you

AI boosters have been kicking the “hyperpersonalization” term around for a few years now. Microsoft said in a 2024 blog post that the outcome of hyperpersonalized AI marketing should be that “customers’ needs are anticipated before they even ask.”

When it comes to games, Buser is excited by the idea of generative AI models that invisibly adjust a player’s experience so that marketing “feels like it’s part of the gameplay experience.”

“AI is doing this so well now that it can predict churn before it even happens in extremely robust ways,” Buser said, “and it can adjust gameplay as well as offers that you put in front of your players in near real time.”

If you’re building software that makes Minority Report-style predictions about the likelihood someone will stop playing your videogame so that you can then manipulate them into playing and spending more, I think you should go sit by a body of water, listen to the birds, and ask yourself what the hell you’re even doing anymore.

If it’s truly impossible to make money selling games without using machine learning to build psychological prisons—like in that one Star Trek: TNG episode where everyone got addicted to willing discs into tubes—then there’s no games industry to save. There are just software companies competing to design the best digital nicotine.

We might already describe parts of the industry that way—loot boxes, daily quests, and other tricks all predate AI—but certainly not all of it. Big and small developers are still making games that, whether or not they pay the bills, exist because their creators genuinely want to enrich our lives.

“Many people make games for money, but we make money for games,” one triple-A studio founder recently told us. Balatro, despite being a certified time-devourer and one of the best-selling PC games in recent memory, came from a developer who dislikes gambling and says microtransactions make him “want to put [his] computer in the dishwasher.” (He hates AI art, too).

But others, especially in Silicon Valley, have clearly decided that making numbers go up is all that matters, and expect us to be thankful that we’ve been deemed suitable targets for their next number extraction endeavor (or propaganda project). They tell us openly that their goal is to make more money while employing fewer people, and then act surprised when we don’t applaud them for it.

Perhaps one day they’ll be able to do away with the abstract idea of ‘making a good videogame’ altogether, designing hits purely by using AI to respond to the dictates of the numbers. They might end up with something like this self-propelled slot machine.

Advertisements

Leave a Reply

Your email address will not be published. Required fields are marked *