
To the disappointment of nearly everyone, generative AI is being forced into every aspect of game development. Not satisfied with just letting it infiltrate the creation process, some companies are now pushing to hand over the controller to AI and let it play the games too.
This week, high-end peripheral maker Razer unveiled an “enhanced” version of its Project Ava at the 2026 Consumer Electronics Show. Ava is an AI system designed to analyze gameplay footage and provide real-time tips while you play. While the project itself isn’t new, its latest version is a standalone device that shows a cat-eared anime girl confined in a glass tube who simply stands there while Ava mechanically dispenses advice.
Initial reactions to the new Ava model are predictably poor. Reports from various tech publications note that Ava fails to answer basic questions about the games it’s meant to analyze, provides clearly wrong commentary, frequently goes off on irrelevant tangents, and makes writers uncomfortable with its flirtatious demeanor. Additionally, it operates using a chatbot model that’s primarily being utilized to generate nonconsensual sexual content of unsuspecting women and girls on X.com.
As you might expect from other generative AI applications, Project Ava appears to fall short of its promises. But its functionality is less important than a more fundamental question: why does this product exist?
Coinciding with Razer’s AI catgirl reveal, a patent for Sony’s latest AI gaming technology also emerged. Filed in 2024, the system would use AI to provide different levels of assistance to players struggling with difficult game sections, according to reports. Features would range from showing a ghostly avatar that guides players toward objectives to the AI actually taking control and completing challenging parts itself. Even if it functioned perfectly, like Ava, this technology seems to fundamentally misunderstand the purpose of gaming.

The most charitable interpretation of Sony’s patent is that it could serve as an accessibility tool to help players overcome sections they’re physically unable to complete, or prevent difficult areas from blocking progress entirely. While better difficulty options and accessibility features are important, these are best implemented by developers who understand their own games, not by external parties imposing solutions.
Both Razer and Sony are attempting to fix issues that aren’t actually problems. While difficulty can be challenging, it’s not a flaw to be corrected. When a game feels too hard to beat, it often means you need to adjust your approach or practice more. In either scenario, the answer is to reconsider your strategy, try a different tactic, and attempt again. This learning process is central to what it means to play a game, and if one title doesn’t suit you, countless alternatives exist.

While a self-playing game sounds ridiculous, Project Ava’s “helpful” hints are actually more problematic, even apart from the AI model’s fundamental issues. Playing an objective-driven game – Razer’s CES demonstration – is about trial and error. Understanding why your approach fails is as crucial as eventually succeeding, and jumping straight to the answer reduces games to efficiency problems rather than artistic experiences.
There’s plenty of guidance you can discover yourself. Numerous players are happy to share strategies online, and the exchange of questions and answers is essential to gaming’s social nature. Project Ava assumes that asking a chatbot is better than pausing to consult other people, but this completely misses the mark. Any aspect of a game that prompts reflection or fosters community is valuable, not an inconvenience to be eliminated by technology.