The Well Is Poisoned
A weekend of AI accusations and generated art chaos has spotlighted exactly how widespread the problem of junk creation is in games - and developers are worried it's going to get worse.
We are in an era of Witch Hunts, one where audiences don’t trust a word or an image coming from any company - and they’re being proven increasingly right to do so.
Large Language Model backed “AI” is the next big technological thing, and like blockchain and crypto before it - every executive is looking at a way to make more money and rubbing their hands with glee. And now the problem is that this is so entrenched that even when it’s not intentional - it can’t be avoided.
Today we’re going to take you through the events of the past few weeks that cover almost every example of how this tech can go wrong - show you how stupid this all is - and then detail the unfortunate reality that this is going to be something that’s a threat to both the marketing and development of games for a very long time yet.
The Status Quo
- We’re talking about a bunch of specific examples from the last few weeks - but first a tone setter.
- Around the holiday break, Xbox’s Indie division ID@Xbox tweeted the following image:
- It’s got that Holiday card feeling- but then you look at it for more than a second and realise the many inconsistencies.
- It’s not even a good AI generated image.
- But it’s one that’s being used to promote Indie Developers on Xbox - and got called out for it.
- So Xbox deleted the tweet.
- We don’t know the details of this being made - maybe it’s an internal Xbox creation (remember that Microsoft just announced the first universal keyboard update in 30 years to add an AI assistance key) maybe it was a contracted company that overstepped the mark.
- Regardless, it’s a mark against Xbox brand with the very audience and partners they were ostensibly trying to support with this marketing.
- But that’s our starting point.
- Companies are using this tech to sell audiences on games - that’s not up for debate.
The Weekend of AI Hell
- Moving on to our weekend then!
- Let’s start with the most obviously stupid and ridiculous example because at least it’s fun.
Wacom
- Wacom make drawing tablets!
- Can you imagine a company more dependent on the goodwill of artists and creatives?
- So they released a bunch of New Years marketing that featured Dragons (standard) except that the artwork was clearly generated from a model and then used for the ad.
- Turns out that an audience almost entirely composed of artists might be able to pick up on that.
- And they did! Including a full multiple minute breakdown from Illustrators like Megan Rose Ruiz among others.
- So many artists announced they’d be switching brands - and the tweets were eventually deleted.
- That response will become a theme.
- No other acknowledgement has been made of the move - or even the outrage against it.
- In the Wacom example - everything is very clear.
- It’s one image in one set of tweets that was clearly debunkable because it was obviously manipulated by modern tech and looked awful.
- Sometimes it’s not that simple though.
Apex Legends
- Apex Legends just announced it’s FF7 Rebirth crossover.
- The game’s first Fortnite style merch event
- It’s got some interesting mechanical stuff in it - giving players a Buster Sword and a completely altered limited time mode as well as a bunch of cool skins.
- It’s also got a limited release cosmetic death box skin that you only get after buying $330 worth of event lootboxes.
- But the controversy we’re here for today is around the trailer above.
- At the 1:04 mark, the various ($20) paid skins start being shown off.
- Each has an intro - and then a flourish happens and they’re rendered in a manga/anime styling.
- You can do this manually in photoshop with any picture as a result of altering certain characteristics in a smart image, giving a cartoon style effect.
- But it’s also not dissimilar to the anime filters you’d find on Snapchat or Tiktok.
- Or the infamous Corridor Digital videos where they effectively used an anime trained large language model to rotoscope over live footage.
- In the majority of those latter examples - the only way that the model is able to apply the “anime effect” is because it’s been fed lots of examples.
- Usually examples that aren’t owned by the people running the model.
- And what typically happens is that these create significant artifacting errors.
- Which in this example were very much picked up on by artists across the internet - belt buckles being removed, clothing features that don’t seem to align, fingers seeming rotated.
- Is this an example of an AI powered filter being applied with all the messy results?
- Or could this be just a simple case of someone performing that manual method and it simply being badly done.
- We do have a pretty good idea of who could have made this though - with third party studio Dark Burn Creative being behind pretty much every Apex Legends video since the game was revealed.
- At time of writing, Respawn hasn’t responded to the criticisms - or deleted the video - but this is yet another example of how audiences are now being extra critical in reviewing material put out by a company.
- And are quick to call foul.
- But now we hit the biggest example - where an audience primed to assume bad actors meets clear evidence of generative AI work and then the company denies it making everyone even madder.
Wizards of the Coast
- Magic The Gathering’s latest card set is framed around a detective story - with the new mechanic being built on the idea of gathering clues.
- The context of that makes everything that happened this weekend much funnier.
- A quick primer on Wizards of the Coast’s 2023, it wasn’t great! Multiple accusations of using AI in creative work that they’ve had to address and create firm policies against:
- The OGL disaster burned a lot of good faith from D&D players (we covered this and the eventual complete climbdown and reversal)
- Magic Players are feeling priced out by a ramping up quantity of releases - feedback their creatives are aware of and want management to listen to
- WotC sent the corporate security firm The Pinkertons (yes those ones) to recover pre-release cards someone had acquired too early - who then acted in a threatening manner while recovering them.
- In August, a long term contributing artist for upcoming D&D book Bigby Presents: Glory of the Giants was accused of using AI for their work.
- They revealed they’d used it in a polishing stage - WotC stepped in and confirmed that was that this would not be happening going forwards and publicly stated their (previously assumed) position that no generative art should be used as part of the process.
- In December - another accusation was made - and this time it was debunked as WotC spoke to the artist, checked the WIP images presented and confirmed everything was kosher. It was simply a case of artistic presentation.
- Also in December - parent company Hasbro laid off over 1000 people, including staff at Wizards of the Coast across the Magic and D&D departments including significant senior staff and art team leadership.
- This was in spite of D&D and MTG providing one of the most profitable parts of the company.
- Then a job listing appeared - one that was suspected by some on social media to be asking for staff with a skill for touching up AI manipulated images.
- It’s debatable as to whether that was actually what was being asked for, as both freelance illustrators and WOTC themselves gave statements describing the work as entirely normal for the workflow - touching up Human created work for release.
- This then prompted the anti AI policy to be extended from D&D to be any creative product produced for WotC.
- As you can imagine - people are not inclined to be charitable to Wizards of the Coast right now.
- Every creative product is being tightly scrutinised for evidence that they’ve thrown their lot in with AI generation for profit’s sake.
- Then this promo turned up:
- And much like our snow scene before - everything outside the cards just looked slightly off.
- Then the rest of the marketing campaign was found, looking like a smoking gun of AI generated art.
- And then the Social Media team doubled down.
- They confirmed that the work above was made by humans.
- Which is notable mostly because they had actively acknowledged the use of AI from contributors in the past - so why wouldn’t they admit it here?
- At which point they were roundly mocked by the audience at large because how could that possibly be true.
- A few days later - we get this.
- Where they attribute this whole thing to:
- A third party vendor that prepares marketing materials.
- Who was using industry standard tools that made use of generative material.
- And that WotC are now changing their policy again to also cover the third party vendors not using generative AI for anything they produce.
- But they still bear the responsibility for not scrutinising it more closely and publishing it.
- Now - that last part is important.
- In a WotC that wasn’t outsourcing, and maybe hadn’t had staff laid off - would this have happened?
- It’s unclear.
- What’s extra miserable about this situation is that hundreds of Magic The Gathering Artists are now alleged to have had their work used by Midjourney to seed their initial LLM for image generation.
- With court entries suggesting Midjourney CEO David Holz actively bragged about this while talking to other staff.
- So you can’t trust third parties to not be using tech trained on stolen work - and audiences can’t trust that companies won’t just take the cheap option when they have to.
- How can it get worse?
The Well is Poisoned
- The Well is Poisoned.
- A phrase that’s been going around for a while relating to AI art in particular.
- Google Searches and Stock images are trustworthy - right?
- Except they aren’t, not any more.
- These libraries have been cluttered with junk made from generative AI prompts, and even things that you look like you’re able to trust might be a risk.
- This is encapsulated perfectly in this from Strange Scaffold’s Xalavier Nelson talking about how he could not make cult game An Airport For Aliens Currently Run By Dogs in the current ecosystem.
- In short - while prototyping that game, the team added stock images of dogs to mark out the NPC objects.
- At which point they fell in love with the idea, and officially pivoted the whole game into an absurdist comedy that gave it it’s name.
- They cannot do something like that now.
- Because while they can license stock images of dogs - the work now required on top to verify where all of these images came from and that they are real is too much.
- Especially when the cost might be the game being unpublishable.
- Steam won’t allow you to publish anything on Steam that you can’t 100% guarantee you own the rights to everything within.
- So if your game features an image that you thought was Stock or Licensed - but it turns out that it was copyright infringing - that means it’s infringing the rules and can be removed.
- So maybe Steam will at least provide some bulwark against the worst of this on PC, even while the issue makes things more difficult for developers all round.
- Maybe for the most part - that means it will just mean more scrutiny on the part of companies, more care given to the work they do.
- Or companies could just own it.
- Indeed, companies aren’t going to not take this option when the C-Suite sees the ability to save cash.
- Here’s Square Enix President Takashi Kiryu in his annual New Years Letter:
We also intend to be aggressive in applying AI and other cutting-edge technologies to both our content development and our publishing functions. In the short term, our goal will be to enhance our development productivity and achieve greater sophistication in our marketing efforts
- This is a man actively speaking to investors and audiences and telling them he’s hoping they can replicate the process of disasters we saw above for all those companies.
- This is profitable and desirable - because the money saved is worth the risk of any stolen art or content being fed into your products.
- And even better is the fact you don’t need staff if you do this.
- That’s why Duolingo can lay off their contracted translators and simply review AI generated work instead.
- And the reward for jumping on the technological bandwagon, for having your systems and pipelines corrupted?
- The people building these tools just steal from you anyway!
- Audiences are right to push back on these moves.
- They won’t always be right about the use of AI, that one Wizards of the Coast artist alone proves that.
- There will be false positives, especially when those AI detector tools come in to play (like thinking the US Constitution is generated)
- But companies being constantly faced with pushback when this is adopted at the public facing stage will give developers in games more ammunition when management turn around and want to integrate things behind the scenes.