Triple-A games exist at a troubled intersection of art and capitalism. But there are hidden secrets in every game that reveal the human behind the machine.
The Last of Us Part 2 is a deeply personal game for many of its developers. Image: Naughty Dog
The poster in the abandoned pet store takes me by surprise. Naughty Dog’s The Last of Us Part II presents sudden moments of familiarity like these with painful emotional precision: staring brightly out into space, the image of dog, on a clinical green and white background, is jarringly innocent, a vestige of a time gone by.
The Last of Us Part II introduced guard dogs as a new enemy type; with keen senses of smell and hearing, the creatures are perilously difficult to hide from, ruthlessly hunting the player down. In the unforgiving world that Naughty Dog created, it’s details like this fading poster that ram home the fact that, as totalizing as the dystopia may seem, it wasn’t always this way; man’s best friend once had a role other than violence.
As it turns out, the dog on the pet shop wall has a name: Murphy.
Senior QA analyst Wilson revealed the dog’s true identity in a tweet thread by Disc Room and Minit developer Jan-Willem Nijman, where he asked game developers to reveal the secrets they had hidden in their games throughout the years.
Envisioned by Wilson and implemented by art director Erick Pangilinan and graphic designer Hailey Del Rio, Murphy the blind, beloved ‘old grandma dog’ made it onto the wall as an easter egg.
As the thread progressed, a few other Naughty Dog game devs signalled the oblique references to their own lives that they had hidden in the game as well. Game designer Michael Barclay added the area code for his home town of Dundee to a found document – a detail that was picked up by the local paper.
Former environmental artist Willow Paquette drew on phrases from her favourite podcasts while designing graffiti for the game, including the sign-off from The Last Podcast on the Left:
These playful references spark curiosity, and they’re fun to uncover. But more than that, their presence, immortalised discreetly in pixels and code, is a sign of the humans behind the engine of game dev. Because it’s not just Murphy’s cute, fluffy little face that’s immortalised: it’s Wilson’s love for his dog, his impulse to honour her, that is immortalised, too.
THE HISTORY OF EASTER EGGS
The phenomenon of easter eggs has always had a proprietary underpinning. The first documented easter eggs originated in the 80s, when arcade games were fast becoming a cultural phenomenon – and a profitable one at that. In 1980, it was common practice not to credit game developers for their work. This was intended to reduce instances of professional headhunting in a limited market. Moreover, by obfuscating programmers’ real experience, companies could limit their ability to negotiate their working conditions and pay.
In 1980, the videogame Adventure was being developed for the Atari 2600, and sole programmer Warren Robinett decided to protest this lack of acknowledgement.
Shortly before leaving Atari, Robinett programmed a secret room into Adventure which was empty except for the phrase ‘created by Warren Robinett.’ The room was accessible only if the player interacted with a single pixel, known to players in-the-know as ‘the gray dot.’
Upon discovery, Atari decided that removing the scene would be too expensive to undertake. Instead Atari and many other games companies began to adopt the inclusion of easter eggs as a commercial strategy that encouraged repeated plays as players tried to discover the right combinations of actions to unlock the game’s secrets.
READ: Game devs reveal their best in-game easter eggs
Who gets the credit today?
We are living through the era of the videogame blockbuster. Budgets, team sizes and development cycles have swollen in 2021 – however, the issue of who gets credited remains fraught. In 1996, Naughty Dog’s smash hit Crash Bandicoot cost US$1.7 million and took less than 18 months to develop and ship. By contrast, The Last Of Us Part II took seven years, and reportedly cost over $100 million. This figure is no longer considered remarkable for games on this scale; Rockstar Games’ 2020 release Red Dead Redemption 2 cost an eye-watering $250 million.
These giant productions require giant development teams: 2,176 developers across fourteen studios were ultimately credited on The Last of Us Part II, in a development cycle that was plagued by layoffs, and allegations of workplace abuse.
Getting credited is not a simple process at many studios. An investigative report from Kotaku last year found that there were no guidelines on how a studio should credit their workers; they are completely unstandardised, both within and across studios. This had lead to developers being undercredited, either accidentally or in retaliation for failures to toe the party line. Rockstar Games’ policy remains that only developers who are working on the title when it ships get to be in the credits. It’s a shockingly frequent practice.
Game designer Ian Schrieber described this phenomenon to Kotaku as the result of lack of unionisation:
‘Having accurate, verifiable credits isn’t part of the certification process for Apple or Steam or Nintendo or anywhere else, so studios are more or less free to do whatever they want, with no consequences if they choose to ignore the standard.’
This is not to mention the fact that games by large companies are often cancelled at the last minute; years of work under NDA can end without warning, and consequently without credit. The unbelievable labour of making games of this technical involvement, and on this scale, without union support, couldn’t be more alienated from the product of the game itself.
SIGNS OF LIFE
On release, The Last of Us Part II drew attention for its heart-wrenching narrative exploration of traumatic cycles of violence, and remarkable technical accomplishments. Its development cycle also drew fire over allegations of unpaid overtime, layoffs, and workplace abuse. It is a towering monument to the intersection of art and capitalism that triple-A games have arrived at: enormous budgets, untenable working conditions, peerless artistic achievements by uncredited workers.
From brunch selfies to the graffiti on the walls of Pompeii, being seen is a fundamental human need. Under the yoke of making art under capitalism, this need doesn’t change; if anything, it becomes more urgent.
Even within the most unassailable piece of technological innovation, created by countless workers many of whom would never have met, who may never have met the final product is undeniably shaped by their personalities. An area code; a picture of a beloved dog. The defiant inclusion of a name. To borrow a phrase from r/showerthoughts: graffiti in videogames uses a real human impulse to prove our own existence. Even in the dystopia, undeniable reminders of life.