The Last of Us hype train seems to have never stopped. For a solid decade and counting, the message is that this fungal post-apocalyptic road trip is the game to end all games. A triumph of interactive entertainment as a narrative medium. As the people behind the HBO adaptation proudly stated, “the greatest story ever told in video games”.
This is, of course, nonsense. Don’t get me wrong – TLOU is good. Great, even. But to be repeatedly told that the tramadol Uncharted game with the ladders represents an entire medium at its peak feels like it tips over the line from marketing into straight up gaslighting. Still, you’d expect the entity ultimately responsible for the gaslighting (Sony) would be invested in making sure that the Greatest Game of All Time Ever wasn’t a bit of a dog’s breakfast when the PC port hit storefronts.
Unfortunately, it’s a bit of a dog’s breakfast. But first, the good: when it’s running, and not crashing, it looks lovely, sporting a suite of tweakable graphics options which can be pushed past the visuals found on PS5, which were already pretty lavish. In terms of image fidelity, the PC version is king, with denser foliage, higher resolution shadows, reflections, and particles all possible, and all running with decent ultrawide support if you have the requisite display.
In terms of performance, though, the situation is pretty dire. The VRAM situation is baffling: the game will almost max out a 12gb GPU no matter how much you tweak the settings. For the avoidance of doubt, the vast majority of users don’t have 12gb of VRAM, and precious few have more than that. Riding the limit of the available resources like this is very likely a contributing factor to the game’s horrendous stability: it will crash regularly. In my case, the game fell over every time a pivotal story moment was about to happen, which makes sense: a rush of new assets pushing the system over the limit, in preparation for a new level or cutscene.
For a game that is full of pivotal story moments (it is, after all, “the greatest story ever told in video games”), this results in regular CTDs, making the experience fairly miserable and anxiety-inducing for precisely the wrong reasons.
As well as the crashing, the stuttering situation is abysmal. The game is plagued by microstuttering on many systems – that is, almost imperceptible inconsistencies in the frame timing that you almost feel more than you see. Moving the camera around feels jerky and granular. It’s like the game is being tapped into your retina with a ball hammer rather than flowing smoothly into it. Depending on your disposition it is, at best, irritating. At worst, headache inducing.
Generally, despite a shader compilation cycle that lasts an excruciatingly long time (over an hour for some users, around 45 minutes on my system), the game will cough and splutter whenever new levels load in. And, as Sherif points out in the video, this is a game that masterfully hides its transitions from one level to the next, so you never actually know when this spluttering is going to happen.
In fairness to the developers, TLOU Part One is, arguably, a difficult game to port from specialised, standardised hardware onto platform that is, by nature, non-specialised and non-standard. The original PS5 version was engineered to take advantage of the PS5’s super-fast storage: a proprietary system with hardware-based compression/decompression built around a blisteringly fast 7,000 MB/s SSD. The vast majority of PCs can’t possibly compete with that sort of retrieval speed in terms of raw numbers. But this disadvantage is offset by the larger pools of RAM and VRAM available to PCs. And yet, this spluttering happens on my system, which has 64gb of system RAM (four times what a PS5 has) and 12gb of VRAM (which, as discussed, the game is almost maxing out as a matter of course).
I’m not a game developer, so my understanding of video game resource management is rudimentary at the very best. I’m not claiming to be an expert here. But the results are what matters, and the results here are incredibly unsatisfying. A PC release that maxes out a high-end PC like this, while running with all the robustness of a teetering stack of crockery, is frankly not fit for purpose. For the vast majority of users, in a market where the median system spec is vastly less cutting-edge than a 3080ti (which hasn’t been cutting-edge for almost two years now), this release is a complete waste of time. But, despite this, the recommended system specs give the impression that you can run this quite nicely on hardware from 2018.
I hope the forthcoming updates, the first of which is scheduled to arrive tomorrow at the time of writing, fix this situation for PC consumers and bring the actual required specs in-line with the suggested specs on store fronts. Because at the moment, it’s cruel and downright misleading to sell this game as working, in a recommended fashion no less, on anything less than bleeding-edge hardware that most people buying it will simply not have.
The “greatest story ever told in video games” surely deserves better than this.