It’s not difficult to get a bad GPU. It just needs to have poor value, some kind of crippling drawback, or simply be a disappointment. There are plenty of cards that embody one or two of these characteristics, but every once in a while, we bear witness to magical moments when a graphics card hits all three.
Whether we’re talking about the best GPUs of all time or the worst, it’s pretty hard to narrow down the winners (or losers in this case) to just seven. Thankfully, I’m something of a connoisseur in awful graphics cards and PC hardware in general, so here are my picks for the worst GPUs ever launched.
1 Nvidia GeForce GTX 480: The ‘G’ in GPU stands for grill
Source: Hyins
In the decade following the emergence of the modern graphics card, there actually weren’t too many bad ones, and when there were, they had a minimal impact since the industry was moving so quickly back then. It was pretty common to see next-generation GPUs beat their predecessors by over 50%, and 100% or more wasn’t rare. But by 2010, things were getting more complicated, particularly for Nvidia. It was trying to make a massive GPU on TSMC’s 40nm, which was sort of broken. The end result was Nvidia’s worst technological blunder ever: Fermi, which powered the GTX 480.
Because the 400 series had to be delayed due to those difficulties, AMD had taken the performance crown from Nvidia with its Radeon HD 5000 series in late 2009 and held it for six months. The 480 did reclaim the crown with a 10% lead over the Radeon HD 5870, but at a terrible price: $500, compared to the 5870’s $380. And that was just the literally terrible price; it got even more expensive considering the 480’s massive power draw. In Anandtech’s testing, the card consumed well over 200W under load and hit over 90 degrees Celsius. If you’ve ever heard of “the way it’s meant to be grilled,” this is what it was referring to.
The GTX 480 was way too hot, way too loud, and way too expensive.
Suffice it to say, the GTX 480 didn’t cut it despite its performance. It was way too hot, way too loud, and way too expensive. Nvidia quickly rushed out the GTX 500 series just half a year later, taking advantage of the 40nm node’s increased maturity and architecture level improvements to increase efficiency. A processor is undeniably bad if it has to be replaced after months rather than years, and the GTX 400 series is one of the shortest-lived product lines in GPU history.
2 AMD Radeon R9 390X: Two-year-old leftovers
Source: MSI
A couple of years after the GTX 480 debacle, things got back to normal. Nvidia managed to stay on top with the GTX 500 series, but the market got competitive again starting in 2012, with Nvidia losing its foothold to AMD’s R9 290X in late 2013. So there AMD was, at the top of the world after yet another long struggle to get back into first place. What was next for this underdog with a new lease on life?
Financial turmoil and potential bankruptcy, as it turns out. Although AMD’s GPUs were quite competitive with Nvidia’s most of the time, they weren’t making a profit, and by the time the 290X came out, the company had basically run out of money for R&D. Its next generation of GPUs didn’t come out until 2015, over a year after Nvidia’s legendary GTX 900 series, but the “new” Radeon 300 series that launched in 2015 was anything but. It was literally a carbon copy of the 200 series with minor tweaks.
The 300 series was not the GPU AMD needed to launch in 2015.
A particularly bad addition was the R9 390X, which was a rebranded 290X. Despite its high power draw of over 300W under full load, reviews were somewhat positive and pitted it as a cheaper alternative to the GTX 980. However, I’m not so sure that the testing data supports that conclusion. TechPowerUp found that the 390X could only match the 980 at 4K, where the framerate was usually less than 60 FPS and sometimes less than 30 FPS. Meanwhile, the GTX 970 matched the 390X at 1080p for $100 less, making it a far better deal (not to mention more efficient).
The 300 series was not the GPU AMD needed to launch in 2015. It was a year late and almost exactly what people got in 2013. AMD also launched the R9 Fury and R9 Fury X, which were actually new flagship cards, but it didn’t quite recreate the 290X moment. AMD would not make a real attempt to recapture the performance crown for years.
3 Nvidia GeForce RTX 2080: The beginning of the end
Source: Nvidia
Technological blunders are often what ultimately doom terrible graphics cards, but sometimes even the most cutting-edge GPUs can mess everything up. The RTX 20 series launched in 2018 at the peak of Nvidia’s six-year reign of dominance. Despite AMD’s lack of competitiveness, Nvidia had always made good to deliver a 30-50% improvement in value each generation. With the 20 series, however, Nvidia decided to do something different.
Ultimately, the RTX 20 series has been a major turning point in gaming GPU history and in the worst way possible.
As the world’s first gaming GPUs to support hardware-accelerated real-time ray tracing and AI resolution upscaling, Nvidia felt it was justified in raising prices. After all, the 20 series was many, many times faster than the previous 10 series with ray tracing and DLSS, so it would still be a great value, right? Unfortunately, there were literally no games with either technology until 2019. So on launch day, the RTX 2080 was essentially just another GTX 1080 Ti (with less VRAM). Techspot was not impressed, saying “we’re getting GTX 1080 Ti-like performance for a 20% price hike.”
Although I’ve singled out the RTX 2080 here, the RTX 2070 and the RTX 2060 were about as bad, offering basically nothing in the way of actual value improvements. Promises about certain games getting ray tracing and DLSS were also regularly broken in 2018 and 2019. Nvidia did eventually rectify the situation by launching the RTX 20 Super series half a year later, much like how Nvidia released the GTX 500 series half a year after the GTX 400 series. But ultimately, the RTX 20 series has been a major turning point in gaming GPU history and in the worst way possible.
4 AMD Radeon VII: The forgotten AMD flagship
Source: AMD
Although the RTX 20 series was by no means a great product line, it still put AMD even further behind because the new RTX 2080 Ti was a full 30% faster than the GTX 1080 Ti, and AMD’s RX Vega 64 only matched the GTX 1080 (sans Ti). Something had to be done, but AMD’s upcoming cards using the 7nm Navi design wouldn’t be available until mid-2019. But then someone at AMD realized something: they already had a 7nm GPU, a 7nm version of Vega made for data centers. It wasn’t going to beat the 2080 Ti, but something is better than nothing, right?
Well, the Radeon VII probably would’ve been better off not existing. Like the 14nm Vega 56 and 64, it was pretty inefficient and although it roughly matched the RTX 2080, it did so while consuming about 70 more watts. It didn’t support ray tracing or upscaling technology either, and the cherry on top was its $700 price tag, which was the same as the 2080.
AMD probably lost money on the Radeon VII and it wasn’t even a good GPU!
But perhaps the worst thing for AMD was that since this was a data center GPU, it came with 16GB of extremely expensive HBM2 memory. AMD probably lost money on the Radeon VII and it wasn’t even a good GPU!
Because the RX 5700 XT launched just a few months later at $400 and had about 90% of the performance of the Radeon VII, this former GPU flagship has been largely forgotten, and for good reason. It didn’t add anything new and its name was awful. Seriously, Radeon VII is a terrible product name. I guess AMD is pretty embarrassed about the whole thing because its official product page has since been deleted from the company’s website.
5 Intel Xe LP: Intel’s big integrated GPU that just couldn’t win
Source: Intel
In order to talk about Intel’s first and worst gaming GPU, we need to step back for a moment. Back in 2015, Intel pulled off a major coup by recruiting AMD Vice President Ari Rauch to spearhead the development of a new graphics architecture. This GPU wasn’t supposed to go head-to-head with high-end or even midrange cards but was intended to be paired with a future Intel CPU. As an integrated GPU, it was supposed to take a shot at Nvidia’s MX laptop chips and AMD’s APUs, and if successful would be great for Intel and bad for its rivals.
In 2018, Intel would pull off another coup in the form of Raja Koduri, who quit his job as leader of AMD’s Radeon division to join Intel as chief architect, and he had ambitions beyond integrated graphics. He expanded the scope of Intel’s graphics plans, and began development on discrete gaming cards and data center GPUs in addition to the powerful integrated graphics, creating the Xe lineup with Xe LP at the bottom, Xe HP in the middle, and Xe HPC at the top. Xe LP would debut on Intel’s 10nm node and would also get a discrete GPU version as well.
It was cruel to bring this poor little GPU into such a hostile world.
The first generation of Intel’s brand-new integrated graphics debuted with Ice Lake U CPUs in 2019, and it actually beat AMD’s Ryzen 3000 APUs. This was just the beginning though, and Tiger Lake U chips with a fully integrated GPU launched in late 2020. But despite featuring 50% more cores than the iGPU in Ice Lake, Tiger Lake U integrated graphics were trashed by AMD’s Ryzen 4000 APUs, and came nowhere close to making Nvidia’s MX GPUs obsolete. DG1, the company’s first discrete gaming GPU ever, was also quite bad and barely matched the GTX 1030.
Rauch only got to oversee the launch of Ice Lake U’s iGPU before he was fired in early 2020. Intel has not updated its integrated graphics. Xe LP was a failure not just because the design was clearly bad, but probably also because Xe HP and Xe HPC got more attention. With Arc Alchemist being as decent as it is, it’s hard to blame Intel for sacrificing Xe LP. I will say though making DG1 was absolutely unnecessary, and it was cruel to bring this poor little GPU into such a hostile world.
6 AMD Radeon RX 6500 XT: Pairs worst with AMD’s own CPUs
Source: XFX
The GPU shortage that began in 2020 was disastrous. Graphics cards were unbelievably expensive and some didn’t even launch with MSRPs since that didn’t mean anything anymore. People on a budget were impacted the worst, because AMD and Nvidia weren’t launching new low-end cards and the old ones from previous generations were going for up to quadruple the original price. But as the GPU shortage began to subside in early 2022, AMD finally got around to launching some entry-level GPUs, and even though the bar was super low, it somehow failed to clear it.
The RX 6500 XT wasn’t a normal gaming GPU. It was originally designed for laptops and consequently consumed very little power and offered very little performance, roughly equal to the RX 480 from 2016. For the $200-$250 it launched for, this wasn’t too terrible, and the 6500 XT has since come down to about $150. However, because the 6500 XT was made to be paired with laptops using Ryzen 6000 APUs with PCIe 4.0, AMD decided to only offer four PCIe lanes, and if you ran the 6500 XT in PCIe 3.0 mode, the performance was awful.
AMD finally got around to launching some entry-level GPUs, and even though the bar was super low, it somehow failed to clear it.
What pushes the 6500 XT over the line into hot garbage is that the company’s budget Ryzen CPUs, which launched at the exact same time, didn’t support PCIe 4.0 since they were recycled APUs that only had PCIe 3.0. AMD penny pinched so hard that pairing an AMD CPU with a Radeon GPU was a bad idea. The 6500 XT makes much more sense with a low-end Intel 12th- or 13th-generation chip. It would be much funnier if it wasn’t literally the only new GPU under $200 actually worth buying, though.
7 Nvidia GeForce RTX 3050: An unaffordable entry-level card
Nvidia has taken a different approach than AMD when it comes to the budget segment, which is to pretend that it simply doesn’t exist. This wasn’t really much of a problem during the GPU shortage since pretty much every GPU was selling for an extra hundred dollars anyway, but once things started to ease up, it became clear that the RTX 3050 was an awful entry-level card for an unaffordable price.
Launching about a week after the 6500 XT in early 2022, the 3050 at first seemed like a decent enough GPU, with bang-for-buck roughly equal to the RX 6600, which was 30% faster but also about 30% more expensive. Over the course of 2022, all GPUs were selling at lower and lower prices as the shortage cleared up, and when RTX 30 GPUs eventually stopped declining in price in mid to late 2022, AMD cards kept going. Today, the cheapest RTX 3050s are only available for just under $300, while the much faster RX 6600 can be found for just over $200.
It’s one thing for the RX 6500 XT to be a bad budget GPU, but at least it’s a new GPU that’s available for $150. Nvidia straight up just gave up making entry-level GPUs by launching the 3050, which isn’t even a good card in its own right with current pricing. Of course, Nvidia will say the premium is worth it because you get DLSS, but even with DLSS enabled, the 3050 might only match or barely surpass the RX 6600 for almost $100 more. The 3050 is symptomatic of the direction Nvidia has been going since the RTX 20 series, and it shows no signs of changing course.
The competition for the worst graphics card is going to get even fiercer
Things aren’t quite as gloom and doom with the latest generation of GPUs as I thought they would be. I once said the RTX 4070 might cost as much as the RX 7900 XTX (it’s $400 cheaper) and that the RTX 4060 would be at least $400 (it’s $300). But even though value isn’t getting worse, the fact that it’s not getting better is still quite terrible. It’s also quite serious when entry-level GPUs now start at about $300 when they used to start at as little as $100. Desktop PC gaming is quickly becoming unaffordable.
The days of good generation-to-generation improvements in value have long since passed; we’ll have to settle for maybe a 10% improvement in bang for buck every year or two. AMD and Nvidia are in a race for the bottom to see which company can get people to pay the most for the worst graphics card. If you can afford the ever-increasing premiums for good graphics cards, then that’s great. But for everyone else who can’t shell out hundreds of dollars for the same performance we got last generation, it just sucks.