PC gaming is, unfortunately, a relatively expensive hobby. But if there’s one component these days that draws the most ire in regards to cost, it’s the GPU, particularly if you’re looking to buy a high-end card. While many of us lust after the RTX 4080 and RTX 4090 graphics cards of our dreams, the serious price tags that come attached to these GPUs seem to have turned them from potential prospects into serious stretch goals.
When a single component goes beyond the $1000 mark and even well into the $2000 realm and above, it becomes less and less easy to justify the performance gains when you take into account the cost of one of these mightily expensive cards. It didn’t used to be this way, we think to ourselves. It was cheaper back in the day.
But, as with many things, are we perhaps putting on our rose-tinted glasses and imagining a past that never was? Adjusting for inflation, how much would some of our favourite GPUs of years gone by cost in modern day money, and are we simply pining for a time that never really existed? I decided to take a closer look.
Let’s take a trip back into the past. Waaaaay back, to a time when this hardware writer, then just a boy of unfortunate height, was starting to figure out that this whole build-your-own PC thing was worth looking into. The late ’90s seems about right, as this was a time when big GPU releases were starting to hit their stride, and much missed names like 3dfx were releasing some of our favourite GPUs of the day.
3dfx Voodoo 2 12MB SLI: One GPU good, two might be better
(Image credit: Wikimedia Commons)
Back in 1998, I’d argue that the highest of high-end graphics hardware setups would be a pair of 3dfx Voodoo 2 12MB (!) cards running in SLI. Certainly this was the configuration that young Andy was pining over in his collection of PC gaming magazines, in much the same way that many of us look fawningly upon an RTX 4090 today. Even though SLI is now dead in the water, it makes sense to compare the top end to the top end, so how much would those twin Voodoo 2 cards cost if you bought them at the time of release?
According to Techpowerup, at its launch on February 2, 1998, a single 3dfx Voodoo 2 12MB retailed for $299, so a pair would equate to $598. That figure looks laughably small compared to the seemingly gigantic figures we would pay for top end PC hardware today. Adjusting for inflation, however, courtesy of the US Bureau of Labor Statistics CPI inflation calculator, $598 in February 1998 would equate to just over $1,136 in today’s money.
The RTX 4090, by comparison, launched at an MSRP of $1,599 on October 12, 2022. Given that inflation has risen since then, let’s do an apples to apples comparison. $1,599 in October of last year is now equivalent to roughly $1,651 in October 2023, and before we get into an argument as to why that’s the case, let me point out that this is an article about PC hardware, and when it comes to politics, we won’t even go there.
Fair? Fair. Let’s continue.
Those of you who bought an RTX 4090 at retail may well consider yourselves lucky given the price increases since launch. Given the high demand and supply issues facing the RTX 4090, now it’s not uncommon to see those top-end cards retail at the time of writing for just under an astonishing $3000.
Of course, you could argue that the RTX 4090 is so far ahead of other GPUs in terms of performance that it justifies all that extra expense, and perhaps you might have a point. But be that as it may, comparing our two figures shows a discrepancy of $515. So far, it certainly seems like the theory that high-performance GPUs are a lot pricier than they used to be holds some weight.
The GeForce 256: A leap forward, but not necessarily in price
(Image credit: Hyins via Wikimedia Commons)
Just over a year after the release of the Voodoo 2, a little company you might have heard of called Nvidia announced what it called “the world’s first GPU”, and it caused quite a stir. It was called the GeForce 256, and it featured integrated transform and lighting hardware on the board itself, distinguishing it from its competitors that relied on the CPU to perform these functions.
Games written to take advantage of this feature were able to gain up to a 50% frame increase over other cards, and it could be used in conjunction with a much slower CPU to achieve better performance, although many at the time were keen to point out that the hardware T&L integration was only a performance advantage in the games that supported it.
Nevertheless, it’s still regarded as a bit of a watershed moment, and the release version was available at around $179. The canny amongst you may realise that this really wasn’t a lot of money for a powerful GPU even adjusting for inflation, as taking this into account gives us a sum in today’s money equivalent to around $327.
Performance wise, while the original Geforce 256 may have flown in games with hardware T&L support, it was hamstrung in some benchmarks by under-cooked drivers and a limited 128-bit memory bus, both of which were solved by the slightly later release of the Geforce 256 DDR. So perhaps it’s better to equate the original SDR version to something further down the stack than the legendary RTX 4090, and instead compare it to the RTX 4080s and 4070s of the modern day.
The RTX 4080 is still a monster of a card, but with that comes a monster of a price, and at an MSRP of $1,119 it’s pretty strange to think that it was more than triple the equivalent cost of a Geforce 256 at launch. The RTX 4070 gets closer of course, with its launch MSRP of just under $600, and since many manufacturers have cut the price to compete with the Radeon 7800 XT plenty of examples can be found around about the $550 mark. But that’s $200 over what a Geforce 256 SDR would have cost you at launch, for an equivalent that’s regarded as quick but upper mid-range, not competing at the top.
As for the mighty DDR version, top of its class in most benchmarks when reviewers finally got their hands on one back in late 1999? Well, that retailed at around $300. That’s roughly $548 in today’s money, and if that doesn’t make you hold your head in your hands when compared to the price of a modern day benchmark crusher, I don’t know what will.
The ATI 9800 Pro: A beast with something of a devilish cost
(Image credit: Future, Anandtech)
To round things off, we’d better take a look at an ATI card, if for no other reason than to ride the wave of nostalgia around a brand name long since retired. While we could stick to GPUs released in the late ’90s or thereabouts, for this comparison I think it’s worth fast-forwarding a little to the heady days of 2003, when ATI released what I would argue was its pièce de résistance, the Radeon 9800 Pro.
Forgive me for getting a little misty-eyed, but the 9800 Pro was, at release, the top end model for ATI’s R350 series of GPUs, and I remember it well. The R300 and R350 architectures were something of a comeback story for ATI, and I’m a sucker for an underdog getting its time in the sun.
Prior to this ATI hadn’t really released anything particularly special in quite some time, and some were beginning to doubt whether they could keep up with the pace. The 9800 Pro came off as a heavyweight rebuttal to ATI’s detractors, what with its 117 million transistors, 128MB/256MB DDR2 memory and a scorching clock speed of 380MHz.
Stop sniggering at the back. Back in my day, this thing was fast.
So fast in fact, that at release it ended up matching or even outright beating the formidable Nvidia FX 5800 Ultra in many benchmarks, particularly when anti-aliasing was enabled. And at a time when 4K resolution was unheard of, anything that got rid of the dreaded jaggies while keeping up the frame rate was considered a blessing from the heavens.
Three hundred and nintey-nine of your finest 2003-era dollars were required to experience the ATI card for yourself, and running that through our inflation calculator equates it to just over a rather chilling $666 in today’s money, a good two decades later. I remember thinking to myself at the time that it was a lot to pay for a GPU, but now looking at what I’d have to pay to get an equivalent card today, I realise that perhaps we never had it so good.
Why are GPUs so much more expensive than they used to be?
Before you think I’m simply pointing my finger at modern GPU pricing and yelling “BAD”, it’s worth drilling down a little into the reasons why this might be the case. Certainly, the pandemic and its associated supply chain issues didn’t help matters, right in conjunction with a crypto-currency boom that saw enterprising miners buy up huge amounts of GPU stock to further their own economic ends. Supply and demand is a very real thing, and when the supply was strangled at the same time as the demand skyrocketed, a big leap in pricing was very much to be expected.
It does bear saying that the development costs of GPUs now with many billions of transistors, advanced packaging methods, all built on cutting edge—and very expensive—production lithographies have risen exponentially. More than that, since the pandemic GPU manufacturing costs at source are most definitely on the rise, and it would be naïve to expect that those increased costs would not be passed onto the end consumer.
All of that being said, it’s very difficult to shake off the feeling that while GPU manufacturers may still be dealing with increased costs and issues with stock post-pandemic, they may have learnt a rather unfortunate lesson out of the whole ordeal: even if the pricing goes up to silly money, people are still prepared to pay very large amounts for a top-flight GPU. Certainly, as manufacturers continue to make substantial profits and the demand for new GPUs remains strong, it seems unlikely that we’re looking at a pricing crash anytime soon.
Worse still, it’s not like salaries have increased at the same level as inflation since the 1990’s. Again, without getting too deep into political arguments beyond the remit of this article, wage stagnation is a very real thing, meaning in realistic terms that while GPUs have certainly become more expensive, the income we earn to purchase them is not tracking at the same rate.
An unfortunate conclusion, and a toast to the past
So what have we learned? Well, economics is a strange beast indeed. I can already hear the critiques, that you can’t compare apples to apples in this way, that so many factors have to be taken into account to create an accurate equivalency, that times have changed so much in the 20+ years since these GPUs were released that a basic price-adjusted comparison no longer makes any sense.
It’s true, all of this needs to be taken into account to gain a truly accurate perspective on why modern graphics hardware seems so expensive. But as an overview, a simple comparison to what you, and I, will pay at the end of the line for a powerful GPU in our systems that delivers fantastic performance in all the latest games, the results are striking.
While PC gaming has always been a somewhat expensive hobby, it really does seem like if you want to buy a high-end GPU in 2023 that you’re going to be paying a higher equivalent price than you would have back in the halcyon days of my somewhat nerdy youth.
Call me an old sentimentalist if you will, but thinking back on those times when I had very little money, but all the vim, verve, and vigour necessary to save enough of my pennies to buy into the top-end of a pastime that I have since turned into a very rewarding career, I can’t help but feel it’s an era long since passed.
Looking at the way GPU prices seem to be continuing to rise, I wonder how many people today are missing out on the experiences I had, simply because the hardware required for the best performance has risen beyond their reasonable means. It’s not a pleasant thought, but one I think is unfortunately becoming more resonant with each passing generation of PC gamer. Certainly, as things currently stand, it shows no immediate signs of changing.
It’s a pricey old game, this passion of ours. Raise a glass with me, to times long gone. May we one day see the like again.