The motherboard may steal the maternal name but the graphics card is arguably the core of any gaming PC these days. It's a computer in and of itself, designed for the purpose of running the resource hungry visual aspects of the modern game; therefore freeing the rest of the computer up to do what it needs to do, such as grabbing information from hard drives, connecting to the internet and passing data around, or crunching numbers that aren't related to the visuals and so on.
These computers inside our computers have their own RAM and their own processors and their own mainboards and over the years they have of course got more and more sophisticated. This pushes the cost of development up and up as we squeeze more out of them. And the giants that build these visual crunching beasts for our PCs keep pushing the prices up.
And to us, it feels like they are seeing where we, the PC gamers, will stop spending. It's big business and the prices for a single GPU are now often more than I used
The releases of a new GPU series don't drop all at once. It's staggered so that the massively priced top end cards in a series are released first. This makes the biggest splash, often smashing aside existing competition on the market. It drives the hype machine forward so early adopters get swept up in the excitement and spend the really big bucks on those tech leader units. The most recent “big launch” was the RTX 2080 Ti Series, which saw the card announced for a staggering $1,199.
This price tag shocked everyone and, for the first time in many years, it seemed we had gone past the ceiling that even the PC gaming enthusiasts could stomach and we then saw the Nvidia share price suffer this year. Nvidia argued that the price hike was justified due to the new Ray Tracing tech, but most of its customers didn't care for that.
As you'll know there are only two major graphics card manufacturers at the moment and out of those two, Nvidia (up until recently) hasn't been all that challenged in the top end GPU arena, while AMD has been pretty happy gathering huge contracts with Sony and Microsoft, supplying PlayStations and Xbox graphic solutions, while also offering competitive lower-end GPUs that are thrown into the cheaper desktops by the likes of PC World etc
With this in mind, our theory is that Nvidia and possibly AMD both have multiple future series of graphics cards completed or sat in their R&D departments, pretty much ready for launch. But they have to sit on them so as not to cannibalize their own sales figures, sticking to a launch roadmap that maximises sales or until their competition forces their hands. We see examples of this all the time. Take the announcement of the RTX Super cards being “leaked” and then releasing just ahead of the new AMD RX 5700 XT announcement just a few weeks back.
But I digress, our subject matter for further discussion today is, how much did you most recently spend on a graphics card (If you bought a whole PC then just take the price tag of the GPU in the PC) and then how much did you spend on the GPU upgrade before that (assuming you can remember). From that info we can draw a bit of a line in terms of consumer spend increase in the graphics card tech space.
As always, don't forget to let everyone know why in the comments section below!