Everywhere you look the term 4K is being thrown around with gleeful abandon. You can’t go near one of the litany of online PC hardware stores without getting a glimpse at these ultra high-definition beasts.

In truth though 4K monitors are elusive; the gaming equivalent of a snow leopard. How many of us have seen a 4K monitor in action? How many know someone who owns one? Not many I’d hazard a guess, and a quick look at GD members reveals less than 0.1% are kitted out with 4K monitors.

It’s a hardcore market, and the switch is definitely taking a whole lot longer than the transition to 1080p. For a lot of us the difference was obvious, 1080p was a heck of a lot better than the blurriness we were used to. Outside of gaming and hardcore cinephiles though, the change was a lot slower; many people don’t look for, or notice, the difference between a DVD and a Blu-Ray. It’s the reason Blu-Rays still have way less shelf-space than DVDs; the benefits of moving to the next technology become lesser and lesser the higher the fidelity you go.

4K is an even harder sell, but if it’s going to become commonplace anywhere it’s going to begin with gaming. PC gaming in particular is all about pushing hardware as fast as it can go. Image quality is almost always a priority for a PC gamer, and it’s here we’re seeing the most chatter about 4K, and 4K gaming.

Getting involved in 4K gaming is a hideously expensive business though. We’ve just seen Nvidia’s $1000 GeForce GTX Titan X is just about capable of decent 4K gaming, but graphics fidelity is being pushed as well as resolution. The Titan X might be a 4K performer now, but if Crysis 4 launched next year, who’s to say it wouldn’t take a battering?

So then, 4K gaming, how important is it to you? Are you intent on upgrading until you can hit that elusive number, or are you happy where you are?