From virtual assistants and chatbots to flying rockets into space, technology has progressed leaps and bounds beyond that which could be correctly predicted by anyone. But when we talk about technology, we often talk about what limits us; what is the actual problem which forces us to upgrade each year?

The culprit is always considered to be the hardware, that's because it's exactly how our beloved manufacturers would like us to see it. It's a version of the truth, but only one that feeds into their (and our) constant demand for hardware.

But please allow me to vent on why I consider the software we use to be the actual bottleneck, and it's the software again which forces us to upgrade, year in, year out. For this investigation we'll be doing a series of articles in which I'll be shedding light on the topics given below:

  • Application Programming Interface (APIs)
  • Drivers
  • Patches
  • Digital Rights Management (DRM)
  • Game Engines
  • GameWorks/GPUOpen

This part focuses on APIs and drivers.

Application Programming Interface

Let's start off with the Application Programming Interface (API).  This is a set of functions and procedures which handle the instructions for a device or peripheral attached to a computer and all the communications from the software to the hardware. In very basic terms, an API tells various software and hardware components how they should interact with one another. This is where the DirectX 12 and Vulkan discussions come into play.

Let's firstly go back three years, to a time when AMD ambitiously began to make inroads as a software company. Remember the esteemed Mantle API? No? Well, you would remember the exemplary 45% performance boost it promised in Battlefield 4 right?

So it was a typical example of how reducing overheads can help increase the frame rates in a particular game. As Katie explained at the time, a high-level API is like a toolbox with a bunch of adjustable spanners and swappable bits, you fiddle around and you get a fairly reasonable thing to fix stuff with, but it takes time. With Mantle, however, it knows which tool it will need for the job already, reducing the overhead on the CPU. Using lower level APIs effectively pushes more of the compilation into the coder's mind, as the API doesn't have to guess as much in processing code.

In other words, there is less “padding” (code, drivers etc) between the software and the “metal” (hardware). This helps stop your CPU bottlenecking your GPU by freeing up CPU hardware and allowing it to make up to nine times as many draw calls per second.

This is exactly the same thing that DirectX 12 and Vulkan intend to do, only AMD did it successfully and yet still failed. Go figure.

Technically speaking the advantages of using a low-level API which can effectively communicate instructions are enormous. Before DirectX 12 was even a reality, Intel had showcased demos in which a 73 percent performance boost was recorded in their benchmarks due to reduced overheads and increasing the efficiency on a software level. In effect, you are getting more, from less. The other advantage of DirectX 12 was power efficiency and in some cases, it could reduce the TDP as far as 50%.

But what is the use of fancy demos when you can't get anything practical out of it? AMD demonstrated their commitment yet I feel they made a really bad decision by backing out on Mantle. If one actually starts considering the possibilities if Mantle would have been kept alive, then DirectX 12 would probably have to have been a stronger product because Mantle offered it serious competition. Heck, I would even argue it was Mantle which forced Microsoft to reveal the next iteration of DirectX computing in the first place.

The same case applies to Vulkan, which only has a handful of games to its name. The difference being that Vulkan is a newcomer to the scene (despite springing from OpenGL), and developers haven't time to implement it across many titles yet. DOOM is biggest Vulkan game out, while Star Citizen is a huge upcoming name. But if developers show commitment to at least one of these low-level APIs, we might actually start seeing games which run a lot better on the current and former hardware which supports them.

Drivers

So let's move to drivers. Over the past few years, it's been the job of the hardware manufacturer to optimise their tech to run games smoothly. It's become a habit of journalists to wait for drivers before they begin to benchmark a specific game (Sorry - Jon). But then what is the developer doing? What was the point of the open beta? What was the point of releasing the game then?

The irony, however, is that people aren't reluctant to update their drivers, yet they create a huge fuss when it comes to updating their operating systems. And the former can actually screw up your PC a lot more than the latter. I find this astonishing. Maybe it is too much to ask but I would like a plug and play solution. Nowadays driver updates are released on day one for practically every AAA title, forcing us to wait even longer on top of download a 40GB game, just on the off chance that it's going to make a game run better. Half the time it breaks the game more than anything, as AMD and Nvidia rush to get their drivers out in time. The onus of performance optimisation for a game on relevant hardware surely lies with the developers, at least on flagship cards, but it's seldom the case. Nvidia and AMD release drivers and highlight the percent improvement in performance the drivers can achieve thanks to their changes, rather than the other way around. It's a backwards logic which tells us the hardware is plenty powerful enough, it's just not being used in an entirely effective manner.

So that sums up my rant on APIs and drivers. Sign off in the comments regarding your thoughts on this topic!

Tags: