Just how useful are PC benchmark modes really?

Have you ever loaded up a new PC title, run the in-game benchmark, tweaked settings for optimal performance then discovered that actual gameplay throws up much lower frame-rates, intrusive stutter or worse? It's a particular frustration for us here at Digital Foundry, and it leads to a couple of very obvious questions: firstly, if benchmark modes are not indicative of real-life performance, what use are they? And secondly, if their use is limited, how representative of real-life gaming are the graphics card reviews that use them, including ours?

Before we go on, it's fair to point out that not every benchmark mode out there is useless beyond redemption. In fact, there are a range of great examples that do set you up reasonably well for tweaking for optimal performance. And then there are others which actually drain system resources more than the actual game, which we'd argue is of more use than those that inflate their performance figures.

However, there are some particularly striking examples we have to highlight simply because the delta between benchmark mode and real world performance is absolutely massive. Perhaps the most notorious example we can muster is Tomb Raider 2013. It tests just one scene - the initial shipwreck scene from the beginning of the game - and it shows the camera panning around the Lara Croft character model. It's a scene that's easy to replicate in-game where we find that the same hardware running the same scene at the same settings produces anything up to a 21 per cent performance deficit.

Read more…