Why Frame Rates Don't Tell Us Anywhere Near Enough About Game Performance

Written by Jon Sutton on Sun, Mar 11, 2018 2:00 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

When benchmarking, it can be all too easy to get hung up on the metric of frame rates. It is, after all, the lowest hanging fruit. Not only is it the easiest metric to track, but it’s also the simplest to share and compare with others. In that sense, frames per second analysis does work, but as anyone who’s played a game with incessant micro-stuttering will attest, it doesn’t paint anywhere near the full picture.

Instead, drilling down on gaming performance should be focused on the triumvirate of frames per second, frame times, and the 99th percentile.

Let’s take a brief look at what each of these methods shows us, and why they’re useful.

Frames per second (FPS)

Universally the most popular method of benchmarking performance, average frames per second is calculated by taking the total number of frames and dividing them by the number of seconds the benchmark runs for. This provides the average frame rate over the entire course of a benchmark run. While undeniably useful, taking a five-minute benchmark and reducing it to a single figure is extremely reductionist. It provides an overall indicator of the performance level, but there is no feasible way for it to provide important information such as frame drops or stuttering.

Let’s take a one-minute benchmark as an example. Our average FPS is 60 frames per second. Ideally, this would be exactly that, 60 frames each and every second. But what if the first half of the benchmark is a nearly unplayable 15fps due to a particularly demanding scene, and the second half of the benchmark is a very smooth 105fps? The basic FPS maths here is 30 seconds x 15 (for the first half of the benchmark) plus 30 seconds x 105 (for the second half of the benchmark). This gives us a grand total of 3600 frames rendered for the minute, which is 60 frames per second. On the surface, we’ve got a fantastic frame rate, yet for half the time the game is practically unplayable. It's an extreme example, but you get the point.

For a real-world look at FPS data in action, let's take DOOM and Batman: Arkham Knight, benchmarked at 1080p screen resolution with a GeForce GTX 1060, 16GB DDR4 RAM and an Intel Core i7-5820K CPU.

Kind of similar, right?

Additional depth can be provided for FPS results by including the minimum and maximum frame rate, switching from a single data point to three data points.

This is much handier, although there are obviously two huge gaps in this data set. Specifically, we’ve got no idea how much time is spent at these high or low points, we just know that it hits them at least once.

Frame times

The second method is frame times, which is an incredibly handy metric for identifying uneven performance in a benchmark that’s otherwise providing solid results. When a game stutters it just doesn’t feel right, and anything above around 50ms frame times can start to become noticeable. Major hitches can push it upwards to 100ms and beyond, which is the equivalent of ≤10fps, albeit for just a fraction of the second. Aside from the minimum frame rate, standard FPS benchmarking has no way to convey this data aside from it dragging down the average FPS by a tiny margin.

Handily, if you’re benchmarking with a tool like FRAPS, the frame times can be automatically recorded by selecting the appropriate toggles in the Benchmark Settings. The gathered CSV data can then be viewed in spreadsheet form or, as I often do, placed into FRAFS Bench Viewer (separate to FRAPS) where it does all the hard work for you.

The overall aim here is to have the yellow line be as tight as possible. I’ve used the example of DOOM because it runs excellently, and that thick yellow line is exactly what we’re aiming for. The dotted line tagged 16.7 ms marks 60fps, and any frames below this dotted line are rendering faster than 60 frames per second. Meanwhile, 33.3 ms represents 30fps, so anything above this is being rendered at less than 30 frames per second.

DOOM Frame Times

There are two spikes where DOOM drops below 30fps, although these are each for a single frame. Over a 3-minute run that’s pretty decent, and rare enough that you’re going to struggle to notice it. These high frame times likely tie in with loading in new areas, such as when I took a lift to a downstairs room in DOOM. Aside from this though, you can see DOOM justifies its 87fps average frame rate, with nice consistent frame times that very rarely rise above 16.7ms.

If we take a notoriously bad performer like Batman: Arkham Knight, which has improved considerably since launch, we can see the frame times, while largely really great, lack consistency and have dozens of stutters every minute.

Batman: Arkham Knight Frame Times

Arkham Knight runs fantastically on the GTX 1060 used, averaging 77fps, but you can really feel the stutters when spinning the camera around, driving the Batmobile, or changing direction quickly. This gives us some valuable data that a simple fps figure can’t replicate.

99th Percentile

Going a step further beyond frame times is the ‘99th Percentile’. This is a method for identifying the worst 1% of performance achieved and can be used to analyse two things: firstly, it means we can omit any performance oddities and get a more accurate figure for general gameplay from the 99%; conversely,  we can also identify the offending 1% of the worst frame times and see just how badly they're affecting performance.

The benefit of 99th percentile over frame times is that like fps, it’s much more quantifiable, and therefore the results can be shared and compared with other gamers. It takes into account both frame times and frames per second and delivers a millisecond value that indicates the biggest dips in performance.Unlike FPS, the figure also aligns much better with how you feel when you play a game, rather than spitting out a generic number.

The 99th Percentile can also be mapped out using a combination of FRAPS and FRAFS, which in the example of Arkham Knight we can see includes every frame that takes 20.5ms or more to render. There are far more egregious examples from this, but we can see that despite an average fps of 77, for 1% of the time the frame rate dips below 49fps.

Batman: Arkham Knight 99th Percentile

Compare this with DOOM where the 99th percentile is 15.2ms, still below the 16.7ms frame time required for 60fps.

DOOM 99th Percentile

As with frame times, we want this chart to be as flat as humanly possible. The worst the performance is, the higher the peak is at the end. If you benchmark a game and the percentile graph begins to result in the graph looking like the curve of a hockey stick, you're going to be suffering serious performance issues.

If you then want to drill this data down to comparison graphs, this can still be achieved with the average frame times, 99th percentile frame times, and the worst offender - 0.1% percentile frame times.

DOOM and Batman: Arkham Knight 99th Percentile Performance Comparison

(the lower the better)

How Frame Rates, Frame Times and 99th Percentiles can Combine

Combine all three of these methods of analysing frame rates and we get a much more accurate picture of performance in DOOM and Batman: Arkham Knight than the simple 87fps and 77fps average frame rates provide. We know that the minimum frame rate is 65fps and 53fps respectively. We can also see the frames are much more evenly distributed while playing DOOM, with several harsh spikes occurring while playing Batman: Arkham Knight.

Finally, with the analysis of the 99th percentile, we get a good picture for how often both these games actually drop below their minimum frames per second. It sounds counter-intuitive, but we’re breaking it down to thousandths of a second rather than entire seconds. For 1% of the time, Batman is 49fps or lower, while DOOM is 66fps or lower. For 0.1% of the time, Batman dips all the way down to 26fps, while DOOM holds steady at 60fps. Despite just a 13% difference in average FPS, at its worst, Batman runs 55% slower than DOOM.

This is all crucial information that frames per second just cannot provide.

Login or Register to join the debate

Rep
164
Offline
15:52 Mar-14-2018

Consistent fps are my moto


I will not say 1fps bad if they are fix.


Thats why I always see minimum fps.

0
Rep
1,041
Offline
senior admin badge
15:34 Mar-12-2018

frame times aka frame pacing is what really matters - it's the consistency of frame sequences displayed on screen, inconsistency results in perceivable stuttering :)

7
Rep
58
Offline
admin approved badge
15:55 Mar-12-2018

Which is usually associated with bad paring of hardware...example i3 dual core running GTX 1070...

0
Rep
1,041
Offline
senior admin badge
18:40 Mar-12-2018

I'd almost bet i3 paired with GTX1070 is still not a noticeable bottleneck in games which don't effectively utilize more than 2 cores ;)

0
Rep
58
Offline
admin approved badge
19:07 Mar-12-2018

Hmmm...I wouldn't test it out lol

0
Rep
58
Offline
admin approved badge
16:09 Mar-12-2018

TZZ: I have a good example to my point. As most of you here know, I'm currently using my i5/IGPU HD 4600 due to a GTX 1060 RMA. When I play Diablo III I get all sorts of stuttering that I never saw with my GTX 1060. I get that stupid hitching issue as well...

0
Rep
1,041
Offline
senior admin badge
18:42 Mar-12-2018

that is a good example,
I remember when I sold my i3-2130 (only integrated gpu) rig to a friend, he took his old HD4850 gpu and could play plenty games at very nice framerates :D

0
Rep
386
Offline
admin approved badge
18:30 Mar-12-2018

@Tzzsmk Yes I've been repeating that here non-stop, but it seems to be a hit or miss with people.


@Maleficarus Not necessarily. quick change in scenery, sudden effects, poor optimization in the form of overhead and stalling and others(potentialy) can cause frame times to fluctuate leading to bad frame pacing.

0
Rep
50
Offline
21:48 Mar-14-2018

the Subnautica early access was for a time riddled with massive frametime issues; there would be moments when you could hit 75fps but the stutter was so bad that the game was unplayable, and the only solution was playing the game in windowed mode and capping the FPS in Radeon Software to 40FPS, also playing off a RAM disk, it was weirdly read speed bound

0
Rep
386
Offline
admin approved badge
08:51 Mar-15-2018

Yup 40fps with consistent frame times over 75+fps with terrible frame times and it's really weird for disk speeds to affect game performance

1
Rep
4
Offline
admin approved badge
06:28 Mar-12-2018

well the main info is within fps, like if its 6fps average or 60fps average, there are always going to be abnormalities, since 99% of computers/gaming computers are different,with other software installed.

0
Rep
45
Offline
21:57 Mar-11-2018

I will not lie, I cannot even watch movies at the cinema anymore because I have spoiled myself with 60fps+. I can only watch movies at my home because the only one in my country with a 60fps config for viewing movies. So games without good graphics settings and low frame rates are unacceptable(below 60fps).

0
Rep
18
Offline
07:34 Mar-12-2018

What kind of movies do you watch in 60 fps..?

6
Rep
45
Offline
09:34 Mar-12-2018

all movies

0
Rep
6
Offline
10:40 Mar-12-2018

normally when i turn on Frames counter during any TV show or movies its around 29-30fps

0
Rep
55
Offline
10:51 Mar-12-2018

as it should be - more than 30 fps ruins the natural flow & feeling of the movie. I mean, sure, for CGI action sequences, it's probably better, but for a normal movie, where there is a lot of talking, its kind of annoying

0
Rep
383
Offline
senior admin badge
11:16 Mar-12-2018

wut

1
Rep
1,041
Offline
senior admin badge
17:47 Mar-14-2018

nowadays TVs offer some sort of automated frame interpolation, hence why TVs are often advertised as 100Hz, 200Hz or more,
they automatically add frames for more fluid motion feel,
it's a bit strange almost no playback software offers that for PC/Mac/Linux, I can remember only "Splash 2" which can do 60fps realtime frame interpolation during playback,
long time ago I did a test of recalculating a game trailer from 30 to 60fps, result is here on youtube and there should be an original at youtube as well - if you play those at least 720p or 1080p, you should definitely notice a difference of motion "smoothness"

0
Rep
109
Offline
12:10 Mar-12-2018

You now most blockbuster movies are filmen with 24 frames per second, and a very few ones like the Hobbit have 48 frames. So playing a movie with Vsync fixed at 60 fps at home are utterly pointless.
Televisions with 100hz, 200hz are just a gimmick, when speaking of movies and television.
Running 60hz output won’t do any difference when the input and source are limited to 24 or 48 frames.

0
Rep
383
Offline
senior admin badge
12:24 Mar-12-2018

Well a 100hz TV will typically play TV at 96Hz frequency which provides a better image than 24fps on a 60hz TV because it doesn't have to do a pulldown.


But, I'm hopeful more new TV and movies are broadcast at higher frame rates because I really don't enjoy 24fps during pans, makes me feel a bit sick.

0
Rep
23
Offline
13:58 Mar-12-2018

hahaha that's why even if I could I'd never buy top graphic cards. I used to have a pc with which i played everything maxed out 60fps for years...then came bf4 and you can only imagine how frustrating and disappointed I was and...

0
Rep
23
Offline
14:01 Mar-12-2018

and since I couldn't afford something better, it only got worse and I got angrier angrier and started to oc like madman..so in the end when the graphics card died I felt relief and sold everything. Years later and I'm happy with my lappy :D

0
Rep
58
Offline
admin approved badge
21:11 Mar-11-2018

Anyone that claims 30 FPS is "good enough" is obviously in my opinion a long-time console gamer and not a long-time PC gamer. Thirty frames is not fine. Not today, not 10 years ago, not even 20 years ago. 60 FPS has always been the golden number ever since I can remember back in the Quake 2 days 1995/96. And this is the golden PC gaming standard ever since! Now of course with super high refresh rates this has changed but it still holds true that 60 is the least you ever want...

2
Rep
386
Offline
admin approved badge
21:17 Mar-11-2018

60fps was the aim back in the day, because gameplay time and actions were tied to the FPS, now that in most games they are tied to a CPU counter, it really isn't needed and I've played both on PC and consoles since I was a toddler 16 years ago and for the past 3-4 years I've primarily gamed on PC and until 2017(not including) I used to play 6-24 hours a day and believe me it didn't matter as long as the game did NOT bind anything to FPS.


Consistent Frame Rate and Frame Times is what truly matters, getting a constant 30/60/120fps at a constant 33/16/8ms frame time.

2
Rep
58
Offline
admin approved badge
21:28 Mar-11-2018

I have to disagree with you on this one mate. Not counting the obvious games that actually require 60+ FPS to play properly like for example, Quake Champions but other games do too, like, for example Diablo III or how about CSGO?. What you are describing seems to me that you "settle" for 30 because on a console that is all you pretty much got up till recently really....

0
Rep
386
Offline
admin approved badge
07:17 Mar-12-2018

CS.GO is precisely a game where input and actions are tied to the fps...

0
Rep
116
Offline
10:21 Mar-12-2018

Absolutely not. Games which functions are tied to FPS literally break when they're not running only at their desired framerate. Games that do that are rare, today and even years back as well.
It's overall stupid and lazy game design which almost all developers tend to avoid.

1
Rep
386
Offline
admin approved badge
12:19 Mar-12-2018

Just because game time is NOT tied to the FPS, does NOT mean anything else isn't.
CoD, for example, has Input tied to the FPS, everything else is separate.

0
Rep
116
Offline
14:23 Mar-12-2018

Why do you constantly capitalize every 'not' in your comments. There is absolutely no necessity to yell at anyone when you're having a civil argument, it just serves to diminish your point and to infuriate the other person. If you think that argument is a shouting match and those with higher voices usually win, then you're in for a big surprise.

0
Rep
116
Offline
14:23 Mar-12-2018

Everyone can read your comments perfectly well without you trying to capitalize every single negation in them.
Regardless, COD is a primarily a console game built for consoles first and PC's second. PC's usually get a lousy port of the game lately.
Due to the game being built for consoles they know they can tie input to framerate because they will cap it at some point.

1
Rep
133
Offline
junior admin badge
21:44 Mar-11-2018

im long time console player since 2005 to 2013 between year 2013 to 2015 PC then till 2018 ps4 now again pc 30 was normal as i lived with it buts never fine ever 30fps is bit skippy or unsmoth i can see the difference between them 60 is min to enjoy 30 to play
Fps or graphics fps

3
Rep
9
Offline
05:01 Mar-12-2018

This guy does not deserve downvotes, separation of game physics/time and fps is clearly important. Consistent frame times is also obviously important, if you can't see why then you've never had micro stuttering/hitching.

0
Rep
386
Offline
admin approved badge
07:18 Mar-12-2018

Thanks bud, finally someone who understands

0
Rep
133
Offline
junior admin badge
08:06 Mar-12-2018

stuttering yea that why console hold 30 so it dose not go to 50 and 30 noticeable i dont get your down votes lol

1
Rep
386
Offline
admin approved badge
09:47 Mar-12-2018

I have accumulated haters at this point, people that downvote me for nothing, or just want me to be wrong, or just people that are conformists and want to project conformism on to others and blindly believe YouTubers, just because they are YouTubers...

2
Rep
49
Offline
admin approved badge
09:49 Mar-12-2018

Free upvote for u haha.

0
Rep
116
Offline
14:23 Mar-12-2018

@Psychoman


When porting it to PC they disregard the fact that other PC's are going to have variable framerate and they don't incorporate any function that's actually going to deal with it. That's why those problems COD has are usually fixed by capping your framerate up to a point you're always able to reach.
Just as I said above, stupid and lazy game design.

1
Rep
116
Offline
22:17 Mar-11-2018

I agree with you. I believe that 60 FPS should be the bare minimum any game should run at, with anything above that a plus.
30 FPS was usually down to consoles being significantly weaker than PC and unable to hold 60 FPS across their games, but latest console systems are fairly powerful, and should have no problems running games at constant 60 FPS.

4
Rep
133
Offline
junior admin badge
22:42 Mar-11-2018

ps3 was 512mb half for gpu half for sys it did good 30fps was ok for it it was bit weak later on
ps4 and pro and xbox have enough power for 60 but devs put more graphics over 30 gta v fxaa and 2x msaa i think and we should have option to choose

0
Rep
386
Offline
admin approved badge
07:26 Mar-12-2018

ok 60fps with each frame taking between 16ms and 48ms, it will be much worse than a consistent 30fps with each frame taking 33ms. In the 60fps scenarios, you will experience micro-stutters to plain stutters, while with the 30fps you will have a smooth gaming experience unless gameplay and input is tied to the FPS, then you are screwed either way.

0
Rep
116
Offline
10:55 Mar-12-2018

But why 60 FPS for you has to run with a fluctuating frametime? Just as 30 FPS can have constant frametime so can 60. Many competent and well made games run a constant frametime.

1
Rep
386
Offline
admin approved badge
18:25 Mar-12-2018

It doesn't, I'm just stating that often time when a game is stressing your hardware to it its limit(usually the GPU) it can NOT always produce consistent frame times, that's why caping it might be better, so that you achieve a constant frame pacing than just leaving frame times to fluctuate up and down. I'm just giving example, realistic, often enough true examples, but examples.
Then NOT only the frame times fluctuate on their own when the fps itself fluctuates it makes them fluctuate as well, again capping at 60fps constant is much better than the game running between 60-120fps none-stop.

0
Rep
386
Offline
admin approved badge
18:28 Mar-12-2018

And I write "NOT" like this, or anything in all caps, to put Emphasis on it, NOT to shout it, sadly there is this dumb idea that if it is in caps its shouting and the default HTML font effects and others are garbage, even the bolding is meh. I do this because often time people miss keywords that just blend in, in a sentence and I myself miss them to and I've had stupid arguments just because of those misreadings on the other person or my part due to missing a "not"/"'t" or something like that, as you can see it catches your attention.

0
Rep
41
Offline
admin approved badge
00:48 Mar-12-2018

I kinda agree, more on personal preference. 30 fps for me is acceptible BUT i dont consider it good. any game that runs in my potato below 45 fps(personal taste) isnt good or enjoyable for me. I only play with 30 fps minimum if I really like the game and my pc cant run it at higher fps.

0
Rep
95
Online
01:53 Mar-12-2018

Once youve constantly gamed at 60fps, it certainly is hard to go back.
But we cant say that 30fps isnt "good enough" since a great majority still play on consoles where a lot sill run at 30fps. And I wld say a lot of these gamers are perfectly happy with what they have.
The mere fact that folks are of the opinion that its "good enough" is proof that it is.

0
Rep
386
Offline
admin approved badge
07:21 Mar-12-2018

Most of my life I have games at much above 60fps at 75hz, now 60hz, a lot of my console games run at 60fps, unless split-screened and as long as frame times are consistent and gameplay is NOT tied/binded to the FPS it's perfectly fine.

0
Rep
4
Offline
admin approved badge
06:24 Mar-12-2018

dude you sound like a undercover graphics card seller, and your comment is a freaking advertisement, fps is solely on preference, as a elderly person would consider 120fps uncomfortable and straining on thier eyes, same with res, id perfer 4k, but i know others are perfectly comfortable with 1080p.

2
Rep
386
Offline
admin approved badge
07:36 Mar-12-2018

there is a scientifically proven equation involving resolution, screen size and viewing distance, that determine at what resolution, screen size(ppi basically) and viewing distance a human with PERFECT eyesight can NOT see the pixels, apple used to call it retina, that's where the name for their panels come from.


here:
https://designcompaniesranked.com/resources/is-this-retina/


now give or take 5-10% from it just in case, still. Anything else is placebo or you have beyond perfect eyesight, 99% of the time the former.

0
Rep
386
Offline
admin approved badge
19:15 Mar-11-2018

consistent frame rate and frame times are my moto, instead of high average.
I do NOT mind playing at 30fps as long as it is 29-30fps 99.9% of the time, it's much better than a game moving between 30fps and 60fps or even 60fps and 120fps like mad, even though my monitor is 60Hz it still has micro-stutters and even plain stutters...

9
Rep
49
Offline
admin approved badge
20:55 Mar-11-2018

After being spoiled with freesync and 75hz, i have such a hard time playing some games at 30fps, they just hurt my brain. Some run perfectly but others are just blagh, Unity at anything below 40fps is just a giant headache.

0
Rep
386
Offline
admin approved badge
21:12 Mar-11-2018

With unity its most likely poor frame times, some frames taking more or less than others.


Again consistent frame rates and frame times. If it runs at 30fps constant, each frame must be exactly 0.33ms with +/-5% difference, so basically 31-35ms for each frame worst case scenario. Same with 60fps(accordingly of course) or any FPS amount.

0
Rep
386
Offline
admin approved badge
21:14 Mar-11-2018

I went from a 75Hz monitor to a 60Hz monitor and I barely noticed the difference, I play console games which most run at 30fps and it takes me about 2-3 mins to adjust and see it perfectly well.


We are good at adapting. And we think that there is a big difference between 30fps and 60fps(in games that do NOT tie actions to the FPS, but to a CPU counter), because we usually watch them side by side and we are good at seeing differences, but when we adapt to just one or the other we are just fine.

0
Rep
49
Offline
admin approved badge
21:22 Mar-11-2018

I agree but i think the reason we adapt between console and pc is the viewing distance. I don't PC game on my tv but when i play uncharted 3 at 30fps i def notice a difference. I also turn off things like trumotion and other effects so could be why some people don't notice a difference and some do. However on my screen 30fps compared to 75fps isn't even close i can def tell. Hell i even have some games that run at 75fps that are too "choppy" for my taste, FarmingSim17 being one of em.

2
Rep
386
Offline
admin approved badge
21:25 Mar-11-2018

again fps doesn't matter if Frame TImes are NOT consistent.
Even 120fps can look terrible if one frame takes 2ms the next takes 14ms to render, it is still 120fps, but with micro-stutters and is choppy as hell.

0
Rep
49
Offline
admin approved badge
22:04 Mar-11-2018

Yea but I'm talking with low 5-6ms frametimes here, nothing like 30-40ms.

0
Rep
386
Offline
admin approved badge
07:28 Mar-12-2018

At 75fps its 13ms if it's consistent, so you can NOT run at 5-6ms per frame, so if it has poor frame times, then to have micro-stutters it just needs to frame pace at 10-16ms, to have stutters it has to frame pace at 7-19ms, still 75fps though.

0
Rep
49
Offline
admin approved badge
08:09 Mar-12-2018

Right but at uncapped fps like where frametimes actually matter, playing Siege, i can get 5-6ms.

0
Rep
386
Offline
admin approved badge
18:43 Mar-12-2018

True, but this is situational, some games have just terrible frame pacing due to a constant change in scenery, poor optimization and many other factors.

0
Rep
63
Offline
17:29 Mar-11-2018

When all is said and done the best way to benchmark a game is to just play it and see how it feels.


If the performance doesn't affect the gameplay experience,then it's playable on that GPU


For stats,we can keep taking a GPU worse than this and see how low we can go.

4
Rep
49
Offline
admin approved badge
17:55 Mar-11-2018

That's exactly right, it's why i don't care what one benchmark says. Origins for example is supposed to get 65fps on my rig, yea maybe in the desert with no one around, in towns i get 50 ish. Most benchmarks aren't anywhere near accurate, it's just a rough benchmark. Example? Car mechanic simulator says i should be getting 45fps in game benchmark, i get 73 constant with zero dips. Origins says 70 average, i get closer to 55. A benchmark would have to be the whole game to be accurate.

1
Rep
116
Offline
16:24 Mar-11-2018

It's always been confusing me why rest of this data is so scarce when it comes to benchmarking.
Even if FPS is a good gauge of how your games might run, it doesn't show everything and it's not the most accurate one, as explained in the post.
I hope you'll try to incorporate this information in your benchmarks going forward, because it's a lot of helpful information.

0
Rep
19
Offline
16:04 Mar-11-2018

Then I suggest you guys include this data too with every performance analysis! I now know why I still feel stuttering at high framerates. :D

0
Rep
49
Offline
admin approved badge
15:54 Mar-11-2018

I think the majority of reviewers and youtubers are idiots when it comes to games and benchmarks and anything PC related. You can't judge a game on release day PERIOD. So many patches come along that a crap game on release day can be an amazing game a couple months later (CMS18 being one that I've had recently). It's the same with hardware reviews and AIOs that's another joke of a review. In-game benchmarks are notoriously poor as well.

-10
Rep
116
Offline
16:18 Mar-11-2018

Well yes you can, and you should. 90% of the profit games earn come from the first month, usually in the first or second week.
Released games are supposed to be complete products, not something that should achieve optimum playability months after it's been released.

14
Rep
116
Offline
16:18 Mar-11-2018

People want to know how their games are going to run on the day they buy them, not sometime in the future. What's the point in buying a product if you can't really use for months.
It would maybe be good to run benchmarks on a release and a month or two later to find out how the game's been improving if it's been improved at all, but completely scrapping benchmarks on the release day is absurd.

5
Rep
49
Offline
admin approved badge
17:49 Mar-11-2018

Wow neg rep heaven for no reason. First of all no game runs great on release day, with the VAST majority of different builds, and most reviewers testing on high/medium end rigs it completely nulls the benchmark and review. Even games that are supposed to be finished that get delayed don't even run right. Guess people forgot about Arkham Knight on release day right? Smooth fps might not always mean smooth gameplay. Example? For me FS17 gets constant 75fps but is far from smooth.

1
Rep
49
Offline
admin approved badge
17:51 Mar-11-2018

What people don't realize is that everyone perceives everything differently. I have 15/20 vision, the slight bit of unsmoothness and screen tearing makes me nauseous. Some people tell me they never see screen tearing below or above refresh rate, that's how i know some people are totally blind. It's the same with sound, some fans are loud to some and not to others. This is why gameday benchmarks are pointless. Not everyone is running the same os, same hdd, same bios, firmware etc.

1
Rep
49
Offline
admin approved badge
17:52 Mar-11-2018

All of those variables can cause completely different fps then what a reviewer will say, even with identical gpus/cpu combos, some people are even surprised that some mobos score higher in firestrike then others. Because everything in a PC can and will make a difference. Now downvote all you want for someone who actually does benchmarks and reviews and knows what he's talking about lol.

1
Rep
116
Offline
18:54 Mar-11-2018

For someone who claims to play around with benchmarking and reviews games, you don't seem to know what becnhmarks actually are. They are a rough approximation of how a game might run on a system similar to yours, not a definitive answer of how it has to run. If a benchmark scores 60 on a build similar to yours than you can be fairly sure that if you play the same game you'll get similar framerate.

4
Rep
116
Offline
18:54 Mar-11-2018

There might be some anomalies or deviations in which certain hardware combinations can cause certain games to behave erratically or very differently from what was shown in a benchmarks, but those are usually few and far between. Every piece of silicon is different, some are better and some are worse, but there's rarely any noticeable performance impact between them, except for special scenarios.

3
Rep
116
Offline
18:54 Mar-11-2018

You state that everyone perceives everything differently, and then in the very next sentence you dismiss that claim because people don't see what you do.
Again, point of a benchmark is to give a rough approximation of how a certain product is going to run on similar systems to give people willing to buy it some degree of understanding what they're getting into.

4
Rep
116
Offline
18:54 Mar-11-2018

Also the statement "no game runs great on release date" is just false. I've played hundreds of different games of almost every genre made by hundreds of different developers indie and AAA alike, which run really well on their release date.
Just because todays AAA devs mostly release their games as glorified QA tests which they then scramble to fix doesn't mean that is or was always the case.

5
Rep
116
Offline
18:54 Mar-11-2018

Also FPS doesn't show how smooth some game will run, it shows how well it's going to run. This is essentially what the article was about if you read it.
Smoothness is better benchmarked by looking at frametimes and 99th percentile functions. It's pretty well explained in the article, and as someone who claims to dabble with benchmarks you should know that.

4
Rep
49
Offline
admin approved badge
20:48 Mar-11-2018

Let's quote you then, running really well doesn't mean perfect. If the hundreds of games you played ran great on release date then none of them would need patches and that's just not true. Yes it's possible a few indie titles come out running perfect with no bugs or patches needed but I'm guessing those are mostly very low end titles. I know exactly how 99%, 98% and frametime works, for a solid 2 months i did nothing but benchmark ryzen on release date with different power configs, ram speeds

-2
Rep
49
Offline
admin approved badge
20:50 Mar-11-2018

Ram timings and so on. Most games never had consistent benchmarks, rise of the tomb raider being the worst one which sometimes would see a deviation of 10fps in the mins just between 2 runs (which is why when i see that game in a review i ignore it completely). Most people are obsessed with high fps for no reason at all, unless you're playing csgo or siege where frame times are as important as ping then for most games playing capped or vsync isn't an issue.

0
Rep
116
Offline
21:39 Mar-11-2018

Running really well means running really well. I never ever said perfect, because true perfection doesn't exist, it's a myth.
Just because a game runs well it doesn't mean patching it is unnecessary. Every single program except for most basic simple ones has bugs and continued support of the product by it's developers is always a plus.

1
Rep
116
Offline
21:39 Mar-11-2018

Most, if not all, in game benchmarks have some degree of randomization added to them which mostly causes no two runs to be the same. That is why if you want the most accurate result you run the benchmark multiple times and take the middle number.
Even professional benchmarking tools don't provide 100% consistency.

2
Rep
116
Offline
21:39 Mar-11-2018

Most people are obsessed with FPS and frametimes for very simple reason. They show how smooth and well your game is going to run. The higher FPS is and lower frametime is, the better. I never said Vsync or capped framerates are an issue. I tend to run most of my games capped at 60 because that's my minimum requirement and that's the refresh rate of my monitor.

1
Rep
116
Offline
21:39 Mar-11-2018

I don't like anything below that, unless a game does not require accuracy or high reflexes, than I can settle for anything above 30, but I'm not going to complain if something I play runs better than I expected it to.

1
Rep
9
Offline
05:17 Mar-12-2018

TheEmperor96 I want to applaud you for saying these factual and true comments. Despite the other guy's flaunting of his reviewing and benchamarking endeavors, you stand by logic and rationality. Props to you

0
Rep
49
Offline
admin approved badge
08:12 Mar-12-2018

Db why don't you suck u to him anymore. I didn't reply because i already know he's correct it's why i didnt reply or argue. But he's still wrong about patches and updates. Like i said before, if a game on release day was perfect it wouldn't need patches/updates for whatever reason unless they're adding content, and if you read patch notes you'd see there is ALWAYS bug fixes and performance changes. You're telling me you still use outdated windows/steam/origin/uplay/vlc/ccleaner clients? Doubt it

0
Rep
49
Offline
admin approved badge
08:13 Mar-12-2018

And btw 90% of profits isn't made on release day, you're thinking consoles, that just does not happen on PC because they know better, but yes some idiots will still buy release day games knowing they're going to be unoptimized and buggy. Same reason titan x pascal owners went on bought the titan xp, there's a sucker born every day.

0
Rep
49
Offline
admin approved badge
09:38 Mar-12-2018

Also you guys know that 2 IDENTICAL systems could have pretty wildly different FPS/frametimes right?

0
Rep
116
Offline
11:11 Mar-12-2018

It feels like you're reading half of what I said and then skipping the other half.
Like I said before, no game is perfect, but that doesn't mean it has to run bad on it's release date. Bugfixing patches will always be there because software always has bugs, and performance improvements even though your game is running really well on release date is always an improvement.

0
Rep
116
Offline
11:11 Mar-12-2018

I also said 90% of the profit is made during the first month, not first day. There are some anomalies, as there always are, but overwhelming majority of new releases sell their biggest share during the release month. I would like to live in a world where PC users knew better, but many of them doesn't. That's mostly the reason why I like this site, most people here know what they're talking about.

0
Rep
116
Offline
11:11 Mar-12-2018

2 identical PC's can have some pretty wild differences but that's very rarely the case. Just as I mentioned above, there are anomalies and special cases that can happen, but usually if you have the same build as someone else, and you're both running your OS's clean and well, you'll most probably have very similar, sometimes near identical experiences.

0
Rep
49
Offline
admin approved badge
13:50 Mar-12-2018

Yea but not too many reviewers and benchmarks are run on a clean windows install, that's what I'm telling you. It's why what someone gets won't automatically be what you get. I get downvoted for speaking the truth it's insane. If you really think 90% of profit is made during release day or even week that's just on consoles man. PC owners tend to wait for reviews before buying a game, we know better.

1
Rep
116
Offline
14:04 Mar-12-2018

You'll never get what they show, are you even reading what I'm writing here? It's an approximation of how it could run and not how it will run. If you know how to keep your system healthy, even a year old system will not deviate as much.
I also think you're giving way too much credit to PC owners, they don't know as much as you claim they do.

0
Rep
49
Offline
admin approved badge
14:39 Mar-12-2018

Yes i get that but benchmarks aren't remotely close to the performance of the entire game, it doesn't mean anything it's just there for s**ts and giggles. Origins is a prime example of that, no where do i get remotely near what the benchmark gets.

1

Can They Run... |

| 30FPS, Medium, 1080p
Core i5-9300H 4-Core 2.4GHz GeForce GTX 1650 Mobile 8GB
Core i3-2100 3.1GHz Radeon HD 6570 4GB
| 30FPS, Medium, 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce GTX 1060 MSI Gaming X 6G Edition 16GB
| 30FPS, Medium, 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce GTX 1060 MSI Gaming X 6G Edition 16GB
Core i7-4770K 4-Core 3.5GHz GeForce RTX 2060 MSI Ventus XS OC 6GB 16GB
100% Yes [1 votes]
| 60FPS, Low, 1080p
Core i7-2600 4-Core 3.40GHz Radeon R9 280X MSI Gaming 3GB Edition 8GB
100% Yes [2 votes]
Ryzen 5 5600X 6-Core 3.7GHz GeForce RTX 3060 Ti Gainward Ghost 8GB 16GB
100% Yes [2 votes]
| 60FPS, High, 1080p
Ryzen 7 3750H 4-Core 2.3 GHz GeForce RTX 2060 Mobile 16GB
| 60FPS, High, 1080p
Ryzen 5 5600H 6-Core 3.3GHz GeForce RTX 3050 Ti Mobile 16GB
0% No [1 votes]
| 60FPS, Ultra, 1080p
FX-6300 Radeon RX 550X 4GB 16GB