Monster Hunter World Uncapped Frame Rate on PC Confirmed, GTX 1080 Performance Revealed

Written by Jon Sutton on Wed, Jul 25, 2018 2:34 PM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

When the system requirements for Monster Hunter World were revealed a few weeks back, there was equal parts confusion and worry over the recommended specs. A GeForce GTX 1060 was recommended to play MonHun World at 1080p / 30fps, prompting concerns there may even be a frame rate cap in Monster Hunter World.

Fear not though, for ResetEra member FluffyQuack was allowed some hands-on time with the PC release of Monster Hunter World and can confirm that there is no frame rate capable in MonHun World whatsoever.

Monster Hunter World runs in Capcom’s proprietary MT Framework game engine. Most games in this engine are actually capped at 120fps, but Monster Hunter World is uncapped on PC. However, if you want to hit these high frame rates you’re going to need a very beefy system.

“I got permission from a Capcom rep to post impressions of a review build of the game,” wrote FluffyQuack. “The game supports framerates above 60fps (the framerate lock options are 30, 60, and “no limit”). Unlike most MT Framework games, the game can go beyond 120fps, though you’ll need a very beefy PC if you want to get close to 100 with settings around max.”

As for just how beefy a PC is required, it turns out Monster Hunter World is a fairly demanding beast. With a GeForce GTX 1080, Intel Core i7-4790K and 16GB RAM, the following average frame rates were achieved in the starting hub at 1440p resolution:

  • Low: 108fps
  • Medium: 65fps
  • High: 60fps
  • Ultra: 44fps

Unsurprisingly, volumetric lighting is a demanding graphics setting in Monster Hunter World, and dropping this from Ultra to High pushes the average frames per second up to a respectable 50fps. Based on this, it seems reasonable to expect that a GeForce GTX 1070 should be able to push a solid 60fps at 1080p Ultra, although performance on a GTX 1060 or AMD Radeon RX 580 may be a little more concerning.

A major thing to remember with these results is that we’re still a fortnight out from Monster Hunter World’s August 9th PC release. There could still be optimisation work to be done and Nvidia has yet to release its Game Ready GPU driver for MonHun World.

Visually, Capcom has said it's aiming for graphical parity with the console release.

Login or Register to join the debate

Rep
95
Offline
12:08 Aug-02-2018

IGN has a video on MHW running at low/high/max 1080p with a gtx980.
Its crap (as a benchmark) but it is there for anyone interested :)

0
Rep
30
Offline
12:01 Aug-02-2018

So I suppose I should be good, good thing I upgraded

0
Rep
272
Offline
admin approved badge
15:48 Aug-01-2018

GREAT NEWS! Really happy with that :)
If they add SLI functionality so I can use both of my 1080Tis then I'll be able to crank the settings up sky-high :)

-4
Rep
28
Offline
20:20 Aug-01-2018

Should look great in 4k but don't hold out for 4k on 60fps as the game is poorly optimised if you want a locked 60fps go 1440p but then that will only match the PS4 Pros graphics and res but with a slightly higher frame rate.

0
Rep
28
Offline
20:22 Aug-01-2018

As Capcom have said they are keeping a visual parity with the console versions so won't exceed the graphics of console so you need a 1080ti to get a locked 60fps with same graphics and res as the PS4 pro that's how unoptimised the game is.

0
Rep
272
Offline
admin approved badge
21:46 Aug-01-2018

I'm not looking for 60fps, I'm looking for something much higher. 60fps feels like lag on my 165Hz screen :)

-4
Rep
28
Offline
01:35 Jul-30-2018

So the PS4 Pro can run this stable 30fps @1440p and ultra graphic settings yet a much more powerful 1080 can only manage 44fps must be poorly optimised.

0
Rep
69
Offline
15:10 Jul-30-2018

at wut graphical setting does ps4 pro is running? I dun own one.

0
Rep
272
Offline
admin approved badge
15:32 Aug-01-2018

If you think the PS4 is running an "ultra" preset then I suggest you get some sleep and come back later :)

0
Rep
28
Offline
20:14 Aug-01-2018

Capcom have said themselves the PC version is going to keep visual parity with the console version so won't have better visuals the highest pre set on PC will only match consoles not exceed it.

-1
Rep
272
Offline
admin approved badge
21:49 Aug-01-2018

There are many nuances, like LODs, draw distances, dynamic resolution rendering, etc that the PC can render at higher detail/distance/density than the console, yet using the same assets. "Same visual parity" does not equal "it will look the same", it merely means "we didn't remaster the assets". The fact that the framerates are unclocked is alone a huge jump in visual fidelity (to me, anyway) compared to the cancer framerates of the PS4.

1
Rep
106
Offline
admin approved badge
00:20 Jul-30-2018

So what do you guys think If I run this game at 1440p get high settings for 60fps. I kind of wish higher fps but oh well

0
Rep
-3
Offline
06:20 Jul-29-2018

I might skip this one out. Honestly I play games in 1280x960 just because my monitor supports 75Hz at that res and not on 1080p. So If I cant reach over 75 fps at that resolution, no thank you

0
Rep
386
Offline
admin approved badge
10:21 Jul-29-2018

Meh as person who went from 1680x1050 75Hz to 2560x1080 the 15hz difference isn't noticable.

2
Rep
-3
Offline
10:48 Jul-31-2018

I guess it depends on the person then. For me it's day and night difference. 60 FPS is a no no for me now that I experienced 75.

0
Rep
272
Offline
admin approved badge
15:50 Aug-01-2018

I have to disagree. 75Hz is VERY noticeable. If ever my GPU driver at work craps out and the overclock on the screen drops from 75Hz down to the original 60Hz - I notice within seconds.

0
Rep
386
Offline
admin approved badge
18:32 Aug-01-2018

Well that's the thing you are switching between them, try doing this. Play on 75Hz one day, switch 60Hz without playing and then play at 60Hz the next day. Our eyes(our brains more accurately) are good at noticing sudden differences.

0
Rep
272
Offline
admin approved badge
21:53 Aug-01-2018

Dunno, man, it can happen overnight and I'd notice in the morning. I also don't like sitting at other people's computers (my friends, my gf, etc) due to the 60Hz screens feeling slow.
Maybe you're not as sensitive to refresh rates, which is ok. If 30fps works for you - that's fine. But to suggest that lower framerates must work for everyone - that's just silly.

0
Rep
386
Offline
admin approved badge
09:58 Jul-28-2018

Well considering how good the game looks and that it's not even out yet for mid range GPU to run it at 1440 high with 60fps, I think it's great, except that the mid range GPU costs 600 euro... -_-

-5
Rep
38
Offline
10:03 Jul-28-2018

gtx 1080 mid range? Yeah right lol

6
Rep
386
Offline
admin approved badge
11:41 Jul-28-2018

It is... Its got a 312mm^2 die, the GTX 560 has a 332mm^2 die... Also the GTX 1080 is GP104(GeForce Pascal 104) chip, the GTX 560 is a GF104( GeForce Fermi 104) chip...
The GTX 560 came out costing 200$ at launch(254$ in 2016) the GTX 1080 costed 700$ at launch... Not to mention that Pascal had 2x lower failure rate than Fermi...

-2
Rep
386
Offline
admin approved badge
11:43 Jul-28-2018

Stop looking at naming schemes... And stop looking at performance, neither of those are price and cost facotrs...


Also Fermi had higher R&D budget than Pascal

-3
Rep
386
Offline
admin approved badge
11:45 Jul-28-2018

Nvidiahas been selling the GTX 1000 series gpus at 100% to 134% profit margins since they launched and even higher with the founders edition bullshiz...


Like people blame Apple for being overpriced, but Apple operates at around 40% profit margins

-3
Rep
93
Offline
21:40 Jul-28-2018

So by your logic are any recent Graphic cards "high end"?

7
Rep
386
Offline
admin approved badge
08:10 Jul-29-2018

No... Any GPU with a big die is a high end GPU...
Performance doesn't determine if a chip is high-end or not

-4
Rep
93
Offline
19:13 Jul-29-2018

But haven't transistors been getting smaller so they can fit more onto a smaller die. Plus it makes no sense for a high end GPU to be many times less powerful than a mid ranged GPU, honestly I think die size being what determines the tier of a card is BS.

4
Rep
38
Offline
20:43 Jul-28-2018

lol the point is that there's literally only 2 cards that are somewhat affordable and meant for gaming that would outperform the 1080 constantly. It being in the top 3 of gaming cards kinda makes it a high range card no matter what.

7
Rep
386
Offline
admin approved badge
20:56 Jul-28-2018

Performance isn't a price nor tier factor... A GTX 280 is high-end till this day since it's a 567mm^2 die size and would cost Nvidia just as much to produce on the 65nm process node as it did back in 2009


Die size, r&d and failure rate are price and cost factors, along with wafer cost and vram cost(8gb gddr5x costs 11.5$ to AMD and Nvidia)

-6
Rep
38
Offline
00:37 Jul-29-2018

Why would I measure it like that?
Something is high end if it's among the best of a generation and the gtx 1080 is. Everything else is completely unimportant and I see no reason why anyone would care about the die size whatsoever.


What it comes down to is only the performance and where it was placed by nvidia, and as we all know the xx80 (ti) is always on top of the normal gaming cards.

7
Rep
38
Offline
00:39 Jul-29-2018

It's not about what is (was) the highest end technology available to the manufacturer, but about what is the highest end Hardware available for the customer.

5
Rep
93
Offline
00:54 Jul-29-2018

Thank you.

4
Rep
386
Offline
admin approved badge
08:14 Jul-29-2018

I measure it like that, because that is how it is measured... It's NOT subjective. And the GTX 1080Ti/GTX Titan Xp are GP102 at 471mm^2 and not the best Pascal GPU Nvidia has/had to offer, Nvidia has/had GP100 at 602mm^2 that they didn't release due to lack of competition from AMD, so there could have been an even better GPU than the GTX 1080ti a GTX 1090 for example...

-5
Rep
386
Offline
admin approved badge
08:25 Jul-29-2018

And if performance was a price factor then the gtx 1080ti is 62400% better than the GeForce 256 and should cost 187200$ without adjusting for inflation

-3
Rep
38
Offline
21:36 Jul-29-2018

You don't get the point at all, answering to the other comment I can't reply to. Even if it's not the best they could offer at the time, thus not the highest tech available, it's still in the high end if it's among the best products of its generation, which it is. That isn't debatablein any way, that's how it is. Not "high tech" still the high range product.

5
Rep
386
Offline
admin approved badge
22:29 Jul-29-2018

facepalm...
How can I explain to you that performance and being "the best available" are not criteria... even if the gtx 1080 was the best GPU until this day, it is still a mid-range GPU... -_-


Performance is NOT a tier(low, mid, high-end) factor, Die Size is, it's not subjective and opinions don't matter, those are facts... you can accept them or not, but that's the truth -_-

-2
Rep
93
Offline
23:47 Jul-29-2018

So a new GPU with a die size double that of the 1080 but half the performance would be a high end card? 99.9% of people will say the 1080 is a high ranged card, what has die size got to do with anything, they can fit more on a smaller die due to the fact (correct me if I'm wrong) that transistors have gotten a lot smaller over time.

2
Rep
386
Offline
admin approved badge
23:52 Jul-29-2018

Die size is what determines if a GPU is high-end or low-end or in-between.


And 99.9% of people used to say that the earth is flat... they were not correct...


Performance is a byproduct... An chip is high end based on die size and the cost factors are die size, R&D costs and failure rate, along with wafer cost and Vram cost(the last two just go with inflation and Vram is super cheap, 11.5$ per 8GB module GDDR5(X)).


Die size relative to the biggest possible to produce and cool die is what matters... Again, not opinion, facts... people can say whatever they want, especially ignorant youtubers and reviewers...

-3
Rep
386
Offline
admin approved badge
23:52 Jul-29-2018

Well there are a couple youtubers that actually know that Die size is what matters...

-2
Rep
93
Offline
00:47 Jul-30-2018

Almost no one buys hardware based on die size. Everyone looks at the things that matter; specs, clock speeds, Vram, performance ect... Since there is "no high end card", that would automatically put the 1080(Ti) as the highest tier card.
The tiers go something like this: XX30= Low end, XX50,60= Mid ranged, XX70,80= High end, XX80Ti/Titan= Enthusiasts. Not: XX30= Low end, XX50,60,70,80,Ti,Titan= Mid ranged.


I do understand your points, however they don't really apply to anyone except those couple of Youtubers and you.

2
Rep
386
Offline
admin approved badge
01:04 Jul-30-2018

Look the performance of a card doesn't make it high end or low end...
And these points apply to everyone, like it or not...
And the gtx 1050ti and gtx 1060 is low-end... -_-
The gtx 1050 and gt 1030 are entry chips...


And people don't buy them based on that, because they are ignorant... that's why they think that if a card's name is "XX80" it is a high-end GPU and should cost appropriately, when the gtx 1080 should have costed 300$ MSRP at launch in 2016, when the gtx 560 costed 200$(254$ in 2016) at launch in 2011 and it was a much more expensive card due to higher R&D costs and higher failure rate, along with bigger die..

-1
Rep
13
Offline
02:38 Jul-30-2018

I OBVIOUSLY bought my 1080 because i wanted a Mid range Card and wanted that die size. Had nothing to do with that it outperformed almost every card out there and i wanted the best....sorry...that came out wrong....wanted mid tier best.

5
Rep
386
Offline
admin approved badge
10:40 Jul-30-2018

why do people care if their GPU is mid-range or high-end?... --
Performance and Tier(low-end, mid-range, high-end) are separate... There can be a GPU that is 10x times better than all the rest of GPUs for 10 years it can still be a mid-range or low-end card, it can be a high-end card too, it's all about the die size... -
-


GPU tier doesn't represent performance, it represents die size and thus the cost and price the chip should be sold at... -_-

0
Rep
386
Offline
admin approved badge
10:42 Jul-30-2018

As I said performance is a byproduct... stop caring if your GPU is high-end, low-end, entry, etc... care if the performance it gives you is good enough for your need... The only reason you should care if a GPU is high-end, low-end or mid-range is its price...


The gtx 1080 should have come out costing 300-330$ MSRP in 2016 and NOT 700$... At 700$ that was above 100$ profit margin... -_-
Nvidia is technically 3x times more overpriced than apple... and I bet you people think apple is overpriced...

0
Rep
386
Offline
admin approved badge
15:55 Jul-30-2018

100%* not 100$ my bad...

0
Rep
93
Offline
19:59 Jul-30-2018

Let' just agree we both have different definitions of "High end" and leave it at that, as arguing is getting us no where. For me high end is the highest tier card that is AVAILABLE to the public, not what could've been available, as that basically means nothing is high end.


Also what about CPUs, are the highest performance ones still mid-ranged as their die is a little small?

1
Rep
386
Offline
admin approved badge
20:59 Jul-30-2018

-_- when will you understand that this is NOT an opinion and that it's a FACT...


Thread-ripper 12 and 16 core are in the lower-end of high-end, Ryzen 7 is mid-range, ryzen 5 6 core is low-end, Ryzen 3 and ryzen 5 quad cores are entry level.
i7 8xxx is mid-range, so is the i5 8xxx the i3 8xxx are low-end.


CPUs can't be cooled as easily as GPUs and thus they have smaller die sizes, along with AMD and Intel wanting them to have as small die sizes as possible for the biggest yields as possible, that's why intel's cores have barely increased in transistor count since sandy bridge, but have become smaller and smaller.

-1
Rep
386
Offline
admin approved badge
21:05 Jul-30-2018

Otherwise, a really proper high-end CPU chip would be a 12/14nm 32 core Threadripper(zen) or 12-18 core i9/i7 or whatever.

-1
Rep
386
Offline
admin approved badge
21:07 Jul-30-2018

I never talk about opinions when it comes to technology... only facts... Die size applies to every IC chip, not just CPUs or GPUs or computer hardware... -_-

-1
Rep
93
Offline
21:28 Jul-31-2018

Well the next time I buy a GPU I will definitely look only for the die size on the box and complete ignore all other specs and how well it performs because they're not important. Oh and if it performs bad I'll just increase the die size which will magically increase the tier of it from mid-ranged to high, thanks.

0
Rep
93
Offline
21:32 Jul-31-2018

Also there really is no official tiers of GPUs, there isn't a rule book that states what tier a GPU is, they're all down to what the user sees as important, if you measure tiers on performance, fine, if you measure it on specs such as shaders and Vram, fine, if you measure it by die size, by all means go ahead.

0
Rep
93
Offline
21:36 Jul-31-2018

But I've been meaning to answer the question: what ON the actual die makes a difference? Is it the higher transistor or core count, because if you're basing it just of the size and completely ignoring what is ON the die itself, then you're just making empty arguments.
So in short: is it just the size of the wafer and nothing else, or how much of a certain thing they can fit on it?

0
Rep
93
Offline
21:36 Jul-31-2018

*silicon not wafer.
I hope you understood, I suck at explain things :P

0
Rep
386
Offline
admin approved badge
22:34 Jul-31-2018

there are tiers of die sizes, measured in quarters cut down from the biggest possible die and that's how they cut them. The GP100 is 2x(100%) times bigger than the gtx 1080(GP104), 1.5x(50%) bigger than the gtx 1080Ti/Titan Xp(GP102) and 3x times bigger than the gtx 1060(GP106). Though if I started talking about quarter cuts all of a sudden I'd confuse you even more and I'm terrible at explaining too... that's why I can't get my point across...


And you should look at the process node too... if wafer A is 50% smaller than wafer B you ideally expect wafer A to have 50% more performance for the same die size...

0
Rep
386
Offline
admin approved badge
22:38 Jul-31-2018

...the same die size, though at times they have to add more functionalities, which means less of the previous execution units, so that is not always the case.


And again Performance =/= tiers...


Sadly you are looking at tiers the way the internet(ignorant reviewers and then their followers) have assigned based on performance... egh...


You should give no fuks at all if your GPU is the best or worst, high-end or low-end(except for pricing) be it die size tiers or what the internet calls tiers based on performance.

0
Rep
386
Offline
admin approved badge
22:41 Jul-31-2018

Now when it comes to wafers there are three-four factors:



  1. Size

  2. Complexity

  3. R&D cost for the wafers

  4. Whole/partial wafer failure rate.


the bigger the wafer the more it costs, the more complex the wafer the more it costs and the higher the R&D is so the more it costs and also the with higher complexity comes high failure rate.


now the transistor count is irrelevant as 1 billion transistors at 40nm cost more than twice as much as 2 billion transistors at 20nm most of the time, unless complexity changes, it's the die size.


And what AMD and Nvidia are paying is for wafers and not transistors.

0
Rep
386
Offline
admin approved badge
22:44 Jul-31-2018

wafers so far have scaled with inflation for the past decade. There were rumors that this year they'd get more expensive on top of inflation, but so far no info has been given if it's true and if so by how much.


A wafer with X amount of 40nm gates would cost the same as a wafer with 2 times X 20nm gates of the same complexity and size and the complexity of GPUs and the size of the wafers has remained the same since Nvidia switched to CUDA and AMD from VLIW to SIMD.


So the bigger the chip, the less fit per wafer, the more it costs per chip.

0
Rep
93
Offline
21:43 Jul-31-2018

Also the Titan V has a die size of 815mm^2 So by your logic that's well in the high range.

0
Rep
386
Offline
admin approved badge
22:29 Jul-31-2018

Yes the Titan V is the only truly high-end GPU

1
Rep
93
Offline
23:10 Jul-31-2018

This is what most people mean when they say "high end": https://imgur.com/vy5QhCD
Most expensive most of the time meaning most powerful in the series.

0
Rep
93
Offline
23:17 Jul-31-2018

It's honesty just a marketing gimmick for consumers, and I think we've gone way too much into it. High end on a customer side and high end on a technical/ manufacturing side are 2 different things.
Really though why are we still arguing about this, it makes no difference what tier you put it in, as long as the performance of the GPU is good enough for you then who cares if it's high/low/mid/entry/extreme ect....
Your recent explanation is interesting, I've never really looked into stuff like that before, I'll have to research it a little.

0
Rep
386
Offline
admin approved badge
00:15 Aug-01-2018

We should care, because Nvidia is/was selling the gtx 1080 at 700$ at launch, which was a 116% profit margin in 2016, whereas the equivalent of the gtx 1080 from 2011 the gtx 560 was sold for 200$(254$ at launch) at launch and for its time did much better and was a much more expensive GPU to produce and R&D than the gtx 1080...


The GTX 1080 should NOT have costed more than 300-330$ MSRP at launch in 2016, 350$ tops, but the ignorance of people and thanks to those damn reviewers and the fact that Nvidia called it an "XX80" GPU fooled people into thinking that 700$(or even 600$) is a correct price for such a small chip.

2
Rep
272
Offline
admin approved badge
21:56 Aug-01-2018

"Everything is worth what people are willing to pay for it." Andrea Guerra, Luxottica CEO.

2
Rep
386
Offline
admin approved badge
19:42 Aug-02-2018

And that's why the money system is sh!t... and why people are sh!t, but oh well...

0
Rep
272
Offline
admin approved badge
18:13 Aug-03-2018

"People" also includes your mighty self, I must highlight. If you have all the answers to human problems then I do urge you to go into politics and change the world. Otherwise - sit there behind your keyboard and don't sweat it.

1
Rep
386
Offline
admin approved badge
19:45 Aug-03-2018

Oh I have easy answers that people would never like when it comes to politics XD

0
Rep
93
Offline
20:08 Aug-01-2018

Ok we can agree on something, GPUs are overpriced, and I see what you mean, in terms of what Nvidia can produce the 1080 Ti is only in the mid-ranged (I did some shower-thinking), however they never did produce anything higher for that series. But the 1080 is much more powerful than say a 980, there was quite a large performance jump between the 2 series, and unless you do a lot of deep research no ones going to take notice in the die sizes or anything else like that. Well all I can can is I hope the GTX 11XX have a die size to satisfy your needs.

0
Rep
386
Offline
admin approved badge
19:48 Aug-02-2018

Well that is true they didn't, just like AMD back in the day of the hd 4000 series didn't produce anything higher than a mid-range GPU -> HD4870 that was only 20% slower than the gtx 280 which was/is a high-end GPU with around a 600mm^2 die size and costed appropriately 650$($775.98 in 2018), but AMD didn't price the hd 4870 at 650$ or 500$, they priced it at 300$(358$ in 2018) and the die size was 256mm^2, considering that at the time wafers were more expensive and so was Vram, especially Vram. Nowadays an 8GB GDDR5(X) module is 11.5$ to AMD and Nvidia, back then for 1GB it was 40-50$

0
Rep
386
Offline
admin approved badge
19:49 Aug-02-2018

And why didn't they do it? Because they would have been slaughtered... back then reviewers kept an eye on the die size and everything... and the same amount of people that were aware then are aware now, the difference is now there are a lot more people buying... a lot more ignorant people.

0
Rep
93
Offline
20:10 Aug-01-2018

If Intel does start releasing high end(or mid ranged whatever) GPUs, then maybe we can start to see some changes.

0
Rep
386
Offline
admin approved badge
19:49 Aug-02-2018

Oh I wish they do, but knowing intel... I doubt it...

1
Rep
14
Offline
08:49 Jul-28-2018

I'll be here after 10+ patches

4
Rep
-25
Offline
17:39 Jul-27-2018

A GTX 1080 + i7-4790k is only good for 44fps IN ULTRA AND 1440P????


Damn, what is it gonna be for 4K???


I seriously hope that this game will support multi-gpu.

1
Rep
14
Offline
08:56 Jul-28-2018

Remember those early Kingdom Come Deliverance benchmarks ? I won't be suppressed if modest cards like gtx 1060 could play decent 50+ fps @ 1080p within 6 months

1
Rep
33
Offline
17:26 Jul-27-2018

Monster Hunter World article. Picture from Monster Hunter Freedom Unite 2.
Nice.

0
Rep
272
Offline
admin approved badge
15:52 Aug-01-2018

Freedom 2*


Freedom Unite (2nd G) had no part 2 :D

0
Rep
4
Offline
16:17 Jul-26-2018

1060 30 fps? no i dont want a headache

8
Rep
42
Offline
11:21 Jul-27-2018

Found the PC Master Race guy

-5
Rep
2
Offline
13:33 Jul-27-2018

Aaaaaaaaaaaahhahahahaahah

0
Rep
4
Offline
13:54 Jul-27-2018

You guys dont agree? When i was younger i didnt mind to play even at 22 fps but now i just cant i get tired pretty fast i want fps and want a smooth gameplay thats why i prefer to lower settings to keep high fps

4
Rep
386
Offline
admin approved badge
09:58 Jul-28-2018

Nope consistent 30fps is more than good enough for me.

-2
Rep
272
Offline
admin approved badge
15:53 Aug-01-2018

30 is cancer. Even 60 feels like lag when you're used to a higher refresh rate. My screen caps out at 165Hz.

0
Rep
55
Offline
08:33 Jul-26-2018

No 4K at 60FPS even on a GTX 1080 for a console look-alike?
I am excited for this game but it seems very unoptimized/

10
Rep
106
Offline
admin approved badge
00:33 Jul-27-2018

yah something is not right with performance. unless somehow they made it where the PC is rendering objects outside of view point I can't believe this performance against the new final fantasy game

0
Rep
31
Offline
02:43 Jul-26-2018

This is just my speculation , but it seem framerate drops are most likely happening in hub areas. Similar to games like Warframe that has smooth buttery framerate all the time except when it has to load 100+ fashion frames in town.

2
Rep
76
Offline
admin approved badge
22:11 Jul-25-2018

Yeah, games are ready to take advantage of stronger cards... except nVidia isn't ready to release newer cards, because they failed to capitalize on mining and have too much of series 10 chips in stock. Now if only we had AMD to compete here...

0
Rep
2
Offline
07:31 Jul-26-2018

Not really, game's just not well optimized. Battlefield V will require a 1060 and 480 for max settings 60+ fps and it looks better than this.

0
Rep
105
Offline
21:28 Jul-25-2018

Well, here we go again, gaming at 1080p ultra settings @ 30-40 fps xD.

1
Rep
179
Offline
admin approved badge
23:20 Jul-25-2018

or just bump it down a notch to high for 60FPS.... Honestly it's not that surprising...... ours are mid range cards, and the fact that we were able to play on Ultra for as long as we have, and even a little 1440P for alot of titles isn't really bad at all..... I feel like I've definitely got my moneys worth.... next gen cards probably aren't going to be nearly as cheap, it remains to be seen what value we get from their likely higher prices.

0
Rep
105
Offline
17:36 Jul-26-2018

Well, you got a big point, there wasnt any GPU to replace our mid-end cards for like 2 years, im afraid requirements will raise with the next gen cards :/

0
Rep
95
Offline
20:04 Jul-25-2018

When I first read this news at another site, I think it said that these were minimum frame rates..
Anyway, really looking forward to this game. While no one wld want an unoptimized game, the anticipation just wouldnt be the same without some performance "concerns".

0
Rep
38
Offline
16:41 Jul-25-2018

Seeing the graphics I thought I should be able to get solid 60 at 1440p at least, I'm kinda concerned. Then again performance on consoles was pretty poor as well, I guess it was to be expected. Hope we can get a lot of performance with only small effects on visuals with some settings.

0
Rep
43
Offline
16:26 Jul-25-2018

I really hope that what the article says about the gtx 1070 is true!

1
Rep
26
Offline
15:47 Jul-25-2018

What's up with the 5fps difference between medium and high..? I hope this get's more optimized by day one. Also what's up with the comments, 30fps gaming is not acceptable in 2018.

11
Rep
80
Offline
admin approved badge
16:46 Jul-25-2018

why are 30fps not acceptable?

-1
Rep
57
Offline
18:36 Jul-25-2018

you got downvotes yet no answers came, i would like to know or i will get downvoted too?

1
Rep
6
Offline
20:10 Jul-25-2018

30FPS are somewhat Acceptable,only if they're constant, lower input latency and without flicker.

1
Rep
26
Offline
19:33 Jul-25-2018

While still talking specifically about games: Long story short, 30fps is horrible imho. 30fps was 'okay' in more movie-esq games where frames didn't matter and console games, of course. Now that I've used 60fps for so long there's no way to avoid shivers when something drops below 30 all of a sudden. A quick google also show that 30fps is starting to become a thing of the past, except for those who generally enjoy a slower pace or are on a strict budget. I doubt you guys feel much different considering @David988 you have a 1080 bringing frames en masse and @Gerulis20 you're rocking a 144Hz screen to avoid even the 60 mark :D

6
Rep
57
Offline
20:09 Jul-25-2018

well yes i agree that 60fps is so much better than 30fps not even talking above 60fps but for casual gamer with controller 30fps is still acceptable which is majority(not talking about controller users, but casual audience). If you are playing with mouse and keyboard some kind of fast pacing game then yes 30fps is a stretch and probably not acceptable by most.

1
Rep
43
Offline
07:34 Jul-26-2018

If you are a casual gamer on controller there are consoles like ps4 that are made for that, i assume that if i spemd much more money on a gaming pc then 30 fps are not acceptable

1
Rep
80
Offline
admin approved badge
02:30 Jul-26-2018

i played ac black flag with 25-30fps on my old rig and it was fine, as lng as the game is not competitive 30fps is k

0
Rep
15
Offline
15:41 Jul-25-2018

Now i'm worried if i'll be able to run it at 1080p 30 FPS..... refund?

1
Rep
55
Offline
16:04 Jul-25-2018

relax, this benchmark was performed @ 1440p, you'll be fine

1
Rep
26
Offline
16:47 Jul-25-2018

Benchmarks from January with the same card that you have (MSI 960) show average frames of 50 and dips to 30's. One would guess that they have only improved optimization from since then, so you should be quite good.

2
Rep
15
Offline
22:16 Jul-25-2018

I think you were looking at Monster Hunter Online Benchmark from 2016.

1
Rep
26
Offline
07:29 Jul-26-2018

@GamerzHell9137 Oops you're right, tired eyes, meet brain. Anyway quick math shows the Gtx 1080 pulling 79fps at ultra 1080p and 194 on low. I think You should be fine, might even pull 60fps, atleast with some tweaking.

2
Rep
38
Offline
16:43 Jul-25-2018

On steam you can refund your preorder at any time and generally under 2h playtime and if the purchase was within the past 2 weeks you can refund any game without any problems.
Then again, why would you preorder at all?

2
Rep
15
Offline
22:18 Jul-25-2018

Didn't preorder the game from Steam because Steam doesn't have regional pricing. (60 Euro is too much for me)
And i've preordered because i'm a fan of the series and have waited for years to finally get a grip on it. Normally i don't preorder stuff but i love MonHun so i just bought it. (Capcom had great ports on PC)

0
Rep
105
Offline
22:22 Jul-25-2018

Maybe at medium settings.

0
Rep
15
Offline
00:49 Jul-26-2018

I wish, i'm not asking for much tbh. I always gamed with shadows on low and just high textures and character models. Extra effects disabled.

0

Can They Run... |

| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
| 60FPS, Ultra, 1080p
Ryzen R5 1600 Radeon RX 580 Sapphire Nitro+ 8GB 16GB
0% No [1 votes]
| 60FPS, Ultra, 1440p
Ryzen 7 5800X 8-Core 3.8GHz GeForce RTX 3090 Zotac Gaming Trinity 24GB 32GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 30FPS, High, 1080p
Ryzen 5 2600 GeForce GTX 1660 Gigabyte OC 6GB 16GB
0% No [2 votes]
| 60FPS, Low, 1080p
Ryzen 5 5500U 6-Core 2.1GHz GeForce GTX 1650 16GB
| 60FPS, High, 1440p
Ryzen 7 5800X 8-Core 3.8GHz Radeon RX 6900 XT 16GB 32GB
| 60FPS, Medium, 720p
Core i5-10300H 4-Core 2.50GHz GeForce GTX 1650 8GB
| 60FPS, High, 1080p
Core i9-9900K 8-Core 3.6GHz GeForce GTX 1060 Gigabyte Mini ITX OC 6GB 32GB
66.6667% Yes [3 votes]
| 60FPS, High, 1080p
Ryzen 5 3600 6-Core 3.6GHz Radeon RX 5700 PowerColor Red Dragon 8GB 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]
| 60FPS, Ultra, 4k
Core i9-9900K 8-Core 3.6GHz GeForce RTX 2080 Ti Asus ROG Strix OC 11GB 32GB
| 30FPS, Ultra, 1440p
Ryzen 5 2600X 6-Core 3.6GHz GeForce GTX 1080 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 3 3100 4-Core 3.6GHz GeForce RTX 3050 16GB
0% No [1 votes]
| 60FPS, Ultra, 1080p
Ryzen 5 5600X 6-Core 3.7GHz Radeon RX 6700 XT 12GB 32GB
| 30FPS, Low, 720p
Core i3-2367M 1.4GHz Intel HD Graphics 3000 Desktop 4GB
| High, 1080p
Ryzen 5 2600 GeForce GTX 1070 Ti MSI Gaming 8GB 16GB
100% Yes [1 votes]