Far Cry 5 System Requirements and PC 4K Specs Revealed

Written by Jon Sutton on Tue, Jan 23, 2018 9:51 AM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

Undoubtedly the biggest release of Q1 2018, Far Cry 5 cranks the chaos-o-meter up to 11 with full co-op support and an open-world Montana that's been overrun by a fanatical cult. Luckily they're a problem that can be sorted out by bullets, and lots of them. Pet wolverines are optional. As with anything Ubisoft though, Far Cry 5 will push graphical boundaries, meaning running this game doesn't come easy. Here are the official Far Cry 5 PC system requirements.

UPDATE: I've contacted Ubisoft and they've confirmed the minimum system requirements are for 30 frames per second at 720p screen resolution. 

Far Cry 5 Minimum System Requirements - 720p / Low / 30FPS

  • OS: Windows 7 64-bit
  • CPU: Intel Core i5-2400 3.1 GHz or AMD FX-6300 3.5 GHz
  • RAM: 8 GB System Memory
  • GPU RAM: 2GB Video Memory
  • GPU: GeForce GTX 670 or Radeon R9 270
  • HDD: TBA
  • DX: DirectX 11

Far Cry 5 Recommended System Requirements 1080p / High / 60FPS

  • OS: Windows 7 64-bit
  • CPU: Intel Core i7-4770 3.4GHz or AMD Ryzen 5 1600 3.2GHz
  • RAM: 8 GB System Memory
  • GPU RAM: 4GB Video Memory
  • GPU: Nvidia GeForce GTX 970 or AMD Radeon R9 290X
  • HDD: TBA
  • DX: DirectX 11

Far Cry 5 System Requirements 4K / High / 30FPS

  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-6700 3.4GHz or AMD Ryzen 5 1600X 3.6GHz
  • RAM: 16 GB System Memory
  • GPU RAM: 8GB Video Memory
  • GPU: Nvidia GeForce GTX 1070 or AMD Radeon RX Vega 56
  • HDD: TBA
  • DX: DirectX 11

Far Cry 5 System Requirements 4K / High / 60FPS

  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-6700K 4.0GHz or AMD Ryzen 7 1700 3.4GHz
  • RAM: 16 GB System Memory
  • GPU RAM: 8GB Video Memory
  • GPU: Nvidia GeForce GTX 1080 SLI or AMD Radeon RX Vega 56 CrossFire
  • HDD: TBA
  • DX: DirectX 11

While everything we've seen of Far Cry 5 seems fairly similar to Far Cry 4 in a technical sense, there's clearly been some improvements done to the visuals behind the scenes. The Far Cry 5 system requirements are a clear step up, and the once mighty GeForce GTX 670 and mid-range Radeon R9 270 will now only run Far Cry 5 at 720p resolution on Low graphics. It should hopefully still look decent, but you're going to need more modern hardware if you want to start cranking the graphics settings up.

Recommended, on the other hand, is pretty reasonable for a AAA 2018 release. Far Cry 5 goes the safe route with the ever-reliable GeForce GTX 970 or Radeon R9 290X, both of which should be able to exceed the frame rates of the base PS4 and Xbox One consoles. Anything from a GeForce GTX 960 / Radeon R9 280X upwards should provide a superior experience to the console versions of Far Cry 5.

Unsurprisingly, the 4K specs are out in a field of their own. You need the best of the best of PC gaming hardware to play Far Cry 5 at 4K with reliable performance. 60FPS is, of course, the preferred option here, but it demands a top-end Core i7-6700K CPU or a Ryzen 7 1700, paired with a GeForce GTX 1080 SLI or a Radeon RX Vega 56 CrossFire and 16GB RAM. In today's market, these components come at an absolutely huge cost. As ever, if you're looking at the 4K specs then we absolutely recommend you target 60 frames per second in order to achieve the best gameplay experience. I think you're better off sacrificing resolution or dropping down to Medium graphics in Far Cry 5 rather than aiming for 30 frames per second.

We won't know the best to achieve all this for sure until Far Cry 5 launches, but it will come with its own benchmark utility and GPU video memory usage meter to help calculate your desired settings. There will also be support for FPS locks, resolution scaling, FOV adjustment and variable aspect ratios.

Remember, you can always check out how well your PC can run the Far Cry 5 System Requirements here, where you can check benchmarking and performance from other users. Compare your graphics card to the Far Cry 5 benchmark chart.

Login or Register to join the debate

Rep
0
Offline
22:00 Mar-24-2018

I knew i shouldn't have bought a 3440x1440p monitor for my GTX 1070. I will need to lower resolution... :/


But it looks **** if i do though...

0
Rep
2
Offline
17:06 Jan-27-2018

Pretty funny how i7s are now in every game recommeded settings, even tho top i5s do just as good in games today. Missleading

0
Rep
57
Offline
23:31 Jan-27-2018

developers recommends i7 to make sure user experience will be best possible. While i5's run games fine they are definitely not the best experience you can get in today games.

1
Rep
4
Offline
15:14 Jan-27-2018

Could I play the game? And yes I will upgrade to 8GB ram

0
Rep
164
Offline
15:54 Jan-27-2018

yes but no one can grantee until you run this

0
Rep
15
Offline
admin approved badge
15:12 Jan-26-2018

The more I look at these system requirements the more I question why you need dual GPUs to run the game at ultra settings at 4K.


I do believe it is likely possible to run the game at a solid 60fps at ultra at 4K with just a single 1080ti.


My theory is that some unnecessary graphics options are likely enabled or just its that badly optimized. Most likely the former.


I know if I was running the game at 4K the first setting I would turn off is AA, In this day and age 4K is high enough of a resolution that aliasing wouldn't even be visible. AA can be a real (cont.)

0
Rep
164
Offline
15:15 Jan-26-2018

i donot know it is with or without anti aliasing

0
Rep
164
Offline
15:16 Jan-26-2018

i agree with you.
i never like AA and always turn of even i see lot of jaggies

0
Rep
15
Offline
admin approved badge
15:18 Jan-26-2018

performance killer. Especially SSAA.


Next up would be bloom, followed by motion blur. Also if this game has any Nvidia gameworks options, I would disable those especially if you have an AMD GPU. Those are unnecessary as they don't really add anything to the game aside from a few fancy effects (such as "realistic looking and moving hair") and are therefore not something one needs to enjoy the game.

0
Rep
272
Offline
admin approved badge
01:36 Jan-27-2018

I enjoy gameworks.
SSAA is not anti-aliasing in the strict sense, it's supersampling or rendering at a higher resolution. 4x SSAA or 200% resolution scaling at 1080p is essentially rendering the game at 4K. That's why the performance tanks.

1
Rep
24
Offline
12:16 Jan-28-2018

gameworks is just pure ****tt

0
Rep
5
Offline
11:16 Jan-29-2018

I like Hairworks in The Witcher 3, although I have to bruteforce it with my poor Vega. Runs just fine at 75-90 fps with Hairworks on High and 4x Hairworks AA.

0
Rep
272
Offline
admin approved badge
19:56 Jan-29-2018

HW isn't even as nice now as it used to be before all the performance patches. Especially visible on bears where the fur is all patchy and disappears as you change camera angles. Used to be a LOT more hair strands everywhere, but people complained about their weak GPUs being unable to run it (despite it being reserved for the top GPU owners anyway...).

-1
Rep
6
Offline
12:37 Jan-26-2018

SLI/Crossfire for 4K 60 fps??? Excuse me folks, is SLI / crossfire risen from the dead?

1
Rep
272
Offline
admin approved badge
12:56 Jan-26-2018

It was never dead. Most new games I played support SLI. I wouldn't be able to achieve 60+ FPS at 5K maxed if not for 1080Ti SLI :)

2
Rep
1,041
Offline
senior admin badge
13:24 Jan-26-2018

well, graphics manufacturers practically cancelled multi-gpu solutions for mainstream gaming,
problem is effectiveness of multi-gpu solutions (so called "scaling"), so nowadays no more than 2 gpus are usually reasonable

0
Rep
164
Offline
15:12 Jan-26-2018

i agree with T pay 2 times for sli and get only 40-60% increase

0
Rep
272
Offline
admin approved badge
01:39 Jan-27-2018

Depends on the game. And the CPU. People forget that "scaling" is sometimes limited by how fast the CPU or the engine can process the extra frames. In most cases with SLI, when you know your way around the game, 90%+ scaling is not unusual. Can't pump out the frames? Increase the resolution!

1
Rep
24
Offline
12:20 Jan-28-2018

sli or crossfire is never worth money vs the performance you get in majority of games, am pretty sure a single 1080TI can manage 4k 60fps in games by turning some useless graphics options off and this game is no exception

0
Rep
272
Offline
admin approved badge
01:40 Jan-29-2018

It can. But if you have the money and don't want to lower the settings or want higher fps than 60 (I have a 165Hz screen!) then SLI is the way to go. And you can get double the performance if you know how to use the tech. Sadly - many people don't. They get CPU-bottlenecked at 1080p or some crap and then say that the second card didn't do anything, while in reality they just need to raise the settings/resolution to get the card doing more, lol

0
Rep
20
Offline
18:02 Jan-25-2018

correct me if I'm wrong but the CPU recommendations on the AMD side make no sense


the R7 1700 performance in games is not that much difference than the R5 1600X or R5 1600 unless it's going to be able to utilize all those extra cores


also at 4K the GPU does most of the heavy lifting

3
Rep
15
Offline
admin approved badge
03:38 Jan-26-2018

Yes, but you still need a more powerful processor anyway to avoid bottlenecking. In any game the CPU and GPU rely on each other, just some more of one over the other.


But don't think you just get a ridiculously weak CPU and couple it with a high end GPU and get away with it just because you are playing at 4K.


It may become GPU dependent at that resolution but again even then you still need a strong processor for it to work at max efficiency.

0
Rep
272
Offline
admin approved badge
01:43 Jan-27-2018

The CPU doesn't necessarily do any more extra work at 4k vs 1080p. That's why people can pair GPUs in SLI/CF setups with lower-end i7 chips and enjoy 4K as normal. The CPU starts to matter if:
A) the game requires a lot of resources for the logic (open world, AI, resource management, physics)
B) You have a high refresh rate display and have GPUs capable of delivering those extra frames - you may become CPU-bound due to the amount of draw calls needed to be processed
If the game runs fine at 720p on a particular CPU then it will run just fine at 8K,provided the GPU can handle it.

1
Rep
15
Offline
admin approved badge
16:12 Jan-27-2018

8K? Was that a typo? Just checking.


But my point is that a stronger CPU is needed because if you use a GPU with too weak of a processor the GPU will be bottlenecked which will be noticeable on more GPU-dependant titles.


This will cause the CPU to run at 100% the whole time causing the fan to also run at full speed. This will in turn kill the CPU eventually.

0
Rep
272
Offline
admin approved badge
22:28 Jan-27-2018

Many misconceptions:
1) If your GPU can't keep up - your CPU will be "resting". Your CPU only does as much as it needs to. If you are not GPU-bound then the CPU will try toi spit out as many frames as it can for rendering by the GPUs.
2) 8K is not a typo. On a 4K display you can run 8K DSR/VSR. This will use less CPU power, running at low fps (naturally) than at 1080p running at 60+ fps
3) CPUs are ok running full-load for a long time at 80+ degrees celsius. I'm talking from experience, often I leave CPUs to render 3D visualizations for days, 100% load. I don't bother with full fan speed - too noisy. Temperature does not affect CPU speed.
4) Your CPU will outlive the rest of your system

0
Rep
164
Offline
15:19 Jan-26-2018

at higher resolution you need more gpu power and less cpu power but cpu usage will increase is higher resolution and ram usage too

0
Rep
272
Offline
admin approved badge
01:43 Jan-27-2018

See my reply/explanation above

0
Rep
55
Offline
17:24 Jan-25-2018

Hmmm....the sys reqs for 1080p gaming at 60 fps seem nice but you never know if they have Denuvo sandwiched with VMProtect and some other stuff. But on paper, real nice for now.

1
Rep
43
Offline
18:19 Jan-24-2018

I shall get 60fps on ultra at 1080p right?

0
Rep
216
Offline
admin approved badge
19:13 Jan-24-2018

Definitely.

0
Rep
43
Offline
20:12 Jan-24-2018

Perfect then

0
Rep
164
Offline
20:00 Jan-25-2018

yes baby

0
Rep
13
Offline
15:40 Jan-24-2018

Well this is just insane. Here I am thinking I would last long enough with my 1080 Ti but apparently not. However I wouldn´t play it in 4k instead it would be 3440x1440 so maybe there is hope after all even for ultra

0
Rep
49
Offline
admin approved badge
16:08 Jan-24-2018

You should be alright, 4k has almost 4 million times more pixels then 1440 ultrawide. 4.9 million compared to 8.3 million i think youll be alright, wont do high but i bet you could be able to do high @ 50-60fps.

1
Rep
13
Offline
05:07 Jan-25-2018

4k resolution is 3840 pixels × 2160 lines which in my opinion isnt that of a big difference from 3440 x 1440 or am thinking it wrong? Just a side note ;)

0
Rep
49
Offline
admin approved badge
12:39 Jan-25-2018

Multiply em both see the difference you get in pixels lol.

1
Rep
383
Offline
senior admin badge
12:44 Jan-25-2018

Yeah don't let the similarities of the numbers fool you, you'll need around 67% more GPU performance for 4K 16:9 vs ultrawide 1440p.

1
Rep
272
Offline
admin approved badge
17:05 Jan-25-2018

Math. Always gets people... And yet despite it being useful literally everywhere in terms of not getting fooled - people still hate it and refuse to learn even the basics xD

0
Rep
13
Offline
05:10 Jan-26-2018

Yeah I was kind of blurry and tired from working while asking that question hahaha don´t know what I was thinking about. Of course there is a mayor difference :D

0
Rep
164
Offline
20:02 Jan-25-2018

gtx 1080 ti can run 4k ultra settings

0
Rep
13
Offline
05:07 Jan-26-2018

I think I will be alright, some tweaking and adjustment will solve it

1
Rep
1,041
Offline
senior admin badge
10:29 Jan-24-2018

clearly laughable to see there is still no single gpu capable of 4K 60fps, yet it's enough to use quad-core cpu :P

2
Rep
272
Offline
admin approved badge
21:02 Jan-24-2018

But there is not a single iGPU in those CPUs that could actually even run that 4K xD

0
Rep
1,041
Offline
senior admin badge
12:39 Jan-25-2018

true, but my point is 4K 60fps cannot become standard anytime soon

0
Rep
272
Offline
admin approved badge
17:07 Jan-25-2018

Well, technically it can. Just not with the ultra settings and somewhat horrid optimization of some games today. I was playing 5K165 (I've got a 165Hz screen, 5K with DSR) for 2 years now, just not in every game :D

0
Rep
164
Offline
20:09 Jan-25-2018

i agree with T
even 4k 30 fps is not joke.
i mean games are getting more demanding and even running on 640x480 is some times difficuilt

-1
Rep
164
Offline
20:10 Jan-25-2018

may be in 2020 4k will be standard

0
Rep
6
Offline
13:14 Jan-26-2018

I dont think that is the case, I think it depends on the optimization of the game. my GTX 1080ti runs games like COD WWII, Forza Horizon 3, GTA V, DOOM, Titanfall at ultra settings 4K 60 fps without a sweat but games like Deux X, Hellblade runs around 40 - 58 fps even at High and custom settings.

0
Rep
272
Offline
admin approved badge
13:21 Jan-26-2018

I was playing 4K in some titles on my laptop with GTX 980M SLI years back. It's all down to the game :)

0
Rep
4
Offline
admin approved badge
02:16 Jan-25-2018

The 1080 ti or the latest quardo's probably can get 60 fps on high to ultra, cause my laptop 1070 able to get around 30 fps in latest titles. so if paired with a a 7770k at 4.6ghz (-5) + and a overclocked 1080 ti should definitely pushing past 60 fps, so it depends on how well you got your setup

0
Rep
1,041
Offline
senior admin badge
12:40 Jan-25-2018

my 980Ti can handle games at 2560x1440 at 60fps (not always though), I personally don't feel any need to get higher res

0
Rep
272
Offline
admin approved badge
17:08 Jan-25-2018

Try Witcher 3 or Watchdogs or GTA or anything in 5K (using DSR, set smoothness to 20%) and you'll see why people want the resolution. Hell, I want it! Difference is - I can actually have it :)

0
Rep
1,041
Offline
senior admin badge
17:12 Jan-25-2018

to be honest, I was most immersed when I played Witcher 2 on huge 42-inch TV at 1080p,
I think the dense resolution is less important than huge screen,
I am actually thinking a lot about 4K screen as my "next" upgrade, but calculating the pixel density, viewing field of view and distance, I came to conclusion that it makes no sense to buy 27" 4K screen, and relatively similar pixel density as my current would be 42" 4K screen, which then would be too huge to sit in front at same distance, and if sitting further, and scaled pixels accordingly, then to maintain similar pixel density, the 4K screen would need to be even bigger

1
Rep
15
Offline
admin approved badge
19:11 Jan-25-2018

One thing that annoys me is that people think there is no point in getting 4K if your graphics card can't run the game at ultra settings or they assume ultra settings in any other scenario.


Let me put this bluntly: Yes. There. Is!!! Stop assuming ultra settings. Someone told me that there would be no point in playing at 4K if I get a 980 ti (which I am still considering). But no, I don't need ultra settings to enjoy a game. Do I prefer it? Yes. Do I need it tho? No. As long as the game is playable to me and I enjoy it. That's all that really matters to me at the end of the day.

0
Rep
272
Offline
admin approved badge
19:31 Jan-25-2018

I'm on the camp of "framerate first", at a reasonable resolution, of course. 1440p is the lowest I'm going and 90fps minimum is my ballpark (which can be a bit lower on non-shooter/action games). If I can hit 90-100fps - super, I'll do 4K. Still hit 100fps? 5K then for a true 4x FSAA. I agree that 4K and 5K look MILES better than 1080p or 1440p, especially in games with many tiny details and textures (try the forests of Skellige at 5K - jaw-dropping!), but I still want my framerates. And being a special type of a boy, I refuse to play on low/medium settings... Hence my PC, where I don't need to make compromises most of the time xD

1
Rep
272
Offline
admin approved badge
19:33 Jan-25-2018

As for Tazz's point on screen size - if I can observe the massive sharpness increase and the removal of aliasing at 5K DSR (that's to even native res, I'm on a mere 27" 1440p display!) and even fave a preference to plop FXAA or MSAA on top of all that - I'm sure as hell ready for a larger res display! Just that nobody makes one at 120-165Hz with G-Sync :D

0
Rep
15
Offline
admin approved badge
19:37 Jan-25-2018

To whom that may concern, you know who you are.


But seriously people just stop. There is a point to playing 4K at low settings. Also simply playing a game at a higher native resolution, by itself, reduces aliasing. Playing a game in VSR produces a similar effect by resampling the image and then displaying it at the monitor's native resolution this produces an effect similar to SSAA as I explained in a recent blog.


(Also I prefer 60fps, That also takes priority over ultra.)

0
Rep
15
Offline
admin approved badge
19:40 Jan-25-2018

(My character limit is too small...)


Another reason why I want to get a 4K monitor is because I am sick of having a low resolution monitor as my primary monitor which I game on.

1
Rep
164
Offline
20:04 Jan-25-2018

i will recommend you Dell P2415Q 4k

0
Rep
164
Offline
20:06 Jan-25-2018

yes gtx 980 ti 6gb is a good card.
buy it

0
Rep
272
Offline
admin approved badge
23:49 Jan-25-2018

I'd look at something with FreeSync in his case. Unless he wants to go with Nvidia later, then G-Sync for sure. It's a pleasure not to have any tearing an stuttering anymore! Any monitor without variable refresh rates is a pass from me.

0
Rep
15
Offline
admin approved badge
03:44 Jan-26-2018

I am strongly considering switching to Nvidia after being burned twice with AMD GPU purchases. But AMD processors? Those are well worth the investment.


Also Nvidia could use freesync as it is open source but they have yet to even allow it. Assuming they ever will.


So Gsync monitor it is then. :)

0
Rep
1,041
Offline
senior admin badge
08:13 Jan-26-2018

I'm somewhat skeptical about variable refresh rates,
superior solution still would be graphics processing unit capable of steady framerate, so the refresh rate would remain constant - obviously this is by design literally impossible to implement with nowadays tech available,
I have a strange feeling the variable refresh rate may have some negative impact on eyesight and psychics in the long run, but that's just my thought...

0
Rep
272
Offline
admin approved badge
13:02 Jan-26-2018

Eyesight-wise - you're probably confusing ULMB with variable refresh rates. During normal variable refresh operation the backlight stays the same - lit. All you're seeing is an image changing more or less often, much like you would in normal movies on static refresh rates anyway - standing in place is standing in place, moving is moving, lol. It's not like CRT screens where a tiny scanline relied on vision persistence to form a full image, straining eyes really badly. Physics...like in-game physics? No change if the physics engine is tied to the game timer rather than the framerate.
The current VRR implementation makes perfect sense - refresh the screen as soon as a frame arrives.

0
Rep
1,041
Offline
senior admin badge
13:22 Jan-26-2018

I wrote psychics, I meant some (indirect) mental consequences (I'm exaggerating this time, but let's say a lesser form of something like posttraumatic stress disorder could easily happen, it's just too early to judge anything)

0
Rep
15
Offline
admin approved badge
14:18 Jan-26-2018

There's no way something like that can affect someone mentally. All G-sync and Freesync do is something called Adaptive Vertical Synchronization or Vsync for short. All it does is prevent screen tearing which makes the game image not as smooth as it could be, especially motion-wise. Even at just 60fps, everything looks and feels much more natural and fluid. 60fps with Gsync is even smoother than 60fps w/o Gsync. I have seen videos of it in action on Youtube.

0
Rep
58
Offline
admin approved badge
16:38 Jan-26-2018

TZZ: I go- to-agree with you buddy. I sometimes game on my 55 inch 1080P TV and I find the extra real estate makes a game more enjoyable...

0
Rep
15
Offline
admin approved badge
20:19 Jan-26-2018

I'd personally be happy with 24in.


Just saying.

0
Rep
272
Offline
admin approved badge
01:46 Jan-27-2018

My 27" is no more than an arm's length away from my eyes at any given time, sometimes much closer. So I don't need a huge TV - rather have the responsivenes and the extra framerate of a designated monitor :)

0
Rep
24
Offline
09:51 Jan-24-2018

well with a little bit of tweaking with high and ultra am able to play far cry primal at 4k with a 60fps average. I am hoping i can manage this one too like. this game looks really exciting

0
Rep
35
Offline
18:09 Feb-04-2018

You have a nice rig... it would be shame to run it at 1080p. :)

0
Rep
28
Offline
03:19 Jan-24-2018

Just hope it will run well on my machine as it is kind of limited.

0
Rep
6
Offline
15:52 Jan-25-2018

With ryzen 7 1700 OC and a gtx 970 u'll have no problem running this game at 1080p :)

0
Rep
10
Offline
21:45 Jan-23-2018

after ac origins this is suprising to have sli support but im excited

0
Rep
21
Offline
21:44 Jan-23-2018

Nvidia GeForce GTX 1080 SLI or AMD Radeon RX Vega 56 CrossFire. really vega 56 is slower than 1080.

0
Rep
-4
Offline
00:30 Jan-24-2018

Depends on the game, in some games the Vega 56 isn't that much slower.

2
Rep
164
Offline
20:02 Jan-25-2018

depends on games.
some games run good on Nvidia an some on AMD

0
Rep
35
Offline
20:24 Jan-23-2018

With good optimization this should be just theory... But that rarely happens.

0
Rep
3
Offline
20:12 Jan-23-2018

I don't even think to run this

0
Rep
179
Offline
admin approved badge
21:26 Jan-23-2018

I'd put my money on you being able to run it just fine at somewhere between low to medium settings at your resolution of 1600x900

0
Rep
6
Offline
15:57 Jan-25-2018

Why not? I really think you will be able at 900p. Maybe a bit low detail, but with better GPU and OC a bit that 2500k it should be fine^^

0
Rep
3
Offline
17:53 Jan-25-2018

I'm just saying...definitely give try. but I'm thinking about an upgrade and just type Rx 580 on Amazon and just regret it

0
Rep
272
Offline
admin approved badge
19:26 Jan-23-2018

SCHWEET, SLI support! Guaranteed 4K60 here I come! Though I want like 90fps, the Tis should be able to keep up, I hope.

0
Rep
15
Offline
admin approved badge
00:27 Jan-24-2018

You and your killer rigs, lol.


Godzilla says he feels small compared to your beast.

1
Rep
15
Offline
admin approved badge
13:06 Jan-24-2018

Whoever downvoted XQuatroX needs to go eat a donut, they are obviously feeling sour! XD

1
Rep
49
Offline
admin approved badge
13:40 Jan-24-2018

Its a not validated downvote lol.

1
Rep
272
Offline
admin approved badge
18:39 Jan-24-2018

Well, those who know me - know that I have the rig. But since I didn't validate I don't blame those who don't know me for thinking it's a fake rig xD

0
Rep
15
Offline
admin approved badge
01:18 Jan-25-2018

Well I know you wouldn't lie about such a thing and even though we may disagree on a certain subject, that hasn't kept me from seeing the good in you. :)


Besides, what is there to gain from lying about such a thing? (Answer: Absolutely nothing.)

1
Rep
272
Offline
admin approved badge
17:09 Jan-25-2018

If people ever want proof I can always supply, not a problem anyway :D
Glad we get along, bro :)

0
Rep
15
Offline
admin approved badge
19:50 Jan-25-2018

You don't need to prove anything to me but it might be a good idea to get it validated anyway.


I haven't validated mine because I have nothing to prove, figuratively speaking.

0
Rep
97
Offline
admin approved badge
18:32 Jan-23-2018

So either Vega 56 scales better in multi-gpu setups, Vega 56 is near gtx1080 in performance or Ubisoft has better AMD optimizations.
Kind of surprised no one else caught that.

2
Rep
49
Offline
admin approved badge
23:04 Jan-23-2018

Its using vega specific features like rapid packed math and so on so its vega optimized if anything.

1
Rep
97
Offline
admin approved badge
01:35 Jan-24-2018

i was actually just listing all the possible reasons.
rapid packed math is for many small things, like grass, leaves and flowers. they wont help a lower spec card run with higher spec cards. what it does allow though is a farther draw distance for those small things. basically its fort small things that dont require intense computations and instead is lower quality math.

0
Rep
43
Offline
20:22 Jan-24-2018

Well since vega 56 performs as a 1070Ti it's pretty colse to a 1080. I thought at first that crossfire would unleash the true power of hbm2, since all of that bandwidth would be handled much faster with two chips

0
Rep
272
Offline
admin approved badge
21:04 Jan-24-2018

How DO HBM GPUs handle high resolutions? On my Nvidia cards I always see a noticeable of a fps improvement if I overclock the memory, say from 10GHz to 11GHz.

0
Rep
27
Offline
16:28 Jan-23-2018

I am glad I upgraded my GPU. Hope it is well optimized.

3
Rep
27
Offline
16:59 Jan-23-2018

it's ubisoft... don't count on it

1
Rep
57
Offline
18:52 Jan-23-2018

usually far cry series gets decent optimization.

2
Rep
-10
Offline
20:07 Jan-23-2018

are you high? Ubi**** is famous for crap day one optimization...

-3
Rep
24
Offline
09:54 Jan-24-2018

yep most of the far cry games i have played in the past are mostly optimized

1
Rep
179
Offline
admin approved badge
15:51 Jan-23-2018

Bet a GTX 750 TI runs it just fine.

2
Rep
164
Offline
16:42 Jan-23-2018

i doubt.
its a very weak card

4
Rep
179
Offline
admin approved badge
17:15 Jan-23-2018

For today it's a bit underpowered, it will still run just about every modern game with the settings turned down to a reasonable level though.... and at 1080P in the vast majority of them with frame rates of 45 or better..... comparing it with the minimum spec of a 670 or a 270, it's not at all far behind either of those.... possibly even slightly edging out the 670 because it's a more modern card that has received driver optimizations far greater than the 670 has.

0
Rep
179
Offline
admin approved badge
17:16 Jan-23-2018

And yet people still love and buy the card even today.

0
Rep
57
Offline
18:56 Jan-23-2018

i think gtx 750ti should run this 720p - 1080p all lowest 60fps. My old gtx 650ti boost (around same performance as 750ti) runs rise of the tomb rider 1080p all low 60fps.

1
Rep
179
Offline
admin approved badge
19:46 Jan-23-2018

Yeah I think so too..... I still have a 750 TI floating around.... and it's still a very capable card, specially when we're talking about the non-reference models, people who underestimate it almost always get proven wrong.

0
Rep
179
Offline
admin approved badge
19:48 Jan-23-2018

I think 60fps might be hoping for a bit much though.... but possibly doable at 720P with a decent CPU.... I'm sure it could easily pull a locked 30 though, even with settings a bit higher than lowest.

0
Rep
57
Offline
19:53 Jan-23-2018

yeah i agree, usually they see minimum requirements and dont bother to check performance videos on youtube.

0

Can They Run... |

| 60FPS, High, 1080p
Ryzen 5 2600 GeForce RTX 2060 Super 8GB 16GB
Ryzen 5 2600X 6-Core 3.6GHz GeForce RTX 2060 6GB 16GB
66.6667% Yes [3 votes]
| 30FPS, Low, 720p
Core i7-7500U 2-Core 2.7GHz GeForce 940MX 2GB 16GB
100% Yes [1 votes]
Core i5-4590 3.3GHz GeForce GTX 1650 12GB
100% Yes [1 votes]
| 60FPS, Ultra, 1440p
Core i5-10400F 6-Core 2.90GHz GeForce GTX 1080 Ti Inno3D Twin X2 11GB 16GB
50% Yes [2 votes]
Core i3-9100F 4-Core 3.6GHz GeForce GTX 1650 Super 4GB 8GB
| 60FPS, Medium, 1440p
Core i3-9100F 4-Core 3.6GHz GeForce GTX 1650 Super 4GB 8GB
100% Yes [1 votes]
Ryzen 3 3100 4-Core 3.6GHz GeForce GTX 1060 MSI Gaming 3GB 16GB
100% Yes [2 votes]
Ryzen 9 4900HS 8-Core 3.0GHz GeForce RTX 2060 Max-Q 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Core i7-10700 8-Core 2.90GHz GeForce RTX 2060 6GB 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Ryzen 5 2600X 6-Core 3.6GHz GeForce RTX 2060 6GB 16GB
66.6667% Yes [3 votes]
| 30FPS, Medium, 720p
Pentium Dual Core E6300 2.8GHz GeForce 210 3GB
100% Yes [1 votes]
| 30FPS, Medium, 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce GTX 1660 Ti 6GB 8GB
| 60FPS, Ultra, 1080p
Ryzen 7 2700X GeForce RTX 3060 Ti MSI Ventus 3X OC 8GB 16GB
100% Yes [3 votes]
| 30FPS, Medium, 1080p
Core i7-2600K 4-Core 3.40GHz GeForce RTX 2060 MSI Gaming Z 6GB 16GB
| Ultra, 1080p
Core i3-9100F 4-Core 3.6GHz GeForce GTX 1070 Gigabyte Windforce OC 16GB
100% Yes [2 votes]
| 60FPS, High, 1080p
Core i5-6600K 3.5GHz GeForce GTX 1060 MSI Gaming X 3GB 16GB
100% Yes [2 votes]
Core i5-4460 3.2GHz Radeon RX 570 XFX RS Black 4GB 8GB
| 30FPS, Low, 720p
Core i5-2310 2.9GHz GeForce GT 1030 8GB
| 30FPS, Ultra, 1080p
Core 2 Quad Q6600 2.4GHz GeForce GT 1030 8GB
100% Yes [1 votes]