Let it Die PC System Requirements Recommend a GTX 1080 or Radeon RX Vega 64 GPU

Written by Jon Sutton on Sun, Sep 16, 2018 9:00 AM
System Requirements Optimum 1080p PC Build Low vs Ultra Screenshots GPU Performance Chart CPU List That Meet System Requirements GPU List That Meet System Requirements

Originally released back in late 2016 exclusively on PS4, Let It Die has led a curious life. Originally known as Lily Bergamo, it's a free-to-play action game with multiplayer invasion elements that had a decent reception but disappeared off the face of the Earth. Here we are almost two years later though, and Let It Die is making a comeback, this time coming to PC with some of the most heinously demanding system requirements we've ever seen.

Let it Die Minimum System Requirements

  • OS: Windows 7 64-bit
  • CPU: Intel Core i7-4770 3.4 GHz or AMD Ryzen 5 2400G 3.6 GHz
  • RAM: 8 GB System Memory
  • GPU RAM: 2 GB Video Memory
  • GPU: GeForce GTX 680 or Radeon R9 380
  • DX: DirectX 9
  • HDD: 40GB Available Hard Drive Space

Let it Die Recommended System Requirements

  • OS: Windows 10 64-bit
  • CPU: Intel Core i7-7700 3.6GHz or AMD Ryzen 5 1600 3.2GHz
  • RAM: 16 GB System Memory
  • GPU RAM: 8 GB Video Memory
  • GPU: Nvidia GeForce GTX 1080 or AMD Radeon RX Vega 64
  • DX: DirectX 9
  • HDD: 40 GB Available Hard Drive Space

I'll just provide a gentle reminder that this game is out in 10 days from the of writing. These are the sort of grossly miscalculated system specs that are farted out about a year before a game launches, so it certainly is worrying to see a DirectX 9 title like Let it Die with such hugely demanding system specs just days before launch.

It's been a while since I've been specced out of a game but Let it Die is going to run it damn close. The minimum requirements for Let it Die are a Core i7-4770 quad-core CPU or equivalent, paired with either a GeForce GTx 680 or Radeon R9 380. This could mean those with a GeForce GTX 1050/Radeon RX 460 may find themselves in trouble.

As for Let it Die's recommended requirements, well, they're just insane. We can only presume these are totally and utterly wrong based on the gameplay footage released. Let it Die looks like an up-rezzed PS3 game so it's difficult to understand why a GTX 1080 or Radeon RX Vega 64 would be recommended unless these are 4K specs. If these specs are for 1080p, then they're probably completely wrong. We'd expect far, far weaker systems than this to be capable of running it without breaking a sweat.

The good news is that Let it Die is free-to-play so there's not too much harm done if it doesn't run very well on your system. There's the not-insignificant matter of a 40GB download, of course, but you shouldn't actually be out of pocket unless you've got a bandwidth cap.

 

As ever, remember you can always check out how well your PC can run Let it Die System Requirements here, where you can check benchmarking and performance from other users. Compare your graphics card to Let it Die GPU benchmark chart and we also have a Let it Die Frames Per Second system performance chart for you to check.

Login or Register to join the debate

Rep
10
Offline
02:17 Sep-18-2018

well I should be ok then :)

0
Rep
14
Offline
16:47 Sep-17-2018

Maybe it's running over a "ps4 Emulator" or trying to emulate something of the PS4 system and that is ther reason of that ridiculous requirements...
Or they are just a**holes that wil be releasing an unfinished game

0
Rep
191
Offline
junior admin badge
18:03 Sep-17-2018

I'd say it the 2nd one.

1
Rep
386
Offline
admin approved badge
17:59 Sep-18-2018

Emulation in the case of the PS4 or Xbox One would be CPU heavy and not GPU, as the GPU architecture doesn't have as many different instructions and variable pipelines to get a heavy performance hit.

0
Rep
55
Offline
09:52 Sep-17-2018

With such bad optimization I think we should take a cue from the name of the game and just 'Let It Die.' :D

1
Rep
8
Offline
07:23 Sep-17-2018

Welcome to another episode of "Why is it important to optimize games".
Subject of today - Let It Die

2
Rep
35
Offline
admin approved badge
23:37 Sep-16-2018

"I was gonna optimize the game...but then I got high!"

3
Rep
58
Offline
19:46 Sep-16-2018

I don't know what the developers are on, but I want some!

2
Rep
3
Offline
18:48 Sep-16-2018

Mmm,That feeling when you Optimization is pure garbage.

0
Rep
105
Offline
16:38 Sep-16-2018

Did i read that right? direct x9? xD .Just did some research, and this game does look like a ps3 xbox 360 game, colors are dull, and you have to start all over if you die.I probably just going to download it to benchmark.

0
Rep
213
Offline
admin badge
13:48 Sep-16-2018

most devs dont even know what would run it and what wouldnt most system requirerments like minimum or reccomended are just a educated guess of what should work thats why some games are unplayable even with minimum hardware and some are perfectly playable its almost like this is just a scheme to get attention for the game for marketting or at least theyre assuming that card will be needed to run all these kinds of AA lol

0
Rep
44
Offline
12:51 Sep-16-2018

Lol, if a game with such graphics literally recommends a GTX 1080, then it will officialy take the throne of the worst optimized PC port.
After GTA 4.

10
Rep
272
Offline
admin approved badge
11:51 Sep-16-2018

680 as a minimum, 1080 as recommended? Let this game die... xD

15
Rep
191
Offline
junior admin badge
14:34 Sep-16-2018

Good one XD

0
Rep
74
Online
11:27 Sep-16-2018

Optimization, not even once.

3
Rep
14
Offline
10:50 Sep-16-2018

looks to me like such shocking "extreme" hardware requirements are the new hype weapon for these probably mediocre games

3
Rep
386
Offline
admin approved badge
11:17 Sep-16-2018

2.5-year-old mid-range gtx 1080 isn't exactly "extreme", we might not have much better than it even after the RTX 2080(ti) comes out, but still. Only extreme are the prices of the gtx 1080 as that 134% profit margin kicks us in the pants and feeds Nvidia's pockets.


Vega is just stupid... HBM costs 13.64x(150$ per 8GB) times more than GDDR5(X)/6(11$ per 8GB) and the transposer costs 25-30$ and GDDR5X/6 do the same job and that's cost to AMD... dumb move, poor value, but not as overpriced... just bad value due to worthless high production cost... but miners buy them...

-11
Rep
14
Offline
11:20 Sep-16-2018

I've heard about your legendary status. Keep it up

14
Rep
74
Online
11:28 Sep-16-2018

Did you seriously just put 1080 in the mid range? Sure, it still costs 1000$ retail, let's call it mid range. My god....

10
Rep
80
Offline
admin approved badge
12:12 Sep-16-2018

it is mid range, if i am correct gtx 1080ti, XP and V are high end,
as gtx 680 and 690 are still high end and it wont change

-17
Rep
386
Offline
admin approved badge
16:24 Sep-16-2018

@David988 gtx 680 has a 298mm^2 die... it was meant to be called gtx 660 back in the day, but since AMD's HD7970 was a 352mm^2 die GPU that performed on par with it and AMD overpriced it at 550$ MSRP, while the hd6970 a single year earlier a 389mm^2 die GPU was 350$... And since Nvidia is actually good at marketing, they knew that they had to name their 550$ product a GTX X80, in this case, gtx 680 to be at the same price as the gtx 580.


So the bottom line is the gtx 680 was meant to be called a gtx 660 and cost 250$, but it ended up as a gtx 680 costing 550$ just like that... -_-

0
Rep
386
Offline
admin approved badge
11:36 Sep-16-2018

user cost and performance don't determine tier... they determine value. Die size is what determines a chip's tier.
Gtx 1080 -> 312mm^2 die size, low priority R&D, less than 20% failure rate from the get-go and 700$ MSRP in 2016.
Gtx 560 -> 332mm^2 die size(10% bigger), huge R&D costs(much bigger than for pascal), 30-40% failure rate(double the failure rate) -> 200$ MSRP in 2011(254$ with inflation in 2016).
Both GPUs had only 2x GPUs stronger than them as I've seen people caring only about that.
The gtx 560 is/was a more expensive GPU to Nvidia, but back then their GPU division operated on a small profit margin, now Nvidia's GPU division operates on 134% profit margin, for reference apple operates at 35-45% profit margin, Nvidia is 3x more overpriced that apple at the very least... -_-

-3
Rep
74
Online
12:06 Sep-16-2018

For as long as I can remember, only thing die size determined was how many transistors you can pack in (a given surface). Never ever have I seen any GPU rating being determined by die size. Companies determine value. My RX 580 dishes out slightly lower performance in most games than my friends 1060 does (-5 fps), while costing 190$ less. I've only ever heard you use die size as a tier chart. Even my professor in Computer architecture said that performance is what determines tier, crossing out the enthusiast GPUs (like titan). All that info you just dished out means squat and is completely illogical.

3
Rep
74
Online
12:07 Sep-16-2018

If anything, you compare the same generation GPU and determine the tier based on that. From 1030 to Titan. I'd even exclude Titan since it's for really rich people (a perverted business). Fabrication process stands by itself and is usually lower as generations go, until the lowest possible is reached before quantum computing. Don't see how entry/mid/perf tier has anything to do with die size. What I meant by illogical is that there can be the same performance for different sizes while you can have different performance for one size, so using die size to compare performance is not really definite.

3
Rep
116
Offline
12:56 Sep-16-2018

I don't tend to agree with Psychoman very often here, but he is right in this case. Chip size is what determines the tier of the GPU.
But due to the lack of competition on the GPU front and Nvidia's damn brilliant marketing department, they have managed to effectively persuade the public that performance is what matters and what determines the tier standing of GPUs.

1
Rep
95
Offline
14:36 Sep-16-2018

Arguing about the GPU tier is as useful for the common person as arguing about whether the earth is flat; in so far as its impact on our day-to-day lives.
As Psychoman has himself said multiple times, when he buys a GPU, he considers cost to performance. Ergo the tier does NOT matter unless you want to be bitter that you have to pay too much money for a mere "mid tier" card.

0
Rep
95
Offline
14:46 Sep-16-2018

Same goes for profitability. Oh this company is selling its product at a loss, it must be a great deal.. I should buy their product. And by this logic, no one should ever buy things like bottled water, air filters, and hdmi/usb cables.
Yes, its a great idea for money to flow to the inefficient companies instead of the ones who manage to keep their costs down.

0
Rep
386
Offline
admin approved badge
16:29 Sep-16-2018

@ProtosAngelus You've never seen a GPU rating based on die size, because they never rated it based on die size. But back in the day AnandTech and many more covered the price for the die size, wafer cost, failure rate and R&D costs and volume expected to be sold(if they were made officially publicly available, which they always were and have been), but they noticed that people just went straight to the benchmarks so they simplified the reviews...


And the tiers are usually an indication of cost and how much better chip there can be. The gtx 1080 is a GP104(GeForce pascal 104) chip at 312mm^2, the gtx 560 is a GF104(GeForce Fermi 104) at 332mm^2, so they haven't changed the chip naming either, great indication of price as always as they name them based on cost and relative size.

0
Rep
386
Offline
admin approved badge
16:32 Sep-16-2018

the gtx 1080ti/titan Xp is a 471mm^2 die size GPU and a GP102 chip, the GP100 is a 601mm^2 die chip that is unreleased.


The gtx 1080 is thus mid-range as it's half of what Nvidia can produce, well 51.9% to be exact. And die size is not about performance as I said many times and tiers aren't about performance... at 40nm a 300mm^2 die size GPU would perform worse than at 14nm 300mm^2 die size GPU if you just added more cores, without optimizing the architecture for it or improving the architecture, which would lead to better performance, but huge overhead if just added more cores.

0
Rep
386
Offline
admin approved badge
16:48 Sep-16-2018

@Penoys absolutely, but it shows how much it costs to them and thus how much more they are charging you, also shows how much better there is and/or can be. the GP100 would have been 80-90% faster than a gtx 1080... but there just wasn't a sane price for them to sell it at for a 100%-134% profit margin... I'm sounding like a broken record, but there is ignorance I can and ignorance that I can't pass by... this I'm tired of letting it pass by since 2011... -_-

0
Rep
386
Offline
admin approved badge
16:54 Sep-16-2018

Also, money flowing to high profit big and mega business is bad... they rarely use it and that money disappears from the circulation... Companies that sell at low-profit margins usually do due to actually using the extra money to expand their current divisions and/or make new ones... That's why many investors have withdrawn from Apple, even though Apple is set to soon be a trillion dollar company, they just don't have enough sources of income for their huge capital.

0
Rep
74
Online
17:47 Sep-16-2018

But why would you rate the GPUs by their die size today? Die size was a major contributer in the early 2000s but today, it's mostly cost to performance ratio and in Croatia, AMD wins on this front, mainly because 1060 is -200$ more expensive than RX 580, while providing a bit more FPS in gaming and decreased single core performance (mining/gaming/processing/editing/solving in Wolfram/...).
So the question still stands, why would you rate them by their die size when you can rate them by their performance in programs/games, from benchmarks, over analytical examinations in physics, to "Can it run Crysis 3?"

1
Rep
74
Online
17:49 Sep-16-2018

When I say cost to performance, I'm looking at my country, not the MSRP.
So, either rate it by cost to performance, absolute performance or don't rate it at all, because rating it by the die size is plain obsolete.

0
Rep
213
Offline
admin badge
17:32 Sep-16-2018

david gtx 680 is not high end its not awful but its clear as day a gtx 1060 3gb beats it http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-3GB-vs-Nvidia-GTX-680/3646vs3148 lol but somehow according to you the gtx 1080 is midrange but the 680 is high end when the 1080 demolishes it http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-vs-Nvidia-GTX-680/3603vs3148 like wut lmao

0
Rep
80
Offline
admin approved badge
19:51 Sep-16-2018

i canot remember where i readed it canot find it again but it was about gpu hierarchy and theyr classification entry, med, high end level, and there they sayed that 680 is high end, as i understood it for example for 10years from now gtx 1080ti will be usless and people will call it entry level or something but it will be actually still be high end.
there is also chance that that site was full of crap and that i am completly wron bec of it

0
Rep
213
Offline
admin badge
20:13 Sep-16-2018

i think its perception more than anything like current gpus most people would see them as gt 1030 as low end gtx 1050 low mid 1050ti 1060 mid 1070 and 1070ti high end and 1080 and 1080ti very high end (thats just an example people dont get your balls in a knot) anyways so if they were talking about that same series when it first launched like i mentioned then yes it makes it perfect sense they would say that but the title doesnt carry over generations like the 290x for example still a damn excellent card is arguably midrange now even though a couple years ago it was flagship for amd

1
Rep
386
Offline
admin approved badge
20:16 Sep-16-2018

@David988
and you are right a 471mm^2 die chip is high-end even with the rtx 2080ti being this huge.
the gtx 680 on the other hand was 298mm^2 and was never high-end.

2
Rep
386
Offline
admin approved badge
18:09 Sep-16-2018

I get it, just because the gtx 1080 costs 700$+ people think it's "extreme" requirements, if the gtx 1080 costed 200$ nobody would care I guess... but nobody cares that there is a chip that would be 80-90% faster that was unreleased and that the gtx 1080 sold at the price point of that chip was supposed to be if it was released and NOT overpriced... so sad... 0 principles, 0 care, just ignorance and lack of interest...

0
Rep
8
Offline
07:20 Sep-17-2018

After three comments I gave up on this discussion. So much misinformation, so much nonsense.
I hope you guys don't have a page/blog/YT channel where you brainwash others with your lack of understanding about stuff...

0
Rep
386
Offline
admin approved badge
11:21 Sep-17-2018

I don't and feel free to correct everything you think is wrong. I'm always open for correction if anything I say is misinformation.

0
Rep
8
Offline
09:46 Sep-19-2018

This useless discussion started when you misunderstood the original commenter and responded with the wrong idea of what the person actually said.
These requirements ARE extreme for such a game. The 1080 wasn't called an extreme range card, rather than its need for this damned game. It's like requiring anything above an AMD APU or Intel HD for a Tetris game.
Not gonna go any further, since this comment is at risk of getting misunderstood as well. :)

1
Rep
60
Offline
10:47 Sep-16-2018

And on top of that it recommends dx9 ....

4
Rep
2
Offline
09:53 Sep-16-2018

The devs were like: "Let's not optimize the game properly and put a GTX1080 in the recommended specs."

5
Rep
386
Offline
admin approved badge
11:18 Sep-16-2018

As always they might not have had other setups available... it's not their job to test every possible combination... they probably have their current systems and their older ones and that's it 2-3x configs tops...

1
Rep
18
Offline
09:43 Sep-16-2018

To be honest I wouldn't have clicked on this article, thus wouldn't have read about this game, if it wasn't for the ridiculous requirements. Sooo maybe a genius marketing tactic?

1
Rep
15
Offline
09:31 Sep-16-2018

yep, let it die indeed :v lol

9
Rep
133
Offline
junior admin badge
09:30 Sep-16-2018

i played it on ps4 graphics look average so why need all that power
one 4K 2 Poor Optimization
but most likely its for 4k

1
Rep
20
Offline
09:28 Sep-16-2018

I think I get it this is a torture game that will kill your system and you're supposed to let it die in order to win :D

0
Rep
133
Offline
junior admin badge
09:31 Sep-16-2018

Let it Go...... I mean Let it Die

2
Rep
39
Offline
09:27 Sep-16-2018

A: The requirements are wrong B: The game is very poorly optimised

1

Can They Run... |

| 60FPS, Ultra, 1080p
Core i5-11400 6-Core 2.7GHz GeForce RTX 3060 Ultra 16GB
| 60FPS, Ultra, 1440p
Ryzen 5 5600X 6-Core 3.7GHz GeForce RTX 2080 Super Gigabyte Aorus 8GB 32GB
0% No [1 votes]
| 60FPS, High, 1440p
Ryzen 5 5600X 6-Core 3.7GHz GeForce RTX 2080 Super Gigabyte Aorus 8GB 32GB
100% Yes [1 votes]
| 30FPS, Low, 720p
Core i5-480M 2.66GHz HD i5 M480 8GB
| 30FPS, Low, 720p
APU A6-5400K Dual-Core Radeon HD 7450 4GB
| Ultra, 1440p
Ryzen 3 3300X 4-Core 3.8GHz Radeon RX 570 4GB 16GB
| 30FPS, Medium, 720p
Core 2 Duo E7500 2.93GHz Radeon R7 250X Sapphire 1GB Edition 4GB
0% No [2 votes]
| 60FPS, High, 1080p
Core i7-10870H 8-Core 2.20GHz GeForce RTX 3060 Mobile 16GB
| 60FPS, Ultra, 4k
APU A10-9700E 4-Core 3.0GHz Radeon R5 M320 2GB 12GB
0% No [1 votes]
| 60FPS, Medium, 1080p
Core i5-11400 6-Core 2.7GHz GeForce GTX 1660 Ti Asus TUF Gaming 6GB 16GB
100% Yes [1 votes]
| 60FPS, Low, 1080p
Core i5-6500 3.2GHz GeForce GTX 1060 16GB
| 60FPS, Medium, 1080p
Ryzen 5 3500 6-Core 3.6GHz GeForce GTX 1660 Asus Phoenix OC 6GB 16GB
100% Yes [1 votes]
| 60FPS, Medium, 1080p
Core i5-3570K 3.4GHz GeForce GTX 1060 Gigabyte Windforce 2X OC 3GB 16GB
| 60FPS, High, 1080p
Core i7-12700K 16-Core 3.6GHz GeForce RTX 3060 Ti Asus ROG Strix Gaming OC 8GB 32GB
| 60FPS, Low, 1080p
Ryzen 7 2700X Radeon RX 550X 4GB 24GB
| 60FPS, Ultra, 1080p
Core i5-10600KF 6-Core 4.10GHz Radeon RX 580 Asus Dual 8GB 16GB
| 60FPS, Medium, 1080p
Core i5-11400F 6-Core 2.6GHz GeForce GTX 1060 MSI Gaming X 6G Edition 16GB
| 60FPS, Ultra, 1080p
Ryzen 5 2600 Radeon RX 580 4GB 8GB
| 60FPS, Ultra, 1080p
Core i5-4570 3.2GHz Radeon HD 7970 OC Sapphire Edition 16GB
| 60FPS, Medium, 1080p
Core i7-8086K 6-Core 4.0GHz GeForce GTX 980 4GB 32GB