MSI Reveals New R9-270X With 4GB Memory

Written by Jon Sutton on Sun, Dec 15, 2013 4:00 PM

MSI has revealed its brand new R9-270X graphics card that doubles the memory specification of the base model card from AMD.

The new Radeon R9-270X Gaming 4G Graphics Card doubles the memory on the card up to an R9 290-rivalling 4GB of GDDR5 memory…

The graphics card from MSI features a new custom design incorporating MSI’s patented TwinFrozr IV cooler. This should ensure not only a GPU much faster than the reference card but also operating quieter and at a cooler temperature. TwinFrozr IV also features dust removal technology that can optimise cooling performance in otherwise dusty conditions.

The MSI R9-270X is clocked at a base frequency of 1030 MHz, but this can be boosted up to a hefty 1120 MHz. MSI claims that the new graphics card is being built with Military Class 4 components that should in theory give the card a much longer than typical lifespan, as well as a greater resistance to high temperatures.

The new card hasn’t had a release date announced yet but expect it in stores relatively soon.

Is this card appealing to you or do you think the extra on-board memory is superfluous?

In what situation would it be worth picking this up over, say, an R9-290?

 

Login or Register to join the debate

Rep
425
Offline
09:58 Dec-17-2013

4GB eh? useless for 1080p and single card..but will really come in handy with CFX setup and 1600p or higher res and eyefintiy..:D
Certainly worth it if two of these cost lower than R9-290..:D

0
Rep
1,041
Offline
senior admin badge
10:21 Dec-17-2013

for CrossFire this may come handy, but still I'd go for at least 280X...

0
Rep
425
Offline
10:37 Dec-17-2013

Yeah you would but many wont and btw 2x270x are stronger than 280x by quite along shot..:)

0
Rep
250
Offline
admin approved badge
10:39 Dec-17-2013

I think he meant 2x 280x instead of 2x 270x for 4K

0
Rep
1,041
Offline
senior admin badge
10:51 Dec-17-2013

@Mukul ..yup ^^
aatire that's true ofc :)

0
Rep
354
Offline
admin approved badge
07:54 Dec-17-2013

Now I get it why they paint cards red
Da red unz go fasta

0
Rep
18
Offline
20:02 Dec-16-2013

Shall i get this one when it comes out or MSI 7950 3GB as I dont want to settle with just 2 gigs for the mods :D

0
Rep
131
Offline
20:07 Dec-16-2013

4gb's are useless on this card, get the 2gb version or save some more for 280x

0
Rep
1,041
Offline
senior admin badge
21:38 Dec-16-2013

I'd save for 280X if I were you ;)

0
Rep
386
Offline
admin approved badge
21:39 Dec-16-2013

on his resolution a r9 280x is just an overkll if you ask me :)

0
Rep
1,041
Offline
senior admin badge
21:44 Dec-16-2013

ah true, people say 280X is as good as GTX770...and I can run triple 1280×1024 surround :D

0
Rep
386
Offline
admin approved badge
21:48 Dec-16-2013

well I think that the r9 280x is as gooda s the gtx 770 because it is as good as an overclocked hd7970ghz and the gtx 770 is as good as an overclocked gtx680, from all the benchmarks I've seen and I would say it's slightly better than the r9 280x, although spec-wise it shouldn't be.Also I don't think he is going to run triple monitors, so a r9 270x will serve him well, but the r9 270 none-X is a better choice :)

0
Rep
1,041
Offline
senior admin badge
07:59 Dec-17-2013

I checked his rig specs, and 270X would be best to match :)

0
Rep
1,152
Offline
admin approved badge
02:59 Dec-17-2013

maybe he can upgrade his gpu right now and monitor in future :)

0
Rep
18
Offline
14:02 Dec-18-2013

I just got a brand new 1080p monitor :D And my budget is 280e so im not gonna go for 280x, though I'd still go for 3GB as games are getting increasing demanding on this matter (BF4) and I dont but GPUs every 2-3 years

0
Rep
386
Offline
admin approved badge
14:03 Dec-18-2013

go for the r9 280x :)

0
Rep
18
Offline
22:32 Dec-20-2013

Aint got moneys :(

0
Rep
386
Offline
admin approved badge
01:46 Dec-21-2013

then go for the r9 270x it's perfect for 1080p

0
Rep
1,041
Offline
senior admin badge
09:57 Dec-21-2013

270X is good enough for 1080p :)

0
Rep
131
Offline
11:38 Dec-21-2013

i recommend people to not to buy 270x 270 is better for some reason most people dont know that look at the benchmarks online 270 is either 5fps slower or 2-4 fps faster

0
Rep
386
Offline
admin approved badge
11:48 Dec-21-2013

it depends if the r9 270 is overclocked or not, because there is no way an the r9 270 at stock to beat the r9 270x, because the r9 270 is an underclocked r9 270x.And yes it's better to buy it and just slightly overclock it to the r9 270x

0
Rep
1,041
Offline
senior admin badge
11:55 Dec-21-2013

truth IS that 270X is just an OCed 270, but manufacturers tend to put better coolers on 270X, and stock 270X is only little worse than older HD7950, so really depence on price difference :Đ

0
Rep
131
Offline
13:27 Dec-21-2013

i would never buy 270x even if i had money for it i would get the 270 and put more money or a cpu or something else like a better case or psu

0
Rep
3
Offline
10:20 Dec-16-2013

Oh well, I just bought the sapphire dual-x 270X 4GB yesterday for the same price as the 2GB model. :)

0
Rep
386
Offline
admin approved badge
11:30 Dec-16-2013

then I don't see anything bad about it :)

0
Rep
40
Offline
10:01 Dec-16-2013

useless gimmick, just buy the 2GB version. if you want high res capable gfx, go with higher end cards nuff said.

0
Rep
1,041
Offline
senior admin badge
10:02 Dec-16-2013

true, this card isn't capable of running 4K anyway :Đ

0
Rep
5
Offline
11:15 Dec-16-2013

but 2x270x cards can run 4K and eyefinity where this 4GB VRAM will be really helpful..This card is intended for that sort of crowd.Nobody will buy this for single-GPU purposes.its for those who will crossfire down the line..then it will show its worth.

0
Rep
59
Offline
11:37 Dec-16-2013

a better choice would be R9-290. its abotu the same price but it should be slightly faster. though most games will run around 30 or below FPS.

0
Rep
1,041
Offline
senior admin badge
12:01 Dec-16-2013

two 280X or two 290 would be better for 4K, I doubt single gfx card could manage 4K well enough...

0
Rep
47
Offline
14:11 Dec-16-2013

Yeah, a single strong card is much more reliable than 2 weaker cards in CF/SLI.

0
Rep
131
Offline
07:02 Dec-16-2013

even 270x isn't worth the money, for some reason, 270 is better o_O

0
Rep
1,041
Offline
senior admin badge
07:05 Dec-16-2013

280X is worthy :Đ

0
Rep
131
Offline
07:07 Dec-16-2013

yea let's just wait for 280 just like 290x isn't worth it, 290 is the same thing but downclocked so the difference is 10-7fps max

0
Rep
94
Offline
07:21 Dec-16-2013

Well, we could use that card.

0
Rep
55
Offline
01:52 Dec-16-2013

This won't make any difference. More VRAM doesn't make the card better. People today play in Full HD using a single screen mostly. Still, 270x is a good card.

0
Rep
25
Offline
02:20 Dec-16-2013

For higher res, yes more VRAM does make a difference

0
Rep
53
Offline
admin approved badge
04:36 Dec-16-2013

Yep, I'll have to agree with Spongy.

0
Rep
65
Offline
04:49 Dec-16-2013

yep on higher res it requires more vram but 256 bit isn't enough to utilize that 4GB.

0
Rep
53
Offline
admin approved badge
04:52 Dec-16-2013

Yeah. Buying this new product would be practically not a good idea. Better yet to buy a higher end GPU or the 2GB version.

0
Rep
1,152
Offline
admin approved badge
02:52 Dec-16-2013

but it's 256 bit which will hold this gpu back when come to high res. :)

0
Rep
65
Offline
04:50 Dec-16-2013

so buddy atlast 1000+ thumbs ouch you have 1000+ thumbs :)

0
Rep
1,152
Offline
admin approved badge
06:33 Dec-16-2013

yeah :)

0
Rep
53
Offline
01:19 Dec-16-2013

let say you buy this today, most probably the higher version of this will be released next few months. I stop chasing all this new tech after I bought my GTX670, instead I'll wait until I cannot play all the latest game. Then, I will think of new one.

0
Rep
17
Offline
11:48 Dec-16-2013

Same here, nothing I can´t play very well at full HD.
I want to upgrade something but there is just no reason.

0
Rep
25
Offline
00:47 Dec-16-2013

Wow, if only there was a color option

0
Rep
0
Offline
21:38 Dec-15-2013

I am having issues with my 270x, can somebody please help me!?
For some reason my PC isn't recognizing it. UNder display adapters in Device Manager it says "Microsft Basic Display Adapter" rather than the card name!
I've tried uninstalling and installing the drivers several times as well.

0
Rep
1,041
Offline
senior admin badge
21:44 Dec-15-2013

did you plug the card correctly?

0
Rep
0
Offline
21:46 Dec-15-2013

Absolutely, 100% sure!

0
Rep
1,041
Offline
senior admin badge
21:49 Dec-15-2013

then if you plugged power cables into it and still not recognized, the problem could be in your PSU...

0
Rep
0
Offline
21:50 Dec-15-2013

Well I have a 500w which is the minimum recommended PSU

0
Rep
1,041
Offline
senior admin badge
21:51 Dec-15-2013

depends on amperage delivered on rails...

0
Rep
0
Offline
21:52 Dec-15-2013

Alright that's a start, thank you

0
Rep
1,471
Offline
admin approved badge
22:20 Dec-15-2013

Hi.
Have you tried using GPU-Z? It seems like the motherboard is not recognizing your card and thus not deactivating its integrated chipset.
Are you really sure you have plugged all the necessary power cables?
Is the card really fit into the slot? Those things are hard to fit :D

0
Rep
1,041
Offline
senior admin badge
22:22 Dec-15-2013

OH, now when Pip said that, you might check BIOS settings and disable igpu or change graphics priority order :Đ

0
Rep
1,471
Offline
admin approved badge
22:28 Dec-15-2013

that should be automatic. In fact, all motherboards disable the chipset once the card is plugged in (and powered lol)

0
Rep
0
Offline
00:17 Dec-16-2013

Thank you everyone for your help! I just updated to windows 8.1. and my PC is now recognizing the card! All is well! Thanks again

0
Rep
15
Offline
04:42 Dec-16-2013

you just updated to windows 8.1 and now it recognizes your gpu? what on earth is this? so what exactly was the cause before?

0
Rep
53
Offline
admin approved badge
04:49 Dec-16-2013

Lol

0
Rep
1,041
Offline
senior admin badge
06:59 Dec-16-2013

@Pip not all, for ex. my P8Z77-I can use both cards (igpu+PCIx16) at once and run them in LucidVirtu (hybrid nVidia Optimus for desktop) so those both can be used for one video output ;)

0
Rep
1,471
Offline
admin approved badge
20:36 Dec-15-2013

Hum...regarding the comments below:
This much VIDEO RAM is only needed for extreme resolutions. Thus, even if the GPU is able to fully utilize its frame buffer, it's not powerful enough to work under those circumstances. In the end, the RAM addition ends up giving none or very little FPS when compared to the 2GB Version.

0
Rep
1,471
Offline
admin approved badge
20:39 Dec-15-2013

and a good example is GeForce GTX 780 Ti (a 3GB Card) performing better than GeForce GTX Titan (a 6GB Card) despite having half as much bandwidth. It's all about Shader Performance these days and 780 Ti has it all.

0
Rep
89
Offline
20:48 Dec-15-2013

Do you think that 3gb will be enough in future or is this card too slow when 3 gb is not much anymore ?

0
Rep
1,471
Offline
admin approved badge
20:49 Dec-15-2013

3GB will suffice for 1080p for at least 5 more years now.

0
Rep
1,041
Offline
senior admin badge
20:56 Dec-15-2013

depends if future will be 4K monitors ;)

0
Rep
1,991
Offline
admin approved badge
21:05 Dec-15-2013

I don't think think our technology is ready for that. Rigs like brgamer's tend to struggle maxing out games at 4K.

0
Rep
1,041
Offline
senior admin badge
21:14 Dec-15-2013

yea, that's why I'm going for 1440p :))

0
Rep
382
Offline
21:17 Dec-15-2013

I don't think our wallets are either :P

0
Rep
1,471
Offline
admin approved badge
20:49 Dec-15-2013

and let's not forget the failure the GeForce GTX 660 Ti 3GB version is. It performs worse than the 2GB version due to its bus-width configuration. Sometimes higher memory means less performance.

0
Rep
7
Offline
21:36 Dec-15-2013

Mind if I ask u in the future for some help, because I plan on upgrading my PC for The Witcher 3. And it looks like you are one of the few that actually speak some sense xD.

0
Rep
1,471
Offline
admin approved badge
22:22 Dec-15-2013

Hi.
Currently your PC is quite balanced and thus upgrading a single component would cause a bottleneck. Are you upgrading both GFX and CPU?

0
Rep
7
Offline
11:15 Dec-16-2013

Yeah, I had in mind to upgrade both.
I thought about the FX 8350 and R9 280X. But recently the price got higher for the 280x so I don't know what to choose, since the R9 290 isn't a bad choice but money is always a problem in my case. I also don't know how my Motherboard ( MSI 990XA - GD55 ) would take all that. And what PSU would be decent for those components :).

0
Rep
11
Offline
21:40 Dec-15-2013

Another exaple would be the Geforce GTX 660 2gb or 3gb performing much better than the Zotac Geforce GT 630 4gb card found here http://www.amazon.com/Zotac-GeForce-Express-Graphics-ZT-60405-10L/dp/B0084IELGE/ref=pd_ybh_4

0
Rep
386
Offline
admin approved badge
21:43 Dec-15-2013

bro the amount of Vram doesn't determin which gpu is better, there are other things like shaders,ROPs,etc.the gtx 660 is better than a gt630, because it has more cuda cores, more texture units, more render units and more bandwith :)

0
Rep
11
Offline
21:44 Dec-15-2013

Yeah thats exactly what i was explaining...

0
Rep
386
Offline
admin approved badge
21:46 Dec-15-2013

oh.. my bad, I didn't see the context :)

0
Rep
19
Offline
20:43 Dec-15-2013

Thanks, useful information. :)

0
Rep
354
Offline
admin approved badge
20:44 Dec-15-2013

Thats right, i can imagine 3 of them in crossfire in multimonitor setup
4GB will help in that case a lot

0
Rep
1,471
Offline
admin approved badge
20:45 Dec-15-2013

in that case they 2 x 3 GB will offer the same performance!

0
Rep
382
Offline
20:45 Dec-15-2013

So you're saying that the bus width isn't a limiting factor? That the screen resolution and shaders are?
Thanks.
Would you be so kind to help me figure out a thing in the thread below? Tzz says the card is internally bottleneck, so I was trying to calculate if it really is. Too bad I've forgotten how to do that :D

0
Rep
1,471
Offline
admin approved badge
20:47 Dec-15-2013

I can't say for sure the card cannot fully utilize a 4GB frame buffer with that memory bandwidth but I can say that, that much video memory is only needed at extreme resolutions (5760x1080) and with extremely demanding games and with image improving techniques. In that situation, the 4GB card might deliver 20 FPS while the 2GB delivers 19.5. Not worth the difference.

0
Rep
386
Offline
admin approved badge
20:50 Dec-15-2013

http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
is this true pip, or just the gtx680 can't utilize more than 2gb of Vram? :)

0
Rep
1,471
Offline
admin approved badge
20:54 Dec-15-2013

that depends if the RAM usage on those games was above 2GB and how much above.
On the 3D games it appears that there's no difference - this is what I mentioned above.
On the synthetic benchmark which is quite heavy, there is a significant difference but both cards show a poor result - also what I mentioned above.

0
Rep
386
Offline
admin approved badge
21:01 Dec-15-2013

thanks for the info :)

0
Rep
386
Offline
admin approved badge
20:49 Dec-15-2013

tnx for the info I tought that it was just the bus and it's bandwith, tnx really much Pip and sorry for the false info I gave, but I got it all wrong :)

0
Rep
1,041
Offline
senior admin badge
19:33 Dec-15-2013

useless, just making it expensive to produce :P

0
Rep
1,991
Offline
admin approved badge
20:40 Dec-15-2013

I saw "10gb" at the end of the link and was just like "what the f*ck?"...

0
Rep
382
Offline
19:44 Dec-15-2013

Wow, and that's coming from an admin who's supposed to know at least a bit about hardware. You should know better, Tzz. :)
You could have at least commented why you think it's useless, because the card itself CAN utilize ALL of it's video memory.
The producing cost doesn't actually increase much. It's just the memory, really. There isn't any additional heat or anything for which you would need better cooling, thus making the producing costs a lot higher. I'm still waiting for any valid sources if somebody doesn't agree that the card can't use more than 2 GB of video RAM

0
Rep
1,041
Offline
senior admin badge
20:09 Dec-15-2013

ok let me start,
any gfx card will try to use all its memory (before overwriting, same as it works for RAM), so people who say that gfx cannot use its memory capacity are wrong;
now let me explain you my post "useless" - it's because that the memory capacity is bottlenecked by bandwidth (which is influenced by memory bus width and memory frequency)
I will not provide any sources, but you can try to calculate how long does it take to fill those 4GB with that gfx bandwidth, then consider what frames per second you'd want and you'll clearly see those 4GB cannot be used effectively, and so would be faster if the card was only 2GB because then it could use additional 2GB of RAM which would be running as 128bit if ran as dual-channel.

0
Rep
382
Offline
20:18 Dec-15-2013

Yeah, I see your point. Wouldn't you agree that the higher VRAM is actually a benefit for higher resolution textures? They can be loaded onto the 'spare' video RAM and loaded straight from there, instead of having to load them from SSD/HDD. I'll try and calculate the bandwidth for this particular card. Just lemme remember the formula.
edit: damn, I really can't figure out the formula for this. Perhaps you remember it?
Was it something like: peak bandwidth divided by desired FPS or was it frame times?
Okay, I'm really starting to confuse myself(and probably others too)

0
Rep
1,041
Offline
senior admin badge
20:29 Dec-15-2013

emilsz ...that would be true, but you're forgetting that "loading" is dependent on bandwidth;
what would be the benefit of loading something if the "queue" is already full? result would be fps drop in case that loading had higher priority...
for ex. one 1080p picture is 2,073,600 pixels, 60fps means 124,416,000 pixels - in a second(!) if we'd consider 2×AA then it would be 497,664,000 pixels in a second...

0
Rep
382
Offline
20:41 Dec-15-2013

With color depth of 32 bit one pixel takes 4 bytes, right? Which means that those almost 500 million pixels would need two billion bytes, that is a bit less than 2 gigabytes. So if the peak is 180 GB/s, how come there would be an internal bottleneck with only 2 GB of data being transferred in a second? I think we missed something here.

0
Rep
1,041
Offline
senior admin badge
20:48 Dec-15-2013

@emilsz ..yay my bad, we didn't calculate sync data, both-way transfer and such...

0
Rep
1,041
Offline
senior admin badge
21:00 Dec-15-2013

OH I know what's the trick, PCIx16 speeds:
v1.x: 4 GB/s (40 GT/s)
v2.x: 8 GB/s (80 GT/s)
v3.0: 15.75 GB/s (128 GT/s)
v4.0: 31.51 GB/s (256 GT/s)
the gfx cards should are actually counted in GT (giga texels), so manufacturer's gimmick, as you can see above, the PCIx16 3.0 offers up to 15.75GB/s (so would be nonsense if gfx card had 220GB/s)

0
Rep
382
Offline
21:16 Dec-15-2013

Okay, let's not confuse ourselves much more today. I'll try and find me some literature on this matter and I'll try to study this thing. But hey, PCIx16 3.0 is just for marketing, even high end cards get practically no difference when running on either 1.0 or 3.0. Nowadays cards haven't even advanced that far to use that much bandwidth. So VRAM of hi-end cards isn't really running faster than 4GB/s? Again, confusing.

0
Rep
1,041
Offline
senior admin badge
21:24 Dec-15-2013

*I'm a bit tired today, so if anyone feels I'm writing bull*** feel free to add his opinions
@emilsz basically something like that; for ex. I tested my RAM (as RAMdisk, so I could use HDD CrystalDisk Mark benchmark) result as you can see it's about 10GB/s for my 1600MHz DDR3 ram

0
Rep
11
Offline
19:32 Dec-15-2013
0
Rep
1,471
Offline
admin approved badge
20:40 Dec-15-2013

Well that GFX is a joke :P

0
Rep
11
Offline
21:29 Dec-15-2013

Yeah like what is the point in it?? Its a low end gpu with 4gb ram???

0
Rep
89
Offline
21:58 Dec-15-2013

Marketing trick.

0
Rep
354
Offline
admin approved badge
19:05 Dec-15-2013

Why to stop at 4GB? Why not to use 32GB of ddr3 and sell it as ULTRA R9 2070X

0
Rep
382
Offline
19:09 Dec-15-2013

Because you wouldn't really get much of 32 gigs of VRAM, especially if it's ddr3. There is a point in making 4GB version for CF and 1600p+ resolutions, but there's not much point going over 4 gigs.
I'm not even sure why I'm replying though.. I know it's sarcasm :D

0
Rep
1,471
Offline
admin approved badge
20:41 Dec-15-2013

he was being sarcastic.
Also, it is not possible to have that many memory modules in a single card, especially DDR3. Unless that's all the card has lol

0
Rep
1,991
Offline
admin approved badge
20:42 Dec-15-2013

I would like to see a card with 32 gigs of RAM though, useless or not.

0
Rep
354
Offline
admin approved badge
20:43 Dec-15-2013

Why not? high density 8GB chip x 4

0
Rep
1,471
Offline
admin approved badge
20:44 Dec-15-2013

I think we will never see that unless people all start to game at 100 inch Monitors or more memory consuming techniques are invented lol

0
Rep
1,471
Offline
admin approved badge
20:45 Dec-15-2013

and where is the so called 8GB Chip? lol

0
Rep
354
Offline
admin approved badge
20:48 Dec-15-2013

Servers?

0
Rep
9
Offline
18:50 Dec-15-2013

Nope, not interested :D

0
Rep
460
Offline
admin approved badge
18:33 Dec-15-2013

Doesn't this GPU have a 256 bit bus...? So this 4GB is a gimmick right...?

0
Rep
386
Offline
admin approved badge
18:35 Dec-15-2013

yes, you will get no performance boost over the 2gb version no matter what :)

0
Rep
382
Offline
18:41 Dec-15-2013

Stop repeating that and give me an explanation why you think so, or who told you so. Or I'll be forced to call Pip here :D

0
Rep
386
Offline
admin approved badge
18:49 Dec-15-2013

kk, well most people don't read the comments below if they did they wouldn't repeat the same question, for me to answear :)

0
Rep
250
Offline
admin approved badge
18:17 Dec-15-2013

Gimmick. Meh

0
Rep
-6
Offline
18:02 Dec-15-2013

i dunno i wouldn't get anything less than a 280x but that's just my opinion

0
Rep
382
Offline
18:02 Dec-15-2013

Probably because you have one :P

0
Rep
-6
Offline
18:03 Dec-15-2013

no i actually dont...yet...as soon as i re read my comment i was like they are gonna think thats why i said that...

0
Rep
382
Offline
18:05 Dec-15-2013

=D
Getting it soon?

0
Rep
-6
Offline
18:13 Dec-15-2013

yea 3gb toxic but im still torn because i just want to see what the hell mantle is going to do for amd...if its nothing worthy i might just postpone the build and save more for a 780...im just tired of missing out on pc gaming because im using this crappy laptop atm lol

0
Rep
382
Offline
18:15 Dec-15-2013

Shouldn't be long now. Just remember that Mantle won't be anything huge at first. It'll take time to develop, it'll get better only by time.

0
Rep
386
Offline
admin approved badge
18:18 Dec-15-2013

the toxic r9 280x actually is almost as good as the gtx780, and with slight overclock from your side you can get the same performance as a gtx780 and it will not heat, don't worry :)

0
Rep
59
Offline
18:33 Dec-15-2013

280X toxic is well over 10% worse according to GD... but if you really want mantle get a R9-290 instead. it costs 30$ cheaper than GTX 780 AFAIK, and custom coolers should come out by january.

0
Rep
386
Offline
admin approved badge
18:36 Dec-15-2013

yes the r9 290 is better than the gtx780 and a stock r9 290 costs only 350-400$ and the stock gtx780 costs 450-500$ :)

0
Rep
-6
Offline
18:42 Dec-15-2013

I just might do that instead then thx guys... appreciate the feedback

0
Rep
131
Offline
17:58 Dec-15-2013

maybe 4gb's are useful if you run them in crossfire on a tri monitor setup but still, it's 256-bit... atleast 384-bit would be enough for enough bandwith

0
Rep
386
Offline
admin approved badge
18:37 Dec-15-2013

nope 256 bit buss can't use more than 2gb of vram, even if more than 2gb of vram are Filled with data it will use only 2gb at a time, no matter what :)

0
Rep
5
Offline
18:40 Dec-15-2013

so you are busy spreading your wrong info everywhere..i am new to this site..it had a really bad impression u know..

0
Rep
382
Offline
18:44 Dec-15-2013

I'm sorry that you've had a bad first impression of our site :/ Hopefully, this guy above will explain himself better or I'll have to report him to the authorities. GD supports a discussion, but we don't support misleading :)

0
Rep
59
Offline
18:44 Dec-15-2013

it CAN use more than. but the card doesn;t have the processing power to fill up the 4Gb so extra vram is waste for this card.

0
Rep
5
Offline
18:49 Dec-15-2013

Ok thanks emilszz ..looks like you a senior member u have badge and high rep..so i guess i can rely on you..cheers..:)

0
Rep
1,152
Offline
admin approved badge
17:52 Dec-15-2013

so what's the price of this card :)

0
Rep
386
Offline
admin approved badge
17:55 Dec-15-2013

20$ over the 2gb version, but this gpu is just a gimmick, it will give no perfromance boost over the 2gb version, 256bits can handle only up to 2gb of gddr5 ram :)

0
Rep
382
Offline
18:00 Dec-15-2013

That's not true. Who told you that?
It can use a terabyte of VRAM, but the raw GPU power just isn't enough at 1080p for the card to really use more than 2 gigs with titles in these days.
Two of those in CF would use all four gigs at a 4K reso.

0
Rep
1,152
Offline
admin approved badge
18:04 Dec-15-2013

yeah high res. like 3840x2160 will be benefited with this gpu :)

0
Rep
5
Offline
18:34 Dec-15-2013

@emilsz guy atleast somebody here knows properly about hardware..i said same thing below and noobs said 256bit cannot use 4Gb lol

0

Can They Run... |

Core i7-10750H 6-Core 2.60GHz GeForce RTX 2060 Mobile 16GB
| 60FPS, Medium, 720p
Core i5-4440 3.1GHz Radeon HD 6670 v2 Gigabyte OC 1GB Edition 16GB
| 60FPS, High, 720p
Core i3-1005G1 2-Core 1.20GHz UHD Graphics 630 4GB
| 30FPS, Low, 720p
Core i5-10400F 6-Core 2.90GHz Radeon RX 560 4GB 16GB
Ryzen 5 2600X 6-Core 3.6GHz Radeon RX 5600 XT Gigabyte Gaming OC 6GB 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Core i7-10700 8-Core 2.90GHz GeForce RTX 2060 6GB 16GB
| 30FPS, Medium, 1080p
Ryzen 7 4800H 8-Core 2.9GHz GeForce GTX 1650 Ti Mobile 16GB
| 60FPS, High, 1080p
Core i5-9300H 4-Core 2.4GHz GeForce GTX 1650 8GB
| 60FPS, Ultra, 1080p
Ryzen 5 3600 6-Core 3.6GHz GeForce RTX 3060 16GB
50% Yes [2 votes]
| 60FPS, Low, 1080p
Core i5-3470 3.2GHz Radeon RX 470 Sapphire Nitro+ 8GB 16GB
100% Yes [1 votes]
| 60FPS, High, 1080p
Core i7-10870H 8-Core 2.20GHz GeForce RTX 2060 Asus ROG STRIX Gaming 6GB 16GB
100% Yes [2 votes]
| 60FPS, Medium, 1080p
Ryzen 5 3400G 4-Core 3.7GHz GeForce RTX 2060 6GB 16GB
| 60FPS, Ultra, 1080p
Ryzen 7 3750H 4-Core 2.3 GHz GeForce RTX 2060 Mobile 16GB
100% Yes [1 votes]
| 60FPS, Ultra, 1080p
Core i7-6800K 6-Core 3.4GHz GeForce GTX 1080 Asus ROG Strix Gaming OC 8GB Edition 32GB
100% Yes [2 votes]
| 60FPS, High, 1080p
Core i7-10700 8-Core 2.90GHz GeForce GTX 1050 Gigabyte D5 2GB 8GB
0% No [1 votes]
Ryzen 5 3600 6-Core 3.6GHz GeForce RTX 2070 Gigabyte Windforce 8GB 16GB
100% Yes [4 votes]
Ryzen 7 5800H 8-Core 3.2GHz GeForce RTX 3060 Mobile 16GB
100% Yes [2 votes]
| 60FPS, Ultra, 1440p
Ryzen 9 3900X 12-Core 3.8GHz GeForce RTX 3070 EVGA FTW3 Ultra Gaming 8GB 32GB
80% Yes [5 votes]
| 60FPS, Low, 720p
APU A8-7410 Quad-Core Radeon R5 7410 8GB
100% Yes [4 votes]
| 60FPS, Ultra, 1080p
Core i5-11400 6-Core 2.7GHz GeForce RTX 3050 Ti Mobile 8GB
0% No [1 votes]