Up For Debate - With the GeForce GTX 1660 Ti, GPU Names Have Become a Confusing Mess

Written by Jon Sutton on Sat, Jan 26, 2019 2:00 PM

With Nvidia’s GeForce GTX 1660 Ti looking increasingly less like an April Fool’s joke and more like the living nightmare of anyone faintly interested in naming conventions, is it about time hardware name was simplified? 

For anyone not intimately familiar with the weird and wonderful world of PC gaming hardware, Nvidia’s naming scheme just became gibberish. Even for the millions who know the ins and outs of graphics cards, it’s incredibly difficult to find a logical reason just why Nvidia would settle on the GTX 16 series.

It means we had the beloved GeForce GTX 10 series, then the GeForce RTX 20 series, and then an awkward step backward to the GeForce GTX 16 range. It’s almost like Nvidia is trying to split the difference between the 10 and 20 series, forgetting the GTX 15 series makes far more logical sense.

The end result is that Joe Bloggs walking into PC World is faced with a bewildering array of options, stretching from budget-priced GT 1030’s up to the $1200 GeForce RTX 2080 Ti. Is a GeForce GTX 1070 faster than a GTX 1660? Is a GeForce RTX 2060 weaker than a GeForce GTX 1660 Ti? What’s the difference between a 2060 and a 1060? Is the 1660 Ti a previous-gen graphics card?

If the rumours are true and Nvidia goes ahead with this, Team Green has obfuscated graphics card naming in a bewildering way. What’s most disappointing is how they’ve done so much work over the last decade to try and simplify it all. No one would argue all PC hardware has naming issues. Whether it’s deliberate or not, nothing is simple. Don’t even get me started on motherboards.

But really, I want to know what Nvidia’s plan is here. For me, the GeForce GTX 1660 Ti name instantly makes me look at it like something I’ve just trodden in. What is this odd-number monstrosity I see before me? It turns me off the entire GTX 16 Series. Performance be damned, it somehow even manages to make the GTX 10 Series more attractive. But is this all part of Nvidia’s warped plans? Could it even perhaps be that Nvidia is deliberately confusing its customer base in order to push them towards the safe, the obvious  - the (more expensive) RTX range? I don’t really know. I don’t think anyone does really.

What we can probably agree on though is that graphics card names and generations could and should be a whole lot simpler. In most markets, confusing naming is the enemy of success. Nintendo had an entire console generation fail because people still thought the Wii U was a controller for the Wii four years later. In PC hardware it practically feels like it’s welcomed though, and it’s enormously intimidating for anyone looking to get into our favourite hobby.

So we’ve got a couple of questions for you here. Firstly, what do you make of the possible GeForce GTX 1660 Ti naming, and why do you think it was picked? And secondly, do you think hardware manufacturers are guilty of trying to deliberately confuse their customer base? Get voting and let us know why below!

Do you like the GeForce GTX 1660 Ti's name?

Are graphics card names too confusing?

Login or Register to join the debate

Rep
35
Offline
07:35 Jan-31-2019

Wow... What a marketing strategy.
You nvidia are drunk, you can go home.

1
Rep
164
Offline
20:25 Jan-28-2019

may be its a new confusing strategy by Nvidia.

1
Rep
386
Offline
admin approved badge
20:29 Jan-28-2019

Yeah, be random, confuse the opponent. Or maybe they don't want to be countered by AMD's uncreative marketing of just sticking different letters in front of their names that are very similar to the competition. Especially with that rumor of Navi GPUs being called RX 3080,3070 and so on, obviously why they would choose that if that turns out to be the true naming.

1
Rep
12
Offline
07:41 Jan-27-2019

It's a nice idea creating a version of RTX 2060 that doesn't have ray tracing,personally I wouldn't use it, I'm not interested in this technology yet but I don't know, the rtx 2060 is like a gtx 1070 for $350.

1
Rep
12
Offline
07:44 Jan-27-2019

So if the 1660ti comes out it will be around $230 for berformance between the 1060 and the 1070.

0
Rep
179
Offline
admin approved badge
09:31 Jan-27-2019

It's not nearly as good as the 2060, the 2060 can actually go head to head with a 1080... the 1660 TI has been stripped of more than just its RTX and Tensor cores, they've also stripped out a ton of cuda cores, weakening this thing severely, the 1660(non-ti) seems like it might actually even get outpunched by a 1060 it's been stripped down so much.... these cards are just stupid and don't really have a reason to exist... oh... and I wouldn't expect such low pricing, more like $280 for 1660ti

3
Rep
58
Offline
admin approved badge
10:59 Jan-27-2019

Nvidia claims the reason they exist is for more buying options are different price points.

1
Rep
386
Offline
admin approved badge
11:03 Jan-27-2019

20% fewer Cuda cores than the gtx 1660ti puts it in gtx 1070 territory of performance(slightly higher), but if it's gddr6 it won't be bandwidth starved like the gtx 1070 so it will be better for sure.

0
Rep
58
Offline
admin approved badge
11:53 Jan-27-2019

For me personally more buying options is a good thing. I was so excited to get the RTX 2060 until I was at the till and was told the checkout price...such a let down lol

1
Rep
386
Offline
admin approved badge
11:54 Jan-27-2019

well, I will recommend you what I recommend to everybody, wait for 7nm GPUs, especially with your gtx 1060 a rtx 2060 would be a poor upgrade.

1
Rep
58
Offline
admin approved badge
12:29 Jan-27-2019

I'm kinda glad I didn't buy when I did. The second generation RTX cards will have much better RT performance anyways...

1
Rep
179
Offline
admin approved badge
17:10 Jan-27-2019

The 1660 TI will likely perform like a 1070...... but if specs are correct, the 1660 from what we're seeing will likely perform more like a 1060.

1
Rep
12
Offline
12:06 Jan-27-2019

Yeah def, it's better to wait for Navi,this way nvidia will have some competition and they might lower their prices

0
Rep
386
Offline
admin approved badge
12:09 Jan-27-2019

From what I'm getting you want Navi to be good, to get cheaper Nvidia GPUs? No?

0
Rep
80
Offline
admin approved badge
12:24 Jan-27-2019

same as everyone as amd is most unlikely to make faster gpus then nvidia bec they are targeting budget to med tiear so team green will still have strongest ones but will lower prices due to amd caching up on them and are cheaper

0
Rep
386
Offline
admin approved badge
12:39 Jan-27-2019

So people expect AMD to compete without anybody buying their GPUs? Well, that's not how the market works... -_-
If you wonder why they don't care to make a good gaming architecture that's why... that's why we got Vega and Fury, because they really weren't counting on people buying them for gaming as they didn't before... From 2007-2016(until Pascal) AMD had vastly to quite a bit superior architectures yet the best they managed to do is 50% market share way back in 2009 when Nvidia had no new GPUs until 2010 and when first gen fermi came out and was a melting, power hungry slug of an architecture...

0
Rep
80
Offline
admin approved badge
12:59 Jan-27-2019

i dont say that theyr gpus are bad or that they are not selling (and those everyone i pulled up from my as) but i think that majority(maybe) are excited that new amd gpus comes out bec that will mean that nvidia is going to lower theyr prices and maybe even response to it with theyr new ones same as intel do, onother side there are those that wants amd becouse its cheaper they age slower and they keep up pretty good with nvidias ones, and i am none of those two quit opposite i am unhappy whenever i hear about new gpus realase as i know that means that mine is falling back and requirments are going up which is normal, to some point

0
Rep
12
Offline
14:37 Jan-27-2019

Well,depends,I've got for free a 4k monitor that has freesync,if Navi has some good performance per $ then I'm gonna buy AMD's gpu.THE RTX gpus are really good but they are expensive.

0
Rep
12
Offline
14:39 Jan-27-2019

If AMD makes a gpu that is cappable of doing even 1440p for around $200 then I'm gonna buy AMD for sure.I believe the 1660Ti can't handle 1440p really good,so no point getting that.

0
Rep
12
Offline
14:43 Jan-27-2019

AMD makes great cards for 1080p gaming but if you want more than that you have to go with Vega but the problem is that they are power hungry.They are great but still 2 much power consumption.

0
Rep
386
Offline
admin approved badge
15:24 Jan-27-2019

@Narkady
Vega 56 has really nice power consumption, about equal to a gtx 1080, so it's not much higher than Nvidia counterparts, considering its computing power(which is irrelevant for gaming, but still) is quite brutal. Vega64 is at 1080ti power consumption.


And in general they should focus on value, 80-85%+ of people have GPUs that are 200$ or less(maybe a little over 200$ too), while less than 5% of people have GPUs that are over 400$. It's absolutely pointless for them to focus on high-end GPUs, especially with nvidia's reputation and people that that price point basically buying the brand over everything else... Nvidia's branding and marketing is much better than AMD's.

0
Rep
386
Offline
admin approved badge
15:27 Jan-27-2019

AMD has been focusing on the 200$ and less market for a while now, while releasing computing GPUs that can game in the form of Vega and Fury.


Both Nvidia and AMD should focus on sub 400$ GPUs and wait until their GPUs in those price ranges become good enough for 4k or 8k or whatever future resolution people buy into(I'm sticking to 21:9 1080p personally) and for those who want more Crossfire and SLI is always an option with modern engines supporting SLI and Crossfire quite well after the few years period where it was quite poor from 2013-2016.
But of course there is little profit to be had like that, so they mark up the prices(especially Nvidia, but AMD too and AMD started it) to make higher profit margins at the expense of the consumer...

0
Rep
386
Offline
admin approved badge
15:30 Jan-27-2019

@David988
People should buy AMD GPUs when they are good and a better value than Nvidia and vice versa(if that ever happens outside peak mining craze times) they should buy Nvidia when they are good and a better value than AMD... Instead of that people just use AMD to lower Nvidia's prices and then buy Nvidia GPUs and after a while when AMD has sat on 25-30% market share they wonder why they are having a hard time to be competitive with Nvidia and are blaming AMD for Nvidia's high prices, instead of blaming the people who bought 1200$ Titans and 700$ Gtx 1080s...

0
Rep
80
Offline
admin approved badge
16:02 Jan-27-2019

uhh.. i am the one that paid 700$ for 1080.. anyway i get it what y want to say bot rx 400/500 was selling good i think, then why do we got 1000$ gpus from nvidia anyway?

0
Rep
386
Offline
admin approved badge
16:11 Jan-27-2019

We got 1000$ GPUs from Nvidia, because they know that the limited quantity they will produce of them will sell and they know that since people bought the 700$ gtx 1080, keep in mind that the gtx 1080 has a 312mm^2 die and the rtx 2080 has a 545mm^2 die, along with Pascal being a low priority R&D architecture. And in reality the GTX 1080 would have been very profitable even at 300-350$...


They are basically doing a really neat(well neat for them) tactic of raising the prices until demand stalls or starts to slightly fall down.

0
Rep
386
Offline
admin approved badge
16:13 Jan-27-2019

And it's normal for most people who are not tech geeks like me to not know that a gtx 1080 was esentially a mid-range chip sold at high-end price in 2016, Nvidia is betting exactly on that, for people to just look at the name and buy it based on that.


The GTX 80 series GPUs have not been high-end since the gtx 600 series(except the 700 series), now the RTX 2080 is a high-end GPU, but tons of the die space is taken by tensor and RT cores, so it's gaming value(so far) is very low, just like Vega's die size is full of FP64 cores and FP16 cores which are(so far) useless for gaming.

0
Rep
76
Offline
admin approved badge
21:03 Jan-26-2019

1660 as name makes no sense. Sure for tech enthusiasts there is no problem and we will remember it. But for anyone else it is just confusing. I believe GTX1160 or maybe GTX2050 would make more sense, 2050 because it is not same as RTX2060, to avoid mainstream confusing of RTX2060 and GTX2060, since they are not same.

3
Rep
76
Offline
admin approved badge
21:05 Jan-26-2019

At least not same in terms of rasterization performance. Which makes sense, since nVidia doesn't want potential RTX2060 buyers to skip RTX2060, plus they don't want to put price on RTX. But still 1660 feels bit too random. Also if it was 2050/2050Ti, then 1650 could be GTX2040, if it is also planned.

1
Rep
76
Offline
admin approved badge
21:08 Jan-26-2019

But I guess it is all to sell more RTX20 cards, since for mainstream 20 is higher than 16 and will give feeling like you are not getting latest thing. But then again, it is not first time nVidia is doing it, there are two 1030s, two 1050s and like 5 or 6 1060s, plus in between cards that weren't necessary like 1070Ti.

1
Rep
80
Offline
admin approved badge
04:48 Jan-27-2019

i agree they should just make 2050gtx/rtx then gtx2040 and gt2030

0
Rep
386
Offline
admin approved badge
17:48 Jan-26-2019

Marketing names are of no importance. All that matters is die size and die size relative to the biggest possible die size for the process node(optical limit). Maybe the chip designations as well, as they don't change regardless of anything it seems.

0
Rep
179
Offline
admin approved badge
18:20 Jan-26-2019

To me, performance gains matter far more than die size, if the incremental gain isn't there, the GPU isn't worth bothering with, and the 1660 and 1660 TI don't seem to have a single reason to exist in my mind, if the 1660 TI was simply a 2060 with the Tensor and RTX cores stripped away for around $250, then and only then would it have a reason to exist...... as it stands now, it should barely trade punches with a 1070, and the price point will likely be near $300, the 1660 will fare even worse.

0
Rep
386
Offline
admin approved badge
20:05 Jan-26-2019

I'm not talking performance, talking about just naming.
Price/Performance is the most important, but if they both start asking too much for some small, low-end GPUs it'd be stupid.

2
Rep
179
Offline
admin approved badge
18:28 Jan-26-2019

And so it doesn't seem like I'm picking on Nvidia, the RX 590 and Vega VII haven't any reasons to exist at their price points either...... both AMD and Nvidia right now haven't given us anything to be happy about...... Nvidia with their beta hardware features and minimal incremental performance improvements, and AMD who can't even seem to squeeze a good incremental improvement out of a die shrink...... both of them are extreme disappointments right now.

2
Rep
93
Offline
17:30 Jan-26-2019

Also I'm confused with CPU and GPU names now, there's so many name changes.

0
Rep
93
Offline
17:29 Jan-26-2019

I'll wait for the GTZ 2547 Ti 73GB GDDR6X Edition.

9
Rep
11
Offline
20:31 Jan-26-2019

Sorry, those are OEM partner cards only.

5
Rep
34
Online
15:08 Jan-26-2019

I barely got used to the R9/R7 - FX whatever terminology in the yesteryear generation. Now it's all about Ryzen, RX, Vega... Nvidia is a tad more stable on this subject even with 1660...

4
Rep
58
Offline
admin approved badge
16:03 Jan-26-2019

Nothing is more confusing then Intel's model names lol

14
Rep
34
Online
16:51 Jan-26-2019

True, their names are weird too, too many Lakes, but the model numbers are still quite the same as always

1
Rep
6
Offline
15:00 Jan-26-2019

gtx 10idontbuy

8
Rep
58
Offline
admin approved badge
14:32 Jan-26-2019

It is a strange model name but whatever really...as long as it is a decent card who cares...?

1
Rep
58
Offline
admin approved badge
14:20 Jan-26-2019

I know ppl who would go for the intel UHD630 over the rtx2080Ti cause it has Intel and UHD in its name so...

4
Rep
58
Offline
admin approved badge
14:21 Jan-26-2019

and shops selling fx-4300 & gt 730 combos as gamer pc...

5
Rep
80
Offline
admin approved badge
17:03 Jan-26-2019

and saying its ultra fast graphics

6
Rep
58
Offline
admin approved badge
17:30 Jan-26-2019

the gtx 1050m as 4k ultra gaming laptop is my favourite (sold with a full hd display)

5
Rep
80
Offline
admin approved badge
17:58 Jan-26-2019

pff that one is even good, but those ultra fast HD and UHD integrated gpus that can run all modern games in high definition

4
Rep
58
Offline
admin approved badge
11:56 Jan-27-2019

You know the GTX 1050 is a decent 1080P high quality setting card. A friend of mine has one and quite happy with it.

0
Rep
386
Offline
admin approved badge
18:11 Jan-26-2019

90% of people IRL or even on some sites that ask me for a PC and usually they go:
"I want a good gaming GPU with 4GB VRam from Nvidia, I don't like AMD, AMD is bad... and one of those i5s, I've heard that's the perfect gaming PC". And every time I think to myself, I should make them an old i5 650 with a gt 710/730 4GB DDR3 build just to troll them and let me pick their PC for their budget myself...


Especially annoying when with the 1000 series the gtx 1060 is 3GB or 6GB(never mind the cuda cores) and I have to explain to them that Vram is one of the less important performance characteristics and that there is no 4GB version and why that is so(bus width)...

6
Rep
58
Offline
admin approved badge
12:00 Jan-27-2019

I simply tell people the amount of memory on a VGA card is only really relevant for resolutions while gaming. The higher the resolution, the more VRAM needed. I don't even go into AA settings just to confusing for new PC gamers to understand. From my experience most new PC gamers are console gamers and they are used to no AA anyways...

0
Rep
386
Offline
admin approved badge
12:07 Jan-27-2019

And then you need to explain to them that not every GPU can handle all the Vram due to the limitation in the memory, memory controllers and bandwidth. Most double Vram GPUs are for SLI/Crossfire, for example the rx 470/570 8GB, 480/580 8GB, gtx 1060 6GB(but in this case it has more cuda cores and TMUs, so it's a valid choice), the gtx 1070 8GB(not that it has lower than 8GB version), gtx 960 4gb, r9 380 4gb and so on are all GPUs with double the Vram that their memory controller Can effectively use and are meant for crossfire/SLI since Vram does NOT stack in SLI/Crossfire, so absolutely pointless to spend the extra 30-50$ on top unless they are the only option available, but by the time I explain that they have basically tuned me out... -_-

2
Rep
28
Offline
14:12 Jan-26-2019

Yeah gt 1030 ddr5 and gt 1030 ddr4 not confusing at all and the ddr4 one performs a lot worse. And all 5 versions of gtx 1060s

0

Can They Run... |

| 60FPS, Ultra, 1080p
Core i7-7700K 4-Core 4.2GHz Intel HD Graphics 630 Mobile 32GB
0% No [1 votes]
| 60FPS, Ultra, 1080p
Core i7-4770K 4-Core 3.5GHz GeForce RTX 2080 EVGA XC Gaming 8GB 16GB
| 60FPS, Medium, 1080p
Ryzen 7 3700X 8-Core 3.6GHz GeForce GTX 1660 Ti Zotac Gaming 6GB 16GB
| 60FPS, High, 1080p
Core i5-11400H 6-Core 2.20GHz GeForce RTX 3050 Mobile 8GB
| 30FPS, 720p
Core i7-5500U 2-Core 2.4GHz GeForce 940M 2GB 8GB
| 60FPS, High, 1080p
Core i7-3770 4-Core 3.4GHz GeForce GTX 1070 Gigabyte G1 Gaming 8GB Edition 16GB
| 60FPS, High, 1080p
Core i7-10750H 6-Core 2.60GHz GeForce RTX 3060 Mobile 16GB
100% Yes [1 votes]
| 60FPS, Low, 1080p
Core i5-6400 2.7GHz GeForce GTX 1050 8GB
| 60FPS, Ultra, 1440p
Core i5-9600K 6-Core 3.7GHz GeForce RTX 2060 Asus ROG Strix Gaming OC 6GB 32GB
| 60FPS, High, 1080p
FX-6300 GeForce GTX 760 24GB
| 30FPS, Ultra, 1080p
Core i5-11400H 6-Core 2.20GHz GeForce RTX 3050 Mobile 8GB
| 30FPS, High, 1080p
Core i5-11400H 6-Core 2.20GHz GeForce RTX 3050 Mobile 8GB
100% Yes [2 votes]