# MSI GeForce GTX 1660 Ti Gaming X 6 GB



## W1zzard (Feb 22, 2019)

The MSI GeForce GTX 1660 Ti is the company's most premium offering based on the new NVIDIA Turing chip that lacks ray-tracing features, designed to woo gamers under the $300-mark. Twin Frozr 7 and other premium MSI exclusives dress up this factory-overclocked card that sits quiet when idling.

*Show full review*


----------



## darkangel0504 (Feb 22, 2019)

Why not use the DirectX 12 in Civilization 6 ?


----------



## Nima (Feb 22, 2019)

Great GPU at great price. thank god for nvidia or AMD would milk us with it's expensive, ultra power hungry GPUs.


----------



## Joss (Feb 22, 2019)

Nima said:


> thank god for nvidia or AMD would milk us


You're being sarcastic, right?


----------



## W1zzard (Feb 22, 2019)

darkangel0504 said:


> Why not use the DirectX 12 in Civilization 6 ?


Wow, somehow I missed that Civ6 supports DX12. Was that added in a later patch? Will use DX12 starting next rebench


----------



## Nima (Feb 22, 2019)

Joss said:


> You're being sarcastic, right?


Not at all. for the proof check out that power hungry, overpriced rx 590 that AMD released just a few months ago.


----------



## dirtyferret (Feb 22, 2019)

Hey Wizzard, you have the memory clocks at 1500 MHz for all these 1660ti including stock but Amazon and several other sites are listing memory at 1200 mhz.

https://www.amazon.com/ZOTAC-GeForc...d=1550846105&sr=8-9&keywords=1660+ti+6gb&th=1

https://www.amazon.com/GIGABYTE-GeF...8&qid=1550846105&sr=8-14&keywords=1660+ti+6gb


----------



## newtekie1 (Feb 22, 2019)

Nima said:


> Not at all. for the proof check out that power hungry, overpriced rx 590 that AMD released just a few months ago.



You don't even have to look at the RX 590, just look at the Radeon VII.  The Radeon VII doesn't even compete with the RTX 2080, but AMD priced it the same.  I guarantee you if the RTX 2080 didn't exist, the Radeon VII would have been $1,000.  When AMD can get away with over pricing their cards, they do, just like nVidia does.



dirtyferret said:


> Hey Wizzard, you have the memory clocks at 1500 MHz for all these 1660ti including stock but Amazon and several other sites are listing memory at 1200 mhz.
> 
> https://www.amazon.com/ZOTAC-GeForce-192-bit-Graphics-Compact/dp/B07NMWQXLR/ref=sr_1_9?ie=UTF8&qid=1550846105&sr=8-9&keywords=1660+ti+6gb&th=1
> 
> https://www.amazon.com/GIGABYTE-GeF...8&qid=1550846105&sr=8-14&keywords=1660+ti+6gb



The Zotac on Amazon list 12Gbps, which is 1500MHz DDR6.  I'm pretty sure the Gigabyte also means 12Gbps even though they put 1200MHz, that is likely just a typo on Amazon/Gigabyte's part.

1500MHz x 8-Bit/clock = 12Gb/s


----------



## W1zzard (Feb 22, 2019)

dirtyferret said:


> Hey Wizzard, you have the memory clocks at 1500 MHz for all these 1660ti including stock but Amazon and several other sites are listing memory at 1200 mhz.
> 
> https://www.amazon.com/ZOTAC-GeForce-192-bit-Graphics-Compact/dp/B07NMWQXLR/ref=sr_1_9?ie=UTF8&qid=1550846105&sr=8-9&keywords=1660+ti+6gb&th=1
> 
> https://www.amazon.com/GIGABYTE-GeF...8&qid=1550846105&sr=8-14&keywords=1660+ti+6gb


GPU-Z and NVIDIA's reviewers guide say 1500 MHz







Edit: Maybe it's a typo at Amazon and they wanted to write "12000 Gbps" and forgot a "0"?


----------



## dirtyferret (Feb 22, 2019)

W1zzard said:


> GPU-Z and NVIDIA's reviewers guide say 1500 MHz


Thanks!


----------



## Nima (Feb 22, 2019)

newtekie1 said:


> You don't even have to look at the RX 590, just look at the Radeon VII.  The Radeon VII doesn't even compete with the RTX 2080, but AMD priced it the same.  I guarantee you if the RTX 2080 didn't exist, the Radeon VII would have been $1,000.  When AMD can get away with over pricing their cards, they do, just like nVidia does.


Yeah, you are right. I mentioned rx 590 because it does not have any new technology like HBM2 or GDDR6 or even a new architecture so fanboys could not bring some excuses for the high price.


----------



## Robcostyle (Feb 22, 2019)

newtekie1 said:


> You don't even have to look at the RX 590, just look at the Radeon VII.  The Radeon VII doesn't even compete with the RTX 2080, but AMD priced it the same.  I guarantee you if the RTX 2080 didn't exist, the Radeon VII would have been $1,000.  When AMD can get away with over pricing their cards, they do, just like nVidia does.


LMAO, it is! Currency is 27 of local squids per 1$


----------



## GloryToYou (Feb 22, 2019)

Pretty good value.  I will definitely be recommending it to people as a solid 1080p card.  I feel 1440p really needs power closer to a rtx 2070 for comfortable fps.


----------



## wrathchild_67 (Feb 22, 2019)

Nima said:


> Great GPU at great price. thank god for nvidia or AMD would milk us with it's expensive, ultra power hungry GPUs.



That's literally the opposite of what is happening right now. Project much?


----------



## unikin (Feb 22, 2019)

Nice one. Finally some price/performance improvements. Let the GPU price war begin. I'll wait till 7th July when Navi comes out before pulling the trigger. AMD will have to offer even lower prices now. Vega 56 has started to sell for only €269 (blower addition) today. I'm expecting Navi to be priced south of €250.  Maybe we can see RX 570 finally lose it's price/performance crown by he end of the year. 2019 might turn to be OK year for GPU buyers after all.


----------



## FreedomEclipse (Feb 22, 2019)

People might call me a troll but lack of RTX and DLSS is hardly a negative at this current juncture of all junctures.

The real negative is when you turn RTX/DLSS on and it eats more than half your frames and looks almost identical to ultra quality settings at 1440p. All that extra money spent on an RTX card when all you needed was to run 1440p or 4k resolutions


----------



## illli (Feb 22, 2019)

I was trying to figure out why at idle it was so much hotter, at 50c vs the other cards (that were in the 30s).  I somehow missed the part about the idle-stop feature.


----------



## GloryToYou (Feb 22, 2019)

illli said:


> I was trying to figure out why at idle it was so much hotter, at 50c vs the other cards (that were in the 30s).  I somehow missed the part about the idle-stop feature.


I count idle stop as being a pretty important feature, and it is one of the reasons why I definitely prefer MSI's gaming-x and armor lines to their ventus line.  

I stupidly got a 2080ti FE last year, and the damn fans were annoyingly loud, even at idle.  I am certainly not making that mistake again.


----------



## Pumper (Feb 22, 2019)

illli said:


> I was trying to figure out why at idle it was so much hotter, at 50c vs the other cards (that were in the 30s).  I somehow missed the part about the idle-stop feature.



50C idle with fans off is still too high. My 970 Stix also turns off the fans on idle but sits at ~36-38C and it is a higher TDP card.


----------



## Robcostyle (Feb 22, 2019)

Ugh. yeap, ofc, great value!

P.S. It's just another poorly priced card for over 300$ - think people here forget about IRL pricing, just as always. 2080 Ti should've been 999$ aswell....
1660 Ti Already for 450$ here, in local stores


----------



## B-Real (Feb 22, 2019)

Nima said:


> Not at all. for the proof check out that power hungry, overpriced rx 590 that AMD released just a few months ago.





Nima said:


> Great GPU at great price. thank god for nvidia or AMD would milk us with it's expensive, ultra power hungry GPUs.


So you say that compared to the GTX 1060 which outperformed the GTX 980 for a $250 price, a GTX 1660Ti which gets GTX 1070 performance for $280 without increasing VRAM is good. ROFLMAO.  Your power hungry words are the only you can use now.  And you use it for a 30-50W difference. Which means nothing IRL. How pathetic you are. 


newtekie1 said:


> You don't even have to look at the RX 590, just look at the Radeon VII.  The Radeon VII doesn't even compete with the RTX 2080, but AMD priced it the same.  I guarantee you if the RTX 2080 didn't exist, the Radeon VII would have been $1,000.  When AMD can get away with over pricing their cards, they do, just like nVidia does.


Who said that AMD is a charity company? Doesn't compete with RTX 2080? Have you checked reviews? Techspot measured 7% difference between them based on a 33 game average. And it has double the expensive HBM compared to the RTX 2080's 8 GB. Nearly half of the cost of the Radeon VII comes from the VRAM. At 7:32: 








Just like NV does? When they announced a near 50% drop in gaming GPU market, they didn't lower the RTX prices - but they selling in low numbers.


----------



## John Naylor (Feb 22, 2019)

Looking comparatively.

Performance (1440p):   Vega 56 has a 3% edge in the " out of the box" performance test, but 1660 Ti gains 9.6%, vega doesn't do well here, Edge =>1660 Ti .. at 1080P even more so.
Power Usage:  141 under peak gaming for the Ti; 237 for Vega 56, Edge (59%) =>1660 Ti
Noise @ Idle:  0 dbA for the 1660 Ti, 25 for the Vega 56, Edge (Infinity) =>1660 Ti
Noise @ load:  32 dbA for the 1660 Ti, 42 for the Vega 56 making it twice as loud, Edge (50%) =>1660 Ti
Temps @ Load:  68C for the 1660 Ti, 75 for the Vega 56, Edge (91%) =>1660 Ti
Price(newegg MSI Gaming X):  $310 for the 1660 Ti, $400 for the Vega 56,  Edge (77%) =>1660 Ti

The Vega 56 just ceased to be relevant in any way.  To get sold, it needs to drop in price 35% to $259




Joss said:


> You're being sarcastic, right?





wrathchild_67 said:


> That's literally the opposite of what is happening right now. Project much?



The above data leaves little room for argument, at this price / performance niche, AMD doesn't have a horse in the race.


----------



## Xaled (Feb 22, 2019)

Wut? No RTX? 
.
.

What only make this card look good is AMDs horrible situation and Nvidias's overpriced products. 

I still insist that offering a 3 years old technology/performance for 50$ less is really really worthless


----------



## kings (Feb 22, 2019)

The power efficiency of this card is tremendous. Just imagine what Turing on 7nm could be...

The price needs to be a little lower, but overall, its a good card. It will probably be the new sales champion in the mid-end, like GTX 1060 was!


----------



## ArbitraryAffection (Feb 22, 2019)

All Nvidia need to do now is bring 1650 for £150 with 1024 CUDA cores and 1060+ performance then i won't be able to say my 570 is still the best value card available. Lol.


----------



## Mistral (Feb 22, 2019)

W1zzard said:


> Wow, somehow I missed that Civ6 supports DX12. Was that added in a later patch? Will use DX12 starting next rebench


Hey W1zzard, would it be feasible to have a separate performance graph for DX12 games only? Might I also suggest some kind of was per-game indication (a small icon or something) t oshow if a given title is AMD or nVidia sponsored?


----------



## danbert2000 (Feb 22, 2019)

This card is incredible, we are now getting 980 Ti performance at 960 prices. I can't believe how quickly the mid range cards became capable at 1440p and even 4k. You could easily use this card to match an Xbox One X.



Mistral said:


> Hey W1zzard, would it be feasible to have a separate performance graph for DX12 games only? Might I also suggest some kind of was per-game indication (a small icon or something) t oshow if a given title is AMD or nVidia sponsored?



What difference is there if a game is Nvidia or AMD sponsored. It's not like people are going to pick what games to play based on whose logo shows up in the splash screen. And no fudging the numbers will save the Vega 56 or the 590, this card is better in every way.

EDIT: Yes, I meant Xbox One X.


----------



## ArbitraryAffection (Feb 22, 2019)

danbert2000 said:


> You could easily use this card to match an Xbox One S.


Eeh, when the PC runs the _exact_ settings the console runs; even my 570 with OC is basically Xbox One X performance~ 1660 ti just annihilates it. 



danbert2000 said:


> And no fudging the numbers will save the Vega 56 or the 590, this card is better in every way.


Except VRAM.


----------



## Fluffmeister (Feb 22, 2019)

danbert2000 said:


> This card is incredible, we are now getting 980 Ti performance at 960 prices. I can't believe how quickly the mid range cards became capable at 1440p and even 4k. You could easily use this card to match an Xbox One S.
> 
> What difference is there if a game is Nvidia or AMD sponsored. It's not like people are going to pick what games to play based on whose logo shows up in the splash screen. And no fudging the numbers will save the Vega 56 or the 590, this card is better in every way.



Yeah it's pretty brutal, 2.5 times more efficent than the RX 590, whilst being way faster. If you'd just landed from another planet you'd think the 590 was released three years ago... not 3 months ego. Ouch.


----------



## notb (Feb 22, 2019)

FreedomEclipse said:


> People might call me a troll but lack of RTX and DLSS is hardly a negative at this current juncture of all junctures.
> 
> The real negative is when you turn RTX/DLSS on and it eats more than half your frames and looks almost identical to ultra quality settings at 1440p. All that extra money spent on an RTX card when all you needed was to run 1440p or 4k resolutions


You do understand RTX is not meant to make the picture look better, right? It makes the picture look more realistic.
It does seem like people don't get what RTRT is doing after all these reviews and articles we got since RTX launch.


----------



## Vayra86 (Feb 22, 2019)

Im not sure why people are so excited.

This is a 1070 with 2GB VRAM removed, over 2 years past its release, at a minor price cut. Also, Turing is not showing to be very inconsistent compared to Pascal. Even the 2060 is jumping all over the place and this 1660ti is anywhere between a 1060 and a 1070ti... I'd grab a 1070 over this any day of the week...

The 1070 launched at about $350... not sure why we are all excited here. I see 1660ti's at the very same price even today.



notb said:


> You do understand RTX is not meant to make the picture look better, right? It makes the picture look more realistic.
> It does seem like people don't get what RTRT is doing after all these reviews and articles we got since RTX launch.



People don't get it because the implementation is different in every game. BFV did reflections, Metro mostly just does a global illumination pass. And that is pretty much all she wrote up until today.


----------



## John Naylor (Feb 22, 2019)

FreedomEclipse said:


> People might call me a troll but lack of RTX and DLSS is hardly a negative at this current juncture of all junctures.
> 
> The real negative is when you turn RTX/DLSS on and it eats more than half your frames and looks almost identical to ultra quality settings at 1440p. All that extra money spent on an RTX card when all you needed was to run 1440p or 4k resolutions



I think to look at this objectively and without exaggerating the performance hit (30 - 50%) w/ biggest hit at highest resolutions

At 4k, we are talking 1.5% of the market, 95% of which is at 60 Hz.    While I can understand the position, Id rather not lose 30-40% of my performance to RT, half the games in TPUs test suite  are doing 90 or better fps at 4k.... so if two brothers have same system except for GFX card.

The nVidia 2080 Ti user will have half 90% of his games capped at 60 fps due to monitor limitations, but for 10 of those games, he can turn on RT with little or no penalty, still being above his monitor's limitation.   The AMD Radeon VII user will have a bit less of his games capped at 60 fps due to monitor limitations because the Ti is 40% faster.   So the point is .... we shouldn't base what card to buy based upon someone who can take advantage of things many can't.

At 1440p, adding up all the fps for games in TPUs test suite the $700 RTX 2080 is 13.3% faster than the $700 Radeon VII.  With both cards overclocked, that grows to 21.8 %.  So what's the downside of RT ?   Let's look at the options here .... $700 Radeon 7 OC'd versus $700 RTX 2080 OC'd

Now at this point in time there's not a lot of games that support it so this is purely conjectional in that we must assume at some point that a % of the games (say 20 - 30% for sake of argument).   From Metro article conclusions, Wiz puts the expected hit at 30 - 40% once it's out a bit and tweaked, Ill use 35%.   So lets assume for example, that some developers release updated version of their games and I'll pick 25% of the games on the list ... numbers are 1440p w/ both cards overclocked.

Divinity OS2 could be played at 119.0 fps on an OC'd VII, 153.1 on a 2080 or 99.5 on a 2080 w/ RT enabled.
F1 could be played at 133.3 fps on an OC'd VII, 157.9 on a 2080 or 102.7 on a 2080 w/ RT enabled.
GTAV could be played at 150.9 fps on an OC'd VII, 184.3 on a 2080 or 119.8 on a 2080 w/ RT enabled.
Witcher 3 could be played at 103.2 fps on an OC'd VII, 127.0 on a 2080 or 82.5 on a 2080 w/ RT enabled.

Now think of that from a perspective of folks playing at 65, 100, 120 Hz monitors.  It's an option and it doesn't cost you a dime.   Now if ya built ya computer so you can brag about how many fps you get, fine.  But, if you are looking from the perspective of the gaming experience, frankly I don't think that I'd care if it was on or off for 3 of those games.  I's take the extra 20-30 fps and enjoy ULMB.  However, on Witcher 3, if it came down to playing on a Radeon VII at 103.2 versus having the choice to play at 127.0 with ULMB or 82.5 w/ RT and ULMB, I'd like to experience the latter.

So again, let's look at the options here .... $700 Radeon VII OC'd versus $700 RTX 2080 OC'd

1.  Of the 21 games in the test suite, only 1 game under 80 fps, with the 2080 (3 for the Radeon VII) which means turning anykinda-Sync off as using Motin Blur reduction is an option ONLY on the 2080.
2.  Of the 21 games in the test suite, with both cards overclocked, 2080 is faster in 19 of them Radeon VII in 2 of them.
3.  Overall, the 2080 is 22% faster with both cards OC'd
4.  So far  .... is RT even a factor in the decision here ?
5.  No one is mandating you to use it ... what is the downside ?

It's like going down to buy a new SUV .. and the salesman says, hey ya know what ... "I can sell ya the 2WD  model you came here for ... but for the same price I can give ya the RT model  and it comes with 4WD, larger more efficient engine means it accelerates faster and uses less gas, comes standard with AC, and runs cooler" and turning it down cause the carpeting in the trunk is red instead of green.

Was not so long ago, that I was saying "the 780 OCd is faster than AMDs offering OCd but below that, weigh your options.  Then it was, "Well from the 970 price point on up, nVidia has the edge but below that look at both cards in each price niche ... "  And then it was "Well from the xx60 price point on up ...".   Saying it isn't so, doesn't change the numbers.

In short, at the $700 price point, like the $300 price point, AMD doesn't really have a horse in the race.  Not having RT at this price point is not a deal killer, because no one else has it either.    But it does have ULMB which is certainly a significant incentive.  And at the upper price tiers, providing the option is an incentive, not as much as ULMB but any tech that gives the user different options to enhance the user experience is a good thing.


----------



## M2B (Feb 22, 2019)

danbert2000 said:


> You could easily use this card to match an Xbox One S.



You mean Xbox One X?
The Xbox One S is so damn weak, a crappy GTX 750 Ti can match it.


----------



## GloryToYou (Feb 22, 2019)

Robcostyle said:


> Ugh. yeap, ofc, great value!
> 
> P.S. It's just another poorly priced card for over 300$ - think people here forget about IRL pricing, just as always. 2080 Ti should've been 999$ aswell....
> 1660 Ti Already for 450$ here, in local stores


The cards are on newegg right now, and listed in stock.  This isn't some paper launch.  Most of them are the $280 list price.
https://www.newegg.com/GraphicsCardsPromoStore/EventSaleStore/ID-506

Newegg is listing this gaming X at $10 over its $300 MSRP, but the rest are pretty much in line.


----------



## kings (Feb 22, 2019)

Fluffmeister said:


> Yeah it's pretty brutal, 2.5 times more efficent than the RX 590, whilst being way faster. If you'd just landed from another planet you'd think the 590 was released three years ago... not 3 months ego. Ouch.



From Anandtech review:



> Performance has gone up versus the GTX 1060 6GB, but card power consumption hasn’t. Thanks to this, the GTX 1660 Ti is not just 36% faster, it’s 36% percent more efficient as well.
> The other Turing cards have seen their own efficiency gains as well, but with their TDPs all drifting up, this is the largest (and purest) efficiency gain we’ve seen to date, and probably the best metric thus far for evaluating Turing’s power efficiency against Pascal’s.



https://www.anandtech.com/show/13973/nvidia-gtx-1660-ti-review-feat-evga-xc-gaming/16

These are quite impressive values, considering that we are talking about GPUs with basically the same node.


----------



## John Naylor (Feb 22, 2019)

ArbitraryAffection said:


> Except VRAM.



Anyone who buys this card to play at 2160p where more than 6GB is needed is making a mistake.

https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/26.html

The 1060 6 GB is faster than the 3 GB because it has 11% more shaders.   If VRAM was in play here  in any conceivable way here as you imply, we should see a massive hit on performance going from 1080p to 1440p.  Instead what do we see ?

The 1060 6GB is 6 % faster than the 1060 GB at 1080p.   Obviously, if you presumption is correct this should be what ?...10%, 12% at 1440p ?  Nope, just the same 6% as at 1080p.  The extra 3 GB had no impact on performance.

What else do we see ?  The 3 GB 1060 is as fast as the 8 GB 480




Vayra86 said:


> Im not sure why people are so excited.  The 1070 launched at about $350... not sure why we are all excited here. I see 1660ti's at the very same price even today.



The MSI Gaming X 1070 was $449 MSRP at launch here in US, ... took a while for prices to settle don to that level tho.  Bought one for my son, receipt says $429 (5 months after releaseThe 970 was about $350.  The MSI 1660 Ti is $310 today, the Ventus is $279.  So $449 to $309 is significant ... and that $309 includes tariff and 1st day pricing.  The MSI Gaming X cards are usually 1st ones out of stock ....

The 2080 Gaming X have been running about $100 over the least expensive cards, the 2080 Ti Gaming X's are $250 or more higher than competititve offerings.


----------



## xkm1948 (Feb 22, 2019)

Hey mighty @W1zzard , does this GPU also supports the new Nvidia OC scanner overclocking?


----------



## Robcostyle (Feb 22, 2019)

GloryToYou said:


> The cards are on newegg right now, and listed in stock.  This isn't some paper launch.  Most of them are the $280 list price.
> https://www.newegg.com/GraphicsCardsPromoStore/EventSaleStore/ID-506
> 
> Newegg is listing this gaming X at $10 over its $300 MSRP, but the rest are pretty much in line.



You think everyone here is a us resident?
https://hotline.ua/computer/videokarty/605520/
Divide by 27.


----------



## moob (Feb 22, 2019)

John Naylor said:


> Looking comparatively.
> 
> Performance (1440p):   Vega 56 has a 3% edge in the " out of the box" performance test, but 1660 Ti gains 9.6%, vega doesn't do well here, Edge =>1660 Ti .. at 1080P even more so.
> Power Usage:  141 under peak gaming for the Ti; 237 for Vega 56, Edge (59%) =>1660 Ti
> ...


No. It didn't. For whatever reason, TPU seems to insist on using reference cards as the benchmark, which is asinine when doing a comparison like this. This isn't a reference card, so why would you compare it to other reference cards? Your conclusion, and TPU's conclusion about Vega 56 running louder and hotter is, quite frankly, incorrect when doing a more apples-to-apples comparison. Noise at idle for my Red Dragon Vega 56 is also 0. Noise at load is extremely quiet (inadubile from where I sit). Temps at load are virtually identical to the 1660Ti. As for performance, you see that in GN's review (starting with the F1 benchmarks as Steve also uses a Red Dragon) 








The only real advantage of the 1660Ti over Vega 56 is power usage. But honestly, and this has been said many times over, most of us aren't gaming 24 hours a day. As far as I'm concerned, for real-world usage, the difference is inconsequential. Of course, that's assuming Vega 56 does indeed drop in price across the board. For the same price, I'd opt for my card over any 1660Ti all day, every day. But if Vega 56 stays $400+ then there's no way to recommend it (I got mine for just over $300 about 6 months ago). I certainly wouldn't recommend a reference Vega 56. Ever.


----------



## ArbitraryAffection (Feb 22, 2019)

John Naylor said:


> Anyone who buys this card to play at 2160p where more than 6GB is needed is making a mistake.
> 
> https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/26.html
> 
> ...



High res texture packs. my 570 has no trouble running them without stutter. Something the 1660 Ti will struggle with even now. I'm so sick of people lapping up getting less memory, whatever. I ain't buying a card with les than 8GB of vram because I'm using almost all of that vram _right now. _


----------



## xkm1948 (Feb 22, 2019)

ArbitraryAffection said:


> High res texture packs. my 570 has no trouble running them without stutter. Something the 1660 Ti will struggle with even now. I'm so sick of people lapping up getting less memory, whatever. I ain't buying a card with les than 8GB of vram because I'm using almost all of that vram _right now. _



Nvidia has traditionally better VRAM compression technology. I would say that 2GB difference is negligible if it is just HD textures at 1080p.


----------



## Fluffmeister (Feb 22, 2019)

moob said:


> The only real advantage of the 1660Ti over Vega 56 is power usage. But honestly, and this has been said many times over, most of us aren't gaming 24 hours a day. As far as I'm concerned, for real-world usage, the difference is inconsequential. Of course, that's assuming Vega 56 does indeed drop in price across the board. For the same price, I'd opt for my card over any 1660Ti all day, every day. But if Vega 56 stays $400+ then there's no way to recommend it (I got mine for just over $300 about 6 months ago). I certainly wouldn't recommend a reference Vega 56. Ever.



To think, there was a time when AMD would actually waste time and money to mock Nvidia when it was their "only real advantage", oh how times have changed.


----------



## rtwjunkie (Feb 22, 2019)

moob said:


> For whatever reason, TPU seems to insist on using reference cards as the benchmark, which is asinine when doing a comparison like this.


Let me school you, since your “assenine” comment about the testing here shows you have not been frequenting TPU reviews very long. 

As with most reviewers, most cards are not W1zzard’s to keep.  Since he has one of the most extensive testing suites in the industry, including both games and multiple previously released cards, he keeps a baseline for comparisons in the future using reference cards. 

It’s up to you to use a little common sense and review knowledge of where cards lie on the stack to extrapolate where a particular card you are interested in would lie. 

Please be aware, W1zzard already retests every card he keeps in the stable, on each and every game in the testing suite. Each time he tests a new card for review.  He already gives literally months of his life in the testing lab each year.

For you to think he should keep a high count of additional cards in stock and test those as well is rude, self-centered, and quite frankly, to use your term, assenine.


----------



## ArbitraryAffection (Feb 22, 2019)

xkm1948 said:


> Nvidia has traditionally better VRAM compression technology. I would say that 2GB difference is negligible if it is just HD textures at 1080p.


you're probably right, maybe i should just get a 1660 ti


----------



## dirtyferret (Feb 22, 2019)

I'm going to get one to see if it can play Crysis....


----------



## moob (Feb 22, 2019)

rtwjunkie said:


> Let me school you, since your “assenine” comment about the testing here shows you have not been frequenting TPU reviews very long.
> 
> As with most reviewers, most cards are not W1zzard’s to keep.  Since he has one of the most extensive testing suites in the industry, including both games and multiple previously released cards, he keeps a baseline for comparisons in the future using reference cards.
> 
> ...


Actually I've been reading TPU for years, I simply started commenting recently.

I know how cards are tested. The problem is, if you're comparing a reference card to non-reference models, then you're going to come up with, at best, inaccurate or incomplete conclusions (such as, _"Vega 56 runs much hotter and noisier than GTX 1660 Ti_" which is both true and untrue based on what you're comparing) . I have nothing against Wizz including reference cards in the benchmark suite (and I think it is quite helpful actually), but there should also be a non-reference card included for comparison if you're going to include those observations in the conclusion. Without it, you're basing your conclusion on incomplete data which does a disservice to the reader. I know that this has been discussed here before as well and that, mostly likely, nothing is going to change so I am going to stand by my "asinine" assertion.


----------



## xkm1948 (Feb 22, 2019)

ArbitraryAffection said:


> you're probably right, maybe i should just get a 1660 ti



Get whatever makes _you _happy. Be it Vega of Polaris or this.


----------



## Fluffmeister (Feb 22, 2019)

xkm1948 said:


> Get whatever makes _you _happy. Be it Vega of Polaris or this.



Indeed, I mean according to this very review the MSI GTX 1660 Ti is only 42% faster than a RX 580 8GB @1080P whilst being twice as efficient too, I guess I can see the appeal of an 8GB Polaris card over it...


----------



## moproblems99 (Feb 22, 2019)

kings said:


> The power efficiency of this card is tremendous. Just imagine what Turing on 7nm could be.



I would want them to crank the clocks up.



Fluffmeister said:


> If you'd just landed from another planet you'd think the 590 was released three years ago.



In all honesty, it was probably released longer ago than that.  When was the 480 released?


----------



## xkm1948 (Feb 22, 2019)

Fluffmeister said:


> Indeed, I mean according to this very review the MSI GTX 1660 Ti is only 42% faster than a RX 580 8GB @1080P whilst being twice as efficient too, I guess I can see the appeal of an 8GB Polaris card over it...



Dude you know all too well some people buy GPUs with hearts and emotions right? For some folks it is a statement of “no” to “evil business practice”. Or out of true love to a specific brand. We gotta respect that man!



moproblems99 said:


> I would want them to crank the clocks up.
> 
> 
> 
> In all honesty, it was probably released longer ago than that.  When was the 480 released?



And they probably will crank the clock even higher.


----------



## Fluffmeister (Feb 22, 2019)

moproblems99 said:


> In all honesty, it was probably released longer ago than that.  When was the 480 released?



https://www.techpowerup.com/reviews/AMD/RX_480/

Three years this June, one rebrand, one die shrink and it's already in a world of pain.


----------



## moproblems99 (Feb 22, 2019)

Fluffmeister said:


> https://www.techpowerup.com/reviews/AMD/RX_480/
> 
> Three years this June, one rebrand, one die shrink and it's already in a world of pain.



Third times a charm?


----------



## Fluffmeister (Feb 22, 2019)

moproblems99 said:


> Third times a charm?



Heh, maybe. Polaris needs 7nm too get even remotely close in terms of efficiency. Maybe it's best they take both Polaris and Vega out back and shoot them dead.

Hence Navi some time later this year who knows when.


----------



## xkm1948 (Feb 22, 2019)

Fluffmeister said:


> Heh, maybe. Polaris needs 7nm too get even remotely close in terms of efficiency. Maybe it's best they take both Polaris and Vega out back and shoot them dead.
> 
> Hence Navi some time later this year who knows when.




What if Navi is just Polaris 7nm with minor tweaks?


----------



## Robcostyle (Feb 23, 2019)

xkm1948 said:


> What if Navi is just Polaris 7nm with minor tweaks?


Then we are screwed for next two years. And knowing intel - I would say, we are screwed for a long time...


----------



## jmcosta (Feb 23, 2019)

B-Real said:


> Who said that AMD is a charity company? Doesn't compete with RTX 2080? Have you checked reviews? Techspot measured 7% difference between them based on a 33 game average. And it has double the expensive HBM compared to the RTX 2080's 8 GB. Nearly half of the cost of the Radeon VII comes from the VRAM. At 7:32:



The Radeon VII  isn't a bad card but its a bit overpriced, power hungry-noisy and possibily will continue having driver issues in these first months of release. Theres simply better options out there.
For pure gaming the gtx 2080 is a better card overall.


----------



## Mescalamba (Feb 23, 2019)

jmcosta said:


> The Radeon VII  isn't a bad card but its a bit overpriced, power hungry-noisy and possibily will continue having driver issues in these first months of release. Theres simply better options out there.
> For pure gaming the gtx 2080 is a better card overall.



RVII has small advantage of being best compute card for money. Only thing that beats it (in FP64) is Titan V. Considerably more expensive..
Sadly it doesnt translate into gaming, somehow (dunno why actually, on paper its great). Titan V is also compute card (its just pro card turned gaming, much like RVII), but its also great gaming card.


----------



## xkm1948 (Feb 23, 2019)

Mescalamba said:


> RVII has small advantage of being best compute card for money. Only thing that beats it (in FP64) is Titan V. Considerably more expensive..
> Sadly it doesnt translate into gaming, somehow (dunno why actually, on paper its great). Titan V is also compute card (its just pro card turned gaming, much like RVII), but its also great gaming card.


That compute part means nothing except miners. In scientific computing CUDA and Tensor Flow dominates OpenCL and Radeon 7 has bad support for both


----------



## XiGMAKiD (Feb 23, 2019)

I like this card, now I just need Nvidia to overstocked this GPU and wait until the market demand is slowing down so I could get one at lower price


----------



## dicktracy (Feb 23, 2019)

xkm1948 said:


> That compute part means nothing except miners. In scientific computing CUDA and Tensor Flow dominates OpenCL and Radeon 7 has bad support for both


They gotta find a “win” somehow for AMD lol.


----------



## notb (Feb 23, 2019)

Vayra86 said:


> Im not sure why people are so excited.
> 
> This is a 1070 with 2GB VRAM removed, over 2 years past its release, at a minor price cut. Also, Turing is not showing to be very inconsistent compared to Pascal. Even the 2060 is jumping all over the place and this 1660ti is anywhere between a 1060 and a 1070ti... I'd grab a 1070 over this any day of the week...
> 
> The 1070 launched at about $350... not sure why we are all excited here. I see 1660ti's at the very same price even today.


Simple: because Nvidia managed to improve efficiency even further.
At the same node it's slightly more powerful and slightly less power hungry. So the emitted heat had to drop significantly. And this is the most important gain here.
This means 1660Ti is a chip that can be paired with a small cooler. In fact most companies that announced their lineup have a compact ~18m variant. And the resulting cards are cool and quiet.
Some companies tried their luck with compact 1070 cards, but with mixed success (and none was as quiet as this one).
Think about what this implies for mobile variants. 

As for the price: you can't expect Nvidia to adjust their pricing to the deals we get from stores for the older product. It can't work that way.
1660Ti card launched at $280, so based on MSRP it's actually way closer to the 1060 6GB ($250) than to 1070 ($380). A theoretical non-Ti 1660 would cost as much as 1060 did. So yes, 1660 is replacing 1060. And yes, it's faster and more efficient than 1070 - a card from a higher segment.


> People don't get it because the implementation is different in every game. BFV did reflections, Metro mostly just does a global illumination pass. And that is pretty much all she wrote up until today.


As I said: current RTX implementation and utilization far from what is possible. But it's not a reason to criticize the technology. We'll slowly get there but we need time and chip performance.

Remember RTRT is not a gimmick. It's not something Nvidia created as just another feature. It's not something we could backtrack from because we don't like current results.
Ray Tracing is how photorealistic renders are made. And since we got into 3D gaming, we knew this is how games are going to be rendered in the future.
Now, we could stay on the curve of GPGPU potential and get gaming cards able to do RTRT in 10 years. Or we can utilize purpose-built RT ASIC and get this tech now.

And once again: RTRT doesn't mean games will look more pleasing. The exact opposite is more likely.


----------



## FreedomEclipse (Feb 23, 2019)

Hmmm, this and card and the ventus are almost identical. This card makes the gaming X look a little over engineered but then again the much higher power limit probably results in better overclocking, But on the Maximum Overclock Comparison chart, the differences were small... The cheaper Ventus does look to be the better buy here. im not too fussed about the fan not stopping in idle anyway as i always run with a custom fan curve and set fans to run at 30% where its inaudible anyway. I did this with my 1070 Gaming X and current 1080Ti

I think the main deal breaker here is if one prefers the plastic or metal backplate. Lacking RTX/DLSS is also a thing but then again why the hell are you looking to buy an 1660Ti if you want RTX/DLSS support?

Set a custom fan curve on the ventus and save yourself $20


----------



## Vayra86 (Feb 23, 2019)

notb said:


> Simple: because Nvidia managed to improve efficiency even further.
> At the same node it's slightly more powerful and slightly less power hungry. So the emitted heat had to drop significantly. And this is the most important gain here.
> This means 1660Ti is a chip that can be paired with a small cooler. In fact most companies that announced their lineup have a compact ~18m variant. And the resulting cards are cool and quiet.
> Some companies tried their luck with compact 1070 cards, but with mixed success (and none was as quiet as this one).
> ...



Great points thanks for opening my eyes there. As a desktop user my focus is always price/perf, but these are real pros for this card indeed.


----------



## newtekie1 (Feb 23, 2019)

ArbitraryAffection said:


> High res texture packs. my 570 has no trouble running them without stutter. Something the 1660 Ti will struggle with even now. I'm so sick of people lapping up getting less memory, whatever. I ain't buying a card with les than 8GB of vram because I'm using almost all of that vram _right now. _



I believe GamersNexus did a good video on the fallacy of VRAM usage, and explained how we really have no tool to measure how much VRAM a game is actually using.  All the tools today simply show how much VRAM is allocated to the game, how much the game is requesting, but not how much it is actually using.  That is why we have some games that will use 10GB+ of VRAM if you have it, but still run just fine on cards with 6GB.  It is requesting all that VRAM, but not actually using it.


----------



## notb (Feb 23, 2019)

FreedomEclipse said:


> Hmmm, this and card and the ventus are almost identical. This card makes the gaming X look a little over engineered but then again the much higher power limit probably results in better overclocking, But on the Maximum Overclock Comparison chart, the differences were small... The cheaper Ventus does look to be the better buy here. im not too fussed about the fan not stopping in idle anyway as i always run with a custom fan curve and set fans to run at 30% where its inaudible anyway. I did this with my 1070 Gaming X and current 1080Ti
> 
> I think the main deal breaker here is if one prefers the plastic or metal backplate. Lacking RTX/DLSS is also a thing but then again why the hell are you looking to buy an 1660Ti if you want RTX/DLSS support?
> 
> Set a custom fan curve on the ventus and save yourself $20


I can't agree with the 30% fans being inaudible. I mean: it's acceptable when you're using the PC (you accept some noise coming from the fans, the keyboard etc).
But there is always a hum and you start to notice it when you're not using the PC: studying, reading a book, sleeping nearby, watching movies etc.
My PC is in my bedroom. It is on almost all the time, but I play for maybe 2h a week. Idle GPU cooling was a must have.

And I'm sure Gaming X cooler is the perfect companion for this chip. It emits even less noise than the Ventus and the fans will stay idle at much higher load.
It's also the version of choice for people that may want to OC.


----------



## rtwjunkie (Feb 23, 2019)

FreedomEclipse said:


> The cheaper Ventus does look to be the better buy here. im not too fussed about the fan not stopping in idle anyway as i always run with a custom fan curve and set fans to run at 30% where its inaudible anyway. I did this with my 1070 Gaming X and current 1080Ti


I totally agree.  I have mine on a custom curve stating at 28% and don’t hear it when idling.  I rather that than let it get up to 50 before any fans start running.  Heck I can’t hear it unless I concentrate when it climbs to 55%.


----------



## FreedomEclipse (Feb 23, 2019)

notb said:


> I can't agree with the 30% fans being inaudible. I mean: it's acceptable when you're using the PC (you accept some noise coming from the fans, the keyboard etc).
> But there is always a hum and you start to notice it when you're not using the PC: studying, reading a book, sleeping nearby, watching movies etc.
> My PC is in my bedroom. It is on almost all the time, but I play for maybe 2h a week. Idle GPU cooling was a must have.
> 
> ...



Each to their own. Sound in general or how loud something is purely subjective so something that is inaudible to me is probably a jet engine to you.

I dont have constantly have my ear glued to my PC and my PC is also in my bedroom so im happy with how it sounds. It could be dead silent but then again im not running a passive machine.

In any case, i use an aftermarket cooler with my 1080Ti and its inaudible till i start gaming.


My 30% was just an example. you can set it as low as 5% if thats what you wish.

You're free to disagree all you like but it wont change people's hearing.


----------



## Kissamies (Feb 23, 2019)

Fluffmeister said:


> https://www.techpowerup.com/reviews/AMD/RX_480/
> 
> Three years this June, one rebrand, one die shrink and it's already in a world of pain.


More like a "die shrink" since the die size and transistor count is still the same. I call RX 590 just a RX 480 rev3.



newtekie1 said:


> I believe GamersNexus did a good video on the fallacy of VRAM usage, and explained how we really have no tool to measure how much VRAM a game is actually using.  All the tools today simply show how much VRAM is allocated to the game, how much the game is requesting, but not how much it is actually using.  That is why we have same games that will use 10GB+ of VRAM if you have it, but still run just fine on cards with 6GB.  It is requesting all that VRAM, but not actually using it.


IIRC Mirror's Edge Catalyst "needs" 8GB VRAM on Hyper settings, but I had no problems running it with a 3GB (or "3.5GB") or 4GB card (780, 780 Ti, 970 SLI or 980) at smooth 60fps in all times


----------



## FreedomEclipse (Feb 23, 2019)

rtwjunkie said:


> I totally agree.  I have mine on a custom curve stating at 28% and don’t hear it when idling.  I rather that than let it get up to 50 before any fans start running.  Heck I can’t hear it unless I concentrate when it climbs to 55%.



At 50'c my GPU fans are already hitting 45% where it will stay till 60'c. I could probably run it at a slower 40% but i want it to hit the maximum boost clocks when gaming. I dont think ive seen it come anywhere close to 60'c or above so far so maybe my fan profile is a little aggressive. I just dont want the VRMs to overheat and blow up as i dont quite trust the heatsinks i glued on there with thermal adhesive.


----------



## John Naylor (Feb 23, 2019)

moob said:


> No. It didn't. For whatever reason, TPU seems to insist on using reference cards as the benchmark, which is asinine when doing a comparison like this. This isn't a reference card, so why would you compare it to other reference cards? Your conclusion, and TPU's conclusion about Vega 56 running louder and hotter is, quite frankly, incorrect when doing a more apples-to-apples comparison. Noise at idle for my Red Dragon Vega 56 is also 0. Noise at load is extremely quiet (inadubile from where I sit). Temps at load are virtually identical to the 1660Ti. As for performance, you see that in GN's review (starting with the F1 benchmarks as Steve also uses a Red Dragon)
> 
> 
> 
> ...





moob said:


> I know how cards are tested. The problem is, if you're comparing a reference card to non-reference models, then you're going to come up with, at best, inaccurate or incomplete conclusions (such as, _"Vega 56 runs much hotter and noisier than GTX 1660 Ti_" which is both true and untrue based on what you're comparing) . I have nothing against Wizz including reference cards in the benchmark suite (and I think it is quite helpful actually), but there should also be a non-reference card included for comparison if you're going to include those observations in the conclusion. Without it, you're basing your conclusion on incomplete data which does a disservice to the reader. I know that this has been discussed here before as well and that, mostly likely, nothing is going to change so I am going to stand by my "asinine" assertion.



1.  TPU uses what they are sent.  
2.  The Red Dragon is $190 more than the MSI Gaming X, 1660 Ti what was it you said about apples and apples ?  $310 versus $500 ? ? ?
3.  Id be happy to read a review of the Dragin on a reputable site ... that category doesn't include Jay2Cents, GamersNexus and most other youtubers.  Unless done by the same reviewer can't compare.
4.  Power is not the only thing, at least in and of itself, add the cost of a 100 watt bigger PSU and an extra case fan and that $190cost difference just went to $225 or so.

So let's look at other AIB Vega 56......

\https://www.hardocp.com/article/2018/03/05/asus_rog_strix_rx_vega_56_o8g_gaming_review/15
+ 127 watts over the Reference overclocked, 96 watts box
+ 4C over refernce, 80C out of box temps

https://www.tomshardware.com/reviews/gigabyte-radeon-rx-vega-56-gaming-oc-8g-review,5413-5.html
THG also got 76C with the Gigabyte AIB Vega 56
THG also noted that this model was heavily tuned for noise reduction and still banged out 40.8 dB(A)

In short, without a published reference from a , it's hard to accept subjective observations, other reviews of AIB Vega 56's contradict what you are observing.  PowerColot is going to have to cut their price on the Red Dragon in half to be relevant in today's market.

If you'd like, I'm sure if you sent your $500 card to Wiz, he'd test it for you .... he can only test what the postman brings him so don't see how he's to blame here.  Also look around and ask yourself this question:  "Why are AIB Vega reviews so hard to find ?  The ones we can find do not support your conclusions, actually contradicting them.   So you can't stand by what your "conclusions" if you want, but the available data in no way supports the position yu have taken.



ArbitraryAffection said:


> you're probably right, maybe i should just get a 1660 ti



if you look at TPus review of the 3GB and 6GB 1060s, you will see that VRAM has no impact on performance at 1080p.  The 6 GB model has more shaders (11% extra) so it does have a performance edge of about 6% .... but if VRAM was in play, in any conceivable way whatsoever ... then that gapo would have to widen at 1440p ... it does not.  I have seen a game or 2 have lower performance that might be related to VRAM, but not conclusively as yet where other factors could be ruled out.

1080p = 3 GB Minimum / 4 GB recommended
1440p = 5 GB Minimum / 7 GB recommended
2160p = 12 GB Minimum / 16 GB recommended
2880p = 20 GB Minimum / 29 GB recommended


----------



## M2B (Feb 23, 2019)

John Naylor said:


> 1080p = 3 GB Minimum / 4 GB recommended
> 1440p = 5 GB Minimum / 7 GB recommended
> 2160p = 12 GB Minimum / 16 GB recommended



It doesn't work like that.
Having 6GB of VRAM for 1440p gaming is actually better than having 4GB for 1080p gaming.
It doesn't scale like that. Going from 1080p to 1440p adds around 400-700MB to the VRAM usage and going from 1440p to 4K adds another 800MB-1GB to the VRAM usage which means 6GB at 4k is roughly as good as 4GB at 1080p, maybe a little better.


----------



## John Naylor (Feb 23, 2019)

xkm1948 said:


> Dude you know all too well some people buy GPUs with hearts and emotions right? For some folks it is a statement of “no” to “evil business practice”. Or out of true love to a specific brand. We gotta respect that man!



As to the respect, I have no respect for myself.... long been a "hardware whore".  I'm like the girl who goes to see her BF race, and after he loses his car, I go home with the guy who beat him 

As to the evil business practices, welcome to capitalism.  It's not as if any corporation would do differently in their position.  Winning has it's advantages.




jmcosta said:


> The Radeon VII  isn't a bad card but its a bit overpriced, power hungry-noisy and possibily will continue having driver issues in these first months of release. There's simply better options out there.  For pure gaming the gtx 2080 is a better card overall.



Well said, there are other cards that have dimilat performance, they are just priced well above them.   Pricing the  VII a bit less than the upcoming 2070 Ti might work but I think to sell it will need to be closer to $550.

Unfortunately AMD's market position is a ball and chain ... with their small market share, and burdened by the console market committments, hard to catch up.  Smartest thing nVidia ever did was leave that market.  It's getting to the point where if they were head to head on proce / perofrmance ... it wouldn't matter.  Power, temps and noise affect buying choices these days so even if they can deliver same performance per dollar, they will have to sell cheaper to offset those.





M2B said:


> It doesn't work like that.
> Having 6GB of VRAM for 1440p gaming is actually better than having 4GB for 1080p gaming.
> It doesn't scale like that. Going from 1080p to 1440p adds around 400-700MB to the VRAM usage and going from 1440p to 4K adds another 800MB-1GB to the VRAM usage which means 6GB at 4k is roughly as good as 4GB at 1080p, maybe a little better.



I have not seen it, looking at 100s of sites and comparisons with 5xx, 6xx, 7xx and 9xx cards, outside of a poor console port or other anomaly, have never seen any instance where 3 GB was not enough for 1080p.  Like the old more than 1.50 volts for DDR3  or 1.35 for DDR4 will void your warranty, the supposition exists despite Intel's published statements to the contrary.  No mater how many times we she published results not supporting the need for more than 3 GB VRAM @ 1080p, the tst results never support it.

Video Card Performance: 2GB vs 4GB Memory - Puget Custom Computers 
Is 4GB of VRAM enough? AMD’s Fury X faces off with Nvidia’s GTX 980 Ti, Titan X | ExtremeTech 
GTX 770 2GB vs 4GB 

Back in the day, IIRC the formula was resolution x color depth / 8.  Nowadays, yes it doesn't scale directly however, the level of conservatism should go up in proportion to the investment.

Th simple fact ids TPUs test results here clearly show  there is no difference at all in performance between the 3GB or 6GB 1060 Ti at 1080p or 1440p.   We so no difference in performance by increasing resolution until we get to 2160p.  So if the VRAM is not a factor at 1440p, then it's certainly not a factor at 1080p


----------



## notb (Feb 23, 2019)

FreedomEclipse said:


> Each to their own. Sound in general or how loud something is purely subjective so something that is inaudible to me is probably a jet engine to you.
> (...)
> You're free to disagree all you like but it wont change people's hearing.


Exactly!
That's why fans turning off is such a terrific feature of graphic cards. They become objectively inaudible.
Some people won't hear silent fans, some will. You never know. And your perception may also change with time.

No so long ago, when custom PCs were louder in general, there were major PC sites devoted to building silent (passively cooled if possible) computers. It was both a popular hobby and a necessity.

Since then, components certainly got quieter: fans became larger, watercooling stopped being something extreme that you use gardening materials for and pray for no leaks.
On the other hand: we didn't really get to the passive or almost passive quality so many hoped for. Obviously, a lot of people moved to notebooks, but surely not everyone.

The situation today is that unless one spends a fortune on cooling, a custom PC will be louder than typical OEM Office PCs and Macs. It wasn't true in the 90s and IMO it shouldn't be now. I think we've settled for less than we should have...


----------



## jabbadap (Feb 23, 2019)

All this vram talk made me wonder if there's is way to restrict cards vram size by registry trick or some other way. At least on linux and wine there were that old wined3d registry key for vram size. Only thing I could find were this nvidia devtalk thread for limiting vram through cuda:

https://devtalk.nvidia.com/default/topic/726765/need-a-little-tool-to-adjust-the-vram-size/


----------



## notb (Feb 23, 2019)

jabbadap said:


> All this vram talk made me wonder if there's is way to restrict cards vram size by registry trick or some other way. At least on linux and wine there were that old wined3d registry key for vram size. Only thing I could find were this nvidia devtalk thread for limiting vram through cuda:
> 
> https://devtalk.nvidia.com/default/topic/726765/need-a-little-tool-to-adjust-the-vram-size/


You can simply run a program on the GPU and allocate memory. The game process won't kill the other program and will use only what is left.


----------



## Shatun_Bear (Feb 24, 2019)

Ridiculous that anyone should be excited for this release from Nvidia. Same performance as a custom 1070, 2 years later, for the same price you could have picked one of those up for over a year. What do you get? Better efficiency but 2GB less VRAM. No thanks.


----------



## Fluffmeister (Feb 24, 2019)

Yeah, on the other hand we get exciting releases like the RX 590 and Vega VII, both late, slow, and overpriced.


----------



## Fourstaff (Feb 24, 2019)

I bought my 660Ti more than 5 years ago, performance has improved quite a bit since then with no increase in cost or power draw. Progress is still alive and healthy.


----------



## Robcostyle (Feb 24, 2019)

Fourstaff said:


> I bought my 660Ti more than 5 years ago, performance has improved quite a bit since then with no increase in cost or power draw. Progress is still alive and healthy.


Well, seems like you and your fanboy friend are a little bit outdated - cause 660 ti came out 7 years ago, but dont worry - you’ll get the idea that people like me trying to deliver. In 2-3 years or so...


----------



## happy medium (Feb 24, 2019)

Seems like a good buy


----------



## Tatty_One (Feb 24, 2019)

Shatun_Bear said:


> Ridiculous that anyone should be excited for this release from Nvidia. Same performance as a custom 1070, 2 years later, for the same price you could have picked one of those up for over a year. What do you get? Better efficiency but 2GB less VRAM. No thanks.



I agree, however I think their focus here is that it offers a "new" solution at a price that offers 10-20% more performance than the oppositions (read 590) at $20 - $40 more so clearly they think there will be significant market share, whether that's the case with AMD allegedly reducing the price of Vega56 is another matter, you could argue that reducing the price of Vega56 suggests AMD agree with them


----------



## ZeroFM (Feb 24, 2019)

MSI armor review coming ?


----------



## dirtyferret (Feb 24, 2019)

Shatun_Bear said:


> Ridiculous that anyone should be excited for this release from Nvidia. Same performance as a custom 1070, 2 years later, for the same price you could have picked one of those up for over a year. What do you get? Better efficiency but 2GB less VRAM. No thanks.


I understand why people would be upset with price but performance?  Nobody screams bloody murder when Nvidia/AMD release entry level cards that can't offer ultra performance at 4k.


----------



## Rivage (Feb 25, 2019)

Why? If we have RTX 2060 on market already, for the same price basically, well, $40 more expensive, and much more fast than GTX 1660 Ti


----------



## ppn (Feb 25, 2019)

because 2060 is just the 2070 die uncapable to run all 2304 cores and supply may be too low depending on the yields of quality chips.
1660T is cheaper to produce, 36% smaller die.

And still why is it so big. 2070 is 445, this is 284. Almost as if they left the RT/tensor cores intact. Imagine 1536 Cuda / 160 bit cut down version of 2070, that would be 296. And this is 192 bit and 284 sq.mm, there is definitely something fishy about 1660Ti. They wasted good wafer just to add useless RT cores isntead of refreshing 1080/Ti with 16GBps GDDR6 on board and calling it a day.


----------



## dirtyferret (Feb 25, 2019)

Rivage said:


> Why? If we have RTX 2060 on market already, for the same price basically, well, $40 more expensive, and much more fast than GTX 1660 Ti



I think you need more fingers and toes to count on, The GTX 1160ti is $279 while the RTX 260 is $349.  That a $70 difference in price with TPU having the GTX 1160 win in each resolution by a slight edge over RTX 260.  I'm not knocking the RTX 260 other then it's ray tracing performance where it really looks handicapped by it compared to the RTX 270 & 280.


----------



## mac007 (Feb 26, 2019)

Wsnt this card replacing GTX 1060, so price should have been lower


----------



## notb (Feb 26, 2019)

mac007 said:


> Wsnt this card replacing GTX 1060, so price should have been lower


Because new phones are cheaper than the previous models? Because cars are? Or maybe sausages or coca cola bottles?

Why are some people making up new laws of economy all the time? More importantly: why are they surprised that these laws don't work?


Rivage said:


> Why? If we have RTX 2060 on market already, for the same price basically, well, $40 more expensive, and much more fast than GTX 1660 Ti


Because why not? Why is having a more dense product lineup such a problem? Shouldn't we be happy because of the choice Nvidia gives us?



Shatun_Bear said:


> Ridiculous that anyone should be excited for this release from Nvidia. Same performance as a custom 1070, 2 years later, for the same price you could have picked one of those up for over a year. What do you get? Better efficiency but 2GB less VRAM. No thanks.


Why would anyone care about efficiency? Leave it to the engineers at Nvidia.
You get smaller coolers and less noise. That's it.


----------



## Rivage (Feb 27, 2019)

dirtyferret said:


> I think you need more fingers and toes to count on, The GTX 1160ti is $279 while the RTX 260 is $349. That a $70 difference in price with TPU having the GTX 1160 win in each resolution by a slight edge over RTX 260. I'm not knocking the RTX 260 other then it's ray tracing performance where it really looks handicapped by it compared to the RTX 270 & 280.


Who care about MSRP prices. Real prices in our Europe is different.


----------



## notb (Feb 27, 2019)

Rivage said:


> Who care about MSRP prices. Real prices in our Europe is different.


Because MSRP is the price a card should be sold for.
It's the best estimator we have for average pricing globally.
Hence, it's the price we reference all the time.

You can't say a card costs X < MSRP because there exists a particular seller that offers it for X.


----------



## dirtyferret (Feb 27, 2019)

Rivage said:


> Who care about MSRP prices. Real prices in our Europe is different.


I'm the states and the cards are being sold at msrp, a $70 difference.


----------



## sergionography (Mar 1, 2019)

kings said:


> The power efficiency of this card is tremendous. Just imagine what Turing on 7nm could be...
> 
> The price needs to be a little lower, but overall, its a good card. It will probably be the new sales champion in the mid-end, like GTX 1060 was!


I am actually not too impressed by turing. Remember this chip has a die size almost as big as that of the gtx 1080. At 284mm2 its about 10-15% smaller than 314mm2 of the 1080, but is on a smaller 12nm process. And is 15% slower than 1080. Doesn't seem like much improvement as progress is atleast to match 1080 performance at that die size but it didnt. I think it would've been way better for the consumer had nvidia simply rebranded gtx1080 rather than spend money on a new chip and tape out that we the consumers end up paying for. 

TLDR: dont just look at core count and compare, look at die size.


----------



## notb (Mar 1, 2019)

sergionography said:


> TLDR: dont just look at core count and compare, look at die size.


Why do you care about the die size? It doesn't fit on the PCB or what?

You don't like the card, fine. You're not forced to give a reason, so why make up something so weird?

It's fast, it's efficient, it's frugal, it's pretty cheap.
If it was small as well, what would you point out? You don't like the colour?


----------



## sergionography (Mar 2, 2019)

notb said:


> Why do you care about the die size? It doesn't fit on the PCB or what?
> 
> You don't like the card, fine. You're not forced to give a reason, so why make up something so weird?
> 
> ...



I was specifically replying to someone who was impressed by Turing's efficiency as something "tremendous". its good and all but hardly anything impressive especially when compared to a gtx1080. Also note i was taking a deeper look at the architecture part in general. I used die size for the argument because it determines how much performance nvidia can get out of each mm2 which ultimately trickles down to how much it costs to produce, and based on that aspect of things; Turing doesn't seem to improve much on pascal(unless i am missing something). Up until now it was hard to have an apple to apple comparison because the rtx Turing cards have all the extra ray tracing cores that take up space and add functionality.


----------



## notb (Mar 3, 2019)

sergionography said:


> I was specifically replying to someone who was impressed by Turing's efficiency as something "tremendous".


He is right. Efficiency is excellent.


> its good and all but hardly anything impressive especially when compared to a gtx1080.


How can it not be impressive compared to 1080? It more or less matches it in performance/Watt despite belonging to a much lower segment.


> Also note i was taking a deeper look at the architecture part in general


That's exactly what I was talking about. On this forum we're consumers, not tech auditors. We should evaluate products, not their technology - we should be looking at "external" properties: how it performs, how much energy it uses, how much noise and heat it emits. This is what matters.
You think a dense architecture is the sign of technological advancement? Given the choice, would you buy a card that performs worse and is less efficient, just because it has a more dense architecture?

Remember Nvidia is not selling architecture. Their product is the GPU.
So yes, their die size may not be as small as we could hope in 2019. But it's their problem, not ours. It's their job to make an attractive product out of what they have available. And they clearly succeeded. We should actually praise them for being able to beat AMD despite not using a more modern 7nm.

When you go to a restaurant, you either like the meal or not. And even if it's winter and you've ordered a gazpacho, you're happy when the soup you get is tasty, right? They somehow managed it, bravo!
You don't complain that they should've made you wait until summer, because making gazpacho with fresh tomatoes is easier.


> I used die size for the argument because it determines how much performance nvidia can get out of each mm2 which ultimately trickles down to how much it costs to produce


But 1660Ti costs a lot less than 1070 at launch and even slightly less than 1070 cost today - 1.5 year after launch. And is faster. So the price is good, right?
Maybe smaller die would make it even cheaper, but they somehow managed anyway. So just like with the soup example, why are you complaining instead of praising Nvidia?


----------



## sergionography (Mar 4, 2019)

notb said:


> He is right. Efficiency is excellent.
> 
> How can it not be impressive compared to 1080? It more or less matches it in performance/Watt despite belonging to a much lower segment.
> 
> ...




Matching gtx1080 in performance/watt yes, but belonging to a lower segment is completely superficial. It belongs to a lower segment because nvidia wants us to think they are doing more for less. But in reality the production cost for both is very similar, and actually less for gtx1080 if you consider cost for tape out etc. Comparing launch prices is meaningless because gtx1080 came out in may 2016 on what was then a brand new process node, producing that same card costs peanuts right now compared to back then. 

Turing has excellent efficiency only because pascal has excellent efficiency too as they are both practically on the same level. We perceive them as excellent because they are better than anything the competition has, but the reason I pay attention to the architectural side of things is because it gives us a clear indication of how progress is taking place, and the conclusion there is that this gen nvidia only added new features but didn't improve efficiency by much if at all. my argument isn't about wishing that the chip was smaller, rather; it is about the somewhat disappointment at how much GPU progress has slowed down.
We desperately need competition in GPUs thats for sure. 

On another note: perhaps the one reason i can think of for even justifying this new chip is to streamline the drivers work and optimization by having the whole gen based on turing rather than supporting 2 architectures. More so to do with end of life of cards/generations down the road. This approach is partly why nvidia cards often age pretty badly, but i wont complain here since pascal is already 3 years old and GPU generations have already become far in between


----------



## notb (Mar 4, 2019)

sergionography said:


> Matching gtx1080 in performance/watt yes, but belonging to a lower segment is completely superficial. It belongs to a lower segment because nvidia wants us to think they are doing more for less. But in reality the production cost for both is very similar, and actually less for gtx1080 if you consider cost for tape out etc. Comparing launch prices is meaningless because gtx1080 came out in may 2016 on what was then a brand new process node, producing that same card costs peanuts right now compared to back then.


Why do you talk about production costs so much? Why do you care? You buy a graphics card. It's characterized by price, performance, size and unwanted emissions.
Are you so worried about production costs when you buy cars? Ovens? Tomatoes?


> Turing has excellent efficiency only because pascal has excellent efficiency too as they are both practically on the same level. We perceive them as excellent because they are better than anything the competition has


All correct. Pascal was great. Turing is slightly better. I'm happy about slightly better. Slightly better is better than slightly worse (e.g. RX580 vs RX480)


> but the reason I pay attention to the architectural side of things is because it gives us a clear indication of how progress is taking place


No, it doesn't. We know way too little details. It's a GPU. There's quantum magic going on inside. Don't think about it too much.
GPUs have properties, which we can measure and compare. Some have actual meaning in real life: price, performance, heat etc. Some have little to no meaning for consumers: size, core count, FP32:FP64 ratio.

You've chosen a property that has absolutely no impact on how people use graphic cards. Most people don't even know how the card looks under the cooler.
And you're trying to argue that there's no progress despite all really important properties suggesting otherwise.


> and the conclusion there is that this gen nvidia only added new features but didn't improve efficiency by much if at all


The growth on the feature side is possibly the most prolific one in a decade. RTRT, as a technology (since performance is still lacking), is a huge step in GPU evolution.
And the efficiency gain may not be monstrous, but is certainly visible. Not in some abstract numbers on graphs. In coolers. 1660Ti coolers are much smaller or much quieter than what we were getting with 1070.


----------

