• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GTX 1660 Ti to Perform Roughly On-par with GTX 1070: Leaked Benchmarks

I'd quite enjoy it being closer to 1070 in price.
But with sales diving (and "mining craze" blaming is an apparent BS in this context) there is only so much money you can squeeze out of customers.
mine crase is over for now... the value plummeted so now ppl are mining in the hope it goes up.
 
Nvidia is definitely profiteering due to lack of competition, their profit must be going through the roof - the best time to invest in their stock! Oh, wait...
 
This card may be one of their better bang for the buck models in this generation (right, not saying much).

1660ti is said to have 190 tensor cores
That thing you posted? Said 192 I thought... but, these don't have either from what I have seen.

Would be great if they realise cards without the tensor cores for less, say a couple of hundred dollars less.
So, you want a $79 GPU that performs like a GTX 1070? lol!


excuse me,but why is every post being downvoted here ?
Systemic abuse of a doomed system from inception?

(surely this post and yours will be deemed low quality..... keep treating the symptoms and not the real issue, TPU......... yawn)
 
Last edited:
dlss is pretty much useless for this card atm, it'll perform well enough at 1080p/1440p not to bother with enabling this new and questionable technology.nvidia said themselves that dlss is rather meant for high-end cards that can't deal with super heavy iq settings like rtx at high resolutions. frankly I'd rather have them optimize dlss for 2080ti and 2080 at 4K only for the absolutely best result rather than push it to rtx 2070/1440p and lower tier cards/lower resolutions.
 
You’re riight, it should be given away for free, because that is how businesses pay their workers. :rolleyes:
I'm simply going after all those who think it should cost substantially less, it's getting really boring to read week after week.

Maybe you did miss my "/s"? :)
 
mine crase is over for now... the value plummeted so now ppl are mining in the hope it goes up.
There is no way an apparent "mining craze is over" would not be obvious when certain folks did the sales forecast.
They overestimated things by whopping 25%, and it nearly entirely is from customers not bending over backwards en mass enough.

2008 was the heigh of financial crisis
End of 2008 was when FC unleashed, most of the 2008 there was crazy growths, in many places two digit.

I feel offended for some weird reason, so let me call your argument's "whining"/QUOTE]
That is fine.

The main reason is that manufacturing got more and more difficult and the chips that we have today are bigger than always.
You could not be any "wronger". 1080 at 330mm^2 was ridiculously small.
We can't discuss "why" of things becoming expensive here, without many getting hysterical over it, so let's just keep it at what is not deniable: it got much more expensive and inflation is a laughable excuse for it.
 
There is no way an apparent "mining craze is over" would not be obvious when certain folks did the sales forecast.
They overestimated things by whopping 25%, and it nearly entirely is from customers not bending over backwards en mass enough.
you must have been living under a rock not to have noticed how mining was last year then when the bubble looked like it was about to burst how people started selling off coins and equipment. which just helped drop the value even more which made more people sell.
Right now your looking at asic's if you want to even come close to mining at a rate that may be profitable.
Most of the hobby based miners have figured this out, and the big farms arent using gpu's

any one could have foreseen this happening and many did which didn't help the value of crypto currency when they dumped their coins.
when some GPU only cryptos had ASCI's designed for them it was the beggining of the end "for the time being" of gpu mining.

I think this may actually be allong the lines of what your saying any way.
 
I'm waiting for the 1660 non ti benchmarks and price point.
 
Remember when new series that might offer 20% performance uplift usually didn't march in price 20%.

From the MSRP of the last card that occupied such performance pedestal, now a card that provides 2Gb less memory and will almost undoubtedly work a smaller chip, is a 15% price reduction... This is a yawn as those with good GTX 1060 6Gb aren't seeing a upgrade, or GTX 1070 are still in the same position.
 
Anyway, a 1080p card (which is what this is) doesn't need more than 6GB... not close most of the time. Would you rather 16GB of HBM2 go on it to raise the price more for no reason? Or, 8GB instead of 6GB... also for no reason? No point for a 1080p card honestly. NVIDIA, IMO, got this one right for its performance. At default Ultra settings 2560x1440, it isn't a 60 FPS card in most titles (assuming its close to a 1070 as rumored). Settings will need lowered to reach that in most titles which means lower vRAM reqs. ;)
 
Last edited by a moderator:
and DLSS (deep-learning supersampling).

Are you sure about that? 1660ti is said to have 190 tensor cores

Why would you even want DLSS when it's already proven that running the game at a lower resolution and upscaling it gives you the same performance with better image quality?
 
Not really sure what to make of this. 1070 performance that launched at $349 right? or was it $399? In either case, I can't really say this some large step forward.

any one could have foreseen this happening and many did

Except for NV apparently judging by their overstock. Anywho, that is for another thread.
 
$379 for GTX 1070 MSRP. This is $100 less for similar performance.

I would be more inclined to call it acceptable if the 1070 launched last year instead of 2 years and 9 months ago. I get that times are harder but I just don't know if I can call it a 'good deal.' It is what it is though and just highlights how exciting it could be with some better competition stack. Not sure if it will be Intel or not. I don't expect them to exceed NV but I do give them a slight chance at exceeding AMD.
 
How that information is sliced is not my concern. It was a statement of fact. :)

We'll see here soon enough its price to performance ratio and where it lands!
 
How that is sloced is not my concern. It was a statement of fact. :)

And mine was merely an opinion based off your fact. Agree with your edit!
 
Wow! Some think in 2019 and at $280 that we should enjoy the idea this only considered as 1080p card. :wtf:

If I'm paying $280 and not getting a usable 1440p, it's even a worse deal. Sad this passes as really "new" with hype more than a RX 590. While Nvidia held strong to a supposed $379 MSRP that in many people minds was truly suppose to be in line with the 1070Ti and just never got cut to the $330 it really settled into. Just because Nvidia kept a stiff-up-lip after mining went bust we aren't drinking the diluted Kool-Aid to think that has any reality after two years.
 
Last edited:
This card looks like it might be a gem, especially with AMD pushing up the price of the mediocre RX 590 just three months ago, they sadly have once again left the door wide open for Nvidia to outperform them and undercut them too because they don't need three dodgy games to sell a superior product.
 
dlss is pretty much useless for this card atm, it'll perform well enough at 1080p/1440p not to bother with enabling this new and questionable technology.nvidia said themselves that dlss is rather meant for high-end cards that can't deal with super heavy iq settings like rtx at high resolutions. frankly I'd rather have them optimize dlss for 2080ti and 2080 at 4K only for the absolutely best result rather than push it to rtx 2070/1440p and lower tier cards/lower resolutions.

RTX is dependent on tensors, hence the opportunity to try DLSS. It's also clear that current RTX/tensor performance isn't optimal below perhaps 16.7ms frame time for performance/IQ tradeoff. However, general MLAA techniques might prove very useful for even low/mid tier GPUs in future. Existing temporal AA techniques take many dev hours to implement, whereas the DLSS toolchain can be up & running in a couple of days with little developer time. That's a big win & should scale well.

This card looks like it might be a gem, especially with AMD pushing up the price of the mediocre RX 590 just three months ago, they sadly have once again left the door wide open for Nvidia to outperform them and undercut them too because they don't need three dodgy games to sell a superior product.
Perhaps because NV left a hole in their price structure. A bit like NV with 2080 allowing AMD to price Mi50 @ $700. Besides, Nvidia has their own dodgy game selection to plug their RTX series...
 
Why would you even want DLSS when it's already proven that running the game at a lower resolution and upscaling it gives you the same performance with better image quality?

This wont be true forever. in 2 years when 80% of AAA games use DLSS and Nvidia gets the kinks out, I might enjoy a 25% boost.
 
This wont be true forever. in 2 years when 80% of AAA games use DLSS and Nvidia gets the kinks out, I might enjoy a 25% boost.

That's a MIGHTY big dream...What you been smokin'?
 
That's a MIGHTY big dream...What you been smokin'?
in 2 years PS5, ,Xbox 2, AMD, Intel graphics, Microsoft DX12, will all be supporting RT and some sort of DLSS.
I guess they are all smoking what im smoking.

As for today ,Nvidia owns the GPU market, is unmatched in innovation and performance, and what they do game developers will follow , and where game developers go, everyone follows. Wanna hit? you seem to need some of this reality weed.
 
Last edited:
Back
Top