Friday, March 8th 2019
Details on GeForce GTX 1660 Revealed Courtesy of MSI - 1408 CUDA Cores, GDDR 5 Memory
Details on NVIDIA's upcoming mainstream GTX 1660 graphics card have been revealed, which will help put its graphics-cruncinh prowess up to scrutiny. The new graphics card from NVIDIA slots in below the recently released GTX 1660 Ti (which provides roughly 5% better performance than NVIDIA's previous GTX 1070 graphics card) and above the yet-to-be-released GTX 1650.
The 1408 CUDA cores in the design amount to a 9% reduction in computing cores compared to the GTX 1660 Ti, but most of the savings (and performance impact) likely comes at the expense of the 6 GB (8 Gbps) GDDR5 memory this card is outfitted with, compared to the 1660 Ti's still GDDR6 implementation. The amount of cut GPU resources form NVIDIA is so low that we imagine these chips won't be coming from harvesting defective dies as much as from actually fusing off CUDA cores present in the TU116 chip. Using GDDR5 is still cheaper than the GDDR6 alternative (for now), and this also avoids straining the GDDR6 supply (if that was ever a concern for NVIDIA).The reference clock speed of GTX 1660 non-Ti is expected to be 1530/1785 MHZ. Custom models, such as the pictured GAMING X, will boost up to 1860 MHz. All graphics cards feature a single 8-pin power connector for increased power delivery. It's expected the GTX 1660 will launch on March 14th, at the $219 mark.
Source:
Videocardz
The 1408 CUDA cores in the design amount to a 9% reduction in computing cores compared to the GTX 1660 Ti, but most of the savings (and performance impact) likely comes at the expense of the 6 GB (8 Gbps) GDDR5 memory this card is outfitted with, compared to the 1660 Ti's still GDDR6 implementation. The amount of cut GPU resources form NVIDIA is so low that we imagine these chips won't be coming from harvesting defective dies as much as from actually fusing off CUDA cores present in the TU116 chip. Using GDDR5 is still cheaper than the GDDR6 alternative (for now), and this also avoids straining the GDDR6 supply (if that was ever a concern for NVIDIA).The reference clock speed of GTX 1660 non-Ti is expected to be 1530/1785 MHZ. Custom models, such as the pictured GAMING X, will boost up to 1860 MHz. All graphics cards feature a single 8-pin power connector for increased power delivery. It's expected the GTX 1660 will launch on March 14th, at the $219 mark.
55 Comments on Details on GeForce GTX 1660 Revealed Courtesy of MSI - 1408 CUDA Cores, GDDR 5 Memory
At last?
Or no?
The long life of AMDs GPUs is a result of their slower development cycles compared to Nvidia, and their ability to continuously bring more performance over the entire life of that GPU through driver optimizations, until it's replacement is finally released (and beyond).
Pitcairn, I believe was around for longer than Polaris has been. I could be wrong though. I know it started in the HD 7000 series and lasted all the way to what, the R7 360 or 370, IIRC? That is a pretty darn long life for a GPU, lol.
Polaris lacks dx12_1 support and cannot acellerate lower precisions. It needs to be replaced. But honestly with Nvidia marketshare being so much higher and Turing having an excellent feature set, those features may be adopted now:)
So other than Fury / Vega AMD has been stuck at the same performance level for 6 years. having only released 4 GPUs that ended up faster.
Another way to look at it.. If you are gaming at 1080p AMD STILL doesn't really have a replacement for the ancient R9 290X unless your willing to pay a hefty price. Otherwise no real performance gains to be had from the red team.
Card was totally fine - it just couldn't stand frostbite.
So thanks for that at least, that your 3-5 y.o. card could be still relevant. Say hello GTX 980 or R9 290X
These days you can easily get away with using cards from 5-6 years ago and still have a decent gaming experience.
Nvidia: Hold my beer.
a slightly cut die with same amount of memory but it's ddr5 this time,it'll end up at 10% slower than 1660Ti. Sorta like 1070Ti was to 1080. meanwhile,a $60 price difference is pretty noticeable for people buying in this segment.
As for Nvidia 7nm it will be 1 full cycle after 12nm Turing I think. 12nm will be a short-lived process. That doesn't mean 12nm Turing is bad, per se, just means that the next generation is going to be, hopefully a lot more compelling. I expect a 2080 replacement card around middle of next year, and 2080 ti performance but potentially 2060 power use and 2080 pricing. Nvidia have pulled off some mental perf/watt gains in the past so i think this is reasonable given the full node shrink. Turing itself is ripe to be shrunk down so I think the gains will be from similar or more cores on smaller dies and clock speeds/bandwidth increases. (Like Maxwell -> Pascal)
Hopefully if Navi is good and Intel coming into the dGPU market, too, 2020 should be a very interesting year for PC graphics^^
Given a simple compute task; GCN is pretty damn good. It scales well to the entire Compute engine, but 3D graphics are a bit different I think. "Workload distribution" was a major problem area that Vega was supposed to address, also primitive rate / geometry (Remember Primitive Shaders?) That never came to fruition due to issues with the design, the silicon and/or the driver. That's what I heard. Vega is unfinished and missed the performance target AMD was aiming for (1080 Ti). But i digress. Ironic it's called "Graphics" core next, it should be called CCN: "Compute Core Next" XD