Tuesday, November 13th 2018
NVIDIA Deploys GP104 GPU for GDDR5X version of GeForce 1060
NVIDIA has just shown us one of the most ingenious ways of creating new custom, competitive SKUs for the midrange market without spending any additional amounts of money on R&D, wiring, or memory controller work: just reuse the chips that already have that work done. This is the case for NVIDIA's new GTX 1060 GDDR5X graphics card, which the company has "designed" to further fill in the gaps on its midrange offerings against a revamped Radeon RX 590.Instead of reworking a purpose-built memory controller solution compatible with the GP106 GPU, the company has gone and carved the SKU from its existing GP104 silicon - which already supports the GDDR5X memory subsystem due to its implementation on the GTX 1080 (GP104) and 1080 Ti (GP102) graphics cards. A smart usage of GP104 inventories - which have been superseded by NVIDIA's new RTX 20-series in the high end - or of very defective dies (remember the GTX 1060 has half the shaders, at 1280, compared to the GTX 1080's 2560). This decision by NVIDIA could also go some way in explaining dwindling inventories and increasing pricing of GTX 1080 graphics cards, as chips that could have been used for that SKU are (possibly) being used for the new GTX 1060.The discovery came courtesy of a teardown on iGame's GTX 1060 U-TOP V2, which features a triple-fan cooling solution, 2x 8-pin power connectors, and an 8+2 phase power delivery design, via Taobao. Apparently, even the SLI fingers remain on the card, which if you'll remember, never where supported on NVIDIA's GTX 1060 - a result of iGame's decision to simply reuse their PCB design for the usually much more powerful, GP104-based GTX 1080.
Sources:
Taobao, via Videocardz
65 Comments on NVIDIA Deploys GP104 GPU for GDDR5X version of GeForce 1060
this is like an entirely different card or revision. Its more like a GTX1060 2.0
Though being English i honestly read most as a sarcastic jab , who knows.
Maybe ,no ,no shader unlock possible since their bios is a hard nut t crack these day's.
Can't figure out how to quote Raevenlord "This decision by NVIDIa could also go some way in explaining dwindling inventories and increasing pricing of GTX 1080 graphics cards, as chips that could have been used for that SKU could be used for the new GTX 1060."
How does gelding perfectly good GP104's that could be sold as GTX 1080's (or 1070-Ti) be a logical use? Other than they want to stop selling them so to drive folks to by into RXT... And between accomplishing that goal, and unloading what probably Bandini Mountain of chips that could not be used for a lowly GTX 1070 they give use this?
Now, had the made a card that had 9Gb of GDDR5X it would've offer that cool WOW factor, but it won't have been anymore of a benefit than this.
videocardz.com/79007/amd-radeon-rx-590-final-specs-pricing-and-performance
I wouldn't be surprised if they do this same later with RTX 2070, just use defective TU104 chips and it's good to go.
The level of performance relative to the full lineup makes a GPU low-end, mid-range or high-end, not the fucking die size.
based on your silly argument the RTX 2070 is a higher end card than GTX 1080 which isn't the case.
Hence why all of you are baffled as to why does Nvidia chooses to sell a product based on a "high end" part in a lower end segment when in fact this makes perfect sense from within their product stack and its hierarchy.
www.techpowerup.com/reviews/MSI/GTX_560_Ti_448_Cores_Twin_Frozr_III/
Are we really supposed to believe that after 2 years of producing Pascal, TSMC's yields have gone down, not up, to the point where they only are able to produce half-working GP104 chips? Nope.
So the only other option is that NVIDIA has killed off GTX 1070/1070 Ti/1080 production in favour of seriously neutering GP104 and putting it into this "GTX 1060 GDDR5X". But that doesn't make sense either: why take a chip that in its lowest-end configuration (GTX 1070) was paired with cheaper GDDR5 memory, and cut it down by half and pair it with more expensive GDDR5X? Why not just chop GP104 down a little less - say, to 1440 shaders - and leave it coupled with GDDR5, to create a GTX 1070 "lite" that would be able to convincingly beat the RX 590?
The only possibility I can imagine is that there is some f**kery with GP104's eight memory controllers that means they can't work with non-multiples-of-4GB GDDR5, but aren't limited in that way with GDDR5X - although that seems unlikely, since those controllers couple into 4 sets of 2 each for GDDR5 mode (4x 64-bit) and 8 sets of 1 each (8x 32-bit) for GDDR5X, so cutting the memory amount down to 6GB would imply 3x2 for GDDR5 which should work just fine.
More information is needed here.
nGreedia never fail to amaze with it's callousness, greed and unscrupulous business...