Wednesday, March 4th 2015
NVIDIA Unveils the GeForce GTX TITAN-X
NVIDIA surprised everyone at its GDC 2015 event, by unveiling its flagship graphics card based on the "Maxwell" architecture, the GeForce GTX TITAN-X. Although the unveiling was no formal product launch, and it didn't come with a disclosure of specs, but a look at the card itself, and a claim by no less than NVIDIA CEO Jen-Hsun Huang, that the card will be faster than the current-gen dual-GPU GTX TITAN-Z, there are some highly plausible rumors about its specs doing the rounds.
The GTX TITAN-X is a single-GPU graphics card, expected to be based on the company's GM200 silicon. This chip is rumored to feature 3,072 CUDA cores based on the "Maxwell" architecture, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory. NVIDIA is likely taking advantage of new 8 Gb GDDR5 chips. Even otherwise, achieving 12 GB using 4 Gb chips isn't impossible. The card itself looks nearly identical to the GTX TITAN Black, with its nickel alloy cooler shroud, with two differences - the "TITAN" marking towards the front of the card glows white, while the fan is decked with green lights, in addition to green glowing "GeForce GTX" logo on the top. You get to control the lighting via GeForce Experience. NVIDIA plans to run more demos of the card throughout the week.
Source:
PC World
The GTX TITAN-X is a single-GPU graphics card, expected to be based on the company's GM200 silicon. This chip is rumored to feature 3,072 CUDA cores based on the "Maxwell" architecture, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory. NVIDIA is likely taking advantage of new 8 Gb GDDR5 chips. Even otherwise, achieving 12 GB using 4 Gb chips isn't impossible. The card itself looks nearly identical to the GTX TITAN Black, with its nickel alloy cooler shroud, with two differences - the "TITAN" marking towards the front of the card glows white, while the fan is decked with green lights, in addition to green glowing "GeForce GTX" logo on the top. You get to control the lighting via GeForce Experience. NVIDIA plans to run more demos of the card throughout the week.
71 Comments on NVIDIA Unveils the GeForce GTX TITAN-X
Anyways, the card is being marketed again as a gaming card again. Wonder what the real performance will be like as the rumors aim this based on comments made by the upper brass as being better than Titan-z which would be fantastic.
A few extra shots including the what seems standard fit-out for I/O (wonder if they added DP 1.2a/1.3 support?)
Since the pictures of the Titan X seem to show 12 memory IC's on the back of the PCB (and presumably 12 on other side), the card is likely using 4Gbit IC's. A second (salvage) part could just omit the 12 chips on the back side of the PCB. 300W nominal yes.
75W via the PCI-E slot, 75W via the 6-pin, 150W via the 8-pin.
And I think you completely misunderstood there I meant no offense to either camp, its just that i think amd needs to focus on actual products that will make them money at their current state because their budget is super low, and their products now are so few and many of those are old and outdated (the fx series and most of their radeon line up). Amd keeps investing in hsa, mantle, virtual reality tools, tressfx and other software perks when in reality these things will not get them any of the fast cash they Need now. I have no problem with either of these technologies, but I do have a problem with the timing. And this is the problem amd keeps making, they come up with excellent futuristic tools and technologies that end up being the way of the future, but because they have no money they fail to push such standards or someone e else ends up capitalizing on the technologies
Look at mantle for example, they recently announced that it will step aside for dx12 and the new opengl, so my question is, was it worth it for amd? Yes they pushed gaming forwards, and yes we all benefitted thanks to amd, but in return amd only lost market share to nvidia because nvidia has a newer architecture and newer products almost top to buttom now.
:D
What do we think is the prospect of a 28nm process providing perfect GM200 parts? We know that the GM204 had segments disabled, so the possibility that Titan X also using deactivated SP, Texture units and/or memory crossbars enable is highly probable.
GM 200 has minimal FP64 support, GK 210 is a development of GK 110.
GK 110 has 1:3 rate FP64. GK 210 improves that to 1: 2.5 (along with a doubled cache/register over GK 110), and is expected to - at least initially, until GP 100/200 (and ultimately GV 100/200) arrives, to power the DoE's Summit super. No process provides 100% yield, but how do you arrive at "highly probable". Of all the Maxwell cards in existence, only ONE, the 970 is affected by memory segmentation- and that looks to be a strategic decision rather than purely architectural. GM 204 has five different SKUs associated with it, and only one part is affected, GM 107 has four SKUs associated with it and none are affected, GM 108 has three SKUs - none are affected, nor the GM 206.
So you ascertain that the issue is "highly probable" based upon a one in thirteen occurrence. :slap: although bearing in mind your prognostication record, I'm very glad to see that you expect a neutered part with memory segmentation- it augers well for Titan X being a fully enabled part ;)
~Two weeks out from a launch, and no reliable benchmarks yet to surface. That might be a record.
You are probably to young to remember the 3Dfx vs {the rest of the world} times. ;) They were mostly producing video cards with 2,3 and even 4 GPUs competing with ATI, nVidia, Matrox, etc and winning. And nobody cared is multi GPU or not. Just the performance.
In any case, it is up to the individual user for what they deem an appropriate trade-off. Dual GPU cards, especially the 295X2's current pricing makes it appealing for some, but every time a new game arrives the onus is on game dev and driver team to make sure CFX/SLI works as advertised or the experience takes a nosedive. The 295X2 may also have longevity issues from what I've seen - while the GPUs remain nice and cool, the same can't be said for the voltage regulation and heat radiating through the PCB across the whole card. Localized temperatures of 107°C can't be conducive to board integrity.
The only instance in which I rely on dual GPU setups is when existing single GPU performance has been exhausted, eg. I bought one Titan and it wasn't enough for what I was doing, so I bought another, but just because there wasn't another single GPU faster than Titan that time around.
If you had as much experience as me on multi GPU setups you'd change your idea, trust me.
AFR needs to be completely forgotten.
Sure not all 970 have a bad L2, but they had to fuse one (of 14) off of everyone to achieve the parity across the volume they need to sell. Yes a "strategic decision" that was arrive at because of the sheer volume of 970's they intended to market, which I could see outsell the other derivatives by a wide margin. I might go as far to say for every four (4) 970's sold, they probably sell one 980, and then perhaps one around the 3 other associated SKUs.
So as I understand now the GM200 (as the TitanX) is standing on its own as the top part... so it's "highly probable" it won't, while lower derivatives might need to have memory crossbars enable still has some veracity.