Wednesday, March 4th 2015

NVIDIA Unveils the GeForce GTX TITAN-X

NVIDIA surprised everyone at its GDC 2015 event, by unveiling its flagship graphics card based on the "Maxwell" architecture, the GeForce GTX TITAN-X. Although the unveiling was no formal product launch, and it didn't come with a disclosure of specs, but a look at the card itself, and a claim by no less than NVIDIA CEO Jen-Hsun Huang, that the card will be faster than the current-gen dual-GPU GTX TITAN-Z, there are some highly plausible rumors about its specs doing the rounds.

The GTX TITAN-X is a single-GPU graphics card, expected to be based on the company's GM200 silicon. This chip is rumored to feature 3,072 CUDA cores based on the "Maxwell" architecture, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory. NVIDIA is likely taking advantage of new 8 Gb GDDR5 chips. Even otherwise, achieving 12 GB using 4 Gb chips isn't impossible. The card itself looks nearly identical to the GTX TITAN Black, with its nickel alloy cooler shroud, with two differences - the "TITAN" marking towards the front of the card glows white, while the fan is decked with green lights, in addition to green glowing "GeForce GTX" logo on the top. You get to control the lighting via GeForce Experience. NVIDIA plans to run more demos of the card throughout the week.
Source: PC World
Add your own comment

71 Comments on NVIDIA Unveils the GeForce GTX TITAN-X

#51
GhostRyder
Casecutter

Interestingly there's no news here that AMD officially, (as in on the record) said they demo'ing Showdown running on the Oculus Rift Crescent Bay, and saying they are doing with an unannounced Radeon R9 flagship ultra-enthusiast product.
www.tomshardware.com/news/amd-radeon-r9-oculus-rift,27298.html#amd-radeon-r9-oculus-rift,27298.html?&_suid=142558478372105216857995277943

Here's something I thought, Jen-Hsun said "the card will be faster than the current-gen dual-GPU GTX TITAN-Z". Well can't that mean it could be a dual GM204, or does the “12GB framebuffer, 8 billion transistors” not make that probable?
Well if it turns out to be a dual gpu design, they are going back to the era of the 295 with that blower design going from the end blowing down two GPUs (though not the dual PCB variants.).
heydan83Can



Yeah I notice the same as you, the first thing that was weird to me was no back plate and then the "Titan" name instead Titan x, men these guys are going to continue with lies becuase they saw fan boys are gonna support them even if they give shit to them... by the way can you put the link of the video?
Yea, I would have thought there would be an X illuminated somewhere on it. Would be a cool design choice and something I was actually expecting.

Anyways, the card is being marketed again as a gaming card again. Wonder what the real performance will be like as the rumors aim this based on comments made by the upper brass as being better than Titan-z which would be fantastic.
Posted on Reply
#52
HumanSmoke
CasecutterWait this wasn’t a reveal or preview… Doesn’t Jen-Hsun pull a card out of a box with “Titan X” and presents to Tim Sweeney head of Epic Games. The cooler just says “Titan”, why not the X on it? Almost like they didn’t have time to even ante-up for a faux 3D stereology cover that had “Titan X”. It’s more like a cheap PR stunt, to get everyone to go “squirrel” and to put his “apology” in the rearview mirror.
The card appears to me to be a Titan with some paint and lights… I just see that Spanish dude from that video, you know that supposed “engineer” ranting about the 970 fiasco. He’s back in a video laughing his ass off as he’s saying... “he pulls out this gussied-up old Titan with paint and lights from a box and holds it up as the second coming, and they drop to their knees!... Like remember the wood screws…”
Other than, “12GB framebuffer, 8 billion transistors” I don’t see that Nvidia gave out any technical spec’s, correct? So more Barnum & Bailey than news, all hinged on words like expected, rumored, likely, plans etc.
CasecutterThe cooler just says “Titan”, why not the X on it?
Custom designed to stir up conspiracy theorists, those prone to pathological hysteria, and get the product talked about on the interwebz ahead of AMD's 390X reveal. Seems to be working so far.
CasecutterThe card appears to me to be a Titan with some paint and lights…
Or recycling the shroud from the Titan Black - which also had lights. Fine by me, I quite like the look of reference cooler, and if they continue to use it, it should attain iconic design status.
CasecutterI just see that Spanish dude from that video...Like remember the wood screws
Different products, and in this instance a different era as well. It's not like they are directly connected....like they are rebrands or anything.
Other than, “12GB framebuffer, 8 billion transistors” I don’t see that Nvidia gave out any technical spec’s, correct?
Nvidia has reserved launches for GTC not GDC. This was more a "get in before the competition on the reveal" play. I'll grant that it isn't as classy as the month long excitement building "launch an already launched card in a secondary market" marketing exercise of their competitors, but not everyone has talent of the calibre of Roy Taylor to call upon.

A few extra shots including the what seems standard fit-out for I/O (wonder if they added DP 1.2a/1.3 support?)
Posted on Reply
#53
xorbe
ParnDoubt the 980Ti will be a full GM200 chip. Most likely it will be a cut-down version, something like 2560SP 320bit 5GB memory.
That's what I was thinking, 320-bit 5GB. Or 320-bit 6GB with 1GB of that "a bit slower" ...
Posted on Reply
#54
HumanSmoke
xorbeThat's what I was thinking, 320-bit 5GB. Or 320-bit 6GB with 1GB of that "a bit slower" ...
I doubt they will be volume sales in any event, so those who want the flagship for the sake of having the fastest card will pay the premium. A salvage part using 6GB of 4Gbit/7Gbps (rather than 4-8Gbit/8Gbps) memory would seem the obvious choice. Retaining the same number of memory IC's would help with keeping the reference PCB standardized across a few different SKUs.
Since the pictures of the Titan X seem to show 12 memory IC's on the back of the PCB (and presumably 12 on other side), the card is likely using 4Gbit IC's. A second (salvage) part could just omit the 12 chips on the back side of the PCB.
the54thvoidAnd on other news, 6 pin, 8 pin power connectors....... What does that make power draw? 300watts?
300W nominal yes.
75W via the PCI-E slot, 75W via the 6-pin, 150W via the 8-pin.
Posted on Reply
#56
sergionography
buggalugsThey never make much money from these cards at $1000-$1500 , very few people buy them. I don't think they will make much money on this at all after the consumers got burnt last time. They charge a fortune for Titan then 8 weeks later release mainstream cards with better performance for half the price!!

Amd have 3xx series coming very soon too.... its a shame you cant see how you sound like a moron making comments like that. Its not funny or correct, just petty.....
Only the 390x is a new chip and the rest are rebrands of Hawaii and Tonga. And Fiji has been ready for a while but AMD mentioned they are just doing "final touches"(though idk why they r taking their sweet time).
And I think you completely misunderstood there I meant no offense to either camp, its just that i think amd needs to focus on actual products that will make them money at their current state because their budget is super low, and their products now are so few and many of those are old and outdated (the fx series and most of their radeon line up). Amd keeps investing in hsa, mantle, virtual reality tools, tressfx and other software perks when in reality these things will not get them any of the fast cash they Need now. I have no problem with either of these technologies, but I do have a problem with the timing. And this is the problem amd keeps making, they come up with excellent futuristic tools and technologies that end up being the way of the future, but because they have no money they fail to push such standards or someone e else ends up capitalizing on the technologies

Look at mantle for example, they recently announced that it will step aside for dx12 and the new opengl, so my question is, was it worth it for amd? Yes they pushed gaming forwards, and yes we all benefitted thanks to amd, but in return amd only lost market share to nvidia because nvidia has a newer architecture and newer products almost top to buttom now.
Posted on Reply
#57
arbiter
HumanSmoke300W nominal yes.
75W via the PCI-E slot, 75W via the 6-pin, 150W via the 8-pin.
I would say it puts it more around 225-250watts draw. Nvidia tends to not draw full power outta the PCI-e and leaves room for boosting so i would say expected TDP is 250 or lower. Still much lower then 295x2 and while if report is right only 10% slower at 4k.
sergionographyLook at mantle for example, they recently announced that it will step aside for dx12 and the new opengl, so my question is, was it worth it for amd? Yes they pushed gaming forwards, and yes we all benefitted thanks to amd, but in return amd only lost market share to nvidia because nvidia has a newer architecture and newer products almost top to buttom now.
How did mantle push anything forward? About only thing it did for sure was force MS to announce DX12 has been in the works. As much as people want to say it made dx12 slim down closer to the metal, I would doubt that as give how much time MS has to put in to DX development they couldn't put it together and have working so fast.
Posted on Reply
#58
HumanSmoke
arbiterI would say it puts it more around 225-250watts draw. Nvidia tends to not draw full power outta the PCI-e and leaves room for boosting so i would say expected TDP is 250 or lower. Still much lower then 295x2 and while if report is right only 10% slower at 4k.
You're likely right for the card. The previous Titan's were 250W TDP, which allows some wriggle room for overclocking. I was actually answering the54thvoid's query about the maximum nominal wattage for 8-pin + 6-pin PCI-E power connections.
arbiterHow did mantle push anything forward? About only thing it did for sure was force MS to announce DX12 has been in the works. As much as people want to say it made dx12 slim down closer to the metal, I would doubt that as give how much time MS has to put in to DX development they couldn't put it together and have working so fast.
Well, it probably heightened the buzz for DX12 if nothing else. Since Mantle, as DICE originally envisaged it borrows from console APIs DX 11.x, GNM, GNMX), and DirectX 11.x in all likelihood predates Mantle ( the early names for the Xbone included the "11 X" ), and AMD's own "Gaming Scientist" Richard Huddy, acknowledged that DX12 development started before Mantle last year:
[Huddy] repeated the contention that Mantle shaped DirectX 12's development. We expressed some doubts about that contention when we addressed it earlier this year, but Huddy was adamant. Development on DirectX 12's new features may have begun before Mantle, he said, but the "real impetus" for DX12's high-throughput layer came from the AMD API.
Which is supported by Nvidia and AMD's own efforts in working on OpenGL's extensions during the same earlier timeframe. Extremely unlikely that all this cross-vendor development was developed in isolation and spontaneously arrived with three APIs more or less simultaneously.
Posted on Reply
#59
radrok
I'd love to see some custom PCBs but I think it'll be a dream like it was with the original Titan...
Posted on Reply
#60
sergionography
arbiterI would say it puts it more around 225-250watts draw. Nvidia tends to not draw full power outta the PCI-e and leaves room for boosting so i would say expected TDP is 250 or lower. Still much lower then 295x2 and while if report is right only 10% slower at 4k.



How did mantle push anything forward? About only thing it did for sure was force MS to announce DX12 has been in the works. As much as people want to say it made dx12 slim down closer to the metal, I would doubt that as give how much time MS has to put in to DX development they couldn't put it together and have working so fast.
No it did I think that part has been proven already, dx12 being in the works doesn't mean it was geared for what it turned out to be after mantle. And it's not just dx12 but also vulkan whom one of the developers tweeted "thanks AMD" because pretty much mantle was adobted mostly and used as the foundation of the api. AMD basically made mantle and showed Microsoft and the opengl team how it's done so let us not be ungrateful, we should at least give them credit where its due especially when they made no money out of it and was completely for charity lol.
Posted on Reply
#61
DeadSkull
So does this card come with 10.5gb video game ram + 1.5gb for Windows cache?

:D
Posted on Reply
#62
Casecutter
At question is how many of these GM200 Nvidia is working with to fill corporate contract, and when such an offering looks to be ready for HPC projects? ... GM200 isn't here just for gamers.

What do we think is the prospect of a 28nm process providing perfect GM200 parts? We know that the GM204 had segments disabled, so the possibility that Titan X also using deactivated SP, Texture units and/or memory crossbars enable is highly probable.
Posted on Reply
#63
radrok
CasecutterAt question is how many of these GM200 Nvidia is working with to fill corporate contract, and when such an offering looks to be ready for HPC projects? ... GM200 isn't here just for gamers.

What do we think is the prospect of a 28nm process providing perfect GM200 parts? We know that the GM204 had segments disabled, so the possibility that Titan X also using deactivated SP, Texture units and/or memory crossbars enable is highly probable.
If it's 28nm, and I am almost 99% sure it is, the manufacturing node is very mature so it's possible that Nv has stellar yields.
Posted on Reply
#64
Casecutter
radrokIf it's 28nm, and I am almost 99% sure it is, the manufacturing node is very mature so it's possible that Nv has stellar yields.
That doesn't stand-up to the conventional wisdom though... As the GM204 had issue within the L2 that cause them to open the "cross-bar" to the scavenge that useable memory controller. Each node will present their own complications, no matter if the manufacturing process is up to snuff.
Posted on Reply
#65
HumanSmoke
CasecutterAt question is how many of these GM200 Nvidia is working with to fill corporate contract, and when such an offering looks to be ready for HPC projects? ... GM200 isn't here just for gamers.
GM 200 isn't for HPC, that has been stated repeatedly by Nvidia themselves. It is the reason that GK 210was run as a parallel development of GM 200.
GM 200 has minimal FP64 support, GK 210 is a development of GK 110.
GK 110 has 1:3 rate FP64. GK 210 improves that to 1: 2.5 (along with a doubled cache/register over GK 110), and is expected to - at least initially, until GP 100/200 (and ultimately GV 100/200) arrives, to power the DoE's Summit super.
CasecutterWhat do we think is the prospect of a 28nm process providing perfect GM200 parts? We know that the GM204 had segments disabled, so the possibility that Titan X also using deactivated SP, Texture units and/or memory crossbars enable is highly probable.
No process provides 100% yield, but how do you arrive at "highly probable". Of all the Maxwell cards in existence, only ONE, the 970 is affected by memory segmentation- and that looks to be a strategic decision rather than purely architectural. GM 204 has five different SKUs associated with it, and only one part is affected, GM 107 has four SKUs associated with it and none are affected, GM 108 has three SKUs - none are affected, nor the GM 206.

So you ascertain that the issue is "highly probable" based upon a one in thirteen occurrence. :slap: although bearing in mind your prognostication record, I'm very glad to see that you expect a neutered part with memory segmentation- it augers well for Titan X being a fully enabled part ;)
Posted on Reply
#66
Crap Daddy
Since Jen Hsun has leaked himself the existence of the card, can we expect some leaked benchmarks from Epic's Tim Sweeney?
Posted on Reply
#67
HumanSmoke
Crap DaddySince Jen Hsun has leaked himself the existence of the card, can we expect some leaked benchmarks from Epic's Tim Sweeney?
Turn about is fair play right? If Jen Hsun can crash Epic's day, surely they can crash Titan X's launch.

~Two weeks out from a launch, and no reliable benchmarks yet to surface. That might be a record.
Posted on Reply
#68
Prima.Vera
radrokApples to oranges comparison, this is a single GPU solution, 295X2 is a dual GPU.
It's irrelevant. We compare video cards not GPUs. :)

You are probably to young to remember the 3Dfx vs {the rest of the world} times. ;) They were mostly producing video cards with 2,3 and even 4 GPUs competing with ATI, nVidia, Matrox, etc and winning. And nobody cared is multi GPU or not. Just the performance.
Posted on Reply
#69
HumanSmoke
Prima.VeraIt's irrelevant. We compare video cards not GPUs. :)
You are probably to young to remember the 3Dfx vs {the rest of the world} times. ;) They were mostly producing video cards with 2,3 and even 4 GPUs competing with ATI, nVidia, Matrox, etc and winning. And nobody cared is multi GPU or not. Just the performance.
Not necessarily. Firstly, performance is a moving target with any multi-GPU setup- for every game that scales well, there are as many - if not more that scale badly, or not at all either through game coding or driver issues. Also, if you're old enough to remember the era (and it is a little condescending to imply radrok doesn't have knowledge of the era), the multi-GPU parts were stratospheric in price compared to any single GPU, even the relatively cheap Obsidian2 X-24 was roughly2-3 times the priceof any other flagship card, and the 4440V was $1900 when any other flagship card of the day (Matrox G200, Number Nine Revolution 3D, ATI Rage Fury/Xpert, Voodoo/Voodoo2) was around $300-350.



In any case, it is up to the individual user for what they deem an appropriate trade-off. Dual GPU cards, especially the 295X2's current pricing makes it appealing for some, but every time a new game arrives the onus is on game dev and driver team to make sure CFX/SLI works as advertised or the experience takes a nosedive. The 295X2 may also have longevity issues from what I've seen - while the GPUs remain nice and cool, the same can't be said for the voltage regulation and heat radiating through the PCB across the whole card. Localized temperatures of 107°C can't be conducive to board integrity.
Posted on Reply
#70
radrok
Prima.VeraIt's irrelevant. We compare video cards not GPUs. :)

You are probably to young to remember the 3Dfx vs {the rest of the world} times. ;) They were mostly producing video cards with 2,3 and even 4 GPUs competing with ATI, nVidia, Matrox, etc and winning. And nobody cared is multi GPU or not. Just the performance.
How can it be irrelevant when most of the times the said "performance" tanks below single card GPUs? I've been a witness of that in many generations of dual GPUs, they work wonders for a few titles that get support and the rest is just left hanging.

The only instance in which I rely on dual GPU setups is when existing single GPU performance has been exhausted, eg. I bought one Titan and it wasn't enough for what I was doing, so I bought another, but just because there wasn't another single GPU faster than Titan that time around.

If you had as much experience as me on multi GPU setups you'd change your idea, trust me.

AFR needs to be completely forgotten.
Posted on Reply
#71
Casecutter
HumanSmokeSo you ascertain that the issue is "highly probable" based upon a one in thirteen occurrence.
Well it's the first I've heard they have intended the GM200 basically for gaming only... Yes that changes things.

Sure not all 970 have a bad L2, but they had to fuse one (of 14) off of everyone to achieve the parity across the volume they need to sell. Yes a "strategic decision" that was arrive at because of the sheer volume of 970's they intended to market, which I could see outsell the other derivatives by a wide margin. I might go as far to say for every four (4) 970's sold, they probably sell one 980, and then perhaps one around the 3 other associated SKUs.

So as I understand now the GM200 (as the TitanX) is standing on its own as the top part... so it's "highly probable" it won't, while lower derivatives might need to have memory crossbars enable still has some veracity.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts