Tuesday, March 17th 2015

NVIDIA Launches the GeForce GTX TITAN X

NVIDIA formally launched the GeForce GTX TITAN X, its flagship graphics card based on the "Maxwell" architecture, following its GDC 2015 unveiling. Based on the new 28 nm GM200 silicon, the GTX TITAN X packs 3,072 CUDA cores, 192 TMUs, 96 ROPs, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory. With 50% more graphics processing muscle over its previous-generation (7.1 TFLOP/s), the card retains its 250W TDP rating, of its predecessor. The GTX TITAN will launch in reference-design, and will be priced at US $999.
Add your own comment

56 Comments on NVIDIA Launches the GeForce GTX TITAN X

#26
rtwjunkie
PC Gaming Enthusiast
renz496so no DP. for what reason this thing priced at 1k?
Because they can. It is now their premier card. They will charge an iconic price for what they view as their iconic product.
Finally, they are charging within what their research tells them people are willing to pay, and that is the true price of products.
Posted on Reply
#27
the54thvoid
Super Intoxicated Moderator
64KI hope that doesn't turn out to be true but if it is then it may mean that Pascal isn't too far off in which case I will wait on Pascal. $1,000 for a gaming single GPU isn't for me.
The vague road map looks like Pascal is 2016 onwards. That's an easy slip into 2017 if the nm process isn't on track. I'd say a year to go for Pascal at least but that's not even an educated guess.

EDIT: Techspot review is out... Disappointing.
Posted on Reply
#29
Rahmat Sofyan
Techspot already out for the review

"The Titan X was also 47% faster than the R9 290X on average at 2560x1600 while it was just 36% faster at 3840x2160"

&

"While the Titan X was just 8% slower than the R9 295X2 at 2560x1600, it was 22% slower at 3840x2160"

Mehhh card then for $1000...

Can't wait for W1zz review..
Posted on Reply
#30
radrok
Day one buy for me, call me sucker :p

This thing is gonna do work under water :toast:
Posted on Reply
#31
erocker
*
ironwolfFrom the last page it sounds like the price is what is getting poo-poo'd pretty hard.
As it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them... I've been there before.
Posted on Reply
#32
64K
erockerAs it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them.
Plus it will give Nvidia some time to improve the drivers for the GM200.
Posted on Reply
#33
radrok
64KPlus it will give Nvidia some time to improve the drivers for the GM200.
Same architecture as GM204, there isn't gonna be THAT much improvement. Kepler has been an example.

GM200 is just a bigger chip of an architecture that's been out for a while.
erockerAs it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them.
I partially agree with you, I'm here to advocate for the crowd that uses this thing for things that are not strictly related to gaming only.

When I load up a scene in Octane to render with CUDA I am limited by my GPU VRAM.

This 12GB IS gonna be a BOON for people like me that work with GPU renderers.

The price is steep but it's gonna pay itself in a month for me.
Posted on Reply
#34
the54thvoid
Super Intoxicated Moderator
erockerAs it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them.
Yeah, not gonna be me this time.
Posted on Reply
#35
Ikaruga
This is a monster, and I want one!
Posted on Reply
#36
64K
Review is up here.
Posted on Reply
#37
xorbe
Looking at the perf @ typical 2560x1400 ... 100/77 -> +29.87% increase. Somewhere here on TPU my WAG was +30%! :cool: aha!
Meh, probably 10-50% faster with an average of 30%.
Posted on Reply
#38
minorisprit
荷兰大母猪For this price I will wait for 980ti(maybe) or amd 390x. I never buy gtx Titan series for this crazy price and useless memory.
为何在这也能看到你
Posted on Reply
#39
qubit
Overclocked quantum bit
64KI think W1zzard posted a few days ago that he was already benching one but had to wait on the NDA to lift.
Thanks. I googled for a review and it came up Anandtech only, but alas it had been pulled. :/

When I asked my question I thought it was an official launch, but it seems it isn't.
Posted on Reply
#40
NC37
And AMD with still months to go before 300 series. Wonderful timing on those "leaks" AMD. Right before Titan launches, juicy details of the 390 emerge.
Posted on Reply
#41
Xzibit
Titan brand gets gimped...
PCPerspectiveDuring the keynote at GTC, NVIDIA's CEO quoted the single precision compute performance as 7.0 TFLOPS, which differs from the table above. The 6.14 TFLOPS rating above is based on the base clock of the GPU while the 7.0 TFLOPS number is based on "peak" clock rate. Also, just for reference, at the rated Boost clock the Titan X is rated at 6.60 TFLOPS.

A unique characteristic of this TITAN X card is that it does not have an accelerated performance configuration for double precision computing, which is something that both the TITAN and the TITAN Black had before it. The double precision performance is still a 1/32nd ratio (relative to single precision). That gives the TITAN X DP compute capability at just 192 GFLOPS. For reference, the TITAN Black has DP performance rated at 1707 GFLOPS with a 1/3rd ratio of the GPU’s 5.12 TFLOPS single precision capability. It appears that NVIDIA is not simply disabling the double precision compute capability on the GM200 GPU, hiding it and saving it for another implementation. Based on the die size, shader count and transistor count, it looks GM200 just doesn't have it.
TechPowerUpIdle temperatures are fine, idle-fan-off would have certainly been a possibility. Under load the card reaches its thermal limit of 84°C after a minute of gaming or so, which will cause it to lower Boost clocks, and in effect reducing performance.
Reference coolers aren't enough now. If it acts like that under DX9-11 imagine under DX12 when the GPU gets taxed more. Throttle and microstutter galore.

This might be Nvidia version of AMD 290X Hawaii reference cooler.
Posted on Reply
#42
Captain_Tom
MathraghWell, it certainly doesn't really bode that well if they lift the NDA substantially later than their official launch.
Scarily in line with the stuff major game publishers are pulling on us with game launches of which the reviews also are NDA-ed for quite a while after the launch, trying to hide all the flaws still in the game from wannabe early adopters.

I was watching the live feed of the presentation and i believe in the second sentence after he showed the Titan X he was already kind of apologizing for its weak DP math (only 0,7Tflops). I can certainly understand that if they really ment this as a gaming card (eventhough the name Titan in that case is more than a little misleading). However they then continue the rest of their presentation with only a very small graphics demo, filling the rest with cool stories about all the scientific and productivity math this thing can do.
I'm kinda missing their true focus there (apart from marketing this card as hard as possible ofc).I also find it scarily in line with the questionable
Titans were always gaming cards. Now they are just dropping the BS and doubling down on its real purpose.
Posted on Reply
#43
the54thvoid
Super Intoxicated Moderator
XzibitTitan brand gets gimped...





Reference coolers aren't enough now. If it acts like that under DX9-11 imagine under DX12 when the GPU gets taxed more. Throttle and microstutter galore.

This might be Nvidia version of AMD 290X Hawaii reference cooler.
Pretty much. Anand has this:


and says this:
The 55dB noise levels that result, though not extreme, also mean that GTX Titan X is drifting farther away from being a quiet card. Ultimately it’s a pretty straightforward tradeoff for a further 16%+ increase in performance, but a tradeoff nonetheless.
You get this though:



Almost double the performance of a 290X for less noise but at the same time, 980 sli is the same power as the OC Titan X and gives better performance.



Pretty sure a 6GB version will do better for less.

Apologies for using Anandtech charts - fit the bill for what I'm discussing.
Posted on Reply
#44
TheHunter
XzibitTitan brand gets gimped...





Reference coolers aren't enough now. If it acts like that under DX9-11 imagine under DX12 when the GPU gets taxed more. Throttle and microstutter galore.

This might be Nvidia version of AMD 290X Hawaii reference cooler.
Yep, I knew they weren't bluffing @ that Japan press conference back in 31 December 2014..

no DP for GM200, it will continue to be GK110 exclusive, Pascal will have it again.
www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-to-speed-up-development-of-graphics-processing-architectures/


Makes you wonder how can they get away with it and charge bs premium for useless 12gb vram.
Posted on Reply
#45
AsRock
TPU addict
the54thvoidPretty much. Anand has this:


and says this:



You get this though:



Almost double the performance of a 290X for less noise but at the same time, 980 sli is the same power as the OC Titan X and gives better performance.



Pretty sure a 6GB version will do better for less.

Apologies for using Anandtech charts - fit the bill for what I'm discussing.
And still after a year of having my 290X it's never used any were near that power usage in a game even with vsync off. People take power consumption a little to seriously.

And dont forget a new 290X cost about 2 1/2 times less and is most likley have a much better 3rd party cooler to boot.

I do like the look of the card it self but maybe there is enough fools to buy this at 1k in hope that it will lower the prices on lower end. but then we are talking nVidia one can hope.
Posted on Reply
#46
Captain_Tom
Thing should have been 1200 MHz+ at stock.
Posted on Reply
#47
qubit
Overclocked quantum bit
W1zzardNDA in ~1 hour... patience people
You do know that my hands were trembling uncontrollably waiting for the review don't you? :p

Ok, maybe not quite, but I was a bit on tenterhooks, lol.
Posted on Reply
#48
荷兰大母猪
minorisprit为何在这也能看到你
怎么了。。。
Posted on Reply
#49
Mindweaver
Moderato®™
说英语的家伙!<--- Translation = Speak English guys!
Posted on Reply
#50
AsRock
TPU addict
Mindweaver说英语的家伙!<--- Translation = Speak English guys!
This is all so why you can see

response What the

Well that's what Google says HAHAHA

Yes would be nice as people don't like to be taken out of their element.
Posted on Reply
Add your own comment
Nov 26th, 2024 14:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts