• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Launches the GeForce GTX TITAN X

I hope that doesn't turn out to be true but if it is then it may mean that Pascal isn't too far off in which case I will wait on Pascal. $1,000 for a gaming single GPU isn't for me.

0.2 Tflops DP, according to nvidia. There's room for the 6gig version of this card(gtx980ti). But I think nvidia will wait for amds r9-390x first(milking mode on, no point of cannibalizing it's own sells).
 
so no DP. for what reason this thing priced at 1k?

Because they can. It is now their premier card. They will charge an iconic price for what they view as their iconic product.
Finally, they are charging within what their research tells them people are willing to pay, and that is the true price of products.
 
  • Like
Reactions: 64K
I hope that doesn't turn out to be true but if it is then it may mean that Pascal isn't too far off in which case I will wait on Pascal. $1,000 for a gaming single GPU isn't for me.

The vague road map looks like Pascal is 2016 onwards. That's an easy slip into 2017 if the nm process isn't on track. I'd say a year to go for Pascal at least but that's not even an educated guess.

EDIT: Techspot review is out... Disappointing.
 
Last edited:
Techspot already out for the review

"The Titan X was also 47% faster than the R9 290X on average at 2560x1600 while it was just 36% faster at 3840x2160"

&

"While the Titan X was just 8% slower than the R9 295X2 at 2560x1600, it was 22% slower at 3840x2160"

Mehhh card then for $1000...

Can't wait for W1zz review..
 
Day one buy for me, call me sucker :p

This thing is gonna do work under water :toast:
 
From the last page it sounds like the price is what is getting poo-poo'd pretty hard.
As it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them... I've been there before.
 
Last edited:
As it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them.

Plus it will give Nvidia some time to improve the drivers for the GM200.
 
Plus it will give Nvidia some time to improve the drivers for the GM200.

Same architecture as GM204, there isn't gonna be THAT much improvement. Kepler has been an example.

GM200 is just a bigger chip of an architecture that's been out for a while.

As it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them.

I partially agree with you, I'm here to advocate for the crowd that uses this thing for things that are not strictly related to gaming only.

When I load up a scene in Octane to render with CUDA I am limited by my GPU VRAM.

This 12GB IS gonna be a BOON for people like me that work with GPU renderers.

The price is steep but it's gonna pay itself in a month for me.
 
  • Like
Reactions: 64K
As it should be. Even if money was no object for me, I'd still wait for a 980ti that will most likely deliver the same, if not better performance for less cash in what... 3 months maybe? They're just going after the "gotta have it now, spend without actually thinking" crowd. There's plenty of those kind of folks to buy them.

Yeah, not gonna be me this time.
 
Review is up here.
 
Looking at the perf @ typical 2560x1400 ... 100/77 -> +29.87% increase. Somewhere here on TPU my WAG was +30%! :cool: aha!

Meh, probably 10-50% faster with an average of 30%.
 
I think W1zzard posted a few days ago that he was already benching one but had to wait on the NDA to lift.
Thanks. I googled for a review and it came up Anandtech only, but alas it had been pulled. :/

When I asked my question I thought it was an official launch, but it seems it isn't.
 
And AMD with still months to go before 300 series. Wonderful timing on those "leaks" AMD. Right before Titan launches, juicy details of the 390 emerge.
 
Titan brand gets gimped...

PCPerspective said:
During the keynote at GTC, NVIDIA's CEO quoted the single precision compute performance as 7.0 TFLOPS, which differs from the table above. The 6.14 TFLOPS rating above is based on the base clock of the GPU while the 7.0 TFLOPS number is based on "peak" clock rate. Also, just for reference, at the rated Boost clock the Titan X is rated at 6.60 TFLOPS.

A unique characteristic of this TITAN X card is that it does not have an accelerated performance configuration for double precision computing, which is something that both the TITAN and the TITAN Black had before it. The double precision performance is still a 1/32nd ratio (relative to single precision). That gives the TITAN X DP compute capability at just 192 GFLOPS. For reference, the TITAN Black has DP performance rated at 1707 GFLOPS with a 1/3rd ratio of the GPU’s 5.12 TFLOPS single precision capability. It appears that NVIDIA is not simply disabling the double precision compute capability on the GM200 GPU, hiding it and saving it for another implementation. Based on the die size, shader count and transistor count, it looks GM200 just doesn't have it.

TechPowerUp said:
Idle temperatures are fine, idle-fan-off would have certainly been a possibility. Under load the card reaches its thermal limit of 84°C after a minute of gaming or so, which will cause it to lower Boost clocks, and in effect reducing performance.

Reference coolers aren't enough now. If it acts like that under DX9-11 imagine under DX12 when the GPU gets taxed more. Throttle and microstutter galore.

This might be Nvidia version of AMD 290X Hawaii reference cooler.
 
Well, it certainly doesn't really bode that well if they lift the NDA substantially later than their official launch.
Scarily in line with the stuff major game publishers are pulling on us with game launches of which the reviews also are NDA-ed for quite a while after the launch, trying to hide all the flaws still in the game from wannabe early adopters.

I was watching the live feed of the presentation and i believe in the second sentence after he showed the Titan X he was already kind of apologizing for its weak DP math (only 0,7Tflops). I can certainly understand that if they really ment this as a gaming card (eventhough the name Titan in that case is more than a little misleading). However they then continue the rest of their presentation with only a very small graphics demo, filling the rest with cool stories about all the scientific and productivity math this thing can do.
I'm kinda missing their true focus there (apart from marketing this card as hard as possible ofc).I also find it scarily in line with the questionable

Titans were always gaming cards. Now they are just dropping the BS and doubling down on its real purpose.
 
Titan brand gets gimped...





Reference coolers aren't enough now. If it acts like that under DX9-11 imagine under DX12 when the GPU gets taxed more. Throttle and microstutter galore.

This might be Nvidia version of AMD 290X Hawaii reference cooler.

Pretty much. Anand has this:

72561.png

and says this:

The 55dB noise levels that result, though not extreme, also mean that GTX Titan X is drifting farther away from being a quiet card. Ultimately it’s a pretty straightforward tradeoff for a further 16%+ increase in performance, but a tradeoff nonetheless.

You get this though:

72553.png


Almost double the performance of a 290X for less noise but at the same time, 980 sli is the same power as the OC Titan X and gives better performance.

72557.png


Pretty sure a 6GB version will do better for less.

Apologies for using Anandtech charts - fit the bill for what I'm discussing.
 
Titan brand gets gimped...





Reference coolers aren't enough now. If it acts like that under DX9-11 imagine under DX12 when the GPU gets taxed more. Throttle and microstutter galore.

This might be Nvidia version of AMD 290X Hawaii reference cooler.
Yep, I knew they weren't bluffing @ that Japan press conference back in 31 December 2014..

no DP for GM200, it will continue to be GK110 exclusive, Pascal will have it again.
http://www.kitguru.net/components/g...lopment-of-graphics-processing-architectures/


Makes you wonder how can they get away with it and charge bs premium for useless 12gb vram.
 
Pretty much. Anand has this:

72561.png

and says this:



You get this though:

72553.png


Almost double the performance of a 290X for less noise but at the same time, 980 sli is the same power as the OC Titan X and gives better performance.

72557.png


Pretty sure a 6GB version will do better for less.

Apologies for using Anandtech charts - fit the bill for what I'm discussing.

And still after a year of having my 290X it's never used any were near that power usage in a game even with vsync off. People take power consumption a little to seriously.

And dont forget a new 290X cost about 2 1/2 times less and is most likley have a much better 3rd party cooler to boot.

I do like the look of the card it self but maybe there is enough fools to buy this at 1k in hope that it will lower the prices on lower end. but then we are talking nVidia one can hope.
 
NDA in ~1 hour... patience people
You do know that my hands were trembling uncontrollably waiting for the review don't you? :p

Ok, maybe not quite, but I was a bit on tenterhooks, lol.
 
说英语的家伙!<--- Translation = Speak English guys!
 
Back
Top