• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce Titan X 12 GB

It's all of us but we're wrong (in an abstract way).

Though we're forgetting how much of a non-jump Titan was over the GK104 (GTX680)

Untitled.png


GTX 680 had 70% the perf of Titan

Untitled2.png


GTX 980 has 77% the perf of Titan X.

It's undoubtedly not as good as Titan was to the 680. And it lacks the DP. It's got a much more limited appeal. And it's not doing too well with the cooler.

All in it's a floppity flop to me. I'm pretty sure a 980ti with GM200 core, a TDP saving 6GB memory and partner freedom will make a 10-15% faster variant which will be a good card to have.

I genuinely don't see how AMD can't top this in June. The 390X (by the above charts) only has to be 40% better than 290X to be a better option than Titan. Go team Red, happy to see what they can do.


EDIT:

Motherfucker. Cheapest is £870 at OcUK. Average is £900. EVGA HC is £1200

:roll::roll::roll::twitch:
If its all hype and nothing more they could, but based on the facts of limiting the voltage on the card, the lower starting clocks than at least I expected, and the fact its got a limit on the cooling performance even is leaving room for them to release something much higher in the future. I believe the 390X will come out, beat it by a decent enough margin everything blows up and then 1 month or around that area Nvidia will release the GTX 1080/980ti/whatever you want to call it and start its clocks much higher to make up for the deficit while allowing a higher voltage for those going for the gold. Since the core is actually full fledged (Unless I have missed something) the only way to go up is with clock speeds and ram speeds which to me they are holding back on purpose to not show their hand until they know what to expect.

Feel free to do the same to me Sony but from now on I'm 'ignoring' you. I cannot see your posts - you are dust to me.

Let the logical discussions continue.



For gaming I will upgrade once I see the 390X performance. I want to go to 4K when win10 is out (better desktop scaling I believe?) so I kinda need >3GB Vram. I just hope 390X has a 980ti dualing partner so most of us have a good choice.
Join the club, I have been a part of it for awhile :). Its much easier to ignore people and save the forums from constant arguments than to acknowledge it and turn a discussion off topic.
But yea, ram requirements have gone through the roof. We need 2 cards to duel it out otherwise the prices will not change though for me I am probably sitting this war out.
I would bet that the R9 390x will beat this Titan X. Nvidia didn't set the bar high enough this time imo. Possibly the gaming only version of GM200 with non-reference cooler will slip past the 390x but probably not by much if it does.
Man they stick it to you guys in the UK. £870 to £900 is $1,280 to $1,325 US.
Yea, I am actually quite shocked after really reading up on the overall consensus in the reviews it seems that its a great card but not exactly as high as we were all predicting including the limited overclocking spectrum (Considering the GTX 980 and likes are such fantastic overclockers).

I like custom watercooling, it's AIO's on GPU's I dislike. It's a copout solution for a problem that shouldnt be there if a GPU is designed correctly and not just brute force. Plus GPU AIO's are ugly as hell.
Meh, everyone likes different looks and some AIO's for GPU's can look pretty slick including the NZXT cooler (Least to me). But either way with the thermals we are already at the fan cooling systems are getting harder and harder to design. Even Titan X has its thermal limits hit on the Titan cooler that has been revered for so long. The problem is how do we revise the cooler enough so that we can handle these loads without causing issues for another card or other components? Blowing all GPU heat out of the case (Or at least most) is still the way preferred by most as we have seen what happens when that design is messed with (Similar to the HD 7990 for instance even though that's a dual GPU card).

Either way, I think we all agree this has become a waiting game to see what comes out and how the market will changes with prices. I think patience is a virtue in this case.
 
Looking at another way what makes it a "Titan"... Nvidias' de-contenting of double precision FP64 compute, is what enabled an advance in Perf/watt over original Titan (26%), while holding the die size at a moderate 50mm2 (9%) increase over the GK110 all respectable.

Interestingly work from GM204 (same DP de-contenting), the GM200 is right at 50% more GPU/Die than the GM204, so $550 +50%= $825. I reckon for moving up from a GM204; 50% bigger die, offering 23% more FpS, a lot of useless memory, while only 10% less efficient is still commendable.

I can see it being a fine enthusiast gaming card when it has 6Gb and ~$700. When it shows as a 980Ti with custom cooler that lets Boost Clocks be unfettered we could see 10% more (or like a 35% improvement above the 980). Then it would be a consumer offering. As the TitanX it is unequivocally not the prosumer card the original GTX Titan at least had going for it, more today just a collector edition.

Edit: Looking at a couple of reviews today I might want rethink what W1zzards' experience. Seems other don't see hot, loud, or power hungry, while some are finding +30% over a 980 and even good OC'n. Was this a runt of the litter?
 
Last edited:
I would bet that the R9 390x will beat this Titan X. Nvidia didn't set the bar high enough this time imo. Possibly the gaming only version of GM200 with non-reference cooler will slip past the 390x but probably not by much if it does.

Man they stick it to you guys in the UK. £870 to £900 is $1,280 to $1,325 US.

I thought this was the gaming only version :confused:
I guess it's the 980Ti, the overclocked versions of it could hipotetically be faster than a stock 390X.
 
As the TitanX it is unequivocally not the prosumer card the original GTX Titan at least had going for it, more today just a collector edition.
It will probably fare about the same. CG rendering is now aimed squarely at 4K and up - most GPU rendering that actually works is CUDA based. 6GB of framebuffer seems like the minimum entry fee for these higher resolutions. I'd also note that, lost amongst the hue and cry of gaming forums, the card is front and centre in a new and burgeoning area of development - the deep learning neural network that was intro'd to a wider audience at CES. Judging by the non-gaming graphics parallel computing forums, there is a fair bit of interest in the board by developers eager to get in on the ground floor of the technology. Much like GPGPU co-processing when it first arrived eight years ago it will probably be dismissed as superfluous, until it becomes ubiquitous and a substantial area of revenue.
Looking at another way what makes it a "Titan"... Nvidias' de-contenting of double precision FP64 compute, is what enabled an advance in Perf/watt over original Titan (26%), while holding the die size at a moderate 50mm2 (9%) increase over the GK110 all respectable.
To make full use of the FP64 if it had been included would likely have required a GK200 GPU on the very cusp of manufacturability (if that), at around 650mm². GK110 required another 50mm² added to it (doubling of register and cache resources) making the GK210 the same size as GM200 to tailor it to double precision workloads - and that is without adding anything in the way of improvements for gaming usage scenarios - what Tom Petersen was alluding to in the earlier video links.
Nvidia had to sacrifice FP64 for gaming performance and a manufacturable GPU, because power demand and process node are against shoehorning everything into a single die. It is also why the GK210 was tailored for FP64 work and won't be offered as a consumer GPU, since it adds little or nothing for gaming over GK110.
Interestingly work from GM204 (same DP de-contenting), the GM200 is right at 50% more GPU/Die than the GM204, so $550 +50%= $825.
Except that doesn't take into account the yield of the wafer and the effects of silicon defects. Scaling a GM 204 sized die up to a GM 200 sized one nets the approximate loss of one third of the die candidates, but it doesn't take into account that the larger the die the larger the loss of parts from defect.
0OpqzHD.png


The die calculation is a quick approximation. 19.95mm x 19.95mm for 398mm² (GM204), and 24.52mm x 24.52mm for 601mm² (GM200)
 

Attachments

This is truly a tribute to excess. I want it now, but will probably never shell out for it either.

BTW: good review,.......
 
very very disappointing overall card

No Shet! If GTX Titan X Maxwell is 50% on the Cuda Core Count of a GTX 980, and it doesn't pull around 50% or more FPS performance of a GTX 980 on all resolutions, it's not worth it... The 12 GB Framebuffer is the only thing worth paying for in the GTX Titan Maxwell. It's overkill, but if you're into Surround or 4K, even though the GPU is going to struggle, you might be able to crank up the AA a little with the 12 GB VRam.

In addition, a lot of people don't notice it, but the FPS performance of the R9-295x is basically what the R9-390x is going to pull, or better, at the TDP of a R9-290x, for half the price. What's really going to surprise me, and I doubt it would happen, I've seen some images that the R9-390x base clock is at 1.4 Ghz. If this is true, I'd be impress...

So ty W1zzard for putting things into perspective... Wish you could provide benches on GTX Titan X Maxwell in SLI. Provide some Frame Time Variance Curves to see how she dipps in G-Sync and no G-Sync.
 
Looks like Inno3D is outfitting Titan X with the hybrid cooler used on their GTX980/970 models.

products_id_192_1.jpg
 
Looking at another way what makes it a "Titan"... Nvidias' de-contenting of double precision FP64 compute, is what enabled an advance in Perf/watt over original Titan (26%), while holding the die size at a moderate 50mm2 (9%) increase over the GK110 all respectable.

Interestingly work from GM204 (same DP de-contenting), the GM200 is right at 50% more GPU/Die than the GM204, so $550 +50%= $825. I reckon for moving up from a GM204; 50% bigger die, offering 23% more FpS, a lot of useless memory, while only 10% less efficient is still commendable.

I can see it being a fine enthusiast gaming card when it has 6Gb and ~$700. When it shows as a 980Ti with custom cooler that lets Boost Clocks be unfettered we could see 10% more (or like a 35% improvement above the 980). Then it would be a consumer offering. As the TitanX it is unequivocally not the prosumer card the original GTX Titan at least had going for it, more today just a collector edition.

Edit: Looking at a couple of reviews today I might want rethink what W1zzards' experience. Seems other don't see hot, loud, or power hungry, while some are finding +30% over a 980 and even good OC'n. Was this a runt of the litter?
Did you mistype or am I missing something?
From the performance summary, we can see Titan X performance is considered 100% and GTX 980 performance is 77% at 1440p,
--> GTX 980 performance = 0.77 Titan X performance
--> Titan X performance = (1/0.77) GTX 980 performance ~ 1.2987 GTX 980 performance
That means Titan X offers ~29.87% more fps at 1440p. So, how the heck did you calculate to get the number 23%, dude?

In summary, we can say that at 1440p GTX 980 is 23% slower than Titan X if we consider Titan X performance the baseline to compare and Titan X is ~29.87% faster than GTX 980 in case GTX 980 performance is baseline. Why did you mess up these two baselines into Titan X performance only? Did you do that intentionally?
 
Last edited:
I've just had that discussion, I agree as well, to be correct Titan X should be stated to be approximately 29.87% faster than the 980. As far as "intentionally" messing them up, I highly doubt that would be the case.
 
Here's a deal for everyone:

If the R390X is 50% faster than the 290X and beats any GTX 980ti 6GB variant by >10% (@ 4K res) I will give both my watercooled 780ti Classified cards away on the TPU forums.

Some of you will know I'm serious. :toast:

They would go nicely with my Core 2 Duo.
 
12GB is good! I mean Nvidia certainly won't be lowering the price of the Titan X even if they cut the VRAM down to 6, or 8GB so the extra VRAM translates directly into more value for the end-user.
 
What in the world are you doing calling someone else a fanboy? Why are you even reading and posting on an Nvidia GPU thread? Silly. :rolleyes:

Cause AMD hasn't had anything note worthy in so long they got nothing better to do but troll nvidia release comments.

The card will also launch with third party air cooling (It uses slightly less power than the 290X after all). Also third party coolers like the Vapor-X are FAR quieter than the Titan X, and in fact this very review points out that the Titan X isn't even much quieter than a reference 290X.

Can't say it uses less power then a 290x, there is been 0 numbers on its TDP. Can't go by power connections with AMD anymore as they used 2x8pin to power a card that can use over 600watts of power. Fact they did an AIO water cooler tells me the card is 300+watts, even could be 350-400 range if they had to go down road of that kinda cooler. Its all speculation at this point as power draw. I doubt though it will be less power then 290x given the increase in # of gcn cores.
 
Anyone who buys this card for gaming is still pretty dumb. This card is clearly meant for other purposes, but Nvidia is branding it in a way so that gamers with too much money might still buy it. Just shows that GTX 970 SLI is still probably the ideal setup for 4K gaming at the moment.
 
GTX 680 had 70% the perf of Titan
GTX 980 has 77% the perf of Titan X.

It is kind of worth remembering though that by the time the Titan launched AMD had sorted the HD7970's drivers and released the revised GHz BIOS. And so the GTX680 was dueling it's main rival on performance while having less VRAM (something that has seen it's lifespan affected), the GTX980 on the other hand is beating it's closest rival (290X) and has equal VRAM.

My point is that the Titan-X may not be as impressive compared to the GTX980 as the Titan was compared to the GTX680, but that's mostly due to the GTX980 being more impressive than the GTX680 was.
 
Anybody get theirs yet? Guy I know bought on release day, and received it yesterday. He bought direct from nVidia.
 
Did you mistype or am I missing something?
So, how the heck did you calculate to get the number 23%, dude?
Why did you mess up these two baselines into Titan X performance only? Did you do that intentionally?
Thank you Sunfire Cape, Sincerely :rockout:
Regrettably I failed... in a flurry of typing the original post (#45) I just clicked back to see what W1zzards summery, and then had just failed to roll-up to the percentage. I later carried over that same "faux pas" to the post #127, as I had referred back to the #45 post for reference... it was then locked in my head as the number. No excuse, no malice, and now is more in line with other review data.

As to other reviews it (as said above in the Edit) seem to me that other reviewer aren't reporting the heat, noise, or even power number W1zzards sample is providing. The way W1zzard indicated these were “passed out”, was from a group with no one reviewer seemingly destined to a given sample, it appears luck of the draw. But that said I would still believe these where "ran through their paces" just to insure they are at least within the "normal distribution" of the bell curve.


\Except that doesn't take into account the yield of the wafer and the effects of silicon defects. Scaling a GM 204 sized die up to a GM 200 sized one nets the approximate loss of one third of the die candidates, but it doesn't take into account that the larger the die the larger the loss of parts from defect.
Excellent information and you are wholeheartedly correct, the simplistic figure I included doesn't provide the entire story as to the candidates that can be harvested and the risk that defects from a die with less candidates due to physical size might provide.
 
Jeepers, that's considerably worse. You blokes have a little more GST on goods over there too don't you?
15%, although sometimes it feels like they charge GST on GST!
Wise man say buy from overseas vendor who hones their creative writing skills on customs declarations.
As to other reviews it (as said above in the Edit) seem to me that other reviewer aren't reporting the heat, noise, or even power number W1zzards sample is providing.
All depends upon the testing application - and if it is a game, what workload the card and CPU are under. Any throttling and/or CPU utilization will skew the result, as will choice of game
getgraphimg.php

[Source]
 
Aww yes, cheapest Titan X here is 1249 Eur.

Could have been ok with 999 Eur price, anything more it's gonna stay where it is.
 
Aww yes, cheapest Titan X here is 1249 Eur.

Could have been ok with 999 Eur price, anything more it's gonna stay where it is.
Same here a TITAN X is 32500czk or more so that about 1275$
 
@Sony Xperia S

Yeah, nvidia really know how to milk the market of money and keep the prices high. This is what you're calling "evil" I presume?

No!

Money is not a value. But it has never been only this. Multiple actions and in general their imperial attitude. It isn't so simple to explain.

Actually, they do not do anything special, it is just some strange symbiosis between them and people. Given that many people in general are bad, then.......

The difference with nvidia is that they're pretty shrewed and have the best products on the market by far

Which one would you prefer? The R9 295X2 for 624$ or this new TitanX for double the price and still lower performance.

http://www.newegg.com/Product/Produ...&cm_re=radeon_r9_295x2-_-14-131-584-_-Product

Remember, the R9 295X2 is still the fastest graphics card and titan did nothing but to close the gap.
 
Back
Top