# NVIDIA GeForce Titan X 12 GB



## W1zzard (Mar 13, 2015)

Today, NVIDIA launches the GeForce Titan X, their fastest single-GPU card yet. It is built on a brand-new, fully unlocked GM200 silicon with 3072 shaders and a massive 12 GB of VRAM. Thanks to the energy efficient Maxwell architecture, this $999 card only consumes 225 watts during typical gaming.

*Show full review*


----------



## Ikaruga (Mar 17, 2015)

Many thanks dear good sir


----------



## radrok (Mar 17, 2015)

Thank you W1zzard, thoroughly reading right now.


----------



## BiggieShady (Mar 17, 2015)

I don't always use meme, but when I do is when I have to.


----------



## natr0n (Mar 17, 2015)

Seems like nvidia "downloaded"too much ram for that gpu.


----------



## RCoon (Mar 17, 2015)

Hmm, still not quite 4K ready. ~40 FPS isn't bad, but not quite there.


----------



## Tsukiyomi91 (Mar 17, 2015)

Killer $1000, single GPU pixel pusher. Blew thru ALL resolutions from 1080p is superb. 12GB is a little too much for today's game but it handles just fine. Could last for a good 6 years. All in all is good, but to save money, a pair of GTX970s is decent & saves you roughly $300+ or so, can use that for a large capacity, super-fast SSD or a higher rated PSU with 80+ Gold. 4K handling on the other hand is ok but better if AA is turned off & running at High IF you want decent FPS count.


----------



## radrok (Mar 17, 2015)

Alright, it's a bit on the hot side but it's gonna be on waterblock day one, power delivery is still a bit scarce from what I would have liked to see from a 999$ GPU.

Performance is there and it's gonna be even more when overclocked, let's also hope for some voltage increase allowance.

As you said RCoon the GPU is not quite 4K ready but it's gonna be enough for single GPU gaming at 3440x1440 with an overclock.


----------



## Ikaruga (Mar 17, 2015)

RCoon said:


> Hmm, still not quite 4K ready. ~40 FPS isn't bad, but not quite there.


Indeed, you still can't even play most games at 120 or 144fps in 1080p if you max out everything, let alone higher resolutions


----------



## the54thvoid (Mar 17, 2015)

Well, as much as my cards pull way more power from the wall, the performance drop is considerable at too high a price.
I'll wait till June for sure. Was hoping for 50% above 980 performance so I'm actually disappointed. Well, my wallet stays healthy for at least 3 more months.


----------



## 64K (Mar 17, 2015)

At my resolution (1440p) it performs very well. Still going to wait on the gaming only card with 6 GB and non-reference coolers. I don't like that it throttles after 1 minute of gaming because it hits 84 degrees. From a value perspective at my resolution a pair of GTX 970s ($700) or an R9 295X2 ($700) beats it by a little so $300 more is overpriced. Two of the gaming only version of GM200 should handle 4K very well. It's one more step to 4K on a single GPU.


----------



## Luka KLLP (Mar 17, 2015)

That power consumption though... 

Btw, has anyone else noticed that W1zzard has a 980 Ti in his system specs?


----------



## W1zzard (Mar 17, 2015)

Luka KLLP said:


> Btw, has anyone else noticed that W1zzard has a 980 Ti in his system specs?


lol fail .. I was playing with the System Specs feature (to add mouse and keyboard) and forgot to change it back


----------



## jabbadap (Mar 17, 2015)

Great review as always and thank you for this:


Spoiler











Quite hilarious graph. Do you have a pair of them, would be nice to see sli review at 4k.


----------



## Sasqui (Mar 17, 2015)

Surprisingly mediocre for the money, but it takes the crown for the time being.


----------



## W1zzard (Mar 17, 2015)

jabbadap said:


> Do you have a pair of them


I wish


----------



## GhostRyder (Mar 17, 2015)

Excellent review @W1zzard, glad to see this card lives up to its reputations especially with those performance gains it shows!

I am a tad disappointed by the overclocking though, I mean the ram overclocks like a dream (That memory controller is great!) but the core clocks are not as great as  Iwas hoping/expecting.

Overall though, nice card!


----------



## radrok (Mar 17, 2015)

^On another review, which I will not mention the site (don't know if I'm allowed here) the GPU boosted up to 1450 MHz.

It's hugely down to silicon variance, as always...


----------



## W1zzard (Mar 17, 2015)

radrok said:


> ^On another review, which I will not mention the site (don't know if I'm allowed here) the GPU boosted up to 1450 MHz.
> 
> It's hugely down to silicon variance, as always...


It's alright if you mention the site, as long as you don't promote something for personal gain.

Absolutely no way my card could boost up to 1450 MHz at stock voltage. It seems the variation is bigger with this release. EU samples were randomized, saw it with my own eyes, everybody just got a random card off the pile. Original Titan everybody got "his" sample, which had special markings to identify leaks


----------



## hardcore_gamer (Mar 17, 2015)

Still can't play Crysis at 4K, even without AA. So we can keep trolling forums with "Can it play Crysis?" till Pascal comes out.


----------



## mroofie (Mar 17, 2015)

Seems to be a crappy card for the price


----------



## DeNeDe (Mar 17, 2015)

For a price of  ~1000$ it is awfully under 970 SLI configs :|


----------



## HumanSmoke (Mar 17, 2015)

the54thvoid said:


> Well, as much as my cards pull way more power from the wall, the performance drop is considerable at too high a price.
> I'll wait till June for sure. Was hoping for 50% above 980 performance so I'm actually disappointed. Well, my wallet stays healthy for at least 3 more months.


Seconded. The card judging by the power consumption is on a very tight leash. I think I'm with you - 390X or 980 Ti Classified, and hopefully the vendors allow both cards and user to be unpacked from their cotton wool.


----------



## jabbadap (Mar 17, 2015)

hardcore_gamer said:


> Still can't play Crysis at 4K, even without AA. So we can keep trolling forums with "Can it play Crysis?" till Pascal comes out.



Forget the crysis, look at this:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/9.html

Almost runs it on full hd


----------



## Rahmat Sofyan (Mar 17, 2015)

Still...Whyyyyyyyyyyyyyyyyyyyyyyy no backplate like GTX 980, is run too hot W1zz?


----------



## 64K (Mar 17, 2015)

jabbadap said:


> Forget the crysis, look at this:
> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/9.html
> 
> Almost runs it on full hd



That game is an Ubislop mess of a PC port.


----------



## the54thvoid (Mar 17, 2015)

This is interesting.

Hexus get's a 20% OC boost in performance....  No voltage and similar cocks to W1zzard.

Not the best game but an idea of performance.








then there's this:


----------



## hardcore_gamer (Mar 17, 2015)

This looks like a gaming card as double precision is not fully enabled.  $1000 is way too much for a gaming card.


----------



## btarunr (Mar 17, 2015)

W1zzard said:


> lol fail .. I was playing with the System Specs feature (to add mouse and keyboard) and forgot to change it back



Yup, that's totally what happened. What he said.


----------



## GreiverBlade (Mar 17, 2015)

oh well not he card that will make me ditch my 290... alike the 980: the price is presumptuous for what it give over that one, i still get 80+ fps in most of my games (on high/ultra) let's wait the 390X to see what will be the next move  (or even the 380X i suspect AMD just refine the 290X and give the 380X out from that because: it can still compete.)


----------



## newtekie1 (Mar 17, 2015)

I'm so pleased you included 970 SLI in the performance charts.  I'm also so glad I went with SLI 970s...


----------



## radrok (Mar 17, 2015)

newtekie1 said:


> I'm so pleased you included 970 SLI in the performance charts.  I'm also so glad I went with SLI 970s...



Yep, that's a solid alternative.


----------



## jabbadap (Mar 17, 2015)

Now this is something:
http://www.evga.com/Products/Specs/GPU.aspx?pn=4FE6AF60-D2CD-45BE-B32A-5E2907099334

Btw. any word if there will be other AIB custom versions of this?


----------



## xorbe (Mar 17, 2015)

jabbadap said:


> Now this is something:
> http://www.evga.com/Products/Specs/GPU.aspx?pn=4FE6AF60-D2CD-45BE-B32A-5E2907099334
> 
> Btw. any word if there will be other AIB custom versions of this?



Wow, non-stock Titan X cards this round.


----------



## radrok (Mar 17, 2015)

xorbe said:


> Wow, non-stock Titan X cards this round.



I wouldn't get my hopes up, EVGA was allowed to use REFERENCE PCB to make a Hydro Copper version with the original Titan too.

It will probably be the only instance of non reference cooled Titan X.

I HOPE I'm proven wrong though.


----------



## jabbadap (Mar 17, 2015)

64K said:


> That game is an Ubislop mess of a PC port.



Yeah I agree, and it's equally messy on consoles too. Wonder if that one will ever be fixed. I certainly hope so, because graphics fidelity in that game is quite gorgeous.


----------



## alwayssts (Mar 17, 2015)

RCoon said:


> Hmm, still not quite 4K ready. ~40 FPS isn't bad, but not quite there.



Like I keep saying...this gen is 1440p single card, 4k dual card...this gen at most was going to get us 75% of the way there (3200x1800 or slightly less?)  Single chip 4k will be 14nm/16nm...conceivably even 2nd gen (first gen might be shrinks/mem consolidation through 2nd gen HBM etc).

It's also why AMD doesn't need to be as fast, just consistent across different titles.  They literally just need to keep above 60 at 1440p in most worse-case scenarios and 30+ (whatever scaling generally turns out to be on avg...vicariously 60 in dual config) at 4k, even if through overclocking, because that in practicality is all that matters.  The questions most people (as most don't use adaptive sync) ask themselves is if it will generally stay above 30/60fps (or again vicariously below 16.66/33.333ms) and at what price.  The question is not if it will do 38 vs 35 fps.   This is nvidia's gpu that mostly can (and through overclocking seems probably will almost always) accomplish that.

Also,

Big thanks to W1zzard for this:

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/33.html

It pretty much proves my point:  8GB is generally the sweet spot (conceivably for the life of the current consoles).  That said, I respectfully disagree with his '640k 4GB is enough for everybody' conclusion.  I think more and more games will push up against that wall, or at least 6GB as Mordor does (scaling from a 900p XB1/1080p PS4 game), as it is clearly shown which titles consume the most memory:  *AC, Mordor, COD (even if busted), Rise, W_D, DR3.  What do these games have in common?  *If you said they were developed primarily towards the memory architecture of the ps4/xb1...You'd be right.  That higher memory usage is not going to change, and only become more apparent as creators target not only more tricky uses for the console memory arch to squeeze performance from those otherwise-limited machines, but also start targeting lower-resolutions on consoles that will require greater texture scaling on PC.

For example, Mordor is currently 1080p on PS4, 900p on XB1.  What if next time they targeted 720p on xbox and 900p on ps4 (which I think will end up being a very common occurrence for multiplat as time passes, as it allows xb1 to still remain HD)?  You end up with 8GB making a lot more sense,  if not the most sense.  While that may occur (worst case scenario), the compute power of the consoles never changes and performance in that respect for scaling is a known...hence why cards are aimed at certain metrics outlined in my top-most comment.

At any rate, I primarily was eying the upcoming baby brother to this because I assumed (and rightfully so it appears) Mordor would consume just under 5.5GB at a higher resolution (given memory use for me before the game starts doing that weird hiccup thing) and would make sense for many more games that might have a similar mentality on console (900-1080p).  I think it also verifies 3.5GB is indeed an issue for that game at some resolutions where 4GB would not be one, but also 4GB could be an over-all limiting factor where core performance may not be (4GB 390x?), even if more consistent because not (weird switching to separate memory controller) issues.

TLDR:  If you could continue to add memory usage as more titles are released, it would be most appreciated.  I would be willing to bet people would LOVE to know what Witcher will demand at various resolutions...

As always, appreciate the thorough review with nice graphs.  Only thing I would mention is to please not forget us over-volters (at least for one sku based on a chip), although I can tell it's become less of a priority.  The clock-scaling/voltage/leakage graphs you used to do on new parts was REALLY awesome/helpful.  I hope that doesn't get left-behind completely.


----------



## W1zzard (Mar 17, 2015)

alwayssts said:


> please not forget us over-volters


there is no software voltage control because the voltage controller has no i2c interface. so i guess i'd have to bios mod or solder mod. and then try to not blow up my card


----------



## Animalpak (Mar 17, 2015)

4K users here is your war horse.

There can be a 980 Ti at this point with 8 GB of RAM.

I think the RAM is the thing that makes the price so high and of course the name TITAN.


----------



## radrok (Mar 17, 2015)

W1zzard said:


> there is no software voltage control because the voltage controller has no i2c interface. so i guess i'd have to bios mod or solder mod. and then try to not blow up my card



Would it be possible to impose a forced voltage to the controller like it was for the OG Titan?


----------



## W1zzard (Mar 17, 2015)

radrok said:


> Would it be possible to impose a forced voltage to the controller like it was for the OG Titan?


No, that controller had I2C, so you could talk to it with software. This one is dumb and just has VID lines going to it, which form a bit pattern that defines the voltage.


----------



## jabbadap (Mar 17, 2015)

Oh, I forget. What is video decode/encode support with this card? Same as gm206 or gm204.


----------



## RCoon (Mar 17, 2015)

W1zzard said:


> No, that controller had I2C, so you could talk to it with software. This one is dumb and just has VID lines going to it, which form a bit pattern that defines the voltage.



Assuming that's because NVidia are now marching towards no overvolting on any cards, so they can get away with making the cheapest possible VRM assembly section. It'll all have to be done by AIB's, if they get a chance to touch Titan X (which I doubt).



jabbadap said:


> Oh, I forget. What is video decode/encode support with this card? Same as gm206 or gm204.



NVENC encode as well as HEVC. Decode, I'm assuming VP6 or 7(likely 7 like the 960) (PureVideo) with the ability for HEVC decoding.


----------



## bogami (Mar 17, 2015)

Costly would say even far too expensive depending on the results and produce costs. The results are good but would not appoint these cards to 4K gameplay . It  will, therefore(4K), requires SLI and of course for physics need is one GPU, because we know that if only one of the two will be set for physics so we need 3 GPU. This could be solved long ago, as a matter in the software. Something for the current fad would say because there is a 20nm. 
So too expensive, under desired performances, expected results, Bad cooled a little OC area. I would say that it is more profitable to buy a GTX 980 and max OC.   For the rich and stupid.  I would not buy unless it was necessary and I was very rich.


----------



## Casecutter (Mar 17, 2015)

Hum Maxwells' mojo didn't seem to scale... It's odd the cooling solution should have been plenty to have it run free, but it seems like to curtail power consumption (aka heat) they reign in the clocks.  Isn't that what it seems by the thermal camera. Then the "matches the Radeon R9 290X noise"... ouch!

As W1zzard said _"*So what's all the fuss about?* Roughly 50% more number-crunching muscle as the GTX 980, a 50% wider memory bus, and three times more memory."_ And from all that they gain 23% FpS increase @ 1440 (not earth shattering), while on the Average/Peak power numbers are up like 42/32% respectively.  The impression I’m left with is the excess memory is contributing to both heat and power it doesn’t make use of.

Is it me or is it interesting the Titan is made with Hynix, while the reference GTX 980 came out with Samsung modules. Wonder if that could be contributing to the memory running hot under W1zzard thermal camera?

Almost like too much of a good thing?


----------



## Ferrum Master (Mar 17, 2015)

Don't agree the criticism about vram. 12GB is a good amount for the targeted audience(cheap quadro, raytracing), and the criticism towards COD... that it never uses again the portion of the data... the hell free ram = wasted ram.

Overall. The card is a disappointment as expected... this should be the current flagship for the 980ties price... not a cut down second tier historical MX like card... everything is gone bonkers.


----------



## mx500torid (Mar 17, 2015)

@the54thvoid 
This is interesting.

Hexus get's a 20% OC boost in performance.... No voltage and similar cocks to W1zzard.

Not the best game but an idea of performance.

O man soda out my nose .


----------



## RCoon (Mar 17, 2015)

mx500torid said:


> similar cocks to W1zzard.



Somebody sig it quick.


----------



## buildzoid (Mar 17, 2015)

@W1zzard you made a mistake with the controller's name. There is no NCP8114 it's an NCP81174. I think the same controller is used for MSI gaming GTX 980 and GTX 970 and I know that one of those has a Vmod guide using the FB pin. The good news is that this VRM is the same oddity that the GTX TITAN BE and GTX 780 Ti had so it should be mostly OK up to 1.35V.

Here datasheet

Looks simple enough to mod. Cut the ILIM and put a VR on the VFB pin. If you aren't that hardcore you can short the shunt resistors near the PCI-e power connectors using CLU(because you can remove it if you want) and you'll also disable the power limit.


----------



## REAYTH (Mar 17, 2015)

newtekie1 said:


> I'm so pleased you included 970 SLI in the performance charts.  I'm also so glad I went with SLI 970s...


Hell I'm still running two 670's in SLI at 1080p and have yet to see a reason to upgrade. Every game maxed out at above 60 FPS. If I ever upgrade monitors MAYBE I might need a new GPU but this is yet another generation I can skip and still max out everything.


----------



## W1zzard (Mar 17, 2015)

buildzoid said:


> @W1zzard you made a mistake with the controller's name. There is no NCP8114 it's an NCP81174. I think the same controller is used for MSI gaming GTX 980 and GTX 970 and I know that one of those has a Vmod guide using the FB pin. The good news is that this VRM is the same oddity that the GTX TITAN BE and GTX 780 Ti had so it should be mostly OK up to 1.35V.
> 
> Here datasheet


Whoops .. I need to l2read off photos 
Vmod using FB is always possible, but soldering required. some decent software control would be nicer


----------



## buildzoid (Mar 17, 2015)

W1zzard said:


> Whoops .. I need to l2read off photos
> Vmod using FB is always possible, but soldering required. some decent software control would be nicer


TBH after I learned how to Vmod using my R7 260X I'm really fond of having hardware level voltage control. Though for 90% of people soldering is a pain or not even an option.


----------



## FreedomEclipse (Mar 17, 2015)

W1zzard said:


> lol fail .. I was playing with the System Specs feature (to add mouse and keyboard) and forgot to change it back



I remember asking for a small section to add what 'audio solutions' we were running - e.g. what headphones, AV receivers, headphone amps or speakers we were using etc etc. 

I remember asking before this place got migrated over to the new Xenforo system (I remember asking for the website to be updated as its been the same since i joined and even though not much feedback was given and people were saying dont fix what aint broke - the site still got a facelift anyway)

I know its probs not the right thread to ask, but i see you here so i might as well put it across while youre here and since 'changes' have been mentioned


----------



## the54thvoid (Mar 17, 2015)

mx500torid said:


> @the54thvoid
> This is interesting.
> 
> Hexus get's a 20% OC boost in performance.... No voltage and similar cocks to W1zzard.
> ...



I'm not even going to correct that typo, it's worth keeping.


----------



## ShurikN (Mar 17, 2015)

The more I look at the 4k charts, the better that 295 looks. Man what a beast...
1000 bucks is just too much for the fastest single gpu title.


----------



## R00kie (Mar 17, 2015)

My 970's were well worth the investment, it seems.


----------



## nunomoreira10 (Mar 17, 2015)

There will definitely be a 6GB 980ti, if they call it like that,
after all they need to sell the half functional dies and it sure won't be on the professional market
great GPU, but greedy Nvidia, as always


----------



## Xzibit (Mar 17, 2015)

Titan brand tried to provide the best of both worlds.  SP/DP and gaming.

Now with Titan X it seems its just aimed at SP & gaming with further hardwire restrictions the Titan brand was known for.  Less for the same price.  I wouldn't be surprised if the drivers started to be more restrictive as well to force the people that were looking at the Titan brand as a cheap alternative to a Quadro to buy a Quadro.

WTF?
DP TFLOPS
Titan X = 0.2
Titan Z = 2.6
Titan Black = 1.7

Tom Petersen said it was a full chip then when asked about DP said they would need more space to add them during *PCPer Live! GeForce GTX TITAN X Live Stream*.

So is it a full chip or not? What's in the M6000?

This looks more like a very restrictive 980 TI / 1080 more than a Titan.


----------



## m6tzg6r (Mar 17, 2015)

Blower fan stock cooler? Yawn give me an MSI Lightning version and then we can talk, but seeing no Titan gets aftermarket versions, except that Windforce cooler Gigabyte packaged with the Titan Black, then the Titan X will just be a card that is too hot and too noisy for my taste. But if there is ever a 6GB 980Ti Lightning, then oh baby its time to SLI and get my 4K on.


----------



## Jetster (Mar 17, 2015)

Interesting how GPUs and CPUs are always $1000 when released


----------



## v12dock (Mar 17, 2015)

I assume World of Warcraft was not in this review because of the changes in patch 6.1?


----------



## Fluffmeister (Mar 17, 2015)

Impressive performance and efficiency with the token $1000 price tag.

It's glory days are obviously still to come in the form of lovely custom 6GB 980 Ti's with the usual love from MSI, ASUS and the like.


----------



## W1zzard (Mar 17, 2015)

Xzibit said:


> What's in the M6000?


GM200 (I have a GPU-Z submission to confirm that)



v12dock said:


> I assume World of Warcraft was not in this review because of the changes in patch 6.1?


That's correct, didn't have enough time to rebench all cards on 6.1


----------



## Xzibit (Mar 17, 2015)

W1zzard said:


> GM200 (I have a GPU-Z submission to confirm that)



Which begs the question is this a cut down chip already since its missing the DP? or is the M6000 that bad at DP as well?

The K6000 had DP of 1.4TFLOPS

Original Titan was a cut down version of the K6000/Titan Black.  Curious if we aren't seeing a repeat of a gimped Titan for a full version later.  Have to wait and see if the M6000 is different once its specs are revealed.


----------



## qubit (Mar 17, 2015)

So, finally Big Maxwell is here and... I'm gonna sit this one out. I'm not even gonna wait for the GTX version with a silent cooler and a more reasonable price point.

Why?

It only offers 30-50% better performance than my 780 ti which I don't think is enough to warrant replacing a £500 card that still works fabulously well today. I want to see doubled performance before I replace it or close to it. Also, it's not fully DX12 compliant in hardware since DX12 hasn't been fully defined yet.

No, I'll keep that £500 investment a while longer and see what the next generation offers, especially with DX12. If I didn't already have the 780 Ti I would have bought the GTX version of this, though.

I also see that the stock cooler isn't sufficient to stop it throttling under normal use, it's got too much fan noise and even coil whine when all these issues can be solved quite easily and without increasing the price. I don't think that's good enough in a $1000 card.

Yes, I'm a little disappointed with this card.


----------



## Caring1 (Mar 17, 2015)

the54thvoid said:


> This is interesting.
> 
> ..........  No voltage and similar cocks to W1zzard.


Someone has been peeking through Wizzards window again ..


----------



## arbiter (Mar 18, 2015)

Rahmat Sofyan said:


> Still...Whyyyyyyyyyyyyyyyyyyyyyyy no backplate like GTX 980, is run too hot W1zz?



Tom from nvidia was on a podcast with Ryan from PCper, he said reason was would cause heat issues, would be less space in a system if you have multi titan cards in a system next to each other. The backplate would limit the air flow and cause throttling issues.


----------



## Ebo (Mar 18, 2015)

Am I the only one getting a flashback reading the exellent review from W1zzard about, high temps, throtteling, fan noise ?, to me it sounds just like when R9 290X ref. came out uuufff.

Will nvidia allow AIB's to put another cooler on the card, I dont think so, since the Titan line is in a class of its own.

Is the card worth a 1000 dollars, well to me thats both a yes and a no, if you look at preformance its tricky then the price seems too high. If you want the top of the pop, we have allways  had to pay a premium price, eventhough its hard to justicefy, that dosent matter.
What I dont like, is 30% of ram is never used, no matter how bad a game you set it to run which in my eyes means you pay for 4GB of ram, that you will never even come close to use.


----------



## 15th Warlock (Mar 18, 2015)

Thanks for the review.

Just a question, is this a hard launch? can't seem to find the card for sale anywhere yet :/


----------



## Xzibit (Mar 18, 2015)

15th Warlock said:


> Thanks for the review.
> 
> Just a question, is this a hard launch? can't seem to find the card for sale anywhere yet :/



Here you go

*PNY - Titan X $999.99*


----------



## xorbe (Mar 18, 2015)

> peaks our interest



 Piques?


----------



## ZoneDymo (Mar 18, 2015)

very very disappointing overall card


----------



## qubit (Mar 18, 2015)

15th Warlock said:


> Thanks for the review.
> 
> Just a question, is this a hard launch? can't seem to find the card for sale anywhere yet :/


It happens sometimes that the cards are listed a couple of days later. I can't find it at the usual UK retailers either, so just give it a bit.


----------



## 15th Warlock (Mar 18, 2015)

Xzibit said:


> Here you go
> 
> *PNY - Titan X $999.99*



Unfortunately, I was able to add the card to my cart, but on checkout it says the card is available for preorder only, no estimate on the date of delivery, it also limits your preorder to one card only 

It looks like you can order the card directly from the nvidia store, and the card is out of stock currently


----------



## qubit (Mar 18, 2015)

@Xzibit @15th Warlock This is looking a bit like a paper launch, isn't it?


----------



## 15th Warlock (Mar 18, 2015)

qubit said:


> @Xzibit @15th Warlock This is looking a bit like a paper launch, isn't it?



Unfortunately, yes, it looks like it is a paper launch so far 

Nvidia must have had very limited quantities available for sale at their own store, to justifiy the "hard launch" title for people reviewing the card 

EDIT: From the Anandtech review:

_Finally, for launch availability this will be a hard launch with a slight twist. Rather than starting with retail and etail partners such as Newegg, NVIDIA is going to kick things off by selling cards directly, while partners will start to sell cards in a few weeks. For a card like GTX Titan X, NVIDIA selling cards directly is not a huge stretch; with all cards being identical reference cards, partners largely serve as distributors and technical support for buyers._

So no EVGA, Gigabyte, Asus or other cards available for the time being, only "Nvidia" branded reference cards, and in very limited quantities it seems. What disappoints me the most is the:
_while partners will start to sell cards in a few weeks _statement above


----------



## HumanSmoke (Mar 18, 2015)

Xzibit said:


> Which begs the question is this a cut down chip already since its missing the DP? or is the M6000 that bad at DP as well?
> The K6000 had DP of 1.4TFLOPS
> Original Titan was a cut down version of the K6000/Titan Black.  Curious if we aren't seeing a repeat of a gimped Titan for a full version later.  Have to wait and see if the M6000 is different once its specs are revealed.


I think it has been stated ad nauseam that Maxwell isn't designed for FP64 workloads. It is the reason why Nvidia developed GK 210 alongside Maxwell, and why Nvidia are on record as saying that the next Tesla parts won't arrive until Pascal.

BTW: The K6000's theoretical FP64 throughput is 1.73TFLOPS not 1.4 ( 901.5MHz Core * 2880 cores * 2 Ops/clock = 5192.64 GFLOPS (FP32) / 3 = 1730.88 GFLOPS (FP64)


----------



## qubit (Mar 18, 2015)

@15th Warlock Yup, nvidia have always been adept at milking their customers and partners for money, really squeezing every last drop. Not something to appreciate really, since trade works best when both sides feel they've gotten a fair deal.

Think about my post above where I said I'm staying with my current card. it's mainly because I paid such an extortionately high price for it of £500 in January 2014 that I wanna make sure I get my money's worth out of it. This means holding off my next purchase until performance improves even further, which means skipping this generation altogether. However, if they'd priced it at a much more reasonable £300-£350 I would have definitely bought its replacement (cheaper GTX version, not Titan) and nvidia would have actually made more money out of me. Their greed has actually _lost_ them money when it comes to purchases from me and both sides lose out. I guess they must feel that overall this strategy makes them more money so good for them.


----------



## nickbaldwin86 (Mar 18, 2015)

ya so $1000 total or $400 more for 10FPS?  WOW

My two 980s are doing great for the same price.

Can I see SLi 980 comparison in those charts? why the 970... 970 is a joke with 3.5GB LAWL


----------



## HalfAHertz (Mar 18, 2015)

Disclaimer: please ignore this message if you're tired of people complaining from the price.

Ok fine, they cut down the DP - we don't mind. But why did they forget to cut down the price as well? They are offering a less capable product this time around, right?

The only "excuse" that Nvidia gave for the price of the original Titan series was that it was a so-called "prosumer" card with heavy DP, mainly aimed at the professionals. Then all the "professionals" rushed in and got the new shiny toy like spoiled little brats... Now everybody has to bend over backwards and accept the fact that the high end starts from 1000? Or do we all of a sudden have a new "special" extra high-end?

Let's have a quick car analogy: Imagine you're at the car dealership, looking for a new car and every year the cars have to get faster, otherwise what's the point of buying a new one, right? You've got your tiny budget cars, you've got some sedans, you have last year's dusty old models and at the very front you've got the shiny new top of the line model.

Client: "I need power!"
Dealer: "We can hook you up! Have a look at our new "special"top model. Remember how much you liked our special model last year? It's faster than last year's of course and it's in the same "special" price range, However, it doesn't have seats this time around because you probably don't need them."
C: "Umm..."

All I'm saying is, it's a fine product but it's not worth it for me.


----------



## BUFDUP (Mar 18, 2015)

Pfft.... kinda dissapointed about the performance compared to a 980.

And why 12GB? i doubt it if it can even saturate even 8GB.
Going SLI with these cards would use most of its memory.

In any case, im gonna wait till next gen. I'm happy with my 980's

I always upgrade every 2 gens (GTX 590 to GTX 770 Lightning SLI to GTX 980 SLI)


----------



## Xzibit (Mar 18, 2015)

HalfAHertz said:


> Ok fine, they cut down the DP - we don't mind. But why did they forget to cut down the price as well? They are offering a less capable product this time around, right?
> 
> The only "excuse" that Nvidia gave for the price of the original Titan series was that it was a so-called "prosumer" card with heavy DP, mainly aimed at the professionals. Then all the "professionals" rushed in and got the new shiny toy like spoiled little brats... Now everybody has to bend over backwards and accept the fact that the high end starts from 1000? Or do we all of a sudden have a new "special" extra high-end?



With Tom Petersen saying they needed more room for the FP64.  You can hear Nvidias explanation in this video

*@ 3:50+ mark in this video*









Ryan from PCPerspective asks him why the difference



			
				Nvidias Tom Petersen said:
			
		

> The way to think about is that, putting the double precision floating point units on the die cost area and that area could have other wise have spent on other things



With the upcoming Quadro M6000 release featuring a GM200 aswell. They probably didn't want to hurt sales.  So instead of getting the same chip as before they're getting a neutered variant.


----------



## Folterknecht (Mar 18, 2015)

qubit said:


> ...Also, it's not fully DX12 compliant in hardware since DX12 hasn't been fully defined yet.
> 
> No, I'll keep that £500 investment a while longer and see what the next generation offers, especially with DX12. ...



http://www.computerbase.de/2015-03/nvidia-geforce-gtx-titan-x-im-test/

"Nvidia hat mittlerweile bekannt gegeben, dass der GM200 DirectX 12 mit dem Feature-Level 12.1 in Hardware unterstützt und damit alle neuen Funktionen der API zum Start unterstützen wird."

My rough translation: NV announced that GM200 will fully support DX12, not only performance optimizations (12.0) but also all the eye candy (12.1).


----------



## Sony Xperia S (Mar 18, 2015)

If it has no even a semi-professional use, then why is it called Titan, in the first place? What does it make different from the other products from 900 series except the raw performance gains and that stupid memory buffer of 12!!! GB?


----------



## HumanSmoke (Mar 18, 2015)

Sony Xperia S said:


> If it has no even a semi-professional use


I'd say that a few people would pair the card with a Quadro for drivers/Viewport for 3D rendering - virtually all of which uses single precision. 6GB seems to be entry level for 4K (and up) rendering.


Sony Xperia S said:


> then why is it called Titan, in the first place?


1. Catchy Name
2. Can reuse the tooling from Titan/Titan Black reference cooler shrouds
3. So that it gives people something else to foam at the mouth about.


Sony Xperia S said:


> What does it make different from the other products from 900 series except the raw performance gains and that stupid memory buffer of 12!!! GB?


Larger bus width? More cores? Higher price tag? Leaves a bigger trail of breadcrumbs for trolls to follow? It's a GTX 980 scaled up by 50%, what were you expecting, HAL 9000 ?


----------



## Sony Xperia S (Mar 18, 2015)

HumanSmoke said:


> 6GB seems to be entry level for 4K (and up) rendering



Really, so you think that because of these 12 GB, this card will do fine with future 4K gaming. Well, actually, it won't.

The GPU will prove slow long before you need such a frame buffer.


----------



## qubit (Mar 18, 2015)

Folterknecht said:


> http://www.computerbase.de/2015-03/nvidia-geforce-gtx-titan-x-im-test/
> 
> "Nvidia hat mittlerweile bekannt gegeben, dass der GM200 DirectX 12 mit dem Feature-Level 12.1 in Hardware unterstützt und damit alle neuen Funktionen der API zum Start unterstützen wird."
> 
> My rough translation: NV announced that GM200 will fully support DX12, not only performance optimizations (12.0) but also all the eye candy (12.1).


I'd take that claim with a large pinch of salt to the point where I wouldn't believe it. I'd wait for the spec to be finalized and cards that say DX12 in their spec sheet. You'll see, today's GPUs will suddenly be out of date and we'll all have to spend lots of money on shiny new ones to get the full DX12 featureset.


----------



## Sony Xperia S (Mar 18, 2015)

qubit said:


> I'd take that claim with a large pinch of salt to the point where I wouldn't believe it. I'd wait for the spec to be finalized and cards that say DX12 in their spec sheet. You'll see, today's GPUs will suddenly be out of date and we'll all have to spend lots of money on shiny new ones to get the full DX12 featureset.



I DO NOT understand this thing. Why do they always release products which lag behind the software and are not able to run the software on them properly?
If the card is still not able to run Crysis 3 at 4K, then why the hell would I need it?

If the card doesn't support the full DX12_Tier 3, then why would I need anything from nvidia again? They have always provided products with inferior DX support!


----------



## RCoon (Mar 18, 2015)

Sony Xperia S said:


> If the card doesn't support the full DX12_Tier 3, then why would I need anything



DX12 is irrelevant. No games are being created with DX12, and won't be created with DX12 for at least a year if not two. By that point there will be new GPU's which will support DX13, and then you'll complain that they only support DX13.1 instead of DX13.3 despite that fact that no games will exist that will utilize DX13.3 by that point either.

As long as said card supports DX12 in the first place, that's just fine. The lack of additional .2 or .3 isn't much to cry over.



Sony Xperia S said:


> If the card is still not able to run Crysis 3 at 4K, then why the hell would I need it?


Well, no card can run Crysis 3 at 4K, so I guess there's no point in you buying any GPU ever until the 590X/Pascal range.



Sony Xperia S said:


> why is it called Titan


Because the Cray Titan used the original Titan's Kepler architecture in their K20X cards, and because it related to awesome supercomputing, and unsurprisingly, naming your card after a supercomputer makes units sell.



Sony Xperia S said:


> What does it make different from the other products from 900 series


Well, it can at least access more than 3.5GB (I assume)? *Hyuck Hyuck etc etc*



Sony Xperia S said:


> it has no even a semi-professional use


It kinda does though. The lack of double precision means relatively nothing in terms of lack of workload. Unless you're a multi-billion-dollar oil company (who's going to buy a compute card anyway), you don't need double precision. Single precision works just fine for all petty human style workloads.

EDIT: That said, I find the Titan X totally irrelevant in today's market. Super lacklustre and uninteresting this time around. Expected more.


----------



## Sony Xperia S (Mar 18, 2015)

You can play Crysis 3 at 4K with the card. Just lower the settings to medium / high and it will fly. But is it worth it to pay 1000$ for this?  No extras at all. 



RCoon said:


> Well, it can at least access more than 3.5GB (I assume)?



No? Because the 980 does the same as well?



RCoon said:


> EDIT: That said, I find the Titan X totally irrelevant in today's market. Super lacklustre and uninteresting this time around. Expected more.



The R9 390X is the card for you, I guess. With its HBM 8 GB.


----------



## HumanSmoke (Mar 18, 2015)

Sony Xperia S said:


> Really, so you think that because of these 12 GB, this card will do fine with future 4K gaming. Well, actually, it won't.
> 
> 
> 
> ...


The link, and post said 4K *rendering*. Well done for 1. not bothering to check the link, and 2. Not bothering to read the post you quoted.


----------



## RCoon (Mar 18, 2015)

Sony Xperia S said:


> The R9 390X is the card for you, I guess



No confirmed benchmarks. If the 390X really beat the Titan, AMD would have claimed it already to make Jen cry at his own launch. If it does beat it, it's probably going to do so in an extremely hot and flustered fashion. I'm also not keen on GPU's requiring an AIO for stock clocks.

My priority has always been silence and power consumption. AMD have never fit that profile.


----------



## Rahmat Sofyan (Mar 18, 2015)

arbiter said:


> Tom from nvidia was on a podcast with Ryan from PCper, he said reason was would cause heat issues, would be less space in a system if you have multi titan cards in a system next to each other. The backplate would limit the air flow and cause throttling issues.



I See, maybe because this is the full Maxwell at 100% performance.


----------



## DinaAngel (Mar 18, 2015)

Rahmat Sofyan said:


> I See, maybe because this is the full Maxwell at 100% performance.


full maxwell isnt released yet, its a quadro card im pretty sure.

as the titan x gpu doesnt have more than around 90 double presicion shaders


----------



## Ubersonic (Mar 18, 2015)

Where's the World of Warcraft bench? 

(Before some fool who played the game on their GT6800 a decade ago rants about how it would be pointless, On max settings it can bring a GTX980 to it's knees, I was wondering if the TX can max the game at 4K).


----------



## DinaAngel (Mar 18, 2015)

Ubersonic said:


> Where's the World of Warcraft bench?
> 
> (Before some fool who played the game on their GT6800 a decade ago rants about how it would be pointless, On max settings it can bring a GTX980 to it's knees, I was wondering if the TX can max the game at 4K).


im pretty sure a 980 would do 200 fps in world of warcraft, i might be mistaken but considering how optimised its now and how fast 980 is


----------



## buggalugs (Mar 18, 2015)

People have short memories though. I don't have a problem with paying $1,000 for a card if it holds its value, but what happened last time? The Titan came out, then like 6 weeks later the 780TI came out with the same kind of performance for $300-$400 less.

 Make no mistake, Nvidia are already building a 980TI that will perform like the TitanX for $300-$400 less money. With the 390X coming soon, Nvidia aren't going to rely on a $1,000 TitanX for the rest of 2015.

 It seems people have short memories and are too quick to bend over for an ass reaming, I just don't see this type of card as being a good investment.


----------



## Aretak (Mar 18, 2015)

DinaAngel said:


> im pretty sure a 980 would do 200 fps in world of warcraft, i might be mistaken but considering how optimised its now and how fast 980 is


You'd be surprised. I'm running a 4790K at 4.4 and a 980 at 1501/7908 and I get drops to 45fps in the busiest areas. That's with just CMAA too, without even considering the MSAA and SSAA options they added in the last patch.

It's an extremely scalable game though. I was running a 570 with the same CPU before I got my 980 in December, and with a few settings dialled back a bit I got about the same performance for almost no noticable visual difference.


----------



## HumanSmoke (Mar 18, 2015)

DinaAngel said:


> full maxwell isnt released yet, its a quadro card im pretty sure.


Quadro M6000 is the exact same GM200 that powers the Titan X.
I'm not sure how many times it needs stating, but the GM200 is the full die. PNY (one of the two main Quadro suppliers with Leadtek) has already stated that the M6000 has the same 1:32 double precision rate.
GM200 was pared down for double precision because Nvidia reworked GK110 into the GK210 (adding another ~50mm² to the die in the process) to specifically tackle double precision workloads.


----------



## Sony Xperia S (Mar 18, 2015)

RCoon said:


> No confirmed benchmarks. If the 390X really beat the Titan, AMD would have claimed it already to make Jen cry at his own launch. If it does beat it, it's probably going to do so in an extremely hot and flustered fashion. I'm also not keen on GPU's requiring an AIO for stock clocks.
> 
> My priority has always been silence and power consumption. AMD have never fit that profile.



I smell an nvidia fanboy here.

Water cooling already means that your setup will be quieter.
About the power consumption, titanic also takes quite large portion of energy, so I have no idea what energy savings you are dreaming about.

390X is and will be as fast as needed and will offer several industry-first features.

We know that humansmoke works either at nvidia or for nvidia. No need to prove it with every posting.


----------



## Rahmat Sofyan (Mar 18, 2015)

DinaAngel said:


> full maxwell isnt released yet, its a quadro card im pretty sure.
> 
> as the titan x gpu doesnt have more than around 90 double presicion shaders



Yeah, but quadro is another level, what I mean was for most users card...

But from the leaked picture, Quadro M6000 looklike used blackplate, something odd, wait and see.


----------



## 64K (Mar 18, 2015)

Sony Xperia S said:


> I smell an nvidia fanboy here.
> 
> Water cooling already means that your setup will be quieter.
> About the power consumption, titanic also takes quite large portion of energy, so I have no idea what energy savings you are dreaming about.
> ...



What in the world are you doing calling someone else a fanboy? Why are you even reading and posting on an Nvidia GPU thread? Silly.


----------



## mroofie (Mar 18, 2015)

64K said:


> What in the world are you doing calling someone else a fanboy? Why are you even reading and posting on an Nvidia GPU thread? Silly.


Rcoon's point on the use of AIO's are correct I despise any form of liquid cooling due to its EXTREME damage capability :0
And also that's not innovation! What amd is doing is the exact opposite :0


----------



## RCoon (Mar 18, 2015)

mroofie said:


> Rcoon's point on the use of AIO's are correct I despise any form of liquid cooling due to its EXTREME damage capability :0
> And also that's not innovation! What amd is doing is the exact opposite :0



I like custom watercooling, it's AIO's on GPU's I dislike. It's a copout solution for a problem that shouldnt be there if a GPU is designed correctly and not just brute force. Plus GPU AIO's are ugly as hell.


----------



## Ubersonic (Mar 18, 2015)

RCoon said:


> I like custom watercooling, it's AIO's on GPU's I dislike. It's a copout solution for a problem that shouldnt be there if a GPU is designed correctly and not just brute force. Plus GPU AIO's are ugly as hell.



You could say the same about any AIO though, I remember complaints because AMD's 100MHz 486 required a fan on it's heatsink :O


----------



## Ferrum Master (Mar 18, 2015)

Ubersonic said:


> You could say the same about any AIO though, I remember complaints because AMD's 100MHz 486 required a fan on it's heatsink :O



From what dinosaur cave you came out?  

Btw It doesn't look that bad if the heatsink is at the bottom of the PC besides PSU... looks pretty neutral. The only thing I hate... is pump noise.


----------



## nunomoreira10 (Mar 18, 2015)

RCoon said:


> I like custom watercooling, it's AIO's on GPU's I dislike. It's a copout solution for a problem that shouldnt be there if a GPU is designed correctly and not just brute force. Plus GPU AIO's are ugly as hell.



Power hungry hot GPU is still better then no GPU, AIO was the solution to the heat problem, i would t say the gpu wasn't design correctly, just that eficiency wast the priority


----------



## tpapas (Mar 18, 2015)

I consider myself not a fanboy of any company.

Currently my pcs are running 1x690 GTX Nvidia and the other 1x295x2 AMD

I do use the cards for scientific computation as also for gaming.

While I do like CUDA I prefer the open architecture of openCL, and in both cases double precision is needed.

So with titan-x NVIDIA shoot itself on the foot.  After a year of AMD having on the store selves 295x2 (which I bought for 670$~) , now NVIDIA also has a comparable product for 1000$.

Well why the @#$ would someone invest 1000$ to a GAMING ONLY card that underperforms and it does not provide any other use.

The reason I see it its only for marketing purposes and keeping up the hype.

And btw for those ppl that do not like the liquid cooling solution, I should say test first , write after. The noise level of the 690 (which is amazing still btw) is higher than the liquid cooling of the 295x2. 

gaming wise i still would prefer the 295x2 even if it is 2-core. 

When oculus of steam decide to sell their products AND there are software with good support for VR we will discuss again (1-2yrs I presume). Seriously though the SLI/CF works marvelously even for VR. Don't even bother reading comments telling you otherwise about no fast communication between cards and lags. All these are already overhyped and even if there is a speck of truth the possible problems will have been long resolved before the CV1 of VR comes along. All companies want you to have the best possible experience with 2 or 3 cards (overselling yes)


----------



## Bjorn_Of_Iceland (Mar 18, 2015)

Animalpak said:


> 4K users here is your war horse.
> 
> There can be a 980 Ti at this point with 8 GB of RAM.
> 
> I think the RAM is the thing that makes the price so high and of course the name TITAN.


.. and of course... the brand nVidia slapped on it.


----------



## pokazene_maslo (Mar 18, 2015)

I wish for identical card with half the memory and half the price! 

@W1zzard great review as always! Thanks!


----------



## the54thvoid (Mar 18, 2015)

buggalugs said:


> People have short memories though. I don't have a problem with paying $1,000 for a card if it holds its value, but what happened last time? The Titan came out, then like 6 weeks later the 780TI came out with the same kind of performance for $300-$400 less.
> 
> Make no mistake, Nvidia are already building a 980TI that will perform like the TitanX for $300-$400 less money. With the 390X coming soon, Nvidia aren't going to rely on a $1,000 TitanX for the rest of 2015.
> 
> It seems people have short memories and are too quick to bend over for an ass reaming, I just don't see this type of card as being a good investment.



It was actually worse than what you are saying.

Feb/March 2013 = Titan
May 2013 = GTX 780
November 2013 = GTX 780ti
Feb 2014 = Titan Black
Mar 2014 = Titan Z

5 flagship products in one year.  All were top gaming gpu except the very much FAIL that was Titan Z (except for compute users etc).



Sony Xperia S said:


> I smell an nvidia fanboy here.
> 
> Water cooling already means that your setup will be quieter.
> About the power consumption, titanic also takes quite large portion of energy, so I have no idea what energy savings you are dreaming about.
> ...



Dude, you need to go back to Xzibit's "pot - kettle - black" post and have a long hard read.  You are the most fervant AMD fanatic on this site right now.  _Obviously_ Humansmoke is an Nvidia fanboy as he said this:



HumanSmoke said:


> Seconded. The card judging by the power consumption is on a very tight leash.* I think I'm with you - 390X* or 980 Ti Classified, and hopefully the vendors allow both cards and user to be unpacked from their cotton wool.



Your posts are a completely retarded waste of space and the level of troll from you is insurmountable.  And anyone can say that to me as well but in my defence my post to thanks ratio is 2:1 (not shoddy, so I must be saying something right).

Sorry to say it how it is but every logical thing put your way is ignored and you always come back with "Nvidia product whatever is moronic" or "Nvidia is evil".

We all hope the 390X is as good as the leaks.  I'm disappointed in the Titan X performance and it will need to be unlocked and flashed to perform properly but at more power consumption than 980 sli with less performance.  If you don't need the 12GB for 3D rendering - it's a bit of a pish card for it's price frankly.

But please, Sony, stop with the constant fanboy crap.


----------



## Captain_Tom (Mar 18, 2015)

RCoon said:


> No confirmed benchmarks. If the 390X really beat the Titan, AMD would have claimed it already to make Jen cry at his own launch. If it does beat it, it's probably going to do so in an extremely hot and flustered fashion. I'm also not keen on GPU's requiring an AIO for stock clocks.
> 
> My priority has always been silence and power consumption. AMD have never fit that profile.



The card will also launch with third party air cooling (It uses slightly less power than the 290X after all).  Also third party coolers like the Vapor-X are FAR quieter than the Titan X, and in fact this very review points out that the Titan X isn't even much quieter than a reference 290X.


----------



## chiboy04 (Mar 18, 2015)

that face! 660Ti owner


----------



## Casecutter (Mar 18, 2015)

Interestingly if you go look back a look at the 290X release data... The Average/Peak power between the 7970Ghz and 290X jumped 13/15%, while the FpS performance (@2560x1600) offered a bump of 20-25% (quite/uber) In perf/watt it was up between 5-20% (quite/uber).

This GM200 shows a gain over the GTX 980 of 23% FpS increase @1440, while on the Average/Peak power numbers are up like 42/32% respectively. TitanX see a perf/watt that actually drops 10%!

They both appear to maxed out against their respective thermal limitations, and both provide the same level of noise.  Both seem straddled with a cooler that appears "recycled" and not doing the job.  Nvidia seems to have delivered their Hawaii!  Though not near as efficient when taking the perspective of each in their respective era (290X = 1.5 years ago).

The difference is AMD always stated they'd offered it for "AIB custom models", in the end minimizing shortcomings and maximize performance.  Will Nvidia do the same for TitanX?   I think we need to convey at least the same (or worse conclusion as W1zzard did; 8.8 vs 9.3!) to this as everyone did with the reference R9 290X… while at least AMD had from the start the intention that in AIB customs that would permit card to be un-tethered.  Nvidia intends to retain TitanX on a very tight leash... Or until Nvidias' vendors can offer it as 980Ti customs.

970 3.5 +.5, the GTX 960 being ho-hum… and this. Is it me, or is this the proverbial "third swing and miss"?


----------



## the54thvoid (Mar 18, 2015)

Casecutter said:


> Is it me, or is this the proverbial "third swing and miss"?



It's all of us but we're wrong (in an abstract way).

Though we're forgetting how much of a _non-jump_ Titan was over the GK104 (GTX680)






GTX 680 had 70% the perf of Titan






GTX 980 has 77% the perf of Titan X.

It's undoubtedly not as good as Titan was to the 680.  And it lacks the DP.  It's got a much more limited appeal.  And it's not doing too well with the cooler.

All in it's a floppity flop to me.  I'm pretty sure a 980ti with GM200 core, a TDP saving 6GB memory and partner freedom will make a 10-15% faster variant which will be a good card to have.

I genuinely don't see how AMD can't top this in June.  The 390X (by the above charts) only has to be 40% better than 290X to be a better option than Titan.  Go team Red, happy to see what they can do.


EDIT:

Motherfucker.  Cheapest is £870 at OcUK.  Average is £900.  EVGA HC is £1200


----------



## 64K (Mar 18, 2015)

the54thvoid said:


> It's all of us but we're wrong (in an abstract way).
> 
> Though we're forgetting how much of a _non-jump_ Titan was over the GK104 (GTX680)
> 
> ...



I would bet that the R9 390x will beat this Titan X. Nvidia didn't set the bar high enough this time imo. Possibly the gaming only version of GM200 with non-reference cooler will slip past the 390x but probably not by much if it does.

Man they stick it to you guys in the UK. £870 to £900 is $1,280 to $1,325 US.


----------



## pioneer (Mar 18, 2015)

Sony Xperia S said:


> I smell an nvidia fanboy here
> 
> Water cooling already means that your setup will be quieter.
> About the power consumption, titanic also takes quite large portion of energy, so I have no idea what energy savings you are dreaming about.
> ...



And im sure about that AMD hired you and many ones like you before ( except AMD dont paying you but just pulling your leg  )  --- for why ? for supporting AMD since hd2900XT released and EVEN BEFORE
1) supporting hd2900xt for its hot and lackey perf against mighty 8800gtx for its generation
2) supporting hd5870 for its extreme low speed in geometry throuput (tesselation) and even hd5970 for its ultra high stutter ( GTX400 series now support dx12  where is HD5000 series ??????? )
3) supporting hd6970 for its very slow speeeeed in geometry throuput (tesselation)   (now GTX400 and GTX500 series cards support dx12  where is HD5000 and HD6000 series by crapy VLIW architecture ??????????? )
for R9290X ultra high temp and consumption and so on for R9 390x

i love r9 390x (world first nuclear gpu architecture who generate heat like nuclear reactor core / consume power like CVN-77 Nimitz class Nuclear aircraft career and cooled by enthusiast liquid cooling )


----------



## Bob_Busfahrer (Mar 18, 2015)

First of all a big THX for the great and comprehensive Review!!
I have a question regarding the Performance of the R9 295x2 in Watch Dogs, Far Cry 4 and Ryse Son of Rome - how did you get Crossfire to work???
I have 2 R9 290 in CF and there are no working CF-Profiles for those Games available, there are many threads in the internet discussing this topic.
Espacially those 3 Games don't work properly with mgpu, there are many artifacts or negative performance scaling or other bugs.

So it would be really usefull for many users to know how to get crossfire to work - would you please share the way you got it working?

Thanks in advance and keep up the great work!!


----------



## Casecutter (Mar 18, 2015)

the54thvoid said:


> All in it's a floppity flop to me.  I'm pretty sure a 980ti with GM200 core, a TDP saving 6GB memory and partner freedom will make a 10-15% faster variant which will be a good card to have.EDIT:
> 
> Motherfucker.  Cheapest is £870 at OcUK.  Average is £900.  EVGA HC is £1200


 
Exactly, I see it the same and said it in the post on page two.


Casecutter said:


> Hum Maxwells' mojo didn't seem to scale... It's odd the cooling solution should have been plenty to have it run free, but it seems like to curtail power consumption (aka heat) they reign in the clocks.  Isn't that what it seems by the thermal camera. Then the "matches the Radeon R9 290X noise"... ouch!
> 
> As W1zzard said _"*So what's all the fuss about?* Roughly 50% more number-crunching muscle as the GTX 980, a 50% wider memory bus, and three times more memory."_ And from all that they gain 23% FpS increase @ 1440 (not earth shattering), while on the Average/Peak power numbers are up like 42/32% respectively.  The impression I’m left with is the excess memory is contributing to both heat and power it doesn’t make use of.
> 
> ...


 
It's not impressive; more working from the motto... More Money than Brains!  I almost feel with Nvidia has brought out the information on Pascal, as an attempt to steer the conversation. Create diversion... Squirrel!


----------



## Sony Xperia S (Mar 18, 2015)

the54thvoid said:


> We all hope the 390X is as good as the leaks.



Oh, come on. Everyone knows to spread empty words with no meaning. If I say that nvidia is evil it's for a reason.

If AMD offers very good products but you still buy nvidia which reflects in the market share, then this is some kind of irrational.

The same happened with Athlon 64 when it was the world top processor bur evil people still invested in Pentiums. What else can I say?

The same shit with Apple, a company worth 750 B $ offers 1500$ iphones which people embrace but have no clue that there are many other companies who will give you the same for multiple times less of money. It's nothing but evil!


----------



## the54thvoid (Mar 18, 2015)

Feel free to do the same to me Sony but from now on I'm 'ignoring' you.  I cannot see your posts - you are dust to me.

Let the logical discussions continue.



Casecutter said:


> Exactly, I see it the same and said it in the post on page two.
> 
> 
> It's not impressive; more working from the motto... More Money than Brains!  I almost feel with Nvidia has brought out the information on Pascal, as an attempt to steer the conversation. Create diversion... Squirrel!



For gaming I will upgrade once I see the 390X performance.  I want to go to 4K when win10 is out (better desktop scaling I believe?) so I kinda need >3GB Vram.  I just hope 390X has a 980ti dualing partner so most of us have a good choice.


----------



## qubit (Mar 18, 2015)

@Sony Xperia S

Yeah, nvidia really know how to milk the market of money and keep the prices high. This is what you're calling "evil" I presume?

No, perhaps you could call them greedy, which is my feeling on it and it pisses me off, but it's not evil. Every single company out there is out to make as much money as possible in any (generally legal) way they can and they'll exploit every opportunity and loophole to get it. That's a fact and we all know it.

The difference with nvidia is that they're pretty shrewed and have the best products on the market by far which allows them to play these games and get away with it. c'mon, if you were in the same position you'd do the same. That's a statement, not a question.


----------



## the54thvoid (Mar 18, 2015)

Here's a deal for everyone:

If the R390X is 50% faster than the 290X and beats any GTX 980ti 6GB variant by >10% (@ 4K res) I will give both my watercooled 780ti Classified cards away on the TPU forums.

Some of you will know I'm serious.


----------



## v12dock (Mar 18, 2015)

Ubersonic said:


> Where's the World of Warcraft bench?
> 
> (Before some fool who played the game on their GT6800 a decade ago rants about how it would be pointless, On max settings it can bring a GTX980 to it's knees, I was wondering if the TX can max the game at 4K).





v12dock said:


> I assume World of Warcraft was not in this review because of the changes in patch 6.1?
> 
> 
> W1zzard said:
> ...


----------



## qubit (Mar 18, 2015)

the54thvoid said:


> Here's a deal for everyone:
> 
> If the R390X is 50% faster than the 290X and beats any GTX 980ti 6GB variant by >10% (@ 4K res) I will give both my watercooled 780ti Classified cards away on the TPU forums.
> 
> Some of you will know I'm serious.


Wicked. I'll arrange a nice, rigged test and snaffle yer gear!


----------



## GhostRyder (Mar 18, 2015)

the54thvoid said:


> It's all of us but we're wrong (in an abstract way).
> 
> Though we're forgetting how much of a _non-jump_ Titan was over the GK104 (GTX680)
> 
> ...


 If its all hype and nothing more they could, but based on the facts of limiting the voltage on the card, the lower starting clocks than at least I expected, and the fact its got a limit on the cooling performance even is leaving room for them to release something much higher in the future.  I believe the 390X will come out, beat it by a decent enough margin everything blows up and then 1 month or around that area Nvidia will release the GTX 1080/980ti/whatever you want to call it and start its clocks much higher to make up for the deficit while allowing a higher voltage for those going for the gold.  Since the core is actually full fledged (Unless I have missed something) the only way to go up is with clock speeds and ram speeds which to me they are holding back on purpose to not show their hand until they know what to expect.



the54thvoid said:


> Feel free to do the same to me Sony but from now on I'm 'ignoring' you.  I cannot see your posts - you are dust to me.
> 
> Let the logical discussions continue.
> 
> ...


Join the club, I have been a part of it for awhile .  Its much easier to ignore people and save the forums from constant arguments than to acknowledge it and turn a discussion off topic.
But yea, ram requirements have gone through the roof.  We need 2 cards to duel it out otherwise the prices will not change though for me I am probably sitting this war out.


64K said:


> I would bet that the R9 390x will beat this Titan X. Nvidia didn't set the bar high enough this time imo. Possibly the gaming only version of GM200 with non-reference cooler will slip past the 390x but probably not by much if it does.
> Man they stick it to you guys in the UK. £870 to £900 is $1,280 to $1,325 US.


Yea, I am actually quite shocked after really reading up on the overall consensus in the reviews it seems that its a great card but not exactly as high as we were all predicting including the limited overclocking spectrum (Considering the GTX 980 and likes are such fantastic overclockers).



RCoon said:


> I like custom watercooling, it's AIO's on GPU's I dislike. It's a copout solution for a problem that shouldnt be there if a GPU is designed correctly and not just brute force. Plus GPU AIO's are ugly as hell.


 Meh, everyone likes different looks and some AIO's for GPU's can look pretty slick including the NZXT cooler (Least to me).  But either way with the thermals we are already at the fan cooling systems are getting harder and harder to design.  Even Titan X has its thermal limits hit on the Titan cooler that has been revered for so long.  The problem is how do we revise the cooler enough so that we can handle these loads without causing issues for another card or other components?  Blowing all GPU heat out of the case (Or at least most) is still the way preferred by most as we have seen what happens when that design is messed with (Similar to the HD 7990 for instance even though that's a dual GPU card).

Either way, I think we all agree this has become a waiting game to see what comes out and how the market will changes with prices.  I think patience is a virtue in this case.


----------



## Casecutter (Mar 18, 2015)

Looking at another way what makes it a *"Titan"*... Nvidias' de-contenting of double precision FP64 compute, is what enabled an advance in Perf/watt over original Titan (26%), while holding the die size at a moderate 50mm2 (9%) increase over the GK110  all respectable.

Interestingly work from GM204 (same DP de-contenting), the GM200 is right at 50% more GPU/Die than the GM204, so $550 +50%= $825. I reckon for moving up from a GM204; 50% bigger die, offering 23% more FpS,_ a lot of useless memory_, while only 10% less efficient is still commendable.

I can see it being a fine enthusiast gaming card when it has 6Gb and ~$700.  When it shows as a 980Ti with custom cooler that lets Boost Clocks be unfettered we could see 10% more (or like a 35% improvement above the 980).  Then it would be a consumer offering.  As the TitanX it is unequivocally not the prosumer card the original GTX Titan at least had going for it, more today just a collector edition.

Edit: Looking at a couple of reviews today I might want rethink what W1zzards' experience.  Seems other don't see hot, loud, or power hungry, while some are finding +30% over a 980 and even good OC'n.   Was this a runt of the litter?


----------



## N3M3515 (Mar 18, 2015)

64K said:


> I would bet that the R9 390x will beat this Titan X. Nvidia didn't set the bar high enough this time imo. Possibly the *gaming only version of GM200* with non-reference cooler will slip past the 390x but probably not by much if it does.
> 
> Man they stick it to you guys in the UK. £870 to £900 is $1,280 to $1,325 US.



I thought this was the gaming only version 
I guess it's the 980Ti, the overclocked versions of it could hipotetically be faster than a stock 390X.


----------



## HumanSmoke (Mar 18, 2015)

Casecutter said:


> As the TitanX it is unequivocally not the prosumer card the original GTX Titan at least had going for it, more today just a collector edition.


It will probably fare about the same. CG rendering is now aimed squarely at 4K and up - most GPU rendering that actually works is CUDA based. 6GB of framebuffer seems like the minimum entry fee for these higher resolutions. I'd also note that, lost amongst the hue and cry of gaming forums, the card is front and centre in a new and burgeoning area of development - the deep learning neural network that was intro'd to a wider audience at CES. Judging by the non-gaming graphics parallel computing forums, there is a fair bit of interest in the board by developers eager to get in on the ground floor of the technology. Much like GPGPU co-processing when it first arrived eight years ago it will probably be dismissed as superfluous, until it becomes ubiquitous and a substantial area of revenue.


Casecutter said:


> Looking at another way what makes it a *"Titan"*... Nvidias' de-contenting of double precision FP64 compute, is what enabled an advance in Perf/watt over original Titan (26%), while holding the die size at a moderate 50mm2 (9%) increase over the GK110  all respectable.


To make full use of the FP64 if it had been included would likely have required a GK200 GPU on the very cusp of manufacturability (if that), at around 650mm². GK110 required another 50mm² added to it (doubling of register and cache resources) making the GK210 the same size as GM200 to tailor it to double precision workloads - and that is without adding anything in the way of improvements for gaming usage scenarios - what Tom Petersen was alluding to in the earlier video links.
Nvidia had to sacrifice FP64 for gaming performance and a manufacturable GPU, because power demand and process node are against shoehorning everything into a single die. It is also why the GK210 was tailored for FP64 work and won't be offered as a consumer GPU, since it adds little or nothing for gaming over GK110.


Casecutter said:


> Interestingly work from GM204 (same DP de-contenting), the GM200 is right at 50% more GPU/Die than the GM204, so $550 +50%= $825.


Except that doesn't take into account the yield of the wafer and the effects of silicon defects. Scaling a GM 204 sized die up to a GM 200 sized one nets the approximate loss of one third of the die candidates, but it doesn't take into account that the larger the die the larger the loss of parts from defect.





The die calculation is a quick approximation. 19.95mm x 19.95mm for 398mm² (GM204), and 24.52mm x 24.52mm for 601mm² (GM200)


----------



## xorbe (Mar 18, 2015)

This is supposed to be a Titan X rom from OCN if anyone wants to load it up and have a look here.
Couresy of Fallendreams
http://www.overclock.net/t/1546747/nvidia-geforce-gtx-titan-x-owners-club/320#post_23684122


----------



## Dia01 (Mar 19, 2015)

Recently listed for $1499.00 here in Australia, wow.  Not that I need it but SLI 970's seems definitely a better option.
http://www.pccasegear.com/index.php?main_page=product_info&products_id=31311


----------



## HumanSmoke (Mar 19, 2015)

Dia01 said:


> Recently listed for $1499.00 here in Australia, wow.  Not that I need it but SLI 970's seems definitely a better option.
> http://www.pccasegear.com/index.php?main_page=product_info&products_id=31311


Bargain! $2K across the ditch ( $1931 in your dollars)


----------



## RealNeil (Mar 19, 2015)

This is truly a tribute to excess. I want it now, but will probably never shell out for it either.

BTW: good review,.......


----------



## Serpent of Darkness (Mar 19, 2015)

ZoneDymo said:


> very very disappointing overall card



No Shet!  If GTX Titan X Maxwell is 50% on the Cuda Core Count of a GTX 980, and it doesn't pull around 50% or more FPS performance of a GTX 980 on all resolutions, it's not worth it...  The 12 GB Framebuffer is the only thing worth paying for in the GTX Titan Maxwell.  It's overkill, but if you're into Surround or 4K, even though the GPU is going to struggle, you might be able to crank up the AA a little with the 12 GB VRam. 

In addition, a lot of people don't notice it, but the FPS performance of the R9-295x is basically what the R9-390x is going to pull, or better, at the TDP of a R9-290x, for half the price.  What's really going to surprise me, and I doubt it would happen, I've seen some images that the R9-390x base clock is at 1.4 Ghz.  If this is true, I'd be impress...

So ty W1zzard for putting things into perspective...  Wish you could provide benches on GTX Titan X Maxwell in SLI.  Provide some Frame Time Variance Curves to see how she dipps in G-Sync and no G-Sync.


----------



## HumanSmoke (Mar 19, 2015)

Looks like Inno3D is outfitting Titan X with the hybrid cooler used on their GTX980/970 models.


----------



## Sunfire Cape (Mar 19, 2015)

Casecutter said:


> Looking at another way what makes it a *"Titan"*... Nvidias' de-contenting of double precision FP64 compute, is what enabled an advance in Perf/watt over original Titan (26%), while holding the die size at a moderate 50mm2 (9%) increase over the GK110  all respectable.
> 
> Interestingly work from GM204 (same DP de-contenting), the GM200 is right at 50% more GPU/Die than the GM204, so $550 +50%= $825. I reckon for moving up from a GM204; 50% bigger die, *offering 23% more FpS*,_ a lot of useless memory_, while only 10% less efficient is still commendable.
> 
> ...


Did you mistype or am I missing something?
From the performance summary, we can see Titan X performance is considered 100% and GTX 980 performance is 77% at 1440p,
--> GTX 980 performance = 0.77 Titan X performance
--> Titan X performance = (1/0.77) GTX 980 performance ~ 1.2987 GTX 980 performance
That means Titan X  offers ~29.87% more fps at 1440p. So, how the heck did you calculate to get the number 23%, dude?

In summary, we can say that at 1440p GTX 980 is 23% slower than Titan X if we consider Titan X performance the baseline to compare and Titan X is ~29.87% faster than GTX 980 in case GTX 980 performance is baseline. Why did you mess up these two baselines into Titan X performance only? Did you do that intentionally?


----------



## Dia01 (Mar 19, 2015)

I've just had that discussion, I agree as well, to be correct Titan X should be stated to be approximately 29.87% faster than the 980.  As far as "intentionally" messing them up, I highly doubt that would be the case.


----------



## Frick (Mar 19, 2015)

the54thvoid said:


> Here's a deal for everyone:
> 
> If the R390X is 50% faster than the 290X and beats any GTX 980ti 6GB variant by >10% (@ 4K res) I will give both my watercooled 780ti Classified cards away on the TPU forums.
> 
> Some of you will know I'm serious.



They would go nicely with my Core 2 Duo.


----------



## jihadjoe (Mar 19, 2015)

12GB is good! I mean Nvidia certainly won't be lowering the price of the Titan X even if they cut the VRAM down to 6, or 8GB so the extra VRAM translates directly into more value for the end-user.


----------



## arbiter (Mar 19, 2015)

64K said:


> What in the world are you doing calling someone else a fanboy? Why are you even reading and posting on an Nvidia GPU thread? Silly.



Cause AMD hasn't had anything note worthy in so long they got nothing better to do but troll nvidia release comments.



Captain_Tom said:


> The card will also launch with third party air cooling (It uses slightly less power than the 290X after all).  Also third party coolers like the Vapor-X are FAR quieter than the Titan X, and in fact this very review points out that the Titan X isn't even much quieter than a reference 290X.



Can't say it uses less power then a 290x, there is been 0 numbers on its TDP. Can't go by power connections with AMD anymore as they used 2x8pin to power a card that can use over 600watts of power. Fact they did an AIO water cooler tells me the card is 300+watts, even could be 350-400 range if they had to go down road of that kinda cooler. Its all speculation at this point as power draw. I doubt though it will be less power then 290x given the increase in # of gcn cores.


----------



## Yellow&Nerdy? (Mar 19, 2015)

Anyone who buys this card for gaming is still pretty dumb. This card is clearly meant for other purposes, but Nvidia is branding it in a way so that gamers with too much money might still buy it. Just shows that GTX 970 SLI is still probably the ideal setup for 4K gaming at the moment.


----------



## Dia01 (Mar 19, 2015)

HumanSmoke said:


> Bargain! $2K across the ditch ( $1931 in your dollars)



Jeepers, that's considerably worse.  You blokes have a little more GST on goods over there too don't you?


----------



## Ubersonic (Mar 19, 2015)

the54thvoid said:


> GTX 680 had 70% the perf of Titan
> GTX 980 has 77% the perf of Titan X.



It is kind of worth remembering though that by the time the Titan launched AMD had sorted the HD7970's drivers and released the revised GHz BIOS.  And so the GTX680 was dueling it's main rival on performance while having less VRAM (something that has seen it's lifespan affected), the GTX980 on the other hand is beating it's closest rival (290X) and has equal VRAM.

My point is that the Titan-X may not be as impressive compared to the GTX980 as the Titan was compared to the GTX680, but that's mostly due to the GTX980 being more impressive than the GTX680 was.


----------



## Slizzo (Mar 19, 2015)

Anybody get theirs yet? Guy I know bought on release day, and received it yesterday. He bought direct from nVidia.


----------



## Casecutter (Mar 19, 2015)

Sunfire Cape said:


> Did you mistype or am I missing something?
> So, how the heck did you calculate to get the number 23%, dude?
> Why did you mess up these two baselines into Titan X performance only? Did you do that intentionally?


Thank you Sunfire Cape, Sincerely 
Regrettably *I failed...* in a flurry of typing the original post (#45) I just clicked back to see what W1zzards summery, and then had just _failed_ to roll-up to the percentage.  I later carried over that same "faux pas" to the post #127, as I had referred back to the #45 post for reference... it was then locked in my head as the number.  No excuse, no malice, and now is more in line with other review data.

As to other reviews it (as said above in the Edit) seem to me that other reviewer aren't reporting the heat, noise, or even power number W1zzards sample is providing.  The way W1zzard indicated these were “passed out”, was from a group with no one reviewer seemingly destined to a given sample, it appears luck of the draw.  But that said I would still believe these where "ran through their paces" just to insure they are at least within the "normal distribution" of the bell curve.




HumanSmoke said:


> \Except that doesn't take into account the yield of the wafer and the effects of silicon defects. Scaling a GM 204 sized die up to a GM 200 sized one nets the approximate loss of one third of the die candidates, but it doesn't take into account that the larger the die the larger the loss of parts from defect.


Excellent information and you are wholeheartedly correct, the simplistic figure I included doesn't provide the entire story as to the candidates that can be harvested and the risk that defects from a die with less candidates due to physical size might provide.


----------



## HumanSmoke (Mar 19, 2015)

Dia01 said:


> Jeepers, that's considerably worse.  You blokes have a little more GST on goods over there too don't you?


15%, although sometimes it feels like they charge GST on GST!
Wise man say buy from overseas vendor who hones their creative writing skills on customs declarations.


Casecutter said:


> As to other reviews it (as said above in the Edit) seem to me that other reviewer aren't reporting the heat, noise, or even power number W1zzards sample is providing.


All depends upon the testing application - and if it is a game, what workload the card and CPU are under. Any throttling and/or CPU utilization will skew the result, as will choice of game




[Source]


----------



## radrok (Mar 19, 2015)

Aww yes, cheapest Titan X here is 1249 Eur.

Could have been ok with 999 Eur price, anything more it's gonna stay where it is.


----------



## xorbe (Mar 19, 2015)

radrok said:


> Aww yes, cheapest Titan X here is 1249 Eur.



Ouch.


----------



## buildzoid (Mar 20, 2015)

radrok said:


> Aww yes, cheapest Titan X here is 1249 Eur.
> 
> Could have been ok with 999 Eur price, anything more it's gonna stay where it is.


Same here a TITAN X is 32500czk or more so that about 1275$


----------



## Sony Xperia S (Mar 20, 2015)

qubit said:


> @Sony Xperia S
> 
> Yeah, nvidia really know how to milk the market of money and keep the prices high. This is what you're calling "evil" I presume?



No!

Money is not a value. But it has never been only this. Multiple actions and in general their imperial attitude. It isn't so simple to explain.

Actually, they do not do anything special, it is just some strange symbiosis between them and people. Given that many people in general are bad, then.......



qubit said:


> The difference with nvidia is that they're pretty shrewed and have the best products on the market by far



Which one would you prefer? The R9 295X2 for 624$ or this new TitanX for double the price and still lower performance.

http://www.newegg.com/Product/Produ...&cm_re=radeon_r9_295x2-_-14-131-584-_-Product

Remember, the R9 295X2 is still the fastest graphics card and titan did nothing but to close the gap.


----------



## qubit (Mar 20, 2015)

It sounds like you're talking about nvidia's ethics then? Yeah, there's a question mark over that too. The recent 970 scandal is a good example.

I wouldn't take either card and as I've said elsewhere in this thread, I'm disappointed with the Titan X in a number of ways. Also, you're comparing a dual GPU card with a single GPU one, but it takes an incredibly powerful single GPU to overcome a dual one, which normally takes a couple of generations from either company. Also, there's significant value in having a single GPU card of either brand, because it doesn't have issues such as microstuttering (usually), multi-GPU scaling, plus lower power consumption and heat output and the frame buffer isn't effectively halved in size. Hence, your example isn't valid.


----------



## Ruby Rabbit (Mar 20, 2015)

Great Review!! Thank you! @ Wizard You are the only site that offers such a wide variety of GPU benchmarks. For me I wanted to see the GTX690 benchmark comparison. Thanks

If I can make a suggestion. I cannot find a site that does benchmarks @ 3440x1440. The ultra wide is a popular res now. I had a 4k monitor liked the photos, but terrible for gaming (GPU chokes). So I went with a Dell U3415w. The immersion for gaming is frankly the best I have experienced 34" @ 3440x1440. Personally I think it is is the sweat spot for less of a GPU hit than 4K and only a few scaling issues. In fact its just amazing all round. (First time I watched videos on my PC- now I am loving it)

So if you could do titan x  benchmark @ 3440x1440 you will be first!!! Hopefully it brings some new fans. Your great review and GPU spectrum got me to sign up!

Now decision time will the Titan X get me back to full ultra glory @ 3440 x 1440. I think it will? (minus the stuttering I get on the GTX690 due to the 2gb Vram


----------



## the54thvoid (Mar 20, 2015)

Ruby Rabbit said:


> If I can make a suggestion. I cannot find a site that does benchmarks @ 3440x1440. *The ultra wide is a popular res now*.



Without sounding too cheeky - no it isn't.  It's quite a rare res to use and it shows in the sales figures and how many 3440x1440 res monitors there are out there.   A tremendous amount of users still use 1080p, not even 1440p.

I'm sure someone who isn't about to go to work can find the stats!


----------



## Sony Xperia S (Mar 20, 2015)

This Titan X is not going to sell, I think not at all.

Look at what nvidia does cause to the market:

*Global graphics card shipments in 1Q15 to fall 20-25%, say Taiwan vendors*

http://digitimes.com/news/a20150319PD219.html

Can you even imagine how serious this is?!

As someone correctly said: _they would prefer to keep the inventories rotting in landfills rather than selling them for margins lower than 50%._


----------



## Animalpak (Mar 20, 2015)

999 dollar price is a lie.

Here in Switzerland is priced 1350 dollars and the Gigabyte one is 1690 dollars.

And for 1890 dollars you get the Titan Z


----------



## qubit (Mar 20, 2015)

Sony Xperia S said:


> This Titan X is not going to sell, I think not at all.


I think you might be right there. Think about it, the previous versions had full-power FP64 capability to differentiate them from the GTX version while this doesn't. Slapping a ridiculous 12GB RAM on it to justify the inflated price just won't cut it.

UK prices range from about £870 to £950. Got mine on pre-order now.


----------



## haswrong (Mar 20, 2015)

the positive side is the 12 gigs of vram. i welcome that attitude. this removes any current and near future limitations in the area and makes it rady for recursive raytracing techniques. but the lack of dp cant justify 999 bucks.. in my opinion. if its a gamers card, give it a gamers price tag.. or maybe im becoming too old for this gpu world, heh.. seems its all about money, but im not. cheers.


----------



## bpgt64 (Mar 20, 2015)

Am I insane to want to drop two GTX 980s for this?  I am sick of SLi not working on games like DayZ, but I want to keep Gsync/4k.  I am willing to give up Ultra for High settings no AA to get this.


----------



## qubit (Mar 20, 2015)

bpgt64 said:


> Am I insane to want to drop two GTX 980s for this?  I am sick of SLi not working on games like DayZ, but I want to keep Gsync/4k.  I am willing to give up Ultra for High settings no AA to get this.


Yes, you are! 

Wait for the GTX version with a much more reasonable price and get that.


----------



## 64K (Mar 20, 2015)

bpgt64 said:


> Am I insane to want to drop two GTX 980s for this?  I am sick of SLi not working on games like DayZ, but I want to keep Gsync/4k.  I am willing to give up Ultra for High settings no AA to get this.



Well, most people including W1zzard say AA on 4K isn't necessary.
I agree with qubit. Wait a few months for the gaming only version of GM200. It will almost certainly be faster. Then when Big Pascal GP200 comes out grab one of those and sell the Maxwell and you should be set for single GPU at 4K.


----------



## bpgt64 (Mar 20, 2015)

64K said:


> Well, most people including W1zzard say AA on 4K isn't necessary.
> I agree with qubit. Wait a few months for the gaming only version of GM200. It will almost certainly be faster. Then when Big Pascal GP200 comes out grab one of those and sell the Maxwell and you should be set for single GPU at 4K.



See the thing is that is the gaming only version...It doesn't have the same Double Prec point as the original titan.  So it's applicability to being a Tesla card replacement is limited, other than the massive frame buffer.


----------



## 64K (Mar 20, 2015)

bpgt64 said:


> See the thing is that is the gaming only version...It doesn't have the same Double Prec point as the original titan.  So it's applicability to being a Tesla card replacement is limited, other than the massive frame buffer.



I know what you're saying and this very well could turn out to be the high end Maxwell gaming card. I don't think so and here's why. There could be another card design with 6 GB VRAM. It would cost less to manufacture and so it could be sold for less. It would use less watts and those watts could be used to bump up the clocks for the same 250 watts. Allowing it's board partners to use non-reference coolers would keep the GPU from throttling after 1 minute of game play and offer greater overclocking potential. And finally, it will be ~1 year before Pascal and I don't see Nvidia waiting that long before another high end release besides the probable dual GPU Titan but all of this is just speculation on my part.


----------



## Sony Xperia S (Mar 20, 2015)

qubit said:


> Wait for the GTX version with a much more reasonable price and get that.





64K said:


> Wait a few months for the gaming only version of GM200.



How do you think that GTX would be called? GTX 980 Ti?

Well, I think NOO!



64K said:


> Then when Big Pascal GP200 comes out grab one of those and sell the Maxwell and you should be set for single GPU at 4K.



First, GP100 and then possibly GP200.


----------



## bpgt64 (Mar 20, 2015)

64K said:


> I know what you're saying and this very well could turn out to be the high end Maxwell gaming card. I don't think so and here's why. There could be another card design with 6 GB VRAM. It would cost less to manufacture and so it could be sold for less. It would use less watts and those watts could be used to bump up the clocks for the same 250 watts. Allowing it's board partners to use non-reference coolers would keep the GPU from throttling after 1 minute of game play and offer greater overclocking potential. And finally, it will be ~1 year before Pascal and I don't see Nvidia waiting that long before another high end release besides the probable dual GPU Titan but all of this is just speculation on my part.



There's always going to be a better card in the future, that's..kinda of inevitable.  There likely will be a 980 Ti, but my bet is it's not the same Cuda core count as Titan, It might have been, if this Titan X was going to wind up being like the first one, where it was a Cheap Tesla version, but the change in DP value to me signify they won't easily cannibalize this card like that did with 780 Ti.  A 980 Ti to me is much more likely to be a 256mb bus with 8gb vram, but a mid way point between the Titan X and 980 in terms of core/rop count.


----------



## qubit (Mar 20, 2015)

Sony Xperia S said:


> How do you think that GTX would be called? GTX 980 Ti?
> 
> Well, I think NOO!


I have no idea what they will call it, other than it will likely start with "GTX". What do you think they will call it?


----------



## Sony Xperia S (Mar 20, 2015)

qubit said:


> I have no idea what they will call it, other than it will likely start with "GTX". What do you think they will call it?



I think there should be no other GTX.

When R9 390X appears, its competition performance wise will be Titan X itself.
If you unveil a castrated 980 Ti to fill the gap between Titan X and 980, it will not be enough for anything.


----------



## 64K (Mar 20, 2015)

Sony Xperia S said:


> How do you think that GTX would be called? GTX 980 Ti?
> 
> Well, I think NOO!
> 
> ...



I would be surprised if Nvidia called the gaming version of GM200 a GTX 980 Ti. That could cause people to think of it as a faster 980 which it's not. The 980 is an upper mid range GM204 Mawell GPU and not a high end GM200 Maxwell GPU. That's probably why Nvidia didn't call the 780 a 680 Ti. 
Yes GP100 will probably come first but for bpgt64 to run a single GPU at ultra ~60 FPS on 4K in almost all games then the GP200 would be the best choice.



bpgt64 said:


> There's always going to be a better card in the future, that's..kinda of inevitable.  There likely will be a 980 Ti, but my bet is it's not the same Cuda core count as Titan, It might have been, if this Titan X was going to wind up being like the first one, where it was a Cheap Tesla version, but the change in DP value to me signify they won't easily cannibalize this card like that did with 780 Ti.  A 980 Ti to me is much more likely to be a 256mb bus with 8gb vram, but a mid way point between the Titan X and 980 in terms of core/rop count.



Well, the gaming version of Kepler GK110 the GTX 780 Ti had more cores than the original Titan and it was faster. We'll see in a few months but I'm thinking it will have the same number of cores but higher clocks.
I know there's always going to be something better on the horizon. What I was getting at is the GP200 Pascal will probably be the first single GPU that can handle 4K at Ultra settings ~60 FPS in almost all games.


----------



## Sony Xperia S (Mar 20, 2015)

64K said:


> I would be surprised if Nvidia called the gaming version of GM200 a GTX 980 Ti. That could cause people to think of it as a faster 980 which it's not. The 980 is an upper mid range GM204 Mawell GPU and not a high end GM200 Maxwell GPU. That's probably why Nvidia didn't call the 780 a 680 Ti.
> Yes GP100 will probably come first but for bpgt64 to run a single GPU at ultra ~60 FPS on 4K in almost all games then the GP200 would be the best choice.
> 
> 
> ...




Sorry. The gaming version of GM200 is already the titan. It's useless for professional and compute tasks.


----------



## bpgt64 (Mar 20, 2015)

Sony Xperia S said:


> I think there should be no other GTX.
> 
> When R9 390X appears, its competition performance wise will be Titan X itself.
> If you unveil a castrated 980 Ti to fill the gap between Titan X and 980, it will not be enough for anything.



I am all in with Nvidia at the moment, Acer XBO268HK  28 inch 4k monitor with Gsync.  So that's not an option, even if it's faster/cheaper.  Gsync is way to sexy, and I'd have to buy another monitor.


----------



## xorbe (Mar 20, 2015)

Sony Xperia S said:


> This Titan X is not going to sell, I think not at all.



I do note that there is a lot less Titan X forum chit chat (various forums) in general than when Titan debuted. OTOH, it's a known thing this time around.


----------



## haswrong (Mar 20, 2015)

qubit said:


> I have no idea what they will call it, other than it will likely start with "GTX". What do you think they will call it?


heres some idea: BFG Titan XXX http://www.overclockers.co.uk/showproduct.php?prodid=GX-098-BG&groupid=701&catid=1914&subcat=1576


----------



## Sony Xperia S (Mar 20, 2015)

haswrong said:


> heres some idea: BFG Titan XXX http://www.overclockers.co.uk/showproduct.php?prodid=GX-098-BG&groupid=701&catid=1914&subcat=1576



April 1st? 

They should have called it XXL with 16.9 GB.


----------



## the54thvoid (Mar 20, 2015)

xorbe said:


> I do note that there is a lot less Titan X forum chit chat (various forums) in general than when Titan debuted. OTOH, it's a known thing this time around.



Well, this is third time around, Titan, Black (I suppose Z too) and now X.

Also, 390X leaks have gone too.  I genuinely would like to see more.

As for BPGT64, I'd stick with 980's till Summer, make a better choice then.  I've learned to not buy new top line Nv stuff but it's taken years to avoid the bug.


----------



## HumanSmoke (Mar 20, 2015)

Sony Xperia S said:


> This Titan X is not going to sell, I think not at all.
> Look at what nvidia does cause to the market:
> *Global graphics card shipments in 1Q15 to fall 20-25%, say Taiwan vendors*
> http://digitimes.com/news/a20150319PD219.html
> Can you even imagine how serious this is?!


More serious for some than others. Nvidia's present market share is 76% of the discrete market. That is tipped to rise to 80% in the present quarter. A big part of the decrease in shipments is because AMD have stopped shipping many SKUs to AIB/AIC/OEMs
*AMD stops shipping chips as bloated channel begs 'Please, no more'*

The previous two quarters were buoyed by sales of the then new GTX 960, 970, and 980. That sales push is now past, so it should be expected that with no new volume parts being launched, shipments should fall back.


----------



## bpgt64 (Mar 20, 2015)

the54thvoid said:


> Well, this is third time around, Titan, Black (I suppose Z too) and now X.
> 
> Also, 390X leaks have gone too.  I genuinely would like to see more.
> 
> As for BPGT64, I'd stick with 980's till Summer, make a better choice then.  I've learned to not buy new top line Nv stuff but it's taken years to avoid the bug.



I know very well, it could be a big mistake, but that's what buying tech always entails.  My thing is, I never use AA at 4k.  I almost didn't buy a second 980, and was willing to use one at medium settings.  This proves to me, that I could get away with 1 card for a very long time at 4k.  As DX12 will need years to mature.


----------



## Slizzo (Mar 20, 2015)

Sony Xperia S said:


> No!
> 
> Money is not a value. But it has never been only this. Multiple actions and in general their imperial attitude. It isn't so simple to explain.
> 
> ...



The fact that Titan X gets close with just ONE GPU vs. 295X2 with two is telling.


----------



## 64K (Mar 20, 2015)

Slizzo said:


> The fact that Titan X gets close with just ONE GPU vs. 295X2 with two is telling.



Not a fair comparison. The R9 295x2 is last generation architecture and the Titan X is new generation architecture. Compare a 390x single GPU (when released) to a single GPU Titan X or a dual GPU of 390x to a dual dual GPU of Titan X for a fair comparison.


----------



## HumanSmoke (Mar 20, 2015)

64K said:


> Not a fair comparison. The R9 295x2 is last generation architecture and the Titan X is new generation architecture.


GCN 3rd generation ( PDF of GCN3 ISA) doesn't look much different from the previous two generations. If you're expecting radical changes you'll be disappointed - HSA optimization and colour compression (as per Tonga) seems about it


64K said:


> Compare a 390x single GPU (when released) to a single GPU Titan X or a dual GPU of 390x to a dual dual GPU of Titan X for a fair comparison.


For a consumer standpoint it's a case of "run what you brung". You can neither compare nor buy unreleased parts as a general rule. When AMD dropped the 295X2, reviews (including TPUs) were only too happy to benchmark the card against single GPU competition from a previous generation - what other option is there when AMD and Nvidia dovetail graphics releases so that each has a bite of the cherry. This particular case isn't much different from the HD 7970 launch. People knew that the GTX 690/680 were launching a couple of months later, but the card was ranked on and compared to what was available at the time ( GTX 580/590, HD 6970/6990).


----------



## 64K (Mar 20, 2015)

I made the statements that I made on a tech site. Not for general public perusal.

We all understand here the game between Nvidia and AMD for dominance. You know that you can't expect that each side will play their hand at the same time so we have to allow for a window of a few months to have a chance of making fair comparisons of new architectures.


----------



## mroofie (Mar 21, 2015)

the54thvoid said:


> Here's a deal for everyone:
> 
> If the R390X is 50% faster than the 290X and beats any GTX 980ti 6GB variant by >10% (@ 4K res) I will give both my watercooled 780ti Classified cards away on the TPU forums.
> 
> Some of you will know I'm serious.


Would you kindly give them to me 
lol


----------



## Sony Xperia S (Mar 21, 2015)

HumanSmoke said:


>



From Q4 2010 to Q4 2014 AMD has lost ~60%. Wow. Very strange graph with all those spikes...... What did cause the increase in sales for nvidia from Q2 to Q3 2012? What caused the rapid decrease in their sales from Q1 to Q2 2014? 

AMD has been in constant decline.  With some stable periods from Q4 2011 to Q3 2012.

This trend is very alarming. If AMD continues to lose so much sales, they will inevitably disappear at some point.

Hope is that those 20-25% decline in sales are all nvidia's part. 



mroofie said:


> Would you kindly give them to me
> lol



Egoism. One can be for you and the other one for another lucky person.


----------



## the54thvoid (Mar 21, 2015)

mroofie said:


> Would you kindly give them to me
> lol



That system will need to be worked out but really, if 390X is awesomely good -as my stated 50% above 290X, (I've now got doubts creeping in though) I WILL give them away (minus postage).

The only certain criteria I can envisage is as long as you're* not a dick 

*group noun, not as in 'you'


----------



## Sony Xperia S (Mar 21, 2015)

the54thvoid said:


> 390X.... 50% above 290X



Someone to tell this guy to pack his cards for giving them away.

AMD work hard on the drivers for the cards and the expected performance gains are in the range of +65%.


----------



## HumanSmoke (Mar 21, 2015)

Sony Xperia S said:


> What did cause the increase in sales for nvidia from Q2 to Q3 2012?


A corresponding lack of sales from AMD


> AMD’s quarter-to-quarter total shipments of desktop heterogeneous GPU/CPUs, i.e., APUs dropped 30% from Q2 and 4.7% in notebooks. The company’s overall PC graphics shipments slipped 10.7%.
> 
> Nvidia’s quarter-to-quarter desktop discrete shipments jumped 28.3% from last quarter; and, the company’s mobile discrete shipments were up 12%, which is impressive in a down market. The company’s overall PC graphics shipments increased 19.6%.





Sony Xperia S said:


> What caused the rapid decrease in their sales from Q1 to Q2 2014?


Probably channel partners and OEM contracts running out inventory of EOL'ed Kepler cards in preparation for Maxwell.


Sony Xperia S said:


> This trend is very alarming. If AMD continues to lose so much sales, they will inevitably disappear at some point.


Unlikely, although AMD's APUs are eating a disproportionate share of their low/mainstream discrete market. Nvidia card buyers are more likely to pair the card with an Intel system, and with Intel's IGP being less competitive, Nvidia's lower end cards survive in greater numbers.


Sony Xperia S said:


> Hope is that those 20-25% decline in sales are all nvidia's part.


Again, unlikely. Nvidia's volume markets are favoured by OEMs. When the consumer market falters, these entry/mainstream level pre-builts underpin a certain continuity of sales. For that to change, AMD would need to vulture some significant OEM discrete contracts.
As has been proven many times in the past, even having a dominant architecture with no significant competition ( such as AMD's Evergreen series...DX11, GDDR5, excellent power/perf. vs. old G92/G94 based Nvidia products), doesn't guarantee market share or sales.


----------



## 1d10t (Mar 22, 2015)

So Titan X or I would prefer naming them Titan milX edition are comes out 
Hilarious performance for > $999 card, beat by their own scandalous 970 SLI and year-behind and powerhog R9295X2


----------



## Slizzo (Mar 22, 2015)

1d10t said:


> So Titan X or I would prefer naming them Titan milX edition are comes out
> Hilarious performance for > $999 card, beat by their own scandalous 970 SLI and year-behind and powerhog R9295X2



I'm confused. You're comparing two GTX970s, which are more than 50% of a single titan, and are saying that that is scandalous?


----------



## Caring1 (Mar 22, 2015)

Based on retail price it seems fair to compare them.


----------



## 1d10t (Mar 22, 2015)

Slizzo said:


> I'm confused. You're comparing two GTX970s, which are more than 50% of a single titan, and are saying that that is scandalous?



3 gigs memory fiasco as we speak,or it's already fixed?
Months before,many bashed AMD for releasing 8 Gb version of 290X,and that doesn't change anything except for very few games.Now nVidia releasing a behemoth carrying 12 gigs VRAM hope to change the tide.Look like someone never learned.

What intrigues me is...



> Roughly 50% more number-crunching muscle than the GTX 980, a 50% wider memory bus, and three times the memory...
> --snip--



So 50% more core,50% wider memory,300% more VRAM and 50% more ROPS only translate to 24% faster.what a great achievement.


----------



## radrok (Mar 22, 2015)

Slizzo said:


> I'm confused. You're comparing two GTX970s, which are more than 50% of a single titan, and are saying that that is scandalous?



Just ignore them.

Most of the people that compare two cards to one are those who never tried SLI or CFX and can't grasp the difference between the setups.

No offense meant


----------



## Derico (Mar 22, 2015)

I understand how some people tend to be rather disappointed in the titan x. When it comes to 4k gaming, this card seems to be barely scratching the surface in crysis 3. 

Using 2 titan x's in sli is where it gets interesting for me.  Playing games on 1440p with maxed out settings (maybe even some supersampling) on a 144hz monitor should go really well with two of these cards. What do you guys think?


----------



## Ikaruga (Mar 22, 2015)

Derico said:


> I understand how some people tend to be rather disappointed in the titan x. When it comes to 4k gaming, this card seems to be barely scratching the surface in crysis 3.
> 
> Using 2 titan x's in sli is where it gets interesting for me.  Playing games on 1440p with maxed out settings (maybe even some supersampling) on a 144hz monitor should go really well with two of these cards. What do you guys think?


I saw some sli tests and it scaled around 60-80% in games, so I don't know how rich you are, but I think this card only worth it if you are going for 4K. If money is not a concern for you, then go for it ofc, but I'm sure Nvidia will soon starts lasercutting and selling the "bad" titan-x chips as 980TI or something for a lower price, so I would wait if I were you.


----------



## Sony Xperia S (Mar 22, 2015)

Ikaruga said:


> I saw some sli tests and it scaled around 60-80% in games, so I don't know how rich you are, but I think this card only worth it if you are going for 4K. If money is not a concern for you, then go for it ofc, but I'm sure Nvidia will soon starts *lasercutting* and selling the "bad" titan-x chips as 980TI or something for a lower price, so I would wait if I were you.



The binning because of not optimal yield, if some dies on the wafer have defects, probably there will be no need at all for lasercutting.



1d10t said:


> What intrigues me is...
> So 50% more core,*50% wider memory,300% more VRAM* and 50% more ROPS only translate to 24% faster.what a great achievement.



It doesn't work like that. You cannot expect the underlined things to gain much performance.

Do you remember the case when people laugh when some super slow card is equipped with inappropriate memory amount?
For instance, a 750Ti with 4 GB memory. Super stupid and useless.

Titan X is the same piece of crap.


----------



## radrok (Mar 22, 2015)

1d10t said:


> So 50% more core,50% wider memory,300% more VRAM and 50% more ROPS only translate to 24% faster.what a great achievement.



This card is constrained by TDP.

If you put this thing on custom water with an unlocked power BIOS and run it at frequencies close to a GTX 980 you'd get 50% more performance.

Vram has never had an impact on performance unless the limit is reached, which isn't the case for today games at 4GB (rare exceptions aside).

Check out in due time OCN Titan X thread and you'll see what this card is capable when it doesn't have to compromise.

I agree though that this time around it shouldn't have costed more than 650$, that 12GB Vram doesn't justify the 1k$ price tag.


----------



## HumanSmoke (Mar 22, 2015)

radrok said:


> This card is constrained by TDP.
> If you put this thing on custom water with an unlocked power BIOS and run it at frequencies close to a GTX 980 you'd get 50% more performance.


PCGH are getting some good numbers (1550MHz core/ 8000MHz effective memory) with just an Arctic Accelero and an unlocked power BIOS


----------



## Ikaruga (Mar 23, 2015)

Sony Xperia S said:


> The binning because of not optimal yield, if some dies on the wafer have defects, probably there will be no need at all for lasercutting.


Aren't they lasercut the disabled parts anyways?


----------



## HumanSmoke (Mar 23, 2015)

Ikaruga said:


> Aren't they lasercut the disabled parts anyways?


AFAIK, the silicon is much more likely to have fuses (laid down in the metal layers of the IC) blown to isolate defective parts of the die.


----------



## Derico (Mar 23, 2015)

Ikaruga said:


> I saw some sli tests and it scaled around 60-80% in games, so I don't know how rich you are, but I think this card only worth it if you are going for 4K. If money is not a concern for you, then go for it ofc, but I'm sure Nvidia will soon starts lasercutting and selling the "bad" titan-x chips as 980TI or something for a lower price, so I would wait if I were you.



Agreed. 980Ti might be a more suiting option for a lot of people. 

Im not so sure if the titan x is only good for 4k though. Right now, i don't see a good alternative for people who want to experience 1440p gaming with maxed out details on 144hz g-sync monitors. 2 GTX980 in sli will get you about 90-100 fps according to some benchmarks. Not sure how much more we can  expect from 2 980Ti's in SLI.


----------



## Ikaruga (Mar 23, 2015)

Derico said:


> Agreed. 980Ti might be a more suiting option for a lot of people.
> 
> Im not so sure if the titan x is only good for 4k though. Right now, i don't see a good alternative for people who want to experience 1440p gaming with maxed out details on 144hz g-sync monitors. 2 GTX980 in sli will get you about 90-100 fps according to some benchmarks. Not sure how much more we can  expect from 2 980Ti's in SLI.


I did not say it's only good for 4K, I said it only worth the price if your are going for 4K. That was an objective observation, but if you would ask me, I personally have a much better opinion about this card compared to the majority here. I think it's a real beast and Nvidia gave gamers the best in the world again. Just look how awesome it performs if you push it a bit harder:  http://www.overclockersclub.com/reviews/nvidia_geforce_gtx_titan_x/


----------



## Derico (Mar 23, 2015)

Ikaruga said:


> I did not say it's only good for 4K, I said it only worth the price if your are going for 4K. That was an objective observation, but if you would ask me, I personally have a much better opinion about this card compared to the majority here. I think it's a real beast and Nvidia gave gamers the best in the world again. Just look how awesome it performs if you push it a bit harder:  http://www.overclockersclub.com/reviews/nvidia_geforce_gtx_titan_x/


Great review with tempting results... now imagine running 2 of these in sli...


----------



## 1d10t (Mar 23, 2015)

Sony Xperia S said:


> It doesn't work like that. You cannot expect the underlined things to gain much performance.
> Do you remember the case when people laugh when some super slow card is equipped with inappropriate memory amount?
> For instance, a 750Ti with 4 GB memory. Super stupid and useless.
> Titan X is the same piece of crap.



so maxwell is already MAX then? 



radrok said:


> This card is constrained by TDP.
> If you put this thing on custom water with an unlocked power BIOS and run it at frequencies close to a GTX 980 you'd get 50% more performance.
> Vram has never had an impact on performance unless the limit is reached, which isn't the case for today games at 4GB (rare exceptions aside).
> Check out in due time OCN Titan X thread and you'll see what this card is capable when it doesn't have to compromise.
> I agree though that this time around it shouldn't have costed more than 650$, that 12GB Vram doesn't justify the 1k$ price tag.



Good point.Would you like to take risk to flash $1000 card and voiding it's warranty?And as far as i remember,nVidia prohibited overclocking or there's a policy change in their Term of Use?
VRAM doesn't have advantage,yet.So why nVidia launch 12 Gb card?Because they can...yup...hoping that every average joe jumped in to buy 12Gb card for their 4k monitor(s).
Yes,overclock will bump performance,but that is extra,not true experience out-of the-box.


----------



## Ikaruga (Mar 23, 2015)

1d10t said:


> So why nVidia launch 12 Gb card?Because they can...yup...hoping that every average joe jumped in to buy 12Gb card for their 4k monitor(s).


This is almost enough to achieve 2/10 troll level. *This card is made for enthusiast and high end users* who want the best and want it now, and *definitely not for average Joes*.
It's the fastest GPU on the planet, it's quiet and doesn't need a nuclear reactor under the bed. People who work with software like Deep Learning or other SP cuda applications, or people who do video editing or CG and need to keep large images/textures in the VRAM, etc... there are a shit ton of people out there who need as much VRAM as possible.
This card has only two cons, it doesn't have DP and it's expensive, the rest are all as good as it can be in March 2015 AD.



1d10t said:


> Yes,overclock will bump performance,but that is extra,not true experience out-of the-box


Just like your FX8350 running at 5Ghz has nothing to do with the out-of-the-box experience, right?


----------



## 64K (Mar 23, 2015)

Ikaruga said:


> This is almost enough to achieve 2/10 troll level. *This card is made for enthusiast and high end users* who want the best and want it now, and *definitely not for average Joes*.
> It's the fastest GPU on the planet, it's quiet and doesn't need a nuclear reactor under the bed. People who work with software like Deep Learning or other SP cuda applications, or people who do video editing or CG and need to keep large images/textures in the VRAM, etc... there are a shit ton of people out there who need as much VRAM as possible.
> This card has only two cons, it doesn't have DP and it's expensive, the rest are all as good as it can be in March 2015 AD.
> 
> ...



I'm glad it does have some uses for professionals. That shores up my hopes that Nvidia will release a GM200 with 6 GB VRAM for gamers only for hopefully ~$700 on release like the Kepler GTX 780 Ti. That is the card I want with non-reference cooler so that it won't throttle due to getting too hot. 12 GB VRAM seems like a lot now but rumors have it that a high end Pascal in the next year or two will have 32 GB VRAM.


----------



## 1d10t (Mar 23, 2015)

Ikaruga said:


> This is almost enough to achieve 2/10 troll level. *This card is made for enthusiast and high end users* who want the best and want it now, and *definitely not for average Joes*.
> It's the fastest GPU on the planet, it's quiet and doesn't need a nuclear reactor under the bed. People who work with software like Deep Learning or other SP cuda applications, or people who do video editing or CG and need to keep large images/textures in the VRAM, etc... there are a shit ton of people out there who need as much VRAM as possible.
> This card has only two cons, it doesn't have DP and it's expensive, the rest are all as good as it can be in March 2015 AD.
> Just like your FX8350 running at 5Ghz has nothing to do with the out-of-the-box experience, right?



So average joe who had $1000 doesn't allow to buy this card?
Yep,this is the fastest single GPU,and the fastest GPU configuration if you SLI'ed them.just wow.
And for other purposes,rather than gaming i think professional go for Quaddro for real time rendering,and back to CPU and RAM for finalizing.
As for my CPU,did i mention anything outside GPU?


----------



## Ikaruga (Mar 23, 2015)

1d10t said:


> So average joe who had $1000 doesn't allow to buy this card?
> Yep,this is the fastest single GPU,and the fastest GPU configuration if you SLI'ed them.just wow.
> And for other purposes,rather than gaming i think professional go for Quaddro for real time rendering,and back to CPU and RAM for finalizing.
> As for my CPU,did i mention anything outside GPU?


I was talking about enthusiast and high end users. People who might work as a professional also, but need the best card in their home PC for gaming, developing, experimenting with stuffz, to finish work at home now and then....etc... Individuals who want the best, they are the target audience, and this card delivers.


----------



## radrok (Mar 23, 2015)

1d10t said:


> so maxwell is already MAX then?
> 
> 
> 
> ...



Did that multiple time on my GPUs, flashing is completely safe if you know what to do.

The REAL problem with this graphics card is that Nvidia markets this thing at enthusiasts and it doesn't provide the tools that an enthusiast card should come with.

This should have been a reference version equipped like an EVGA's Classified GPU, no more no effin less.

We get poop phase VRM and a retarded voltage regulator. That's the problem, not the 12GB or the price because last time I checked Nvidia has GPUs in all price segments so there's no reason to complain they are selling a 999$ GPU.

NO one is complaining that Intel sells a 999$ CPU, get over it guys, Nvidia has the upper hand and it's in the position to create an halo product.

That's what it is.


----------



## Ikaruga (Mar 24, 2015)

radrok said:


> We get poop phase VRM and a retarded voltage regulator.


They used "6+2" phase design. Would you be kind to tell what "numbers" should be there? You want Nvidia to put stuff there for you on a consumer product to achieve things like this?:


----------



## radrok (Mar 24, 2015)

Ikaruga said:


> They used "6+2" phase design. Would you be kind to tell what "numbers" should be there? You want Nvidia to put stuff there for you on a consumer product to achieve things like this?:



Something that isn't comparable to a 500$ GPU for starters. Their profit is at a maximum with Titan X PCB.


----------



## Vlada011 (Mar 30, 2015)

This review is excellent and usually TPU reviews help me to decide between NVIDIA graphic card versions.
Example 780Ti Classified review only confirm me that upgrade from 780 to 780Ti worth.
But this confirm one more thing... If you pay 1200e, pay little more for 130MHz fabric OC, because this sample and 50-60% of people will get similar will crash with EVGA Superclocked BIOS example. That's not nice when you invest so much in card. Most people will not dream about 1500MHz and similar number on reference PCB and 8+6pin but warranty on 130-150MHz for 50-60e over reference clock is welcomed. Than in start customer avoid 20-30% worse chips.
And if you have nice backplate to cover naked memory chips on 1200e card from same manufacturer that's even better.


----------



## Vlada011 (Mar 30, 2015)

If someone have TITAN X should explain how high temps go in latest games if fan is set manual to 80%.
For me is that not so loud except on AMD cards.


----------



## Ikaruga (Mar 30, 2015)

Vlada011 said:


> If someone have TITAN X should explain how high temps go in latest games if fan is set manual to 80%.
> For me is that not so loud except on AMD cards.


Why would you set the fans to manual?


----------



## Vlada011 (Mar 31, 2015)

Because anyway fan will work on some speed and better to spin constant on one speed than every 2 min to change 20% up and down.
First I check how much fan spin in some game and if need 60-70% I will set immediately 70% and that's it.

People talk TITAN X work on 85C and If they measure on Auto, NVIDIA profile that's expected temp for premium graphic chip.
But if temps are 75C on 70-75% fan speed that's not bad.
In theory TITAN X with 6GB = GM200 6GB could even work on 1200MHz base clock.
At least some better models. TITAN Black had 900MHz, little less, some GK110 models are launched with 120-170MHz more. That can happen again.


----------



## xorbe (Mar 31, 2015)

I used MSI AB to ramp fan min/max from 50C to 100C -- that lets the gpu temp top out around 76 celcius.  A bit more fan than stock default.


----------



## ZeroFM (May 5, 2015)

what about memory usage at 1080p , no AA ?


----------



## tpapas (May 5, 2015)

It would be nice to see also the 3x1080 game comparison for eyefinity users. You were one (if not the only) site that included it and now you ceased?


----------



## xorbe (May 5, 2015)

Ikaruga said:


> Why would you set the fans to manual?



Some people game with closed-back earphones, and just pump up the fan for lowest temps. *shrug*


----------



## Ikaruga (May 7, 2015)

xorbe said:


> Some people game with closed-back earphones, and just pump up the fan for lowest temps. *shrug*


Wouldn't be still easier to set a custom profile with high fan speeds then? I still don't get it why would anyone thinker with the fans manually every time they start or stop a game.


----------



## xorbe (May 7, 2015)

Ikaruga said:


> Wouldn't be still easier to set a custom profile with high fan speeds then? I still don't get it why would anyone thinker with the fans manually every time they start or stop a game.



Because they think 13 MHz more on the gpu is earth shattering, hah.  Oh, they probably do just use an aggressive profile.  I guess I missed some detail of the discussion here.  Max manual for benchmarking.


----------



## Arabianknight (May 23, 2015)

Hello ,

Can you please update this test using three 1440P , 1600 P screens in Ultra settings ?

that is : 7680 x 1440 or 7680 x 1600 

People are ignoring 3 screens tests after the release of 4K monitors


----------



## Zero3606 (Jun 16, 2015)

Vlada011 said:


> Because anyway fan will work on some speed and better to spin constant on one speed than every 2 min to change 20% up and down.
> First I check how much fan spin in some game and if need 60-70% I will set immediately 70% and that's it.
> 
> People talk TITAN X work on 85C and If they measure on Auto, NVIDIA profile that's expected temp for premium graphic chip.
> ...



I have a Titan X and my temps do not get above 67 degrees and any game.


----------

