• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce Titan X 12 GB

@Xzibit @15th Warlock This is looking a bit like a paper launch, isn't it?

Unfortunately, yes, it looks like it is a paper launch so far :(

Nvidia must have had very limited quantities available for sale at their own store, to justifiy the "hard launch" title for people reviewing the card :shadedshu:

EDIT: From the Anandtech review:

Finally, for launch availability this will be a hard launch with a slight twist. Rather than starting with retail and etail partners such as Newegg, NVIDIA is going to kick things off by selling cards directly, while partners will start to sell cards in a few weeks. For a card like GTX Titan X, NVIDIA selling cards directly is not a huge stretch; with all cards being identical reference cards, partners largely serve as distributors and technical support for buyers.

So no EVGA, Gigabyte, Asus or other cards available for the time being, only "Nvidia" branded reference cards, and in very limited quantities it seems. What disappoints me the most is the:
while partners will start to sell cards in a few weeks statement above :(
 
Last edited:
Which begs the question is this a cut down chip already since its missing the DP? or is the M6000 that bad at DP as well?
The K6000 had DP of 1.4TFLOPS
Original Titan was a cut down version of the K6000/Titan Black. Curious if we aren't seeing a repeat of a gimped Titan for a full version later. Have to wait and see if the M6000 is different once its specs are revealed.
I think it has been stated ad nauseam that Maxwell isn't designed for FP64 workloads. It is the reason why Nvidia developed GK 210 alongside Maxwell, and why Nvidia are on record as saying that the next Tesla parts won't arrive until Pascal.

BTW: The K6000's theoretical FP64 throughput is 1.73TFLOPS not 1.4 ( 901.5MHz Core * 2880 cores * 2 Ops/clock = 5192.64 GFLOPS (FP32) / 3 = 1730.88 GFLOPS (FP64)
 
Last edited:
@15th Warlock Yup, nvidia have always been adept at milking their customers and partners for money, really squeezing every last drop. Not something to appreciate really, since trade works best when both sides feel they've gotten a fair deal.

Think about my post above where I said I'm staying with my current card. it's mainly because I paid such an extortionately high price for it of £500 in January 2014 that I wanna make sure I get my money's worth out of it. This means holding off my next purchase until performance improves even further, which means skipping this generation altogether. However, if they'd priced it at a much more reasonable £300-£350 I would have definitely bought its replacement (cheaper GTX version, not Titan) and nvidia would have actually made more money out of me. Their greed has actually lost them money when it comes to purchases from me and both sides lose out. I guess they must feel that overall this strategy makes them more money so good for them.
 
ya so $1000 total or $400 more for 10FPS? WOW

My two 980s are doing great for the same price.

Can I see SLi 980 comparison in those charts? why the 970... 970 is a joke with 3.5GB LAWL
 
Disclaimer: please ignore this message if you're tired of people complaining from the price.

Ok fine, they cut down the DP - we don't mind. But why did they forget to cut down the price as well? They are offering a less capable product this time around, right?

The only "excuse" that Nvidia gave for the price of the original Titan series was that it was a so-called "prosumer" card with heavy DP, mainly aimed at the professionals. Then all the "professionals" rushed in and got the new shiny toy like spoiled little brats... Now everybody has to bend over backwards and accept the fact that the high end starts from 1000? Or do we all of a sudden have a new "special" extra high-end?

Let's have a quick car analogy: Imagine you're at the car dealership, looking for a new car and every year the cars have to get faster, otherwise what's the point of buying a new one, right? You've got your tiny budget cars, you've got some sedans, you have last year's dusty old models and at the very front you've got the shiny new top of the line model.

Client: "I need power!"
Dealer: "We can hook you up! Have a look at our new "special"top model. Remember how much you liked our special model last year? It's faster than last year's of course and it's in the same "special" price range, However, it doesn't have seats this time around because you probably don't need them."
C: "Umm..."

All I'm saying is, it's a fine product but it's not worth it for me.
 
Pfft.... kinda dissapointed about the performance compared to a 980.

And why 12GB? i doubt it if it can even saturate even 8GB.
Going SLI with these cards would use most of its memory.

In any case, im gonna wait till next gen. I'm happy with my 980's

I always upgrade every 2 gens (GTX 590 to GTX 770 Lightning SLI to GTX 980 SLI)
 
Ok fine, they cut down the DP - we don't mind. But why did they forget to cut down the price as well? They are offering a less capable product this time around, right?

The only "excuse" that Nvidia gave for the price of the original Titan series was that it was a so-called "prosumer" card with heavy DP, mainly aimed at the professionals. Then all the "professionals" rushed in and got the new shiny toy like spoiled little brats... Now everybody has to bend over backwards and accept the fact that the high end starts from 1000? Or do we all of a sudden have a new "special" extra high-end?

With Tom Petersen saying they needed more room for the FP64. You can hear Nvidias explanation in this video

@ 3:50+ mark in this video

Ryan from PCPerspective asks him why the difference

Nvidias Tom Petersen said:
The way to think about is that, putting the double precision floating point units on the die cost area and that area could have other wise have spent on other things

With the upcoming Quadro M6000 release featuring a GM200 aswell. They probably didn't want to hurt sales. So instead of getting the same chip as before they're getting a neutered variant.
 
Last edited:
...Also, it's not fully DX12 compliant in hardware since DX12 hasn't been fully defined yet.

No, I'll keep that £500 investment a while longer and see what the next generation offers, especially with DX12. ...

http://www.computerbase.de/2015-03/nvidia-geforce-gtx-titan-x-im-test/

"Nvidia hat mittlerweile bekannt gegeben, dass der GM200 DirectX 12 mit dem Feature-Level 12.1 in Hardware unterstützt und damit alle neuen Funktionen der API zum Start unterstützen wird."

My rough translation: NV announced that GM200 will fully support DX12, not only performance optimizations (12.0) but also all the eye candy (12.1).
 
If it has no even a semi-professional use, then why is it called Titan, in the first place? What does it make different from the other products from 900 series except the raw performance gains and that stupid memory buffer of 12!!! GB?
 
If it has no even a semi-professional use
I'd say that a few people would pair the card with a Quadro for drivers/Viewport for 3D rendering - virtually all of which uses single precision. 6GB seems to be entry level for 4K (and up) rendering.
then why is it called Titan, in the first place?
1. Catchy Name
2. Can reuse the tooling from Titan/Titan Black reference cooler shrouds
3. So that it gives people something else to foam at the mouth about.
What does it make different from the other products from 900 series except the raw performance gains and that stupid memory buffer of 12!!! GB?
Larger bus width? More cores? Higher price tag? Leaves a bigger trail of breadcrumbs for trolls to follow? It's a GTX 980 scaled up by 50%, what were you expecting, HAL 9000 ?
 
http://www.computerbase.de/2015-03/nvidia-geforce-gtx-titan-x-im-test/

"Nvidia hat mittlerweile bekannt gegeben, dass der GM200 DirectX 12 mit dem Feature-Level 12.1 in Hardware unterstützt und damit alle neuen Funktionen der API zum Start unterstützen wird."

My rough translation: NV announced that GM200 will fully support DX12, not only performance optimizations (12.0) but also all the eye candy (12.1).
I'd take that claim with a large pinch of salt to the point where I wouldn't believe it. I'd wait for the spec to be finalized and cards that say DX12 in their spec sheet. You'll see, today's GPUs will suddenly be out of date and we'll all have to spend lots of money on shiny new ones to get the full DX12 featureset.
 
I'd take that claim with a large pinch of salt to the point where I wouldn't believe it. I'd wait for the spec to be finalized and cards that say DX12 in their spec sheet. You'll see, today's GPUs will suddenly be out of date and we'll all have to spend lots of money on shiny new ones to get the full DX12 featureset.

I DO NOT understand this thing. Why do they always release products which lag behind the software and are not able to run the software on them properly?
If the card is still not able to run Crysis 3 at 4K, then why the hell would I need it?

If the card doesn't support the full DX12_Tier 3, then why would I need anything from nvidia again? They have always provided products with inferior DX support! :rolleyes:
 
If the card doesn't support the full DX12_Tier 3, then why would I need anything

DX12 is irrelevant. No games are being created with DX12, and won't be created with DX12 for at least a year if not two. By that point there will be new GPU's which will support DX13, and then you'll complain that they only support DX13.1 instead of DX13.3 despite that fact that no games will exist that will utilize DX13.3 by that point either.

As long as said card supports DX12 in the first place, that's just fine. The lack of additional .2 or .3 isn't much to cry over.

If the card is still not able to run Crysis 3 at 4K, then why the hell would I need it?
Well, no card can run Crysis 3 at 4K, so I guess there's no point in you buying any GPU ever until the 590X/Pascal range.

why is it called Titan
Because the Cray Titan used the original Titan's Kepler architecture in their K20X cards, and because it related to awesome supercomputing, and unsurprisingly, naming your card after a supercomputer makes units sell.

What does it make different from the other products from 900 series
Well, it can at least access more than 3.5GB (I assume)? *Hyuck Hyuck etc etc*

it has no even a semi-professional use
It kinda does though. The lack of double precision means relatively nothing in terms of lack of workload. Unless you're a multi-billion-dollar oil company (who's going to buy a compute card anyway), you don't need double precision. Single precision works just fine for all petty human style workloads.

EDIT: That said, I find the Titan X totally irrelevant in today's market. Super lacklustre and uninteresting this time around. Expected more.
 
Last edited:
You can play Crysis 3 at 4K with the card. Just lower the settings to medium / high and it will fly. But is it worth it to pay 1000$ for this? :D No extras at all. :(

Well, it can at least access more than 3.5GB (I assume)?

No? Because the 980 does the same as well?

EDIT: That said, I find the Titan X totally irrelevant in today's market. Super lacklustre and uninteresting this time around. Expected more.

The R9 390X is the card for you, I guess. With its HBM 8 GB. :)
 
The R9 390X is the card for you, I guess

No confirmed benchmarks. If the 390X really beat the Titan, AMD would have claimed it already to make Jen cry at his own launch. If it does beat it, it's probably going to do so in an extremely hot and flustered fashion. I'm also not keen on GPU's requiring an AIO for stock clocks.

My priority has always been silence and power consumption. AMD have never fit that profile.
 
Tom from nvidia was on a podcast with Ryan from PCper, he said reason was would cause heat issues, would be less space in a system if you have multi titan cards in a system next to each other. The backplate would limit the air flow and cause throttling issues.

I See, maybe because this is the full Maxwell at 100% performance.
 
I See, maybe because this is the full Maxwell at 100% performance.
full maxwell isnt released yet, its a quadro card im pretty sure.

as the titan x gpu doesnt have more than around 90 double presicion shaders
 
Last edited:
Where's the World of Warcraft bench? :(

(Before some fool who played the game on their GT6800 a decade ago rants about how it would be pointless, On max settings it can bring a GTX980 to it's knees, I was wondering if the TX can max the game at 4K).
 
Where's the World of Warcraft bench? :(

(Before some fool who played the game on their GT6800 a decade ago rants about how it would be pointless, On max settings it can bring a GTX980 to it's knees, I was wondering if the TX can max the game at 4K).
im pretty sure a 980 would do 200 fps in world of warcraft, i might be mistaken but considering how optimised its now and how fast 980 is
 
People have short memories though. I don't have a problem with paying $1,000 for a card if it holds its value, but what happened last time? The Titan came out, then like 6 weeks later the 780TI came out with the same kind of performance for $300-$400 less.

Make no mistake, Nvidia are already building a 980TI that will perform like the TitanX for $300-$400 less money. With the 390X coming soon, Nvidia aren't going to rely on a $1,000 TitanX for the rest of 2015.

It seems people have short memories and are too quick to bend over for an ass reaming, I just don't see this type of card as being a good investment.
 
im pretty sure a 980 would do 200 fps in world of warcraft, i might be mistaken but considering how optimised its now and how fast 980 is
You'd be surprised. I'm running a 4790K at 4.4 and a 980 at 1501/7908 and I get drops to 45fps in the busiest areas. That's with just CMAA too, without even considering the MSAA and SSAA options they added in the last patch.

It's an extremely scalable game though. I was running a 570 with the same CPU before I got my 980 in December, and with a few settings dialled back a bit I got about the same performance for almost no noticable visual difference.
 
full maxwell isnt released yet, its a quadro card im pretty sure.
Quadro M6000 is the exact same GM200 that powers the Titan X.
I'm not sure how many times it needs stating, but the GM200 is the full die. PNY (one of the two main Quadro suppliers with Leadtek) has already stated that the M6000 has the same 1:32 double precision rate.
GM200 was pared down for double precision because Nvidia reworked GK110 into the GK210 (adding another ~50mm² to the die in the process) to specifically tackle double precision workloads.
 
No confirmed benchmarks. If the 390X really beat the Titan, AMD would have claimed it already to make Jen cry at his own launch. If it does beat it, it's probably going to do so in an extremely hot and flustered fashion. I'm also not keen on GPU's requiring an AIO for stock clocks.

My priority has always been silence and power consumption. AMD have never fit that profile.

I smell an nvidia fanboy here.

Water cooling already means that your setup will be quieter.
About the power consumption, titanic also takes quite large portion of energy, so I have no idea what energy savings you are dreaming about.

390X is and will be as fast as needed and will offer several industry-first features.

We know that humansmoke works either at nvidia or for nvidia. No need to prove it with every posting.
 
Back
Top