Thursday, May 1st 2014

New GTX TITAN-Z Launch Details Emerge

NVIDIA's GeForce GTX TITAN-Z missed the bus on its earlier 29th April, 2014 launch date, which was confirmed to the press by several retailers, forcing some AIC partners to content with paper-launches of cards bearing their brand. It turns out that the delay is going to be by just a little over a week. The GeForce GTX TITAN-Z is now expected to be available on the 8th of May, 2014. That will be when you'll be able to buy the US $3,000 graphics card off the shelf.

A dual-GPU graphics card based on a pair of 28 nm GK110 GPUs, the GTX TITAN-Z features a total of 5,760 CUDA cores (2,880 per GPU), 480 TMUs (240 per GPU), 96 ROPs (48 per GPU), and a total of 12 GB of GDDR5 memory, spread across two 384-bit wide memory interfaces. Although each of the two GPUs is configured identical to a GTX TITAN Black, it features lower clock speeds. The core is clocked at 705 MHz (889 MHz on the GTX TITAN Black), with GPU Boost frequencies of up to 876 MHz (up to 980 MHz on the GTX TITAN Black); while the memory remains at 7.00 GHz. The card draws power from a pair of 8-pin PCIe power connectors, and its maximum power draw is rated at 375W. It will be interesting to see how it stacks up against the Radeon R9 295X2 by AMD, which costs half as much, at $1,500.
Source: ComputerBase.de
Add your own comment

105 Comments on New GTX TITAN-Z Launch Details Emerge

#1
matar
I am a nVidia Fan but not this time $3000 for 3 slots so basically $1000 for each slot.
Posted on Reply
#2
johnspack
Here For Good!
Again, why... I know we are a performance minded communtity... why not just 2 tis, I know, no high end dp performance to do cad work.......
Posted on Reply
#3
MxPhenom 216
ASIC Engineer
I would have much rather seen Nvidia release a GTX790 to take on the 295x2 rather than this card. Unless Nvidia is working on one too. In gaming I expect this card to be killed by the 295x2, but anyone who buys this card for strictly gaming might want to reevaluate their life.
Posted on Reply
#4
GreiverBlade
MxPhenom 216I would have much rather seen Nvidia release a GTX790 to take on the 295x2 rather than this card. Unless Nvidia is working on one too. In gaming I expect this card to be killed by the 295x2, but anyone who buys this card for strictly gaming might want to reevaluate their life.
well if you expect that card to be killed by the 295x2 then the 790 would have been same ... afaik a Titan Black is equal to a 780 Ti but plus DP compute (and the Tblack has a higher clock but the Z will have a lower tho) rather buying 2 or 3 780 Ti is more efficient (price and frequencies) even on air cooling (for gaming ofc) except if the 790 would be priced under 1500$.

but for the last part i am totally following you
matarI am a nVidia Fan but not this time $3000 for 3 slots so basically $1000 for each slot.
exactly what i thought when they released the info on the price :roll:
Posted on Reply
#5
MxPhenom 216
ASIC Engineer
GreiverBladewell if you expect that card to be killed by the 295x2 then the 790 would have been same ... afaik a Titan Black is equal to a 780 Ti but plus DP compute (and the Tblack has a higher clock but the Z will have a lower tho) rather buying 2 or 3 780 Ti is more efficient (price and frequencies) even on air cooling (for gaming ofc) except if the 790 would be priced under 1500$.

but for the last part i am totally following you



exactly what i thought when they released the info on the price :roll:
Maybe, but the clock speeds are butchered on this Titan Z. I was hoping they would change some things during the delay to get high clock speeds.
Posted on Reply
#6
GreiverBlade
MxPhenom 216Maybe, but the clock speeds are butchered on this Titan Z. I was hoping they would change some things during the delay to get high clock speeds.
indeed that's what i said in my post, but a 790 would also be on a 2x8pin and if they go for a 3 slot air cooling for a hypothetical 790 the clock of the 780 Ti would be also butchered.

i guess lower clock is what you get for a air cooling solution and a tdp of 375w, maybe with a 3x8pin ... but again the price point of the 295x2 is the main force and if you take a OC 295x2 like the Sapphire even in Gflops the Z is behind... i would love to see a 790(Ti) but for the moment the 295x2 is my favorite, even if the hybrid cooler is a little hindrance due to the place needed for the radiator, just for user who want two of them, me with only one i would be happy :D

500w versus 375w well ... for power efficiency the Z hold the lead but at what price ...
Posted on Reply
#7
Suka
MxPhenom 216Maybe, but the clock speeds are butchered on this Titan Z. I was hoping they would change some things during the delay to get high clock speeds.
Apparently they aren't going after the 295 performance crown with the Titan Z. Maybe that would be a 790 but then if they are to cool two GK110 chips wouldn't they need water cooling also or they will go with a dual or triple fan configuration for the cooler? Am assuming they would clock the chips ~1GHz boost.
Posted on Reply
#8
dj-electric
So guys... What happened to the ASUS Launch PR?
Posted on Reply
#9
HumanSmoke
GreiverBlade... i would love to see a 790(Ti)
Not me.
If the whole idea is a PR stunt for the halo, then Nvidia should just go nuts at pay no heed to the PCI-SIG. Supply enough power (3x8pin), add in some binned chips to guarantee 1150-1200MHz boost and be done with it. Maybe take a leaf out of AMD's book and bundle the card with a couple of 740QC's and a 240mm rad just for sh*ts and giggles.
Personally, I find dual cards a waste of time and resources. Benchmarks results (and subsequent conclusions) are heavily dependant upon a working driver profile for SLI/CrossfireX in the chosen games, and when it doesn't work, disabling one GPU is a limited option fix for an expensive cash outlay. If you're looking at productivity/content creation applications that don't leverage SLI/CFX, then that's all good- but it still represents a miniscule number of potential users that require a precise feature set.

I'm guessing, overclocking two 6GB 780's will net a better experience, moreso when you factor in the $1860 saving over the Titan Z.
Posted on Reply
#10
GreiverBlade
HumanSmokeNot me.
If the whole idea is a PR stunt for the halo, then Nvidia should just go nuts at pay no heed to the PCI-SIG. Supply enough power (3x8pin), add in some binned chips to guarantee 1150-1200MHz boost and be done with it. Maybe take a leaf out of AMD's book and bundle the card with a couple of 740QC's and a 240mm rad just for sh*ts and giggles.
Personally, I find dual cards a waste of time and resources. Benchmarks results (and subsequent conclusions) are heavily dependant upon a working driver profile for SLI/CrossfireX in the chosen games, and when it doesn't work, disabling one GPU is a limited option fix for an expensive cash outlay. If you're looking at productivity/content creation applications that don't leverage SLI/CFX, then that's all good- but it still represents a miniscule number of potential users that require a precise feature set.

I'm guessing, overclocking two 6GB 780's will net a better experience, moreso when you factor in the $1860 saving over the Titan Z.
i rephrase ... i would love to see a 790(Ti) under 1500$ otherwise : it's a no go!

i can't but agree on the dual gpu card are a wast... even if i like to see them, ofc i would be happy with one 295x2 in my SG09B but a R9 290/290X would be more than enough already.
Posted on Reply
#11
LAN_deRf_HA
Between the clock speed, deflated cooler, and price, this has turned out to be a pretty shitty product. Hopefully it does poorly enough so they don't try this again any time soon. Of course people had hoped that would be the case for the Titan, but everybody bought that up so now we're stuck with $1000 cards being considered acceptable.
Posted on Reply
#12
64K
GreiverBladei rephrase ... i would love to see a 790(Ti) under 1500$ otherwise : it's a no go!

i can't but agree on the dual gpu card are a wast... even if i like to see them, ofc i would be happy with one 295x2 in my SG09B but a R9 290/290X would be more than enough already.
You probably will see a GTX 790 but I doubt it will be under $1,500.
Posted on Reply
#13
PLAfiller
Come on people :) can't you just enjoy it already :P. It's not like it's your investment to release the card. Judging by the amount of comments Titan Black is causing in the Net, it already paid for itself.
Posted on Reply
#14
Sihastru
So much hate.

I understand the arguments.

CON #1: It creates a precedent. A really expensive card, like we never had before. It creates a new (new-new) price bracket for very (very-very) high end cards. We don't want that.

REPLY to CON #1: For every Koenigsegg, there are millions of affordable cars of all sizes and for all purposes. You're not required to buy the Koenigsegg. You can buy the half price Nissan GTR and still go as fast as the speed limit.

CON #2. It's stupid. It makes no sense. I can get INSERT-NAME-HERE card for half the price, or I can get two of the INSERT-ANOTHER-NAME-HERE for even less.

REPLY to CON #2: So what? If we won't push the limits, how will we ever get ahead? You're not required to buy into the crazy-bonkers ragged edge products.

CON #3: It's not as great for mining as the INSERT-NAME-HERE! AMD FTW!

REPLY to CON #3: The world is not just about mining. NVIDIA created an ecosystem in the professional market and they can now get a nice return on that investment. For the professional market, it's a bargain card. If AMD wants in, they must work at it. It's irrelevant that an AMD card is good at certain compute tasks if the companies that write the software for the professional market do not care. And there are good reasons for them not to care, one of the most important ones is that AMD always offloads almost everything to INSERT-COMPANY-NAME-HERE.

The most recent example to this is Mantle. AMD created a lot of buzz, but in reality it offloaded the actual work to the game developers. The same with 3D display technology. AMD offloaded work to some company and then it buzzed it up with the words "free" and "open source". And there are many other examples...

CON #4: Nobody will buy the card. If you buy the card, you're stupid! NVIDIA is stupid! Titan-Z is stupid!

REPLY to CON #4: I hear this argument a lot. People that can afford expensive things are for some reason all stupid. Well... maybe some of them are, but most of them are smarter then most people. And a lot of people will buy this card. They will.

Crazy and stupid are not the same thing.

CON #5: Why would NVIDIA create such a product? The same was asked when Titan came along.

REPLY to CON#5: Because people will buy it. This kind of purchase can be justified in many ways. If you have the money, 'because I can' is enough. NVIDIA thanks you.

CON #6: Whatever.

REPLY to CON #6: Whatever.
Posted on Reply
#15
Sony Xperia S
Oh, come on, don't get silly. Nvidia will NEVER thank you, they are the big evil laughing in the corner at anyone who is not smart enough to value their money but throw it. Nvidia DOESN'T need your money, it has enough!

The comparison between the car is nonsense too. I would call the R9 295X2 a Bugatti Veyron, and this one would be a cheaper Nissan in real value.

That said, I will buy the card only if it is fairly priced at around 1000 bucks. If you are greedy and want more, no deal. :laugh:
Posted on Reply
#16
64K
SihastruSo much hate.


REPLY to CON #3: The world is not just about mining. NVIDIA created an ecosystem in the professional market and they can now get a nice return on that investment. For the professional market, it's a bargain card. If AMD wants in, they must work at it. It's irrelevant that an AMD card is good at certain compute tasks if the companies that write the software for the professional market do not care. And there are good reasons for them not to care, one of the most important ones is that AMD always offloads almost everything to INSERT-COMPANY-NAME-HERE.
Nvidia is actively causing a lot of this uproar on the net concerning the Titan Z. This is a GeForce card (a gaming card) and look at what they are saying on their GeForce website.....

www.geforce.com/whats-new/articles/announcing-the-geforce-gtx-titan-z

If this card makes more sense than 2X Titan Black to professionals then spend the extra money but as long as Nvidia aims this card at gamers for $3,000 then you will continue to see comments like the above posts all over the net from gamers.
Posted on Reply
#17
HM_Actua1
Thee dumbest, non-sense making deploy by Nvidia I've ever seen.......

This card and it's price tag make ZERO SENSE!

What people should really understand and know is the the "12GB" of Vram is really 6GB

they're doing what they did with the 690 allocating 4GB per GPU

So in the Z's casse 6GB to each GPU which means you really only get 6GB of usable VRAM.

I'm a stout Nvidia/Intel user but this saddens me to see such stupidity.
Posted on Reply
#18
MxPhenom 216
ASIC Engineer
Hitman_ActualThee dumbest, non-sense making deploy by Nvidia I've ever seen.......

This card and it's price tag make ZERO SENSE!

What people should really understand and know is the the "12GB" of Vram is really 6GB

they're doing what they did with the 690 allocating 4GB per GPU

So in the Z's casse 6GB to each GPU which means you really only get 6GB of usable VRAM.

I'm a stout Nvidia/Intel user but this saddens me to see such stupidity.
I think everyone here, and everyone even remotely interested in the card knows that. Its SLI on same PCB. Each GPU gets their own 6GB pool of memory, and 384 bit bus.

They are doing what they HAVE done with every dual GPU they have made, AMD does the same thing.

Found a screen shot of the PCB. One crowded PCB, scares me like the GTX590 PCB.

Posted on Reply
#19
HM_Actua1
MxPhenom 216I think everyone here, and everyone even remotely interested in the card knows that. Its SLI on same PCB. Each GPU gets their own 6GB pool of memory, and 384 bit bus.

They are doing what they HAVE done with every dual GPU they have made, AMD does the same thing.

Found a screen shot of the PCB. One crowded PCB, scares me like the GTX590 PCB.

I wouldn't go as far to say everyone knows that but. who knows.

Yah Dual GPU cards have WAY too much heat on a single PCB
Posted on Reply
#20
Scrizz
maybe they should've gone the 9800 GX2 route?
LOL

Posted on Reply
#21
xorbe
2 x Titan Black base clock 26.1% faster, but Titan Z 50% more expensive. Does not compute for gamer.
Posted on Reply
#22
Fluffmeister
If people choose to buy two Titan Blacks over a single Titan Z than I really don't think Nvidia care.
Posted on Reply
#23
radrok
MxPhenom 216I think everyone here, and everyone even remotely interested in the card knows that. Its SLI on same PCB. Each GPU gets their own 6GB pool of memory, and 384 bit bus.

They are doing what they HAVE done with every dual GPU they have made, AMD does the same thing.

Found a screen shot of the PCB. One crowded PCB, scares me like the GTX590 PCB.

Oh wow that power delivery is almost even worse than the original Titan, ha ha ha sorry can't stop laughing.

For that price one would have hoped they'd equip it with a decent PCB, I'm sorry but ATI has always been superior, look at the "overkillness" they slapped onto the 295x2.
Posted on Reply
#24
MxPhenom 216
ASIC Engineer
radrokOh wow that power delivery is almost even worse than the original Titan, ha ha ha sorry can't stop laughing.

For that price one would have hoped they'd equip it with a decent PCB, I'm sorry but ATI has always been superior, look at the "overkillness" they slapped onto the 295x2.
Its not entirely about the amount of phases. But also the capacity that each phase is rated for.

I expect that with dropping one phase per gpu, the rest are rated a bit higher to compensate, but who knows.
Posted on Reply
#25
Ferrum Master
radrok, I'm sorry but ATI has always been superior, look at the "overkillness" they slapped onto the 295x2.
Be more mature...

Nevertheless the design looks underpowered. Both Titan too much Zeroes and 295X2 Celsius are failures design wise... they are utterly useless for the given price, R/D cost and other stuff... it is just a check in the book like we had them...
Posted on Reply
Add your own comment
Nov 17th, 2024 17:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts