# NVIDIA Announces the GeForce GTX TITAN-Z



## btarunr (Mar 25, 2014)

Here it is, folks, the fabled monster dual-GPU graphics card from NVIDIA, based on its GK110 silicon, the GeForce GTX TITAN-Z (sounds like "Titans"). The first reference-design graphics card to span across three expansion slots, the GTX TITAN-Z features a cooler design that's an upscale of the GTX 690, with a pair of meaty heat-pipe fed heatsinks being ventilated by a centrally-located fan. The card features a pair of GK110 chips, with all 2,880 CUDA cores enabled, on each. That works out to a total core count of 5,760! 

That's not all, the two chips have 480 TMUs, and 96 ROPs between them; and each of the two is wired to 6 GB of memory, totaling a stunning 12 GB on the card. At this point it's not clear if the GPUs feature full-DPFP, but their SPFP totals 8 TFLOP/s. Display outputs on the card include two dual-link DVI, a DisplayPort, and an HDMI. According to its makers, the GTX TITAN-Z is the first graphics card that's truly ready for 5K resolution (5120 x 2700 pixels) on a single display head, for gaming. At US $2,999, the card costs thrice as much as a GTX TITAN Black, for twice its performance.





*View at TechPowerUp Main Site*


----------



## damric (Mar 25, 2014)




----------



## iO (Mar 25, 2014)

Just a $1000 price premium over 2 Titan BEs?
Sounds like a good deal...


----------



## dj-electric (Mar 25, 2014)

I'm aroused.


----------



## hckngrtfakt (Mar 25, 2014)

$3k ??? 

oh well, i only needed one kidney anyways ...


----------



## Patriot (Mar 25, 2014)

iO said:


> Just a $1000 price premium over 2 Titan BEs?
> Sounds like a good deal...



Lol... Yeah that was my thought.   I think I will take 3x 780ti's instead...


----------



## HumanSmoke (Mar 25, 2014)

iO said:


> Just a $1000 price premium over 2 Titan BEs?
> Sounds like a good deal...


Standard practice for a halo part that Nvidia only needs for PR ( a dual Hawaii counter) rather than any meaningful customer base numbers*. Remember the HD 7990 priced at 2-3 times the price of a single HD 7970 for the same reason, along with the fact that there were precious few initially available.

* I'm going to guess that the number of people that drop $3K on a graphics card but only have a motherboard with a single PCIe x16 slot borders on the infinitesimal.


----------



## Nordic (Mar 25, 2014)

I think this is a cool, although overpriced card. They move to a triple slot cooler, which looks pretty darn great, but still one fan. What does nvidea have against two fans?


----------



## MxPhenom 216 (Mar 25, 2014)

HumanSmoke said:


> Standard practice for a halo part that Nvidia only needs for PR ( a dual Hawaii counter) rather than any meaningful customer base numbers*. Remember the HD 7990 priced at 2-3 times the price of a single HD 7970 for the same reason, along with the fact that there were precious few initially available.
> 
> * I'm going to guess that the number of people that drop $3K on a graphics card but only have a motherboard with a single PCIe x16 slot borders on the infinitesimal.


Throwing this card into an ITX build would be absolutely insane!


----------



## Fluffmeister (Mar 25, 2014)

james888 said:


> I think this is a cool, although overpriced card. They move to a triple slot cooler, which looks pretty darn great, but still one fan. *What does nvidea have against two fans?*



Good engineering.


----------



## dj-electric (Mar 25, 2014)

If you have the money for this, you probably have the money for a water-cooling block.
Not overclocking a GK110 is a crime. Not overclocking two of them - and you'll be in death row.


----------



## buildzoid (Mar 25, 2014)

Dj-ElectriC said:


> If you have the money for this, you probably have the money for a water-cooling block.
> Not overclocking a GK110 is a crime. Not overclocking two of them - and you'll be in death row.


If the GTX 590 is anything to go by the VRM on this thing will go boom the moment the card pulls more than 105% of it's TDP.


----------



## Disparia (Mar 25, 2014)

Nice, after I buy one I'll have enough money to get an... AM1 board and APU. Hope that's a good pairing.


----------



## DaJMasta (Mar 25, 2014)

It better be bitchin' fast for 3k.




The first things I thought when I read it:

Wat.


Whyyyyyyyyyyyy??????



Then I realized the pace of new architectures and performance bumps has been slowing somewhat in the past couple years... it's probably a result of that.


----------



## Hilux SSRG (Mar 25, 2014)

Sweet obsolete in 5...4...3...2...1...


----------



## progste (Mar 25, 2014)

Dafuk are they smoking?!
oh, money? ok...


----------



## R-T-B (Mar 25, 2014)

Fluffmeister said:


> Good engineering.



Don't some of the best third party nvidia boards feature 2 fans for cooling?


----------



## TheHunter (Mar 25, 2014)

^
Probably for acoustic reasons, 3slot radiator should take care of the rest.


But the price is just, :yawn:


----------



## nunomoreira10 (Mar 25, 2014)

8 GF single precision is 4 GF each,
since titan black has around 5 Gf that means a substantial lower clock and potential good perf/watts
it may still require 3  8pins to power it.


----------



## Assimilator (Mar 25, 2014)

buildzoid said:


> If the GTX 590 is anything to go by the VRM on this thing will go boom the moment the card pulls more than 105% of it's TDP.



The only nVIDIA dual-GPU card that had a problem with popping its VRMs is also the only one the AMD fanboys trot out every time a new nVIDIA dual-GPU card is announced. How unsurprising.


----------



## BorisDG (Mar 25, 2014)

My next card.


----------



## Ferrum Master (Mar 25, 2014)

This clearly defines AMD as poor man's choice now lulz...


----------



## Animalpak (Mar 25, 2014)

bitchin fast that your human eyes can not see any differences


----------



## zinfinion (Mar 25, 2014)

Going to need a driver update for 5K, seeing as current drivers wont allow anything wider than 3840.

Should be good news for downsampling if they do enable it, 5120x2160 on a 29" 21:9 will be sweet.


----------



## Serpent of Darkness (Mar 25, 2014)

GTX Titan-Z is literally 2x GTX Titan Blacks on a single PCB.  GTX 790 is 2x GTX 780s.  The difference between the two will tell you what premiums you're paying.

GTX 790 = 2304 CC x2 + 850 Core Clocks x2 + 3GB Frame Buffers x2 + Dx11.1 + Premiums is roughly $1,099.99.

GTX Titan-Z = 2880 CC x2 + 800 Core Clocks x2 + 6GBs Frame Buffers x2 + DX11.2 + 1/3(32-bit FPP) for 64bit FPP + Premiums is roughly $2,999.99

Price Difference = factor of  2.72779 more.
CC Difference = 25.0%.
Core Clocks = 6%.
Frame Buffer Size = 2.0
DX11.1 versus DX11.2...
Single Point Precision versus Double...

Side Notes:

1. Basically it's a GTX Titan Black Card x2.  Nothing more, nothing less.

2. Unless you're using 3x 4K TV Screens for Surround, you'll probably never utilize the full Frame Buffer amount.

3. This doesn't really cannibalize sales of K40 Tesla because the price you pay for one, mainly, is for the 1x 12GB Frame Buffer to 1 GPU versus 2x 6GB Frame Buffers to 2 GPUs.  In addition, all the features you'd find on a workstation graphic card.

4. Card would be more ideal for ZBrush or other animation software that utilizes a lot of subdivisions for a cheaper price in comparison to the Tesla K40.

5. My own personal point of view.  A lot of people are saying that NVidia GTX 780 Ti sales are high.  People, on a macro level, would pay the extra $100 to 200 dollars for 1 to 10 frames more in PC Games.  I disagree that GTX 780 Ti are selling well.  Instead, I believe that NVidia is pushing more Dual-GPU Variants to produce more sales because the GTX 780 TI sales are stagnate in comparison to the competition, and the recent price increases that's occurred thanks to the Crypto-Money situation.  If sales of GTX 780 TI were high, prices would be going up.  Instead they are static.  In addition, when a company doesn't push out new tech fast enough, it's because they are still continuing to ride their money cash-cow.  There's no point in spending more money when the end-game of the business is to generate more money from the consumers and spend less for cost.  Therefore, NVidia is pushing GTX 790 and GTX Titan-Z, as investments, to produce more money from it's consumer bases.

6.  I suspect since ROG is only coming out with a dual GPU AMD variant, it's safe to say the "King of the Year" is the AMD ARES III (R9-290x x2, 2x 2816 SP, 4GB Frame Buffers x2, full DX11.2 support, 1000 Mhz Turbo Cores x2).  This could change with a MARS IV or V with Titan-Z in the near future...  Only issue I see with the King of the Year is the frame time Variance.  How it's going to handle on a single Graphic Card.

7. GTX 780 Ti and GTX Titan Black have probably set the bar, as far as FPS performance, for the GTX Titan-Z x2.  I don't expect 2x scaling like the 2x R9-290x competition, on an FPS Performance, competition comparison.  Furthermore, Titan-Z FPS Performance will probably be less than GTX 780 Ti x2 or GTX Titan Black x2 in SLI.


----------



## Slomo4shO (Mar 25, 2014)

Comparitively, the GTX 690 was exactly 2x the price of a single GTX 680 when released...  Talk about progress...


----------



## Sasqui (Mar 25, 2014)

Now I know they unlocked some ROPS or CuDAS or what every those things are, so this chart is not quite accurate:







But for $3000, the scaling has to be better than that!


----------



## Casecutter (Mar 25, 2014)

james888 said:


> I think this is a cool, although overpriced card. They move to a triple slot cooler, which looks pretty darn great, but still one fan. What does nvidea have against two fans?


It appears they did a decent job cloaking that fact if it is a 3-Slot...  From that picture it's hard to tell.  If it is 3-Slot it provides that much extra thermal displacement area, lessen the need to pack the fins so tight creating backpressure.  While the fan doesn't look like their normal radial (blower) fan, but a more a traditional blade fan.  That's surprising as blade fans don't make the flow and pressure one would think is still needed.  And the radial fan Nvidia has with Titan has demonstrated to be a very good proprietary design.


----------



## buildzoid (Mar 25, 2014)

Assimilator said:


> The only nVIDIA dual-GPU card that had a problem with popping its VRMs is also the only one the AMD fanboys trot out every time a new nVIDIA dual-GPU card is announced. How unsurprising.


I'm unfortunate enough to have one(GTX 590) and am in the process of aquiring Epower boards to fix it. I was listing the most extreme example because GTX 680s GTX 780s GTX TITANs(not BE) have all been know to blow their VRMs once you push 1.45+V through them and degrade rapidly at voltages over 1.3V. Compare that to a R9 290X which can survive short periods of 1.6V on water cooling.


----------



## eidairaman1 (Mar 25, 2014)

ill take 2 R9 290 Gamers thanks lol


----------



## beardofnails (Mar 25, 2014)

I am so tired of seeing dual GPU cards. This trick only works on people that do not understand how computers work. The pci 3.0 bandwidth is damn near saturated with a single gpu 680. I'm no engineer, but I'm pretty sure that a tractor trailer can't fit through the eye of a sewing needle...


----------



## beardofnails (Mar 25, 2014)

oh and /thread


----------



## ensabrenoir (Mar 25, 2014)

..........................................................................(waiting for head to regenerate after it exploded)...........................................


Nvidia....taking* WIN* to Obscene levels........


----------



## HumanSmoke (Mar 25, 2014)

beardofnails said:


> I am so tired of seeing dual GPU cards. This trick only works on people that do not understand how computers work.









beardofnails said:


> The pci 3.0 bandwidth is damn near saturated with a single gpu 680.


Whatever


beardofnails said:


> I'm no engineer.


I believe you.


buildzoid said:


> I was listing the most extreme example because GTX 680s GTX 780s GTX TITANs(not BE) have all been know to blow their VRMs once you push 1.45+V through them and degrade rapidly at voltages over 1.3V. Compare that to a R9 290X which *can survive short periods* of 1.6V on water cooling.


Because killing a board in seconds is soooooo much worse than killing a board in minutes.

The other side to that particular coin is that, pushing voltage becomes a case of diminishing returns. Would there be much to be gained by allowing 1.6V through a GK 110 ? I can understand the interest for sub-zero (suicide) runs, but for 24/7 operation ?


----------



## bogami (Mar 25, 2014)

Abnormal price because there is no excuse for it either 4x Titan Black Edition = $ 4000. Therefore no one will be stupid to throw away $ 2000 as 2x TITAN-Z Will cost you $ 6,000.
Where do you find this logic layout price. There is nothing you would not already have 2xK110, 2x6GB RAM, PLX chip, 2x power units. Busiest frequency does not appear and is unlocked and possible further clockeng we do not know! 5K (5120 x 2700 pixels) !!!And do not fucking way because there are no monitors I have not seen offered anywhere, not to mention outdated outputs! This price is for a rich man and they do not know what they are buying. Yes, I had at least two but you can dream about them at this price rather wait for the Maxwell and get in the a single processor which offers two here, and will also be supported with desired outputs but this is more dependent on Apple's rights  i observed. AMD will issued 2 x R-9 290X for $ 1000 max $ 1,100 and INVIDIA will not be sold 10 pieces !


----------



## Chetkigaming (Mar 25, 2014)

Such WoW! Nvidia Very OP!


----------



## 20mmrain (Mar 25, 2014)

So Nvidia releases this card for the people who want bragging rights. (Or to prove they are idiots, justify anyway you want) 
Later (Not much later hopefully) I bet Nvidia releases "the true gamer dual card" The GTX 790. 
My guess is the Titan-Z will have a really low production rate and only an extreme (Retarded) few will buy it. 

Nvidia (speaking to them as if they can hear me).... with releases like the Titan-Z, AMD is looking better and better everyday.... and this is coming from someone who is not afraid to drop thousands on Tri and Quad GPU setups. 
Dropping 6+k on a Dual chip card Quad SLI setup that will not scale as well as a single chip quad SLI setup is just plan price gouging and dumb.


----------



## Chetkigaming (Mar 25, 2014)

Spoiler






20mmrain said:


> So Nvidia releases this card for the people who want bragging rights. (Or to prove they are idiots, justify anyway you want)
> Later (Not much later hopefully) I bet Nvidia releases "the true gamer dual card" The GTX 790.
> My guess is the Titan-Z will have a really low production rate and only an extreme (Retarded) few will buy it.
> 
> ...





amd homeless, can you answer on 1 qestion, how guys, that have a lot of money, cannot afford somth like that? Do you really think they are not smarter than you? Anyway, this people, who will buy that demonic gpu - supporting Nvidia very well, and new upcoming technologies too, so amd will look on results and create future technologies in order to not lose market sharing place. Its a normal competition.


----------



## HumanSmoke (Mar 25, 2014)

bogami said:


> Abnormal price because there is no excuse for it either 4x Titan Black Edition = $ 4000.


Kind of depends upon workload.
For gaming, it's obviously a play for the halo effect. A counter to the dual Hawaii card, and makes no sense - hence the pricing. There's a good chance that Nvidia will have to bin low vCore GK 110's for the Titan-Z, so they will be taking from the same bin as Tesla K40/Quadro K6000 ( runtime precision excepted).
For CG rendering and the like, it might depend more on how much power you can jam into a form factor. People are obviously shoehorning as many Titans into a box as possible, so why fill sixteen slots with 8 GPUs, when you can fill fifteen with 10 GPUs ? If rendering time is money, the added initial expense might not enter into the equation


----------



## buildzoid (Mar 26, 2014)

HumanSmoke said:


> Because killing a board in seconds is soooooo much worse than killing a board in minutes.
> 
> The other side to that particular coin is that, pushing voltage becomes a case of diminishing returns. Would there be much to be gained by allowing 1.6V through a GK 110 ? I can understand the interest for sub-zero (suicide) runs, but for 24/7 operation ?



I said short periods of time because I only found one guy who did it and he didn't kill the card but I suspect that after some hours the core(not the VRM) will burn out. Also the frequency at 1.6v was 1300mhz on water cooling so for benching its worth it. Also 1.45v is a pretty safe (for the core) voltage for benchin and shouldn't BBQ a well designed VRM.


----------



## erocker (Mar 26, 2014)

So... two years ago Nvidia's high end dual GPU set you back $1000 bucks with the GTX 690. Now, their high-end dual GPU costs $3000. WTF happened in two years? I sincerely hope this turns around and bites them in the ass somehow.


----------



## Nihilus (Mar 26, 2014)

April Fools is NEXT week, right?


----------



## lanceknightnight (Mar 26, 2014)

This card is for enterprise use and is not a bad price for that market.


----------



## Xzibit (Mar 26, 2014)

lanceknightnight said:


> This card is for enterprise use and is not a bad price for that market.



Nvidia disagrees with you



> Two GPUs, One Insane Graphics Card: Introducing the GeForce GeForce GTX TITAN Z.
> 
> GeForce GTX TITAN Z is a gaming monster, built to power the most extreme gaming rigs on the planet. With a massive 5760 cores and 12 GB of 7 Gbps GDDR5 memory, TITAN Z gives you truly amazing performance—easily making it the fastest graphics card we’ve ever made.
> 
> This is a serious card built for serious gamers. TITAN Z is designed with the highest-grade components to deliver the best experience – incredible speed and cool, quiet performance—all in a stunningly crafted aluminum case.


----------



## SIGSEGV (Mar 26, 2014)

lanceknightnight said:


> This card is for enterprise use and is not a bad price for that market.



geforce brand tagged as an enterprise segment? holycrap.. wtf are you talking about?
---


yeah, the price is normal for extreme (idiot) rich people or die hard fanboy (willing to die for JHH)


----------



## Fluffmeister (Mar 26, 2014)

Xzibit said:


> Nvidia disagrees with you



Indeed, based on their initial Titan experiment who can argue.


----------



## Xzibit (Mar 26, 2014)

Fluffmeister said:


> Indeed, based on their initial Titan experiment who can argue.



You mean that was the first time that people tried to use a gaming GPU in order to not pay the price of a workstation card.

With this revelation I dont see why you aren't working at Nvidia yet.


----------



## Fluffmeister (Mar 26, 2014)

Xzibit said:


> You mean that was the first time that people tried to use a gaming GPU in order to not pay the price of a workstation card.
> 
> With this revelation I dont see why you aren't working at Nvidia yet.



Indeed, I'm surprised too.

But I try to educate through the tears and tantrums.


----------



## Chetkigaming (Mar 26, 2014)

2 Titan blacks its just SUCH a premium!! This price because of bad gpu's from amd, that cannot compete with Titan.


----------



## HumanSmoke (Mar 26, 2014)

Fluffmeister said:


> But I try to educate through the tears and tantrums.


Trust you to unearth Rory's secret weapon




"RELEASE THE KLEENEX!!!"

Nice to see so many of the usual anti-Nv trolls finally join the party. Nap time must be over.


----------



## erocker (Mar 26, 2014)

HumanSmoke said:


> Trust you to unearth Rory's secret weapon
> 
> 
> 
> ...


Where are the trolls? I see people making legitimate comments and voicing their concerns over ridiculous pricing.


----------



## Am* (Mar 26, 2014)

So Nvidia went from trolling (with $1000 Titan) to just plain retarded ($3000) in a year...






But in all seriousness, can someone please explain to me what they are charging an extra $1000 for over two already stupidly overpriced Titan Blacks? For that awful cooler that's guaranteed to be incapable of cooling either of the GPUs, or the worse than shitty VRMs that are going to whine under 3D load?

P.S. In before AMD come out with an $1100 dual 290x card that's going to kick Nvidia in their groin (causing an instant $1000 price drop). I'd hate to be the owner of this card knowing how quickly it's going to de-value and by how much...and here I was worried about the 780 Ti getting price cuts sometime soon. Looks like there's no chance of that happening for a while.

I dread to think how much Maxwell is going to be overpriced by...will probably just wait for a slight price drop on the 780 TIs before buying one up and calling it a day on upgrading for a good 3-5 years, especially at these moronic prices thanks to Nvidia.


----------



## lanceknightnight (Mar 26, 2014)

The titan moniker was used by the lowest end workstation card. This card was given drivers for video gaming. This had the card out preform other cards on the market but maintain a price that was for enterprise. The card was a hit in the gaming community and entered a price range that few thought would be viable. Enterprise prices for a home card. The marketing now reflects that the market can bear these prices. Outlandish prices that in the home are silly but in a business environment which can utilize this type of horsepower they make sense. These sales taught a lesson that the gaming market is now willing to drop thousands on cards. This X2 card is a good price for an existing enterprise system with limited slots. In a home unit they are silly. But I reiterate, they are going to market to this segment as the titan sold like gangbusters to the gaming community. The value edition of an enterprise card with what was though to be an outlandish price sold out. Process that. The card sold too well. Thus this time they are selling an enterprise type card and marketing to gamers. The enterprise that would want this card will know who they are (they have titans now) it is the gamers who must be convinced. Most of these cards will need riser cables but what is $10 when you have already dropped 3k.  

Good play Nvid. 

On a side note this play is also amazing for the rest of the market as it keeps the one chip versions viable and sets a new high bar (think limbo) on price for AMD when they release their x2 cards. Nvid has ensured that the price war will go up higher and in favor of manufacturers. This is even shown in the pricing of 290x which at retail is going for 100 over MSRP. The market is showing that it is willing to bear higher prices and resellers are demanding more in kind. Nvid is helping distributors make a better profit on a specialized card. It is also attempting to tap out of the price war with AMD. I think this is a way to recoup the losses from the console and a few other poor choices on their part. The market Nvid has is the titan cards and the new synk tech in these cards for monitors. These premium items allow for overlap and profit. However Nvid can see that Intel the chip monster is on its way as it cannot crack the mobile market and AMD is also working on better on die GPU tech that makes the only viable market the High end consumer and enterprise client. This leads me to think of Nvid trying to store up for winter like a bear. Knowing that they need to get fat when they can as winter is coming. This is a great play for Nvid. they make money to recoup losses and attempt a different market entrance in a year in order to keep relevant on a five year plan.



Am* said:


> So Nvidia went from trolling (with $1000 Titan) to just plain retarded ($3000) in a year...
> 
> 
> 
> ...








AMD will come out with the super hand dryer GPU soon and it will big I am sure. I have 2 290x cards but respect the temps and work on the titan line but cost wise AMD is king. But the winner of the race is much more money. I hold Nvid is encouraging AMD to push prices higher to sustain both businesses. Now AMD could undercut and is expected to do so but may do tripple price as well. I do not care for this idea but can see that id AMD does not do it that the retailers will. This titan 2 is a market viability test put on an enterprise solution. As it is an E solution there is a built in market that will buy the product if this test fails. If the test is successful you will see the cards first batch sell out very fast. If there is no sell out then the chips will go to e use and no loss. If there is a sell out, Nvid will know the market will bear even more.


----------



## SIGSEGV (Mar 26, 2014)

HumanSmoke said:


> Trust you to unearth Rory's secret weapon
> 
> 
> 
> ...



i haven't seen any trolls, have you? 
i do see that you're the true troll here


----------



## HumanSmoke (Mar 26, 2014)

erocker said:


> Where are the trolls? I see people making legitimate comments and voicing their concerns over ridiculous pricing.


Well, I thought that the pricing as a gaming card was already done to death as being exhorbitant. What I'm also seeing is people deriding the card as a budget pro alternative...or in Xzibit's case, deriding someone who pointed out that the card might have more than one buying segment - I'd call that trolling, you call it legitimate. I think we can agree to disagree on that.


Xzibit said:


> Nvidia disagrees with you


(_BTW: This poster's  thoughts on the 2.5 - 3  x more expensive than the single GPU card HD 7990 was "__People are gonna eat these up if OC is good just to have the best scores__"....wouldn't the same apply with this card ?)_
...When I had already posted a link to both the gaming and pro/dev sites, and it apparent that Nvidia has priced the card for max PR - enough supply to keep it relevant in gaming benchmark charts, whilst adopting a top down pricing hierarchy based on the cards above it- not below it. I think Ryan Smith at Anandtech summed up the same sentiments


> Now for compute users this will still be an expensive card, but potentially very captivating. Per FLOP GTX Titan Black is still a better deal, but with compute users there is a far greater emphasis on density*. Meanwhile the GTX Titan brand has by all accounts been a success for NVIDIA, selling more cards to compute users than they had ever expected, so a product like GTX Titan Z is more directly targeted at those users. I have no doubt that there are compute users who will be happy with it – like the original GTX Titan it’s far cheaper per FP64 FLOP than any Tesla card, maintaining its “budget compute” status – but I do wonder if part of the $3000 pricing is in reaction to GTX Titan undercutting Tesla sales.


But hey, but everyone should keep thrashing the gaming-only aspect - awesome
* See post #39 in this thread


SIGSEGV said:


> i haven't seen any trolls, have you?
> i do see that you're the true troll here


You know what they say, If you can't beat them...hold the mirror up to their faces - or somesuch.


----------



## Fluffmeister (Mar 26, 2014)

SIGSEGV said:


> i haven't seen any trolls, have you?
> i do see that you're the true troll here



I dunno, if you aren't in the market for the $3000 Titan Z* then it kinda falls into trolling too.

*Nvidia offer cards to fit everyone's budget.


----------



## lanceknightnight (Mar 26, 2014)

"Huang compared the new Titan Z to Google Brain, which features 1,000 servers packing 2,000 CPUs (16,000 cores), 600 kWatts and a hefty price tag of $5,000,000 USD. A solution using Titan Z would only need three GPU-accelerated servers with 12 Nvidia GPUs total."

Source:http://www.tomshardware.com/news/geforce-titan-z-nvidia-gk110-cuda,26391.html


The unit obviously is directed for enterprise but the marketing will be for rich gamers as the cost benefit to a server farm is simple math and needs little convincing. 5k per server times 3 is 15k add the 36k for the cards and you have 51k cost replacing 5 mil. The cost of a day of server time is more money than buying your own now. Also the water test was used due to difficulty but also because industry like auto uses hydrodynamics. This shows real time testing ability to server time purchasers. These individuals may just buy a server brain now.


----------



## SIGSEGV (Mar 26, 2014)

Fluffmeister said:


> I dunno, if you aren't in the market for the $3000 Titan Z* then it kinda falls into trolling too..



what a brilliant conclusion..


----------



## awesomesauce (Mar 26, 2014)

this card is really overpower.

man if someone buy this today, he can hoooooooooooooooolllddddddd it a f**kin long time.
Look at this monsta.. back to me,
who do not want it?? 

wait 1-2year this card will drop price(hope so). now admire


----------



## Xzibit (Mar 26, 2014)

HumanSmoke said:


> You know what they say, If you can't beat them...hold the mirror up to their faces - or somesuch.



Okay.




HumanSmoke said:


> *I don't like any dual GPU card on principle - not just this one.*
> *Duallies are usually more problematic, suffer in ability (OC) to two single cards, have more issues with drivers, lower resale, and generally aren't a saving over two separate cards*


^I agree this Titan Z is terrible.


----------



## sweet (Mar 26, 2014)

3000$??? Uhm, it's nVidia after all.
EDIT: Seem like I misread a bit :|


----------



## 15th Warlock (Mar 26, 2014)

$3K?

Thanks, but no thanks, and this coming from a guy who invested over $2K on Titans a year ago, this is a halo product, a place holder to deter any damage to the Nvidia brand now that the dual 290X is to be released.

I'll wait for big Maxwell, thank you very much 

EDIT:



sweet said:


> 3000$??? Uhm, it's nVidia after all.
> But for just 3 GB of VRAM effective? A trash card :|



I think you misread the release, this card features 6GBs of VRAM per GPU, 12GBs total VRAM


----------



## HumanSmoke (Mar 26, 2014)

Xzibit said:


> ^I agree this Titan Z is terrible.


I don't know why you felt you had to bold the whole thing. I don't like dual GPU gaming cards, the Titan Z would be no exception- in fact, given it's pricing, it makes no sense at all- as I've already stated.
I believe what I pointed out is that the card is just as much aimed at non-gaming workloads. Most of talk is about gaming (isn't it always)- most of the loud talkers have a gaming-only mentality, and marketing targets gaming since 1. It is a large market, 2. Easily swayed by a halo product, 3. generates endless PR via forum debate, and 4. pro users will buy based on fact/workload efficiency, and generally aren't swayed by marketing or anonymous posting drivel.
I really don't see much difference between my stance in this regard than I did with the original Titan (note the troll post immediately after mine)


HumanSmoke said:


> I'd say the company are walking a fine line between keeping the card relevant enough to maintain its inclusion in review suites, whilst not underselling the board causing a complete sell out/no stock situation and cannibalizing Tesla sales for people who want compute/FP64 but have little use for ECC (somewhat overrated in GDDR5 anyhow).



Screaming that Nvidia won't sell cards based on their gaming ability alone doesn't alter the fact that the boards will find users with other workloads in mind...and their priorities probably aren't the same as someone who sits around playing Titanfall.


----------



## matar (Mar 26, 2014)

WOW Super impressed how nVidia were able to put 2 full GK-110 on one PCB with All ROPs and 6GB each GPU
I am nVidia Fan But not this time $3000 Is insane even for enthusiast gamer because you will always get better results in a real 2-way SLi  rather then a single card SLi + it's much more useful in so many scenarios one of them is better overcloking and if you run out of cash sell one   the only benefit in a single PCB if you have one PCIe slot and want the best then that's the way to go but not by paying a triple Fee. ok I understand 1+1=2 so $2000 Ok NVidia work a lot to accomplish this Dual GPU then $2200 if the offer the same clocks speed at one PCB at a lower power consumption , and I still think any GPU over $1K is insane.


----------



## tjmagneto (Mar 26, 2014)

Is the GTX Titan-Z made with Super Alloy Z?


----------



## xenocide (Mar 26, 2014)

matar said:


> $3000 Is insane even for enthusiast gamer


 
Yea, it's almost like they aren't targetting gamers with this card.  If only these cards with insane FP64 performance usable in enterprise environments had a market of their own, maybe Nvidia could get away with such a "costly" solution.  But since nobody on earth does anything with GPU's but play Video Games clearly they are just corporate assholes trying to squeeze money out of the public.


----------



## leeb2013 (Mar 26, 2014)

it's a crazy price for an unnecessary graphics card. Of course there's much better value out there, but some people will need the Titan-z (devs?) and some people have lots of money. If there was no demand at all, they wouldn't make it. Is a $1mil hyper car 200 times better than my Ford Falcon, nope, but it's 200 times more expensive and some people can and will buy it.


----------



## dados8756 (Mar 26, 2014)

3k for GPU??? LoL


----------



## LeonVolcove (Mar 26, 2014)

Chill out guys.

I think the conclusion is quite simple

1. For those who want to play ULTRA-HIGH END settings, 4K or Higher resolution, Free lagging, G-sync ON, and ABLE to afford such a price without problem, and want to SHOW OFF their RIG or DIE-HARD "Green" Fan feel free to buy. no one gonna stop you, its your money after all

2. For those who already satisfied with their current rig and cannot afford such luxury, then dont buy it, thats all

and for the last but not least, Red Team
Dont sad let nvidia have their fun for now, i am sure AMD have something on their sleeve to counter this Nvidia gonna have to turn down this "BEAST" price down.


----------



## SetsunaFZero (Mar 26, 2014)

now wait for Titan-X and Titan-Y and u can fuse them together to XYZ-Titan 


Spoiler


----------



## LeonVolcove (Mar 26, 2014)

Good 


SetsunaFZero said:


> now wait for Titan-X and Titan-Y and u can fuse them together to XYZ-Titan
> 
> 
> Spoiler


+1 Good Idea


----------



## xvi (Mar 26, 2014)

HumanSmoke said:


> * I'm going to guess that the number of people that drop $3K on a graphics card but only have a motherboard with a single PCIe x16 slot borders on the infinitesimal.


..unless they want three Titan-Zs.


----------



## RCoon (Mar 26, 2014)

Why the outcry? This is not for 99% of the world's gamers. This is for the criminally rich oil barons and such, who run intel extreme chips and probably don't build the systems for themselves.

Everybody is also missing the entire point of the current Titan. It's a compute card. Even at $3000, this is an *incredibly cheap compute card*. It sold like hotcakes in the first place for the compute power. Or has nobody seen how expensive the K series compute cards are for NVidia?


----------



## Tatty_One (Mar 26, 2014)

erocker said:


> So... two years ago Nvidia's high end dual GPU set you back $1000 bucks with the GTX 690. Now, their high-end dual GPU costs $3000. WTF happened in two years? I sincerely hope this turns around and bites them in the ass somehow.


 Monopoly sadly, however most speculation that I have read on other sites suggest it's just a Gimmick card to win back some Epeen territory, and in reality there may be only about 10,000 worldwide sold or even produced, guess we will just have to wait and see, to be honest I quite like the look of it, if over the coming months NVidia reduce the price to $199 I might just get one!


----------



## cmberry20 (Mar 26, 2014)

Takes 3 Slots.

Is $3k

Obvious really: Half Life 3 confirmed.


----------



## Xzibit (Mar 26, 2014)

tjmagneto said:


> Is the GTX Titan-Z made with Super Alloy Z?



Might be same material they made these Titan Z out of.












> This is a serious card built for serious gamers. TITAN Z is designed with the highest-grade components to deliver the best experience – incredible speed and cool, quiet performance—all in a stunningly crafted* aluminum case*.


----------



## jigar2speed (Mar 26, 2014)

Nvidia price trolling AMD - like they don't care ?


----------



## Recus (Mar 26, 2014)

...better than R9 295x2.



Spoiler


----------



## Prima.Vera (Mar 26, 2014)

beardofnails said:


> The pci *3.0* bandwidth is damn near saturated with a single gpu 680.


You mean PCIex 2.0...  PCIex 3.0 is not even saturated by a dual GPU card. Just check and compare the specs if you don't believe me. Google is your friend


----------



## Chetkigaming (Mar 26, 2014)

cmberry20 said:


> Takes 3 Slots.
> 
> Is $3k
> 
> Obvious really: Half Life 3 confirmed.


lol  Exactly


----------



## rtwjunkie (Mar 26, 2014)

beardofnails said:


> I am so tired of seeing dual GPU cards. This trick only works on people that do not understand how computers work. The pci 3.0 bandwidth is damn near saturated with a single gpu 680. I'm no engineer, but I'm pretty sure that a tractor trailer can't fit through the eye of a sewing needle...


 
I was under the impression that a GTX780 still doesn't saturate the bandwidth of PCIe 2.0?  If so, we're pretty safe with dual video cards on PCIe 3.0.


----------



## TheHunter (Mar 26, 2014)

Im still disappointed in Jen-Shun

First he shows Pascal, which irrelevant for another 2-3 years, few moments later he shows this meh Titan-z GK110 gpu...


----------



## LeonVolcove (Mar 26, 2014)

I know this sound stupid but i dont see anything about GPU Clocks or Memory Clocks


----------



## Yorgos (Mar 26, 2014)

only $2999, not even 3000


----------



## LeonVolcove (Mar 26, 2014)

Yorgos said:


> only $2999, not even 3000



$1 its not make any difference my friend


----------



## TheHunter (Mar 26, 2014)

LeonVolcove said:


> I know this sound stupid but i dont see anything about GPU Clocks or Memory Clocks



288gb/s means 6ghz ram, just like on old titan, 8tflops means ~ 700mhz per gpu.


----------



## Fairlady-z (Mar 26, 2014)

I'd be lying if I was to say oh I wouldn't want that lol..id take on in a heart beat if I was financially capable of spending this much on something I use for gaming, but to me this card seems to be focused for pro's. I would love to have one, or why stick with one when two of them is better lol. 

Now if someone could show me how I can cram this into my iMac that would be great lol.


----------



## LeonVolcove (Mar 26, 2014)

TheHunter said:


> 288gb/s means 6ghz ram, just like on old titan, 8tflops means ~ 700mhz per gpu.



Well since this thing is Half Gaming Graphics card and Pro Graphics card, but its just curiousity, that so called "5K".Is it true? since AMD Announcing R9 290 series is ready for 4K now Nvidia said that this card is ready for 5K


----------



## Vario (Mar 26, 2014)

Xzibit said:


> Might be same material they made these Titan Z out of.


Can you pimp that ride, Xzibit?


----------



## TheHunter (Mar 26, 2014)

Vario said:


> Can you pimp that ride, Xzibit?


lol'd


----------



## 64K (Mar 26, 2014)

I don't know very much about compute cards but does the Titan Z even make sense for someone who needs a card for compute and gaming? Wouldn't 2 Titans for $1,000 less be a better deal. I don't think Nvidia is going to release the Titan Z with anywhere close to a 500 watt TDP so it won't be as powerful as 2 Titans.


----------



## Xzibit (Mar 26, 2014)

Vario said:


> Can you pimp that ride, Xzibit?



No problem




Before





Pimp'd out





Here some more Titan Zoolander






2x8-pin




3-Slot so very limited on SFF like ITX were there only 2 slot and depth can be an issue


----------



## Am* (Mar 26, 2014)

Xzibit said:


> No problem



Hah, you legend...


----------



## SimpleTECH (Mar 26, 2014)

For $3000, this card better come with sharks with frickin' laser beams attached to their heads.


----------



## Casecutter (Mar 26, 2014)

After listening to all this, I have an atypical interpretation (or as "Serpent of Darkness covers in point #5) .  Nvidia enjoys (nye almost demands) such PR to keep selling GK110's as a gaming offerings, hear me out.

Nvidia knows the tipping point they can recoup engineering, tooling, manufacturing costs to deliver such a card, and they have a good idea the number they can expect to sell.  Even if that number of units mainly to enterprise purchasers, the PR frenzy it whips-up within "Gamers" just adds to the business plan for releasing it.  It's a perfectly good plan, and it returns venue, actually better profit that selling "X" amount chips individually (at lower margins) as offerings that counteracts AMD Hawaii product.  Nvidia gets to elevate the brand even higher, use up chips on extreme products offerings, and that actual adds "cred" to themselves as not directly vying with AMD.

We know 20Nm Maxwell is some time off, Nvidia needs to maintain GK110 production, but can’t hold margins selling a bunch in some price war especially the good full-compute parts.  I think the dual chip board Nvidia realized it lends itself to more to low-end enterprise, as the package provide substantial punch in more non-traditional chassis arrangements.  Two of them puts 4x compute without the need for risers on a more traditional (cost effective) motherboard, but honestly I don’t how/what is entailed in that today.   So, they release this as it devours two full-chips, pays the overhead, return profit, all the while shoring up the legitimacy for the pricing on the GTX 780/780Ti.  Best of keeps the gamers in side-bar discussing it.  It’s a very shrewd move to extent they use 2x the product (GK110), in an exceptionally high margin offering.

It will find buyers mostly from enterprise/compute that hadn't justified the Tesla/Quarto pricing.  I see it as product that 85% of even "bleeding edge gamers" won’t step-up to purchase… that fine.  It's a card that folks can use as for experimentation, pushing huge resolutions and multiple monitor configurations, if some deep pocket gamer finds it worthwhile... all the more merry.


----------



## Vario (Mar 26, 2014)

Xzibit said:


> No problem
> 
> 
> 
> ...


Disappointed in the lack of a bubble gum dispenser, bowling ball shiner, and a fish tank


----------



## johnspack (Mar 26, 2014)

Silly card....  gaming? workstation card with half the ram.....   just silly....


----------



## EarthDog (Mar 26, 2014)

1. +1
2. Depending on in-game settings and the title, 6GB will NOT be enough for 3x 4K... FFS BF4 at default Ultra uses almost 3GB at 2560x1440...
3.  +1
4.  +1
5.  Prices do not always have to go up when sales are good... basic business principle there...*blows dust off degree* LOL!
6.  Who knows...
7. I had no idea the (vaporware?) 2x r9 290x was scaling 2x consistently. Like all dual card setups, it depends on several factors to determine scaling... drivers, the title, resolution, in-game settings, etc.


----------



## xorbe (Mar 27, 2014)

I can't imagine even the hardest core gamer buying one of these.  Maybe some rich sheik addicted to WoW.  A handful of people with the biggest and baddest flight sim setups that are already megabuck cockpits.

The sale to compute users isn't entirely clear either: Titan Z may use 3/4 the room of Titan Black, but 2xTitan Black is 2/3 the cost, and probably faster, and far less costly to replace if one fails.

(Disclaimer: I have an original firmware-flashed Titan, which has been quite good.)


----------



## HumanSmoke (Mar 27, 2014)

xorbe said:


> I can't imagine even the hardest core gamer buying one of these.  Maybe some rich sheik addicted to WoW


They'd have to be a rich sheik with brain damage.
By all accounts, EVGA will sell the  6GB 780 inc Kingpin and GTX 780 Ti for around the same money as the current 3GB versions. A couple of 780Ti's for around half the money of a Titan Z, and I'm pretty sure which will overclock better.


xorbe said:


> The sale to compute users isn't entirely clear either: Titan Z may use 3/4 the room of Titan Black, but 2xTitan Black is 2/3 the cost, and probably faster, and far less costly to replace if one fails


Not necessarily. Tyan sells a custom barebones especially for rendering (the first board below is actually by Trenton, the second a Supermicro), there is a reason why they are fully stocked with boards. If the 4U (in this case) can accommodate 10 GPUs (5 x Titan Z), why would they stick with 8 Titan/Titan Black ? You are potentially losing 20% of the possible performance per unit. It isn't really that much different from server CPU economics- initial cost may be somewhat less than needing extra hardware to achieve the same throughput - there's a reason that there are multi PCI-E slot boards available, and they generally revolve around putting as much processing power as possible into a single unit.


----------



## The Von Matrices (Mar 27, 2014)

HumanSmoke said:


> Not necessarily. Tyan sells a custom barebones especially for rendering (the first board below is actually by Trenton, the second a Supermicro), there is a reason why they are fully stocked with boards. If the 4U (in this case) can accommodate 10 GPUs (5 x Titan Z), why would they stick with 8 Titan/Titan Black ? You are potentially losing 20% of the possible performance per unit. It isn't really that much different from server CPU economics- initial cost may be somewhat less than needing extra hardware to achieve the same throughput - there's a reason that there are multi PCI-E slot boards available, and they generally revolve around putting as much processing power as possible into a single unit.



I still don't see how a Titan Z could work in a server case.  Those cases are designed for a torrent of front to back air flow.  But the Titan Z (and other modern consumer dual GPU cards) exhaust from both the front and rear of the card.  In such a server case the Titan Z's front GPU will be starved for cooling since it exhausts against the case's flow of air.

As far as I know you have to go to the Tesla range with NVidia (like the K10) and Firepro with AMD (like the S10000) to get dual GPU cards that will work with front to back airflow.


----------



## Xzibit (Mar 27, 2014)

Making it seam like every ones doing just because someone has a blog about it doesn't make it  the norm.

You can always just read the guys blog.  Echelon Blog



> *I can however not attest to how stable they are in a 24/7 setting. Our systems run 24/7 but computation on GPUs is only done in small bursts throughout the day. There are however some computing centers that run multi-GPU systems (not 8 but 2 or 3 per node) with consumer grade cards quite successfully. There are probably more problems than with professional cards but they are also a lot more expensive.*


----------



## HumanSmoke (Mar 27, 2014)

The Von Matrices said:


> I still don't see how a Titan Z could work in a server case.  Those cases are designed for a torrent of front to back air flow.  But the Titan Z (and other modern consumer dual GPU cards) exhaust from both the front and rear of the card


You'd be best to direct the question at someone who runs one (or more). There are a few blogs and sites for multi-GPU CG workload machines.
FWIW, the same could be said of the GTX 690 - a card that I've seen more of as an Octane renderer than as a gaming card. Not saying the central radial fan and front exhaust are ideal, since it works against the natural airflow, but it doesn't seem to deter everyone








Tyan barebones rackmount product page.


Xzibit said:


> Making it seam like every ones doing just because someone has a blog about it doesn't make it  the norm.


Didn't say everyone was doing it, or is the norm. If it were then Nvidia would be selling boatloads of them wouldn't they?




I can keep putting up the render rigs if you'd like just to prove that Echelon's isn't the only render rig in existence.


----------



## Disparia (Mar 27, 2014)

The Von Matrices said:


> I still don't see how a Titan Z could work in a server case.  Those cases are designed for a torrent of front to back air flow.  But the Titan Z (and other modern consumer dual GPU cards) exhaust from both the front and rear of the card.  In such a server case the Titan Z's front GPU will be starved for cooling since it exhausts against the case's flow of air.
> 
> As far as I know you have to go to the Tesla range with NVidia (like the K10) and Firepro with AMD (like the S10000) to get dual GPU cards that will work with front to back airflow.



You just remove the fan unit and shroud. Even if you kept them on, it's not like those fans will actually disrupt airflow in the cases by Tyan and Supermicro. Like a squirrel standing up to a semi.


----------



## TheHunter (Mar 27, 2014)

HumanSmoke and your point is?

butthurt much that you need to prove something?


----------



## xorbe (Mar 27, 2014)

Obviously Titan Z was aimed at AMD's W9100 today, which is probably more than $2999.


----------



## FX-GMC (Mar 27, 2014)

TheHunter said:


> HumanSmoke and your point is?
> 
> butthurt much that you need to prove something?



Seems like the norm for his posts..

Might buy one of these if I win the powerball.


----------



## radrok (Mar 27, 2014)

Yawn Nvidia, I've been enjoying the same kind of performance, if not more, for more than a year, earlier and one thousand cheaper. Old tech and overpriced, give us a new architecture please, stop milking dead horse Kepler.


----------



## xorbe (Mar 27, 2014)

HumanSmoke said:


> . If the 4U (in this case) can accommodate 10 GPUs (5 x Titan Z), why would they stick with 8 Titan/Titan Black ? You are potentially losing 20% of the possible performance per unit.



You would save $7000.  Buy two more Titan Blacks, and use the $5000 left to buy whatever additional server room is needed.  I don't see how it comes out ahead.


----------



## 15th Warlock (Mar 27, 2014)

xorbe said:


> Obviously Titan Z was aimed at AMD's W9100 today, which is probably more than $2999.



Bingo!

What boggles my mind is why call it a Geforce and not Tesla... Seems like Nvidia is trying to appeal to a much wider audience by marketing this card to Compute users as a cheap alternative to Tesla and gamers alike.


----------



## Xzibit (Mar 27, 2014)

15th Warlock said:


> Bingo!
> 
> What boggles my mind is why call it a Geforce and not Tesla... Seems like Nvidia is trying to appeal to a much wider audience by marketing this card to Compute users as a cheap alternative to Tesla and gamers alike.



Doesn't Nvidia still locks cards out from benefiting compatibility with software, Bios + PCB, hardlocks and softlocks so you just cant use or trick it into being a Tesla/Quadro.  At $3000 it looses its appeal as a cheap alternative to anything serious.

It might also be feeling pressure from Intel MiCs and AMD FirePro which have started providing competitive products stacks at a much lower price in that segment.

I believe your right though that the aim is more of a Student/Gamer/Novice


----------



## HumanSmoke (Mar 27, 2014)

Xzibit said:


> Doesn't Nvidia still locks cards out from benefiting compatibility with software, Bios + PCB, hardlocks and softlocks so you just cant use or trick it into being a Tesla/Quadro.  At $3000 it looses its appeal as a cheap alternative to anything serious.


Not really, since you can use Quadro and Forceware drivers concurrently. If you need Viewport or pro driver support, it is a simple matter of connecting display out to a cheap Quadro NVS - which typically start at ~$100 (or any other Quadro as the mixed GPU machine picture shows in post #103).


TheHunter said:


> HumanSmoke and your point is? butthurt much that you need to prove something?


Providing a possible usage scenario for the subject of the hardware being discussed.
Providing examples to substantiate said possible usage scenarios.

Maybe I should just take a leaf out of your book and go with the resource-light,-no-thinking-required ad hominem attack that has zero content regarding the actual subject of thread.


----------



## FX-GMC (Mar 27, 2014)

HumanSmoke said:


> Not really, since you can use Quadro and Forceware drivers concurrently. If you need Viewport or pro driver support, it is a simple matter of connecting display out to a cheap Quadro NVS - which typically start at ~$100 (or any other Quadro as the mixed GPU machine picture shows in post #103).
> 
> Providing a possible usage scenario for the subject of the hardware being discussed.
> Providing examples to substantiate said possible usage scenarios.
> ...



butthurt is an adjective (which means it isn't a noun).  Therefore it cannot be considered name calling.  Now if he called you a hurt butt........


----------



## HumanSmoke (Mar 27, 2014)

FX-GMC said:


> butthurt is an adjective (which means it isn't a noun).  Therefore it cannot be considered name calling.  Now if he called you a hurt butt........


The image I pulled off a Google search as an illustrative. What I _actually said_ was


HumanSmoke said:


> Maybe I should just take a leaf out of your book and go with the resource-light,-no-thinking-required *ad hominem* attack that has zero content regarding the actual subject of thread.


----------



## FX-GMC (Mar 27, 2014)

HumanSmoke said:


> The image I pulled off a Google search as an illustrative. What I _actually said_ was



Sorry, ignored the text in favor of the colorful image with a large arrow pointing to NAME CALLING.


----------



## Disparia (Mar 27, 2014)

xorbe said:


> You would save $7000.  Buy two more Titan Blacks, and use the $5000 left to buy whatever additional server room is needed.  I don't see how it comes out ahead.



It's like when you need to buy an 8-way Xeon instead of getting a pair of quads. The quads have a huge cost advantage but don't provide the necessary performance or are not applicable to your situation.

Similarly, it's going to be far more common to see Titans in use, but there's going to be those occasions where its advantageous to use the Z.


----------



## xorbe (Mar 27, 2014)

Xzibit said:


> Doesn't Nvidia still locks cards out from benefiting compatibility with software, Bios + PCB, hardlocks and softlocks so you just cant use or trick it into being a Tesla/Quadro.



Isn't the only difference ECC support?  Though I'm guessing some OpenGL operations are "driver nerfed" for the non-Quadro Titan.


----------



## Xzibit (Mar 27, 2014)

xorbe said:


> Isn't the only difference ECC support?  Though I'm guessing some OpenGL operations are "driver nerfed" for the non-Quadro Titan.



Yes. It's compatibility through software.  It reverts workload back to CPU or software mode.

There a few sites that try to use software tricks and PCB switching.  You can get the software to make it show up as a Quadro and 1 or 2 features on certain commercial products but Its not the same as running a Quadro to a Geforce. Nvidia made sure of that.

Short list
* A lot of memory + ECC support.
* 64x antialiasing with 4×4 supersampling, 128x with Quadro SLI. The Geforce is limited to 32x, but supersampling is used only in certain 8x and 16x modes.
* Display synchronization across GPUs and across computers with the optional Quadro Sync card
* Support for SDI video interface, for broadcasting applications
* GPU affinity so that multiple GPUs can be accessed individually in OpenGL. This feature is available on AMD Radeon but not on Geforce.
* No artificial limits on rendering performance with very large meshes or computation with double precision
* Support for quad-buffered stereo in OpenGL
* Accelerated read-back with OpenGL. There are also dual copy engines so that 2 memory copy operations can run at the same time as rendering/computation. However, this is tricky to use.
* Accelerated memory copies between GPUs
* Very robust Mosaic mode where all the monitors connected to the computer are abstracted as a single large desktop.


----------



## HumanSmoke (Mar 28, 2014)

xorbe said:


> Isn't the only difference ECC support?  Though I'm guessing some OpenGL operations are "driver nerfed" for the non-Quadro Titan.


Yup. GeForce boards are firmware (and driver) crippled for OpenGL /OpenCL performance, so if the workload is primarily OGL based then no software mods will transform a GeForce's performance. CUDA performance isn't affected in the same way (hardware based).
The other primary differences are runtime validation (fewer errors in calculations) with Quadro, and better Viewport performance. The same applies to the difference between FirePro and Radeon. The video below compares a FirePro W5000 (basically a castrated HD 7850 - 25% fewer cores and texture address units, 50% slower memory) running rings around a Radeon.









Even if/when you can flash a gaming card into a pro card, you still can't get around the better runtime binning of the Tesla/Quadro/FirePro, so whilst you may save cash, it could come at the expense of visual artifacts in 3D renders or similar issues in other workloads.


----------



## Fluffmeister (Mar 28, 2014)

15th Warlock said:


> Bingo!
> 
> What boggles my mind is why call it a Geforce and not Tesla... Seems like Nvidia is trying to appeal to a much wider audience by marketing this card to Compute users as a cheap alternative to Tesla and gamers alike.



Because the Titan moniker brand falls under GeForce, and has since the original launched, there is really nothing mind boggling about it.

But you may well have nailed it with your assumption, since 3K for higher end Tesla/Quadro cards is really not a lot of money.

You see, when you create a high level brand and expect companies in turn to pay top dollar for it, it actually makes sense to protect that brand.

A concept peeps here clearly struggle with.


----------



## sweet (Mar 28, 2014)

Meanwhile, AMD released their Hawaii-base FirePro and TPU just ignored it. LOL no bias media


----------



## HumanSmoke (Mar 28, 2014)

sweet said:


> Meanwhile, AMD released their Hawaii-base FirePro and TPU just ignored it. LOL no bias media


Technically, AMD just announced the W9100. It hasn't been released. launch is slated for April.


----------



## lanceknightnight (Mar 28, 2014)

Casecutter said:


> After listening to all this, I have an atypical interpretation (or as "Serpent of Darkness covers in point #5) .  Nvidia enjoys (nye almost demands) such PR to keep selling GK110's as a gaming offerings, hear me out.
> 
> Nvidia knows the tipping point they can recoup engineering, tooling, manufacturing costs to deliver such a card, and they have a good idea the number they can expect to sell.  Even if that number of units mainly to enterprise purchasers, the PR frenzy it whips-up within "Gamers" just adds to the business plan for releasing it.  It's a perfectly good plan, and it returns venue, actually better profit that selling "X" amount chips individually (at lower margins) as offerings that counteracts AMD Hawaii product.  Nvidia gets to elevate the brand even higher, use up chips on extreme products offerings, and that actual adds "cred" to themselves as not directly vying with AMD.
> 
> ...


This is what I was saying too.


----------

