# NVIDIA GeForce GTX TITAN 6 GB



## W1zzard (Feb 17, 2013)

Announced earlier this week, NVIDIA's $1000 GeForce GTX TITAN set out to claim the single-GPU performance throne. The new card offers not only performance improvements but also comes with a new GPU Boost 2.0 algorithm that helps keep temperature and noise levels down.

*Show full review*


----------



## dj-electric (Feb 21, 2013)

INB4 Bashing to remind this is the results using an early driver for this card.


----------



## W1zzard (Feb 21, 2013)

Dj-ElectriC said:


> INB4 Bashing to remind this is the results using an early driver for this card.



i doubt the driver is significantly different to that of other kepler cards, but we'll see


----------



## dj-electric (Feb 21, 2013)

You may never be completely sure... At least you can hope, eh?


----------



## Fluffmeister (Feb 21, 2013)

Those power consumption figures are really impressive.

Great review thanks and a fair conclusion.


----------



## BarbaricSoul (Feb 21, 2013)

Great review as always Wizz.

Only a 24% boost over my 7970? Guess I won't be buying one of these afterall and my next upgrade will be crossfired 7970. And I had very high hopes for GK110.


----------



## DarkOCean (Feb 21, 2013)

nice gpu but worst possible performance per dollar lol.


----------



## buggalugs (Feb 21, 2013)

I dont get it, 25% more performance for 100% more cost? The way it was being hyped I was expecting at least 50% better performance than current cards.

The 7970 had a much bigger performance jump over the 6970 and 580 and it was no where near this expensive.


----------



## Kaynar (Feb 21, 2013)

I'm pretty sure the card will have 10-20% better performance in 2-3 months with newer drivers and the price will drop by 10-20% as usually after a few months. That is... if you can actually find one in stores 

Other than that I really enjoyed the crossfire/SLi review with the 7970 680 and Titan, as it clearly showed me it is not worth the money due to poor performance in most games.

The only good thing about Titan is that you get a 30% performance gain for the same TDP.


----------



## D4S4 (Feb 21, 2013)

decent oc for a fresh card


----------



## Fourstaff (Feb 21, 2013)

Smoking powerful for "this gen". I wonder how PS4 is going to react to its own relative impotency when it goes on sale at the end of the year.


----------



## progste (Feb 21, 2013)

buggalugs said:


> I dont get it, 25% more performance for 100% more cost? The way it was being hyped I was expecting at least 50% better performance than current cards.
> 
> The 7970 had a much bigger performance jump over the 6970 and 580 and it was no where near this expensive.



to be fair it sores better at higher resolution, it is clearly geared toward future high res screens, but still that price... -.-


*it would be interesting to see it against three HD7950 since the complessive price is around the same!*



> Smoking powerful for "this gen". I wonder how PS4 is going to react to its own relative impotency when it goes on sale at the end of the year.



just as usual: claiming to be the most powerful thing ever and probably ending up convincing the sony fanboys


----------



## Kaynar (Feb 21, 2013)

progste said:


> to be fair it sores better at higher resolution, it is clearly geared toward future high res screens, but still that price... -.-
> 
> 
> *it would be interesting to see it against three HD7950 since the complessive price is around the same!*



Well there is this simultaneous review that show 1, 2 and 3 card setups for gtx680, 7970 and Titan. The crossfire setup results are quite underwelming for the the majority of the games. So the 7950 in tri-fire would naturally be abit worst than the ones you can see with the 7970's


----------



## hardcore_gamer (Feb 21, 2013)

Finally nvidia launched the "real gtx 680". Not twice as powerful of gtx 580 but at twice the price. 


May be we can expect the gtx780 to be twice as fast as this card and cost 2000 bucks.


----------



## Fourstaff (Feb 21, 2013)

progste said:


> *it would be interesting to see it against three HD7950 since the complessive price is around the same!*


I think it will be a pretty unfair comparison, firstly the 7950 trifire will only work in certain setups. The motherboard you need to get the 7950 trifire setup will probably skew the cost against the Titan, although you could argue that cost is no longer much of an issue at this price. 



hardcore_gamer said:


> Finally nvidia launched the "real gtx 680". Not twice as powerful of gtx 580 but at twice the price.
> 
> May be we can expect the gtx780 to be twice as fast as this card and cost 2000 bucks.



I don't think we will be seeing 780 anytime soon, so its too early to speculate


----------



## BigMack70 (Feb 21, 2013)

Meh. Stupidly overpriced card. Only for people who want to buy multiples of it or who REALLY don't want to go multi-GPU.

I'm seeing any pair of cards already available from the 7950/670 up as being significantly faster and cheaper at the same time. I understand there are multi-GPU issues that single GPU doesn't have, but I can't see it being worth both a substantial price hike and performance loss at the same time.


----------



## hardcore_gamer (Feb 21, 2013)

Fourstaff said:


> I don't think we will be seeing 780 anytime soon, so its too early to speculate



That was a sarcastic comment.

Gone are the days when the new flagship single GPU cards were twice as fast as the previous generation while costing the same.

GPU manufactures are greedy these days. No wonder PCs are dying.


----------



## VulkanBros (Feb 21, 2013)

It can only be - because NVIDIA want´s the performance crown....
I dont get it - who the h... will pay for this........

Like having a toothbrush - in pure gold


----------



## Aquinus (Feb 21, 2013)

Fourstaff said:


> Smoking powerful for "this gen". I wonder how PS4 is going to react to its own relative impotency when it goes on sale at the end of the year.



Pretty sure that when most people who buy a gaming console they don't ask themselves "well, before I buy it, what are the specs?"

I think Sony is going to make a lot of money and AMD is going to get a share of that. The Titan is a nice product, don't get me wrong, but it isn't where the consumers are. They're pandering to enthusiasts and the money to be made is in mainstream and mobile devices.

nVidia might have the fastest single GPU card now, but it's the worst performance for the money you pay for it which puts it in the same class as Intel Extreme edition CPUs. If nVidia was serious about competing with AMD they need to drop the price point on their cards across the board. AMD's weakened financial state is what is driving their prices down and if nVidia could undercut AMD for just one line-up I bet it could be a life-threatening blow to AMD imho. I doubt that will happen though. I don't think nVidia has the overhead to undercut AMD.


----------



## Sasqui (Feb 21, 2013)

Impressive for a single CPU card for sure, but the 690 beats and isn't too far behind in power consumption, unless I'm missing something.

$1000?  Wonder what these will sell for on eBay in 4 years, lol


----------



## Aquinus (Feb 21, 2013)

Sasqui said:


> Impressive for a single CPU card for sure, but the 690 beats and isn't too far behind in power consumption, unless I'm missing something.
> 
> $1000?  Wonder what these will sell for on eBay in 4 years, lol



...but you wouldn't play a game at 1080p with this thing, so you need a $1,000 USD monitor (or 3 1080p displays,) to make it worth while. Otherwise it's wasted money because there are a lot of other video cards that can play games great on 1080p displays.

So all in all, the video card might cost $1,000 USD, but imagine what the platform it's going to be put into will look like, even more so with two of them... then imagine the total cost.


----------



## brandonwh64 (Feb 21, 2013)

Highlights of this card are 21% increase from 7970 at the cost of 1K


----------



## Fourstaff (Feb 21, 2013)

hardcore_gamer said:


> Gone are the days when the new flagship single GPU cards were twice as fast as the previous generation while costing the same.



Last gen fastest: GTX580
This gen fastest: Titan

According to Wiz's charts, GTX580 is about 54% of a Titan in the most relevant resolution (2560x1600), so it is almost "twice as fast"

As for the cost, well, Nvidia has some explaining to do



hardcore_gamer said:


> GPU manufactures are greedy these days. No wonder PCs are dying.



My pc is still healthy, I have not caught this pc killing disease yet. 

If you are referring to the fact that PC gaming is dying, I think you are horribly mistaken, PC gaming is at least as healthy as ever, if not healthier as last gen consoles are slowly withering.



brandonwh64 said:


> Highlights of this card are 21% increase from 7970 at the cost of 1K
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_Titan/images/perfrel.gif








or 30% when you are looking at the relevant resolution. However, this product is truly peerless and as such a completely outrageous price can be attached to it. Why would humans spend so much money buying blocks of shiny carbon?


----------



## Jack1n (Feb 21, 2013)

I would only consider this card with a boundless budget.


----------



## Random Murderer (Feb 21, 2013)

Impressive card, but not worth the price premium. For the same amount, you could get a GTX690, which judging by this review performs better at the cost of power consumption.

As always, great review W1zz!


----------



## fwix (Feb 21, 2013)

euuuuuh the price to be honest i so stupid -_-' +24% on the hd 7970 ghz for +600$ i can buy any hd 7970 ghz for 340euro in Europe so why the hell this suppose to be 1000 euro for only 24% wth is going on , are the nvidia fanboy rich to that point to buy anything with any price ? 
ore perhaps it's out only to say we have the biggest flagship 7.B T .....
i love nvidia but this .. sorry they just make no sense to the normal buyer to continue buying there products


----------



## brandonwh64 (Feb 21, 2013)

Fourstaff said:


> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_Titan/images/perfrel_2560.gif
> or 30% when you are looking at the relevant resolution. However, this product is truly peerless and as such a completely outrageous price can be attached to it. Why would humans spend so much money buying blocks of shiny carbon?



But most of us here do not run that res yet four. If you did then great buy a titan but most here I would venture are still 1080 or 1200.


----------



## Kovoet (Feb 21, 2013)

Max i would spend now would be 500 quid.


----------



## jihadjoe (Feb 21, 2013)

Holy crap it uses LESS power than a 7970!
Did not expect that.


----------



## Fourstaff (Feb 21, 2013)

brandonwh64 said:


> But most of us here do not run that res yet four. If you did then great buy a titan but most here I would venture are still 1080 or 1200.



Remind me what is the point of getting a Titan for 1200 screen?


----------



## Shihab (Feb 21, 2013)

Compute benches?


----------



## Mathragh (Feb 21, 2013)

Thanks for this great review! I like the way that your writing style is going


----------



## brandonwh64 (Feb 21, 2013)

Fourstaff said:


> Remind me what is the point of getting a Titan for 1200 screen?



Never said there was one. I just mentioned the performance increase from 7970 and if you game 1080/1200 then stick to the 7970/680


----------



## tastegw (Feb 21, 2013)

I'm impressed the most on how well the sli/crossfire scaling was for both amd and nvidia in 3dmark11, wish all the games were like that, and not just a bench.

The 680 saw slightly more than double fps with 2cards vs 1


----------



## chinmi (Feb 21, 2013)

so the price is more/less almost 2x of 680, but the performance is only 1.3x of it ?? 
wow... what a total rip off move nvidia... lol...


----------



## TheMailMan78 (Feb 21, 2013)

1000 bones to get 100 FPS in console ports............looking at this card makes me want to be involved in a Harlem Shuffle.


----------



## Lionheart (Feb 21, 2013)

Really nice card Nvidia but $1000 USD, no thanks I will stick with my $299 AUD HD7950


----------



## Fluffmeister (Feb 21, 2013)

jihadjoe said:


> Holy crap it uses LESS power than a 7970!
> Did not expect that.



Yeah pretty surprising, for a such a large complex chip with that level of performance I thought it would peak much higher.


----------



## Frick (Feb 21, 2013)

Like the power consumption very VERY much, and the card is impressive.


----------



## bogami (Feb 21, 2013)

Yeah here's Titan - a lot of frustration especially on looks as much as it is put on the appearance is taken in functionality. at this price I would expect moudeng options and choice of thermal results, but not card should look sexy. I still do not take into account that those who will add Liquide block provided with a slot in SLI and would want at least another layout slot they have acquired to use for upgrading (PCE-E SSD opt., Audio .....) must physically remove output instead that this could be resolved or which consist of smaller output than it did AMD. I think that people are not so stupid that I'd be able to connect multi monitors, it's not as expensive as you can afford AMD. Cards Why do manufacturers do not realize this is very sad! (3 slot card, 4slot card, card 5slot giwe me nightmares .... Sat stupid). If you raise the price so drastically since then to get some of it, not deaf ears designers. Preformens Titan GPU is of course a disappointment, again locking pounega potential of the processor as it was in the GTX 480, all complite neglect adaptation drivers for SLI option because it often turns out that one card works better than 2 or 3. It is clear to see that this product last year for which we have long expected a Invidia has worked rather to made profit with (K104). Results of the first tests are far below expectations, according to FPS in games tested here.because according to the tests that are shown here, we can ask if they have been all the necessary software loaded because often works SLI worse than single card.


----------



## TheMailMan78 (Feb 21, 2013)

bogami said:


> Yeah here's Titan - a lot of frustration especially on looks as much as it is put on the appearance is taken in functionality. at this price I would expect moudeng options and choice of thermal results, but not card should look sexy. I still do not take into account that those who will add Liquide block provided with a slot in SLI and would want at least another layout slot they have acquired to use for upgrading (PCE-E SSD opt., Audio .....) must physically remove output instead that this could be resolved or which consist of smaller output than it did AMD. I think that people are not so stupid that I'd be able to connect multi monitors, it's not as expensive as you can afford AMD. Cards Why do manufacturers do not realize this is very sad! (3 slot card, 4slot card, card 5slot giwe me nightmares .... Sat stupid). If you raise the price so drastically since then to get some of it, not deaf ears designers. Preformens Titan GPU is of course a disappointment, again locking pounega potential of the processor as it was in the GTX 480, all complite neglect adaptation drivers for SLI option because it often turns out that one card works better than 2 or 3. It is clear to see that this product last year for which we have long expected a Invidia has worked rather to made profit with (K104). Results of the first tests are far below expectations, according to FPS in games tested here.because according to the tests that are shown here, we can ask if they have been all the necessary software loaded because often works SLI worse than single card.



I don't mean to be rude but I have no F#$King clue what you just said.


----------



## SIGSEGV (Feb 21, 2013)

thanks. nice review. 

1000$ for around 20% - 30% is beyond a joke.


----------



## nickbaldwin86 (Feb 21, 2013)

no impressed... buy a 690 and have a better card for the same money


----------



## TheMailMan78 (Feb 21, 2013)

nickbaldwin86 said:


> no impressed... buy a 690 and have a better card for the same money



This isn't about saving money or budget. This is about having the best card no matter the price. For me its crazy. For Bill Gates its already over night to his house.


----------



## nickbaldwin86 (Feb 21, 2013)

each there own themailman... the Titan is a way for NV to fill there bank account.

Personally just wouldnt consider buying a Titan but I have been thinking about getting a 690 and now I want one more, maybe in a few weeks the price will drop?


----------



## nleksan (Feb 21, 2013)

Impressive engineering, very innovative, and awesome card...

But the $2k I had planned for a pair of these in SLI will be going towards my 3-Way GTX670FTW SLI and more cooling and storage...  The performance increase of these is simply not worth the cost of admission...


----------



## Animalpak (Feb 21, 2013)

excellent review ! 


In terms of pure performance battle i notice that the GTX 590 defended himself with fists and kicks and is glued to the GTX titan back. What a wonderful die hard graphics card !!

This shows that the GTX 590 is a succesful card, i think i will start to look for a second-hand GTX 590 !!


----------



## Casecutter (Feb 21, 2013)

Thoughts;  Nice Halo but perhaps not as impressive as the GTX690, given the price.

Power consumption is amazing! If there was a problem in the first GK-100 it wasn’t power or they found it and fix it twice over!

New drivers aren’t going to make any real improvement, but for the extra money you hand them I hope Nvidia will have a team that’s only  supporting Titian and GTX690 for the next 4 years.

Forget about any substantial retail price reductions (like $800) until it’s irrelevant say mid-2014.

Lastly, consider Nvidia has fairly extensive number of GK-110 die’s given Tesla production. Is this K20X or some form of gelding from that 14 SMX bin, that might be the reason Nvidia hasn't more room on price?  Is yield for full-fledged Tesla that good, or was cutting-off anymore SMX modules (like the K20 13 SMX or down to 12) exceptionally detrimental on performance, or might there be a further reduce spec chip in the wings for $700.  I ask would folks pay for something in the middle and fork out say $700.  Something tells me this is the best Nvidia could manipulate, while holding decent perf/price (and face), and from that all they could release to a gaming GPU. They probably have a fair amount of GK-110 that really make no sense to build into something, so Titian pricing is intended to supplement their dudes.


----------



## Rahmat Sofyan (Feb 21, 2013)

So far 7970GE still the best bunch with never settle bundle, and if buy 2 amazing isn't?

for me this is more important













Just Buy one 7970 GE get free nice games or GTX680, and the for rest of money I'll buy 2560x1600 LCD.


----------



## LDNL (Feb 21, 2013)

Priced artificially away from common sense and out of most of our reach.


----------



## Animalpak (Feb 21, 2013)

yes all is amazing and top notch about this card, but the price ... Yes you know the answer..


----------



## Max Mojo (Feb 21, 2013)

Awesome, comprehensive analysis. Yes, like I've posted already, missing backplate is a bad habit, but marginally. Would be nice to add the 3dM11 points in performance mode.
Funny this tautology 'if it's too expensive, it's not for you'. It should be possible to discuss the pricing and pros and cons without getting educational suggestions. 
In my personal view it's a mixed bag. I was already on preorder for EVGA Titan and could stop myself from hitting the preorder button. 
Interesting might be to imagine I had that card already built in my rig. 
Would I be happy? 
One of my 680's is about  P11450 3dMarks11, Titan about P12150 in Guru3d review. My  SLI P18000+. If it were for juice and loudness, this is not important for me. If it were on eye level with 680 SLI it would be much easier.
The design is a piece of art. Unfortunately only viewable from button-up. I my case only the naked back is in sight. OK. Backplate is available later from EVGA. It should be constructed upside-down, so one sees all the beauty. 
So until now there is not enough power in that horse for my taste. 
What I was waiting for since a year was GTX 780 GK110 fully unleashed. 
So let's see overclocking potential. As the card is a paper launch, and is only listed for preorder and probably sold out before it's in the shelves this card has a touch of illusiveness. Let's see...

Holy moly: Titan is listed 
http://geizhals.at/eu/?fs=gtx+titan+&x=0&y=0&in=


Would like to see the number with a 3930K cpu.

Already out of stock. Just the notify button at EVGA

http://eu.evga.com/products/moreInfo.asp?pn=06G-P4-2790-KR&family=GeForce TITAN Series Family&uc=EUR

EVGA
Product Warranty
This product comes with a 3 year limited warranty. 

The warranty is for sure extentable for extra $s, but 3 years standard warranty for EVGA is related to the pricing a bit Scrooge-like. EVGA had a 10 years standard warranty until early 2012, which was fantastic.


----------



## Feänor (Feb 21, 2013)

And another professional yet easy to understand review W1zzard. When it comes to gpus, you ARE the Man!

Now tell me how that gk110 core can score when folding. Eager to see that DP power put to ppd production!


----------



## HammerON (Feb 21, 2013)

I have to say that it is a pretty amazing GPU. Would be curious to see how it crunches/folds...
Thanks for the review W1zzard


----------



## Xzibit (Feb 21, 2013)

Compute performance is a mix bag from the reviews so far. Dependant on what your running it runs like it should (K20x) and in others its a wth.

Open CL - Computer Base

SGEMM & DGEMM / Synthetic + In-Game - Anandtech


----------



## Shihab (Feb 21, 2013)

Animalpak said:


> What a wonderful die hard graphics card !!



So wonderfull, it entered the stage with a _bang_!




HammerON said:


> I have to say that it is a pretty amazing GPU. Would be curious to see how it crunches/folds...



^This.
It's based on a compute chip after all, eh?
I wonder if it'll prove to be a _cheaper_ replacement for CUDA accelerated applications' users.


----------



## d1nky (Feb 21, 2013)

VulkanBros said:


> It can only be - because NVIDIA want´s the performance crown....
> I dont get it - who the h... will pay for this........
> 
> Like having a toothbrush - in pure gold



or spend that money on hookers lol


----------



## d1nky (Feb 21, 2013)

Rahmat Sofyan said:


> Just Buy one 7970 GE get free nice games or GTX680, and the for rest of money I'll buy 2560x1600 LCD



or be clever and apply for both never settle bundles and get five games??? like me lol


----------



## librin.so.1 (Feb 21, 2013)

>Looks at the performance graphs
>Notices this









and this









>Looks at 7970's FPS
>Looks at the resolutions

*WHAT THE BUTT!?*


----------



## BigMack70 (Feb 21, 2013)

^ Could just be a driver issue


----------



## librin.so.1 (Feb 21, 2013)

BigMack70 said:


> ^ Could just be a driver issue



Everything scales as expected for the 7970 everywhere else, framerates as expected. Then BAM! it runs much faster than expected on tri-monitor. On two of the games. i.e. doesn't look like a driver issue.
Did W1zzard misplace some of the numbers or something?


----------



## qubit (Feb 21, 2013)

Lovely card, but I'm disappointed by the performance, since it gets beaten quite handily by the GTX 690. The TITAN name would have been justified if it had beaten it. This way, it's a little embarrasing.

It's obvious that the low clock speed really hamstrings it, or it could have easily beat the 690, or at least equalled it. Shame about that.


----------



## Animalpak (Feb 21, 2013)

Grumpy cat have something to say about TITAN.


----------



## jihadjoe (Feb 21, 2013)

The way I see it, Titan is a bargain if you want the compute. 1/3 FP64 for less than 1/4 the price of a Tesla is money well spent.

For everyone else it's just epeen.


----------



## MxPhenom 216 (Feb 21, 2013)

Performs as expected. 25-30% faster then 680/7970. and about 10-15% slower then the 690. (I never expected it to beat the 690) Power consumption lower then the 7970 which is pretty sweet for being that much faster then it. The card would be perfect if it was priced at $699, as I would get one, but I guess the Magnesium alloy and vabor chamber cooler assembly comes with a price.


----------



## Bruno747 (Feb 21, 2013)

MxPhenom 216 said:


> Performs as expected. 25-30% faster then 680/7970. and about 10-15% slower then the 690. (I never expected it to beat the 690) Power consumption lower then the 7970 which is pretty sweet for being that much faster then it. The card would be perfect if it was priced at $699, as I would get one, but I guess the Magnesium alloy and vabor chamber cooler assembly comes with a price.




Ill have to go find the quote, but it was said that the cooler on titan is not the alloy. Nvidia said it was too expensive so the titans is made of aluminum. 

Cheaper cooler, cheaper card to make, cheaper cheaper cheaper, and not only are they raping people on the price, they cant even give a back plate with it. :shadedshu

Get a sweet looking card and in most towers that have a window on the case you get to see...and ugly fugly pcb. WOOO!

Anyone know anything about when amd will be looking into a 7970 refresh, or next gen? I am washing my hands of nvidia after the titan released, so amd is the only other place to go.


----------



## MxPhenom 216 (Feb 21, 2013)

I really don't understand peoples deal with backplates. I don't find them special at all. As long as the PCB is black im good to go.



> Visually, the GeForce GTX Titan resembles NVIDIA's GTX 690. It uses the same sexy unibody design with a magnesium allow shell and Plexiglas window. Unlike the GTX 690, there is only a single window and the fan has been moved further to the back, which makes sense as there is only one GPU to cool now. Length of the card is 27 cm, a bit shorter than GTX 690.



And yes the shroud is magnesium alloy, the cooler heatsink itself is vapor chamber.

EDIT: okay apparently not.


----------



## W1zzard (Feb 21, 2013)

Bruno747 said:


> Ill have to go find the quote, but it was said that the cooler on titan is not the alloy. Nvidia said it was too expensive so the titans is made of aluminum.



from the nvidia reviewer's guide:


> This heat from the vapor chamber is then dissipated by a large, dual-slot aluminum heatsink.
> 
> GeForce GTX TITAN also has an aluminum baseplate, which provides additional cooling for the PCB and board components.
> 
> Given the high-end nature of this board, NVIDIA engineers decided to use an aluminum casing for TITAN’s cover. At the center of the cover is a clear polycarbonate window, allowing you to see the vapor chamber and dual-slot heatsink that’s used to cool TITAN’s GK110 GPU.



i think you might be right. the material feels slightly differently and has a rougher surface texture. it still feels extremely high quality, not any better or worse than gtx 690


----------



## Bruno747 (Feb 21, 2013)

MxPhenom 216 said:


> I really don't understand peoples deal with backplates. I don't find them special at all. As long as the PCB is black im good to go.



To me I see it about like buying a new car. You buy the car and they make a mess of cash even after you talk them down a whole load on the price. Now imagine that same scenario, except that you bought the car, it made them a mess of money and the car company couldnt even be bothered to include nice glossy paint on it. They instead put matte ugly scuffs the first time you breath on it paint.

Or better yet, buying a truck, same theory but instead they make you go and buy a bed liner, (something that should be included at no charge) from another party.

The point is, given the ungodly amounts of cash they are making on each one of these, you would think they could throw a person a bone and spend the extra 50 cents it costs them to complete the package with a clean looking card from every angle, rather than making the person that just tossed $1000+ down on a card spend another $20 from a third party to complete the finish on the card.


----------



## Xzibit (Feb 21, 2013)

The cover is not the HS

Maybe they should sell the cover sperately, should lower the card $200


----------



## MxPhenom 216 (Feb 21, 2013)

W1zzard said:


> from the nvidia reviewer's guide:
> 
> 
> i think you might be right. the material feels slightly differently and has a rougher surface texture. it still feels extremely high quality, not any better or worse than gtx 690



You might want to change that in your review wizard. You said it has Magnesium alloy shroud/casing.


----------



## yogurt_21 (Feb 21, 2013)

qubit said:


> Lovely card, but I'm disappointed by the performance, since it gets beaten quite handily by the GTX 690. The TITAN name would have been justified if it had beaten it. This way, it's a little embarrasing.
> 
> It's obvious that the low clock speed really hamstrings it, or it could have easily beat the 690, or at least equalled it. Shame about that.



Titans got beaten and replaced by the greek Gods, Titanic sank, etc. 

Name fits. Personally though I'd be more interested in what cut down variants will do against the 7970GHZ/680/7970/670/7950/660ti and etc. 1000$ price point doesn't represent most of the market. 
hopefully its a gpu compute monster and will fulfill the role there. Dropping 4k in a gpucompute solution is nothing, dropping 1k on a single component on a pc build? that's not nothing.


----------



## Delta6326 (Feb 21, 2013)

amazing review W1zz!! Looks like i was right with it 10-20% back from the 690 I do think this should have been around $800


----------



## Yellow&Nerdy? (Feb 21, 2013)

Amazing performance... But the price is too high. I mean it is clearly slower than the GTX 690, but priced the same. Even though it does use less power and overclocks better, it's not worth it IMO. If I sold my kidney and had a grand to burn on a graphics card, I'd probably grab a GTX 690. Should of been priced somewhere between 700 - 850 bucks.


----------



## LAN_deRf_HA (Feb 21, 2013)

I'm confused, why does setting a higher temp target increase fan noise? Shouldn't it do the opposite? Why does running the fan faster make it run hotter?


----------



## phanbuey (Feb 21, 2013)

the 690 looks like a better buy...

Hopefully they come out with like, a GeForce 'Demigod' or something that is the *70 version.


----------



## TheMailMan78 (Feb 21, 2013)

W1zzard said:


> from the nvidia reviewer's guide:
> 
> 
> i think you might be right. the material feels slightly differently and has a rougher surface texture. it still feels extremely high quality, not any better or worse than gtx 690



This is why I always come back to TPU. The reviewers have integrity and get the job done without crying about missing parts or someone pointing out a "mistake" in the review. They just fix the situation and keep rolling like a BOSS.


----------



## Rahmat Sofyan (Feb 21, 2013)

this really what I miss too from nVidia, since my GTX 260,why nVidia not included the backplate?



> What I find really disappointing though is that NVIDIA did not install a backplate on the card. While handling the cards I always worried I might break something, a backplate would have alleviated those fears, and added to the design. For its price NVIDIA should have really included a backplate.


----------



## Aquinus (Feb 21, 2013)

LAN_deRf_HA said:


> I'm confused, why does setting a higher temp target increase fan noise? Shouldn't it do the opposite? Why does running the fan faster make it run hotter?



Because the rate that it has to ramp up at is that much higher because it will have the fan running 100% well before 100*C but the fan speed will stay really low until it hits it's target. With a lower target it has more time for the fan to ramp up before hitting 100%. That's a guess anyways. It's how I would do it. I bet you if the temperature is under 80*C with an 80*C target, it will be next to silent and it's loud because it's getting to 80*C no problem at low fan speeds because the thing is a beast.


----------



## W1zzard (Feb 21, 2013)

LAN_deRf_HA said:


> I'm confused, why does setting a higher temp target increase fan noise? Shouldn't it do the opposite? Why does running the fan faster make it run hotter?



very good question.

the fan speed always follows a certain curve that determines fan speed % for certain temperatures.

now when you tell boost that you are happy with 90°C, it will run higher clocks, higher voltage, for longer.

obviously this increases temperatures.

you are correct that fan speed increases now, due to the increased temperature, which provides more cooling potential.

but now boost boosts even higher, producing even higher temperatures.

this will go on until boost reaches the highest clock available, or temperate reaches the target and boost will not boost any higher.

does that make sense?


----------



## cadaveca (Feb 21, 2013)

So temp target has nothing to do with fan, and is more for clocks....?


----------



## buildzoid (Feb 21, 2013)

So with this Nvidia messed up overclocking even more. I though the whole boost thing was stupid on the 600 series but this boost 2.0 is even more messed up. I like it when my card goes to the 1160mhz and stays there because I know what performance I can expect about 16% where as boost 2.0 is like maybe it will be more maybe less if the temps are right and the miserable 6% 
power target. 
I hate dynamic clocking as I've had cases where my CPU was running at 1.2Ghz in a Cinebech run because the damn thing decide that I didn't need more power. So now GPUs will do the same but without letting you turn it off. I've also had my GPU clock to 500mhz while playing saints row the 3rd because youtube was running in the background, slideshows are the best way to view mass destruction by satchel charge (managed to fix that).
I'm really interested in the Titan SLI going against something like HD 7970 Toxic CF running on 1.2Ghz bios and then both setups on max OCs.


----------



## Protagonist (Feb 21, 2013)

They should have just released it as GTX680 in the first place, i expected Titan to match GTX690, but oh well as i stated it was always meant to be the GTX680.

Not worth $1000 they should have just sold it at $500 as it was initially meant to be, i guess that's why there are little to no games with nvidia the way it was meant to be played coz nvidia don't play like that any more.

I'll wait for next round, this is nothing to wast money on, I'll be buying a PS4 at the end of the year to replace my aging  PS3 that has served me well, we are always getting console ports so a new console eg PS4 is much worth it to me than a GTX Titan


----------



## Phusius (Feb 21, 2013)

only 5 fps faster then a non-OC 7970 ghz edition in sleeping dogs at 2560 x 1600... lulz.  $1000

have fun Nvidia fanboys. ill enjoy my $200 7950 oc to 1200 core (sold the 3 free games it came with which made it $200)


----------



## W1zzard (Feb 21, 2013)

cadaveca said:


> So temp target has nothing to do with fan, and is more for clocks....?



that's correct. temp target controls boost. fan control is completely decoupled from boost. the fan speed just changes to adjust to current gpu temp


----------



## Crap Daddy (Feb 21, 2013)

It seems the world is going insane. Google asks 1200 dollars for a Chromebook, Nvidia wants 1000 dollars for a video card and Sony presents the PS4 without actually showing anything.


----------



## LAN_deRf_HA (Feb 21, 2013)

Yeah that makes sense, a way to control boost more directly than setting a fan profile, but it's surprising to me Nvidia would allow that. They seem to be taking protecting Kepler cores pretty seriously as evidenced by the voltage cap in recent drivers and the greenlight program.


----------



## Rahmat Sofyan (Feb 21, 2013)

Crap Daddy said:


> It seems the world is going insane. Google asks 1200 dollars for a Chromebook, Nvidia wants 1000 dollars for a video card and Sony presents the PS4 without actually showing anything.



I'll post this on my FB ...


----------



## radrok (Feb 21, 2013)

I want to disagree with some of the people here, sure 1000$ for a GPU are a lot (especially because it does not dethrone the GTX 690) and it could have gone for less.
Still I don't think the GTX 690 is a better buy, you'd have to deal with SLI profiles, single card even if performs a bit less is still better than a SLI/CFX solution.
Having toyed with a lot of multi GPU setups (3+ cards) I can tell you that a SINGLE GPU powerhouse like this is worth gold to me.

I am actually glad I preordered Titan


----------



## HammerON (Feb 21, 2013)

If I didn't already have three (only use two for gaming) 7970's, I too would most likely be looking at buying this card as a single card solution for gaming at 2560x1600. It would be nice not to have to worry about SLI/Crossfire issues.


----------



## HumanSmoke (Feb 21, 2013)

brandonwh64 said:


> Highlights of this card are 21% increase from 7970 at the cost of 1K


Flawed math. Titan is a 26.6% increase over the 7970GE.
You're also probably missing the point. AMD at this point in time have obviously noted GK104 (and lower) lack of compute. Without the Titan in the mix, what are the chances that AMD's Gaming Evolved program continue (or increase) to code optimize for compute shader operations with overkill looping between compute shader, pixel shader, and memory buffer? Titan probably ensures that game i.q. compute doesn't spiral too far away from non-GCN (GK104 and lower, VLIW4 and 5 ) architectures.
As for Titan's cost...if no one buys it then you win the argument, but I'm guessing that the pricing is to keep the card from selling out and then pressuring Nvidia to divert GPUs from more lucrative Tesla and Quadro lines. I'd say the company are walking a fine line between keeping the card relevant enough to maintain its inclusion in review suites, whilst not underselling the board causing a complete sell out/no stock situation and cannibalizing Tesla sales for people who want compute/FP64 but have little use for ECC (somewhat overrated in GDDR5 anyhow).


Fourstaff said:


> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_Titan/images/perfrel_2560.gif
> or 30% when you are looking at the relevant resolution. However, this product is truly peerless and as such a completely outrageous price can be attached to it. Why would humans spend so much money buying blocks of shiny carbon?


If you're using the 7970GE as a baseline as Brandon was, the actual percentage increase from that card is 42.9% for that resolution.


----------



## Xzibit (Feb 21, 2013)

The reason why TITan cost $1k.

Nvidia building new HQ

Local news with a street level view

Give us your money!!!!


----------



## erocker (Feb 21, 2013)

HammerON said:


> If I didn't already have three (only use two for gaming) 7970's, I too would most likely be looking at buying this card as a single card solution for gaming at 2560x1600. It would be nice not to have to worry about SLI/Crossfire issues.



That's one too many! I need one!


----------



## xXxBREAKERxXx (Feb 21, 2013)

1k € price and not even beating 680 SLI. Sad if you ask me.


----------



## HammerON (Feb 21, 2013)

erocker said:


> That's one too many! I need one!




I saw your WTB thread
You buy me a Titan and I will give to you for free a 7970. Deal???


----------



## Max Mojo (Feb 21, 2013)

Xzibit said:


> The reason why TITan cost $1k.
> 
> Nvidia building new HQ
> 
> Give us your money!!!!



Yes, read about this.  That's the way Nvidia is taking over US. Titan Computer is the mastermind and the GTX Titans are remote mini robots. Like in South Park. Episode Pocemon. So come on,  Nvidiologists, let's donate $$$ to our Master.   
Damned, passed San Jose last year, could have taken some pics.


----------



## johnspack (Feb 21, 2013)

Will they do 4 way sli?  No way I'm buying these unless I can run 4....


----------



## Fourstaff (Feb 21, 2013)

HumanSmoke said:


> If you're using the 7970GE as a baseline as Brandon was, the actual percentage increase from that card is 42.9% for that resolution.



Ah yes, it depends on how you look at it, its 42.9% increase from 7970GE, or 7970GE is 30% slower.


----------



## KainXS (Feb 21, 2013)

with water this card would probably clock to the sky but with this level of performance 

1G = a walk away from most

the power consumption is amazing though, its kinda obvious this was the card meant to be the 680, thx for that nvidia(greed)


----------



## Mindweaver (Feb 21, 2013)

Titan is up at Newegg.. with a Release Date of 02/28/2013, but they are taking pre-orders.

[yt]dnCd-0qJts8[/yt]


----------



## Max Mojo (Feb 21, 2013)

Most convincing argument: 

3dMark11:

GTX Titan:         P12155     

versus     

GTX680SLI:       P14463

@stock clocks


http://www.guru3d.com/articles_pages/geforce_gtx_titan_review,21.html 



3dMark11:

GTX Titan:         P13248     

versus     

GTX680SLI:       P18040  - my modest overclock dated May 2012
@occed clocks


http://www.guru3d.com/articles_pages/geforce_gtx_titan_review,25.html


----------



## erixx (Feb 21, 2013)

I am not reading the whole thread but I did read the whole review, thank you W1zzard!

This card makes me watering... nVidia fails not and improves over time.


----------



## qubit (Feb 21, 2013)

I've just read the review in full and I can see that even when TITAN is clocked to even faster than the GTX 690, the GTX 690 still handily beats it, as you can see below. I was somewhat surprised and disappointed by that.

I guess the reason is that the dual GK104 effectively makes for a "bigger" GPU since it has an effective bus width of 512-bit and 64 ROPs when one considers the two GPUs as one chip.

Overall, I'd love to have one, but the sky high price and the fact it doesn't beat the GTX 690 in straight-line performance even when overclocked are disappointments. I reckon that the extra RAM could allow it to beat the 690 in multiple monitor scenarios perhaps, where it's memory would max out and performance drops significantly.


----------



## twicksisted (Feb 21, 2013)

TheMailMan78 said:


> 1000 bones to get 100 FPS in console ports............looking at this card makes me want to be involved in a Harlem Shuffle.



Or a cleveland steamer.


----------



## cadaveca (Feb 21, 2013)

uh...wow...


----------



## Hayder_Master (Feb 21, 2013)

it's worth at max $700.


----------



## Max Mojo (Feb 21, 2013)

Did Vince Lucido had any choice to state something different? Isn't he working for  Nvidia?


----------



## cadaveca (Feb 21, 2013)

Max Mojo said:


> Did Vince Lucido had any choice to state something different? Isn't he working for  Nvidia?



EVGA, I think.

Heck, he could work just for himself, for all I know.


The whole "Titan = GTX680 @ 1800 MHz" is what gets me. I don't really care about the rest.


----------



## TheMailMan78 (Feb 21, 2013)

cadaveca said:


> uh...wow...



I saw "Kingpin" and was expecting a new Kingpin game. (Leaves disappointed)


----------



## Max Mojo (Feb 21, 2013)

cadaveca said:


> EVGA, I think.
> 
> Heck, he could work just for himself, for all I know.
> 
> ...




Now I remember, phoned with EVGA a year ago, they said something like he's their home clocker.

To be precise it was just: Titan < 680SLI@1260MHz.


______________________________________________________________________________________

If not posted already by someone else, this sums it up:

http://www.tomshardware.com/reviews/geforce-gtx-titan-performance-review,3442-13.html


"...by Chris Angelini

I gave you a handful of conclusions in Tuesday's story, and promised today’s data would back me up. Between then and now, I’ve run and re-run a bunch of data. How well do my first impressions carry through? Here’s what I said:

1) Pay the same $1,000 for a GeForce GTX 690 if you only want one dual-slot card and your case accommodates the long board. It remains the fastest graphics solution we’ve ever tested, so there's no real reason not to favor it over Titan.

We can stand by this one. Although it’s technically true that the GeForce GTX 690’s 2 GB per GPU potentially limits performance at high detail settings and resolutions, our tests at 5760x1200 didn’t turn up any troublesome numbers. Far Cry 3 was the one title that felt choppy—and that was the case even as far down as 2560x1600. I don’t think on-board memory is the issue.

2) The Titan isn’t worth $600 more than a Radeon HD 7970 GHz Edition. Two of AMD’s cards are going to be faster and cost less. Of course, they’re also distractingly loud when you apply a demanding load. Make sure you have room for two dual-slot cards with one vacant space between them. Typically, I frown on such inelegance, but more speed for $200 less could be worth the trade-off in a roomy case.

This proved to be a little controversial. If you judge solely on performance per dollar, two Radeon HD 7970 GHz Edition boards absolutely cost less and go faster than a GeForce GTX Titan. But there’s the case against poor acoustics. There’s also a discussion to be had about micro-stuttering. Our single-GPU frame latency numbers show that AMD has already made inroads into minimizing frame latency in some games, and that other titles remain problematic. But we can’t compare multi-GPU configs using the same tools. Fortunately, we have something coming soon that’ll address micro-stuttering more definitively. In the meantime, those Radeon cards are compelling, so long as you’re able to cope with their noise.

Given a number of driver updates, one 7970 GHz Edition is quicker than GeForce GTX 680. As long as Nvidia sells the 680 for more than AMD’s flagship, the Tahiti-based boards are going to continue turning heads.

3) Buy a GeForce GTX Titan when you want the fastest gaming experience possible from a mini-ITX machine like Falcon Northwest’s Tiki or iBuyPower’s Revolt. A 690 isn’t practical due to its length, power requirements, and axial-flow fan.

This is unequivocal. There’s no way to get anything faster than a GeForce GTX Titan into a Tiki, Revolt, Bolt, and so on. Why would you spend $1,000 on a card that tends to be slower than the GeForce GTX 690? This is why.

4) Buy a GeForce GTX Titan if you have a trio of 1920x1080/2560x1440/2560x1600 screens and fully intend to use two or three cards in SLI. In the most demanding titles, two GK110s scale much more linearly than four GK104s (dual GeForce GTX 690s). Three Titan cards are just Ludicrous Gibs!

Gaming at 5760x1200 is sort of the jumping-off point where one GeForce GTX 690 starts to look questionable. Of course, then you’re talking about $2,000 worth of graphics hardware to either go four-way GK104s or two-way GK110s. To me, the choice is easy: a pair of GeForce GTX Titans is more elegant, lower-power, and set up to accept a third card down the road, should you hit the lottery.

With all of that said, the benchmarks also reveal that OpenCL support isn’t fully-baked yet in the GeForce GTX Titan driver. A number of issues in synthetics and real-world apps make it clear that bugs still need to be stomped out, and that’s never a pleasant revelation about such a pricey piece of kit.

At least we know that GK110 does have the compute chops GK104 lacks. Developers who would have loved a Tesla K20X but couldn’t afford its almost-$8,000 price tag may consider $1,000 for a Titan true value. If, on the other hand, you’re a bitcoin miner—well, AMD’s GCN architecture still has the lock on hashing performance.

At the end of the day, we maintain that a $1,000, GeForce GTX Titan is for two very specific gamers, both of which we explicitly called out on Tuesday. Everyone else will still consider this a very fast, very well-built piece of hardware. However, it runs into serious competition in Nvidia’s stack, and from rapidly-improving Tahiti-based cards that got beaten up on pricing early on in their life."


----------



## the54thvoid (Feb 21, 2013)

Umm...

Erm...

Can we all drop the pretense about value.  All the arguments about if people want it they'll buy it and if they can afford it they will buy it.  I have a PC saving fund currently at £4000.  Obviously my PC fund when OTT so I'll put some toward my wedding....

I can afford 2, easily.  Under water even.

But can I just get this out the way?

Far Cry 3 @ 2560x1600 = 32fps.

There are a couple of other shmoozies out there (Metro 2033 - i know it's badly coded).  But honestly?  Also the Hexus review has Dirt Showdown going to the 7970GHz when all FX used at maximum (Nvidia grumbled it's poorly optimised and pro AMD).






And again for Sleeping Dogs.






If I spend >£800 on a single card, I'd expect to get good frame rates across the board and *never* lose to a card costing £500 less - no matter how it was coded.  You can't even buy a card for £500 less than a non 6GB 7970.

W1zzard argues crossfire fails too much - true.  But Titan can't overcome a game that happens to be coded to favour the AMD card, even though Titan has mammoth grunt behind it.

Dirt Showdown and Sleeping Dogs should be taken as a warning here.  No matter how good people think Titan may be it's got problems.  If AMD go down the path of  helping develop more games (as it is now actively doing) that £800+ monster is going to look pretty f*cking stupid.

Why do i sound pissed?  *Because i really wanted one* but there's no chance I'm buying a card that _half_ my PC's gpu grunt can beat if AMD help code the game.

I am not a AMD fanboy.  My last post today was talking about selling my 7970's for a Titan under an EK water block (http://www.techpowerup.com/forums/showpost.php?p=2850861&postcount=4).  But seeing some of the fps results - I'd be a fool.  Truly a fool.  If AMD do as they say they're doing - focusing on drivers and development, I'll miss Titan out.

And ofc, Nvidia can develop drivers too but that's a cat and mouse argument.  Maybe if JSH mails me a card I'll sing it's praise but damn... so disappointed.


----------



## LAN_deRf_HA (Feb 21, 2013)

qubit said:


> I've just read the review in full and I can see that even when TITAN is clocked to even faster than the GTX 690, the GTX 690 still handily beats it, as you can see below. I was somewhat surprised and disappointed by that.



For comparison I got my two 660 Ti's 3gbs for $500 total. Their overclocked performance easily matches and exceeds a 690. There's just way better ways to get next years single card performance than buying a Titan.


----------



## ThunderStorm (Feb 21, 2013)

And people scream at $1600 ARES II...
Titan is worth 700 at max. And how can you call Titan a flop?? Superb and unraveled performance, great power consumption compared to HD 7970's, name, crown. 

Titan is a great card imho.


----------



## HumanSmoke (Feb 21, 2013)

KainXS said:


> with water this card would probably clock to the sky but with this level of performance



How about 1685 MHz on LN2 ?


----------



## Xzibit (Feb 21, 2013)

TheMailMan78 said:


> I saw "Kingpin" and was expecting a new Kingpin game. (Leaves disappointed)



I was laughing at the end..

An Nvidia employee wishing he can have one.  You'd think that would set off an alarm at the marketing department but i'm sure the accounting department said no $1K to meet margins.

Not to mention where he said. "We know you guys like to test the performance of the GPU in a totaly different way"

Then refers to a Air OC WTF


----------



## Rowsol (Feb 22, 2013)

This is the biggest rip off ever.


----------



## EarthDog (Feb 22, 2013)

cadaveca said:


> EVGA, I think.
> 
> Heck, he could work just for himself, for all I know.
> 
> ...


you are right on both accounts. Evga and for himself, kingpin cooling for sub zero lots and such.


----------



## Bjorn_Of_Iceland (Feb 22, 2013)

Impressive card. Unimpressive price.


----------



## Delta6326 (Feb 22, 2013)

the54thvoid said:


> Umm...
> 
> Erm...
> 
> ...



Don't worry I think with more mature driver they will fix those few games that have horrible results. and I hope that you buy 2 and water them so I can look at your pic in the WC thread.


----------



## Fourstaff (Feb 22, 2013)

Delta6326 said:


> Don't worry I think with more mature driver they will fix those few games that have horrible results. and I hope that you buy 2 and water them so I can look at your pic in the WC thread.



They had some time to fix the results for the GTX680, but they still haven't. Why?


----------



## Crowned Clown (Feb 22, 2013)

Disappointed..., （￣へ￣） it does worth it for a price range of $550-$600 though.


----------



## qubit (Feb 22, 2013)

Fourstaff said:


> They had some time to fix the results for the GTX680, but they still haven't. Why?



Cost-benefit, perhaps?

The company will be most driven to fix driver bugs and glitches in games which are more likely to sell their cards, which are inevitably blockbusters such as Call of Duty, Far Cry 3 or the latest free to play games which actually dwarf those blockbusters in revenue to the tune of 14 billion dollars.


----------



## HumanSmoke (Feb 22, 2013)

Fourstaff said:


> They had some time to fix the results for the GTX680, but they still haven't. Why?


Do tell.
As far as I'm aware, the GTX 680's shortcomings are more architecturally based- late pipeline/post-process compute functions - DoF, motion blur, global illumination, ambient occlusion, hardened shadows.

No amount of driver revision is going to make the GK104 competitive in DiRT Showdown and Sleeping Dogs, precisely because the coding is compute heavy and tailored for GCN.


----------



## PatoRodrigues (Feb 22, 2013)

I'm getting sick already... "over thinking" about CFX, SLI and the Titan. The price is mind-boggling. I would really love the computational power on this card but the price just isn't right and far from my reach.

I'll stick to OC'ed GTX670's... but man, nVidia knows how to produce premium cards. 
Looking forward to see lottery winners from TPU making rigs with this card. 

Time for AMD to take a step up and compete with GK110.


----------



## ThunderStorm (Feb 22, 2013)

PatoRodrigues said:


> I'm getting sick already... "over thinking" about CFX, SLI and the Titan. The price is mind-boggling. I would really love the computational power on this card but the price just isn't right and far from my reach.
> 
> I'll stick to OC'ed GTX670's... but man, nVidia knows how to produce premium cards.
> Looking forward to see lottery winners from TPU making rigs with this card.
> ...



Haha, Exactly what I'm thinking. Not likely to happen but it's nice to see another HD 4890 again.


----------



## btarunr (Feb 22, 2013)

ThunderStorm said:


> Haha, Exactly what I'm thinking. Not likely to happen but it's nice to see another HD 4890 again.



They already did a "HD 4890", it's called HD 7970 GHz Edition.


----------



## v12dock (Feb 22, 2013)

Thank you AMD for not wasting your time with 'Titan'


----------



## librin.so.1 (Feb 22, 2013)

v12dock said:


> Thank you AMD for not wasting your time with 'Titan'



Is that sarcasm?


----------



## v12dock (Feb 22, 2013)

Vinska said:


> Is that sarcasm?



75% more cores than the 680 for a 23% performance increase. The drivers should be optimized for the Kepler architecture already.


----------



## Protagonist (Feb 22, 2013)

the54thvoid said:


> *If I spend >£800 on a single card, I'd expect to get good frame rates across the board and never lose to a card costing £500 less - no matter how it was coded.  You can't even buy a card for £500 less than a non 6GB 7970.*



Id expect the same thing too, but oh well. I have been talking about game developers being the root of all bad performance with whichever card and I'll still say game developers should stop being lazy and code games appropriately, If a console can play a game like Metal Gear Solid 4 Guns Of The Patriot in all its glory @ 1080p and the console has inferior hardware than most Gaming PCs and the gaming PC cant play a game like Borderlands 2 fluidly then its clear that the developers don't code for each GPU and ensure it runs the game very well.



PatoRodrigues said:


> Time for AMD to take a step up and compete with GK110.



They don't need to compete with Titan by producing another GPU, Id say AMD is on the right path with the game developers, if they get more developers to optimize games to run very well in their GPUs then that's the way to go. The hardware people have had eg since 2010 are well more than enough to run any game out there at full max at whichever res eg @ 2560x1600, problem has always been the developers not the hardware.

If developers can do so much with so little in a console and make a game glorious, they should be able to make the same game run very well even in a low end PC GPU.

Not even Titan can overcome badly codded games, till today the poorly codded games like Crysis and Metro still punish GPUs :shadedshu


----------



## Melvis (Feb 22, 2013)

Excellent card, excellent performance but it fails where it matters the most and that's price/performance.


----------



## Slizzo (Feb 22, 2013)

v12dock said:


> 75% more cores than the 680 for a 23% performance increase. The drivers should be optimized for the Kepler architecture already.



That's unfair, better metric is performance per watt. Which the Titan kills on.

However, as the horse has been beaten; cost is still an overriding factor.


----------



## Protagonist (Feb 22, 2013)

Family Lives Without Money—By Choice—and Thrives
http://shine.yahoo.com/financially-...money--by-choice--and-thrives--190436599.html

Ah corporations like Nvidia, AMD & Intel should take NOTE


----------



## Phusius (Feb 22, 2013)

Protagonist said:


> Family Lives Without Money—By Choice—and Thrives
> http://shine.yahoo.com/financially-...money--by-choice--and-thrives--190436599.html
> 
> Ah corporations like Nvidia, AMD & Intel should take NOTE



Still living off the backs of others, I read the article... so you think we should live off the backs of others?  Interesting way to progress society.


----------



## Protagonist (Feb 22, 2013)

Phusius said:


> Still living off the backs of others, I read the article... so you think we should live off the backs of others?  Interesting way to progress society.



No i don't think its right, unless some one agrees to help you. But isn't that what Nvidia is doing living of backs of others by simply over pricing as they feel and we all know that Titan was meant to be $500 card in the first place, So the extra $500 to make Titan $1000 is to me living off the backs of others.

I don't agree to help Nvidia with an extra $500, that's why a titan is a no go for me.

At the same time the article talks about wasteful spending and to me $1000 on Titan is wasteful spending on a $500 card that's over priced.


----------



## HumanSmoke (Feb 22, 2013)

Protagonist said:


> No i don't think its right, *unless some one agrees to help you*. But isn't that what Nvidia is doing living of backs of others I don't agree to help Nvidia with an extra $500, that's why a titan is a no go for me...


Unfortunately, you seem to have torpedoed your own argument. There are people quite willing to buy the card -which is tacit agreement between the buyer and the company. Nvidia holds no monopoly, and certainly isn't strong arming people to buy the card. You draw the line at $500- there are a huge amount of people who would balk at spending even half that amount on a card. To them (and some of them post here), buying any card over $250 is akin to burning money.



Protagonist said:


> At the same time the article talks about wasteful spending and to me $1000 on Titan is wasteful spending on a $500 card that's over priced.


And a third and fourth card of any description for SLI or CrossfireX falls into the same category. How much extra performance do you get for that third or fourth card for the expenditure ? For that matter, what about bespoke water cooling ? Using SSD's for storage or in RAID 0 ?

For *you*, any of these things would likely represent "wasteful spending", as probably would a luxury car, an expensive hobby, a Raymond Weil or Rolex watch, or any other number of supposedly unsound fiscal purchases...but what might be prohibitively expensive for you, might represent a drop in the bucket for others. I might spend five figures building an engine that gets 8 miles per gallon (if I'm lucky) and gets used in twelve second increments- but the enjoyment far outweighs the fiscal _irresponsibility_.

Pretty much any enthusiast tech purchase is a case of diminishing returns. But what's the alternative? Buy a bang-for-buck ultra safe OEM box knowing you're losing less on depreciation, and put the saved cash into T-bills ?  

Where's the fun in living to be 110 years old if you have to live off bran flakes every day to get there ?


----------



## Aquinus (Feb 22, 2013)

HumanSmoke said:


> Unfortunately, you seem to have torpedoed your own argument. There are people quite willing to buy the card -which is tacit agreement between the buyer and the company. Nvidia holds no monopoly, and certainly isn't strong arming people to buy the card. You draw the line at $500- there are a huge amount of people who would balk at spending even half that amount on a card. To them (and some of them post here), buying any card over $250 is akin to burning money.
> 
> 
> And a third and fourth card of any description for SLI or CrossfireX falls into the same category. How much extra performance do you get for that third or fourth card for the expenditure ? For that matter, what about bespoke water cooling ? Using SSD's for storage or in RAID 0 ?
> ...



The problem with your argument is that the only way I could see someone justifying getting a Titan is someone who is planning on running multi-monitor with games. If you're getting it for anything else it is a waste because a 680 or a 7970 will do just as well and for such an exorbitant amount of money, it's simply not worth 100% more price for 20% more power.


----------



## Protagonist (Feb 22, 2013)

Aquinus said:


> The problem with your argument is that the only way I could see someone justifying getting a Titan is someone who is planning on running multi-monitor with games. If you're getting it for anything else it is a waste because a 680 or a 7970 will do just as well and for such an exorbitant amount of money, it's simply not worth 100% more price for 20% more power.



^
This, just to add there are very many luxury things that people/I buy coz they make perfect sense. But Titan is not one of those considering its initial conception, it was initially conceived to be a $500 card to as far as i know, so I would love me a Titan but its a total rip off, I already donated enough to Nvidia this gen by buy a GTX670 which would have easily been a GTX660 for around $250


----------



## Aquinus (Feb 22, 2013)

I also might add that I don't buy luxury items that aren't going to benefit me. I wouldn't buy a Ford Excursion because I want a v12, but because I want room and there are plenty of roomy vehicles that are like the Excursion that have just as much power in a smaller engine that eats less fuel. Simple as that.

The Titan is a beast, but it's not worth $1k USD. I think that is way off the mark.

What's the point of having the fastest GPU if hardly anyone can or is willing to afford it? At least the 7970 is relatively affordable.


----------



## symmetrical (Feb 22, 2013)

If this was around $700, I would have sold my GTX 680 and bought this. But $1000, I'd rather just get the 690. Heck, at this point it looks like I'm just gonna have to go 680 SLI.


----------



## symmetrical (Feb 22, 2013)

Protagonist said:


> Family Lives Without Money—By Choice—and Thrives
> http://shine.yahoo.com/financially-...money--by-choice--and-thrives--190436599.html
> 
> Ah corporations like Nvidia, AMD & Intel should take NOTE



A little off topic.

But that guy worked at low paying jobs most of his life. Then decides to go to College. Graduates first, and then decides to embark on a Journey in which he will live without money with his wife and child by taking resources from others who give it to him?

Story of his life.

Anyway, how 'bout that Titan 'eh?


----------



## Ikaruga (Feb 22, 2013)

Well done review, thank you very much for the great work. 

But the good numbers did not surprise me tho, I expected the card to perform better at 1440p and above. It was well known that previous Keplers were bandwidth starved at high resolutions due to the 256bit wide bus.


----------



## Aquinus (Feb 22, 2013)

...but for what it is, I think the numbers are rather lackluster. This is a beast of a GPU with a ton of compute hardware. I mean look at it. The number of shaders were increased by 75% over the 680 and you're lucky to find half of that in performance. It might be fast but it looks poorly optimized in comparison to the 680 and 7970 even if it is faster overall.


----------



## Ikaruga (Feb 22, 2013)

Aquinus said:


> ...but for what it is, I think the numbers are rather lackluster. This is a beast of a GPU with a ton of compute hardware. I mean look at it. The number of shaders were increased by 75% over the 680 and you're lucky to find half of that in performance. It might be fast but it looks poorly optimized in comparison to the 680 and 7970 even if it is faster overall.



I think the numbers probably will be much better 4-5 driver revisions later (as usual).


----------



## Aquinus (Feb 22, 2013)

Ikaruga said:


> I think the numbers probably will be much better 4-5 driver revisions later (as usual).



For nVidia's sake, I certainly hope so. This is Kepler though, it's not like they just released a new kind of video card. They have experience with the platform on the 600-series cards already. I'm skeptical that we will see vastly better numbers from driver updates.


----------



## Casecutter (Feb 22, 2013)

Aquinus said:


> For nVidia's sake, I certainly hope so. This is Kepler though, it's not like they just released a new kind of video card. They have experience with the platform on the 600-series cards already. I'm skeptical that we will see vastly better numbers from driver updates.


I tend to agree...  I didn't hear Nvidia having any real learning curve between Fermi and Kepler.  Other than changing the Cuda cores and having them run at clock speed (instead of half) I didn't hear there's much difference in the architecture overall.  It's not the like AMD with GCN.   Here the thing with these... verses when you by professional grade Tesla/Quattro or FirePro graphics cards, a big part of their lofty admission price means you're getting dedicated driver team to be certain you're receiving only the utmost compatibility with professional computing software.  With Titian or GTX690 probably not near that, you’re just lumped in with the GTX650Ti crowd if they find 2% that’s what you’ll benefit.


----------



## MxPhenom 216 (Feb 22, 2013)

I hope at some point that nvidia realizes they are castrating themselves. No one will buy this card for gaming when the GTX690 costs the same and performance quite a bit better. up to 15%.

Tip for Nvidia:

Drop the price of this bitch, and I for one will buy it. $699 please? Im sure a lot of others will as well.


----------



## cadaveca (Feb 22, 2013)

MxPhenom 216 said:


> I hope at some point that nvidia realizes they are castrating themselves. No one will buy this card for gaming when the GTX690 costs the same and performance quite a bit better. up to 15%.
> 
> Tip for Nvidia:
> 
> Drop the price of this bitch, and I for one will buy it. $699 please? Im sure a lot of others will as well.



Well, I've got to say...

Every card at launch, for years and years...the original launch cards clock like mad. AMD, NVidia doesn't matter.


Then, prices drop...and cards stop clocking.


Both GTX680 and HD 7970 seem to follow this too, so, if you want Titan, and you want to OC it, you will buy it now, or in the future, you'll pay less, but you'll also get less. Part of what you pay for is warranty service, support, and other things, and the cost of providing such services, when a card is in limited numbers in the wild, is higher than when there are thousands upon thousands.

I hate this price, and I won't pay it myself, but I think NVidia is more than justified asking for it. It's not just performance out of the box you are buying.


----------



## Fourstaff (Feb 22, 2013)

Aquinus said:


> ...but for what it is, I think the numbers are rather lackluster. This is a beast of a GPU with a ton of compute hardware. I mean look at it. The number of shaders were increased by 75% over the 680 and you're lucky to find half of that in performance. It might be fast but it looks poorly optimized in comparison to the 680 and 7970 even if it is faster overall.



Or some games are CPU bounded by the Titan, worth checking. There are plenty of places for the Titan to go wrong, who is willing to do some detective work


----------



## qubit (Feb 22, 2013)

Wow it didn't take long for pent up excitement to turn into a good slagging off, lol.

Yeah, I wish it would have beat the 690 too and the price been more reasonable too. It's interesting to see that the performance sits right where the predictions said it would be 6 to 9 months ago.

Still a fantastic card though. Just price it reasonably and it will be the one to have.


----------



## buildzoid (Feb 22, 2013)

I'm pretty sure the +20% performance over the 7970 Ghz ed. is caused by that low clock. The 680 is 1008mhz and the Titan is 836mhz that's 83% of the 680 clock so the effect of the 2688 shaders is reduced by 83% so they actually perform like 2231shaders at 1008mhz. Also the 680 was bandwidth starved well this is also bandwdith starved as true shader preformance went up by 45% (over the 680) and bandwidth is up 50% and add to that the fact that these are the first drivers for this card and you get why it's so slow.

However no amount of driver optimization can make up for the low clock so the max that I can see this card pushing is 45% more performance over an equally optimized 680 while still having that stupidly high price tag


----------



## HumanSmoke (Feb 22, 2013)

Aquinus said:


> The problem with your argument is that the only way I could see someone justifying getting a Titan is someone who is planning on running multi-monitor with games


You think that is why Nvidia have broken their own convention of castrating FP64 performance on GeForce cards by allowing full 1:3 rate double precision on a consumer GTX Titan ?

It would seem rather obvious (at least to me) that Nvidia is casting a wider net than just Surround gaming. GK110 was developed primarily for compute, yet aside from disabling MPI and ECC memory, the Titan retains not only the full spec of the $3200-4500 Tesla (inc the 6GB memory component) but also allows for the option of a 35% clock increase if the users workload is FP32 based.

Hey, but what the fuck do I know? Maybe Nvidia just forgot to disable double precision, and the selling of workstation card at consumer card prices is just the beginning of the end.


Aquinus said:


> it's simply not worth 100% more price for 20% more power


Strange math. You work on a different numbering system where you live?
@ 2560 by W1ZZ's charts the Titan shows an increase of 31.57% over the 7970GE and 42.86% increase over the GTX 680


Aquinus said:


> It might be fast but it looks poorly optimized in comparison to the 680 and 7970 even if it is faster overall.


Depends how you look at it. 
1. GK 110 wasn't developed as a gaming chip- the GK104 and Tahiti were.
2. The Titan uses equal power to the 7970GE yet offers 31.57% more gaming performance at 2560x1600
The only real argument is price- which nobody is disputing, and is largely irrelevant since the pricing is 1. Deliberately set high to ensure Nvidia need not keep the consumer channel supplied with GPUs that would return better margins as Quadro and Tesla, and 2. Not to undermine the professional cards above it in the product stack.

Whats the point of Nvidia pricing Titan at $499 ? It means that Nvidia then have to sell the GTX 680 for around $299-329, with the rest of the product stack realigned. The same people that are going to buy Titan at $499, would then buy a 680 for $299...or a 7970 at whatever price AMD would need to be competitive....assuming people didn't use the same logic/ performance-per-$ metric and buy a couple of bargain basement priced GTX 660 TI's or 7950's.


----------



## happita (Feb 22, 2013)

I guess Nvidia had a good reason for naming it the "Titan"


----------



## Xzibit (Feb 22, 2013)

HumanSmoke said:


> Depends how you look at it.
> 1. GK 110 wasn't developed as a gaming chip- the GK104 and Tahiti were.
> 2. The Titan uses equal power to the 7970GE yet offers 31.57% more gaming performance at 2560x1600
> The only real argument is price- which nobody is disputing, and is largely irrelevant since the pricing is 1. Deliberately set high to ensure Nvidia need not keep the consumer channel supplied with GPUs that would return better margins as Quadro and Tesla, and 2. Not to undermine the professional cards above it in the product stack.



How can it not be deliberately set high when there is no Quadro and Tesla option to compete.
If you buy a Tesla you need to buy a Quadro for video out thats $6K 

Something that a $3.5K W10000 will do with full support. Thats half the money and half the slots and 2/3rd the power saved right there unless its CUDA your after.

That same premise can be made for every big chip Nvidia has released that went into a Tesla variant.  Maybe you have more insight but I havent heard how selling those chips in GeForce variants hurt HPC sales in the past.


----------



## HumanSmoke (Feb 22, 2013)

Xzibit said:


> How can it not be deliberately set high when there is no Quadro and Tesla option to compete.
> If you buy a Tesla you need to buy a Quadro for video out thats $6K


Glad to see that we agree, although I'd hate to have you shopping for me:
Tesla K20 $3259 + Quadro NVS 510 $365 = $3624


Xzibit said:


> Something that a $3.5K W10000 will do with full support


Cool. Where can I buy this imaginary card ? You can't even buy the $3599 S10000 yet.


Xzibit said:


> Thats half the money


Well, its actually 99.3%....or 74.7% with the same stores K20X. You use the same numerical system as Aquinas ? 


Xzibit said:


> and half the slots


3 slots versus 2 slots = two thirds


Xzibit said:


> and 2/3rd the power saved right there


Unlikely. The S10000 is board rated at 375 watts (in comparison; the W9000 is rated at 274 Watts and reaches that consumption) . The K20/K20X is rated at 225/235W, and the Quadro NVS 510 is rated at 35 watts.
If you think that two Tahiti GPUs use less power than one GK110 + one GK107 then I'd suggest you do some more fact checking.


Xzibit said:


> unless its CUDA your after.


Considering AMD's pro drivers are basically non-existent, I'd say that the Quadro drivers and apps also come into that equation.


Xzibit said:


> That same premise can be made for every big chip Nvidia has released that went into a Tesla variant.  Maybe you have more insight but I havent heard how selling those chips in GeForce variants hurt HPC sales in the past.


Maybe you couldn't understand my previous post. I will re-iterate:
Tesla and Quadro cards retain full compute ability. GeForce cards with the exception of the Titan have had their compute features artificially limited to protect the Quadro and Tesla brands.


----------



## Xzibit (Feb 22, 2013)

I like how suddenly money is an issue for a professional setup.

What happen to its not targeted at you, TITAN arguement 



HumanSmoke said:


> Glad to see that we agree, although I'd hate to have you shopping for me:
> Tesla K20 $3259 + Quadro NVS 510 $365 = $3624



If your gonna try and correct someone atleast reference the proper card yourself 
K20X $4,450



HumanSmoke said:


> Maybe you couldn't understand my previous post. I will re-iterate:
> Tesla and Quadro cards retain full compute ability. GeForce cards with the exception of the Titan have had their compute features artificially limited to protect the Quadro and Tesla brands.



You can confirm this? Firmware is not limiting factor anymore?
Links please.




HumanSmoke said:


> You can't even buy the $3599 S10000 yet.


AMD FirePro S10000
TigerDirect
SabrePC  - Same site you referanced


----------



## Freedom4556 (Feb 23, 2013)

You know, I love that you guys do price/performance charts, especially broken down by resolution. However, I do have one suggestion that'd would make them absolutely perfect. You guys do the price/performance calculation for us and then order by best "value", and that could be useful for some, but for a lot of us we're looking more for maximum performance without crossing a harsh "diminishing returns" wall. (Kinda like Tom's "Best ___ for the Money" columns). What I'd like to see is price on one axis (That way later we could adjust for price changes mentally) and performance on the other, ordered by performance, and broken down by resolution like it is now. Personally I'm thinking kind of like a line chart, or even the current bar chart rotated 90 degrees, but ordered by _performance _instead of value.

I guess at the end of the day, the question I really want to know with that section is, "At a given resolution, at what point to I hit 60fps average [overkill] or start getting ripped off [diminishing returns]?" It's like, I know a Geforce 660 is a great value, but it's not going to drive the FPS I want at 2560x1440 high details, you know?


----------



## HumanSmoke (Feb 23, 2013)

Xzibit said:


> I like how suddenly money is an issue for a professional setup.


Stop trolling. You know full well that I was correcting your faulty $6K figure


Xzibit said:


> If your gonna try and correct someone atleast reference the proper card yourself
> K20X $4,450


You mean the card I already referenced ? If you cant comprehend my posts, why bother trying to answer them?


HumanSmoke said:


> Well, its actually 99.3%....or 74.7% with the same stores K20X. You use the same numerical system as Aquinas ?





Xzibit said:


> You can confirm this? Firmware is not limiting factor anymore?





> the biggest factor is that for the first time on any consumer-level NVIDIA card, double precision (FP64) performance is uncapped. That means 1/3 FP32 performance, or roughly 1.3TFLOPS theoretical FP64 performance. NVIDIA has taken other liberties to keep from this being treated as a cheap Tesla K20, but for lighter workloads it should fit the bill.
> 
> As compared to the server and high-end workstation market that Tesla carves out, NVIDIA will be targeting the compute side of Titan towards researchers, engineers, developers, and others who need access to (relatively) cheap FP64 performance, and don’t need the scalability or reliability that Tesla brings.


[source 1], [source 2], [and source 3 on the first page of this thread]


----------



## Xzibit (Feb 23, 2013)

HumanSmoke said:


> [source 1], [source 2], [and source 3 on the first page of this thread]



Really linking to more benchmarks

That doesnt establish if the TITAN is not Firmware limited like previous GeForce *80s.


----------



## HumanSmoke (Feb 23, 2013)

Xzibit said:


> That doesnt establish if the TITAN is not Firmware limited like previous GeForce *80s.


What part of "double precision (FP64) performance is uncapped. That means 1/3 FP32 performance, or roughly 1.3TFLOPS theoretical FP64 performance" don't you understand?


----------



## cadaveca (Feb 23, 2013)

Xzibit said:


> Really linking to more benchmarks
> 
> That doesnt establish if the TITAN is not Firmware limited like previous GeForce *80s.



But, running it under LN2, does. Cards are VRM limited for LN2. Firmware doesn't even need to be thought about. Find K1ngP1n's rig pics, and your answer is there.


----------



## HumanSmoke (Feb 23, 2013)

cadaveca said:


> But, running it under LN2, does. Cards are VRM limited for LN2


I'm sure someone will get around to hard modding/adding a daughterboard to the card at some stage...probably about 5 minutes after the HWBot leaderboard becomes congested with unmodded Titans filling the single, 2, 3, and 4 card benchmarks.

/Looking forward to a succession of Titan OCérs *shattering* the 3DM Fire Strike record...by 5 points...every few days


----------



## cadaveca (Feb 23, 2013)

HumanSmoke said:


> I'm sure someone will get around to hard modding/adding a daughterboard to the card at some stage.



K1ngP1n already did. Which to me says the cards are VRM limited already. 1 day after launch. 

You just can't compete with these guys that work at the OEMs and have open access to parts. Anything anyone else would try has already been done. Now it's just a matter of binning cards for the best one, and @ $1000 a pop, that's not gonna happen too quickly.  1750 MHz,  more than double stock, already posted on HWBOT.


----------



## HumanSmoke (Feb 23, 2013)

cadaveca said:


> K1ngP1n already did. Which to me says the cards are VRM limited already. 1 day after launch.


Understandable from a vendors point of view. The nature of the enthusiast is to keep pushing until something breaks...and components breaking-regardless of the circumstances tend to reflect badly on the manufacturer. I'm pretty sure that if thermal throttling were removed from modern CPUs, or average Joe Blow could switch off PowerTune, the blowback would more than negate any gain from HWBot competition.
As far as Nvidia are concerned, you could probably see the writing on the wall when GTX 590's started producing fireworks when overvolted. In the days when a YouTube video negates a whole marketing campaign its easy to see why they wouldn't take the chance. 


cadaveca said:


> You just can't compete with these guys that work at the OEMs and have open access to parts. Anything anyone else would try has already been done.


Pretty much. With competitive overclocking now being a valid PR and marketing tool, every vendor seems eager to jump on the bandwagon, which means that the traditional enthusiast orientated powerhouses need to up the ante


cadaveca said:


> Now it's just a matter of binning cards for the best one, and @ $1000 a pop, that's not gonna happen too quickly.  1750 MHz,  more than double stock, already posted on HWBOT.


I'd be surprised if the top vendors weren't already binning for factory OCéd "specials" like the Asus Matrix/DCII, MSI Lightning, EVGA SSC/HC, Gigabyte WF3 - in which case, they will certainly be putting aside any golden samples for the extreme crowd.


----------



## radrok (Feb 23, 2013)

Meh, they don't cherry pick chips, this has been proven atleast for ASUS, I mean look at the 7970 Platinum, some clock 100 Mhz worse than reference GPUs...


----------



## BigMack70 (Feb 23, 2013)

radrok said:


> Meh, they don't cherry pick chips, this has been proven atleast for ASUS, I mean look at the 7970 Platinum, some clock 100 Mhz worse than reference GPUs...



A lot of times the cherry picking is for LN2 and not air on cards like that. I can tell from the ASIC on my chips (in addition to the results of others) that Lightning 7970s are absolutely binned for LN2.


----------



## johnspack (Feb 23, 2013)

Isn't this kind of like the 7950GT which was released like 3 months before the 8 series,  just to pacify the enthusiasts?  Just a quick market grab.  Money well spent!


----------



## Aquinus (Feb 23, 2013)

HumanSmoke said:


> You think that is why Nvidia have broken their own convention of castrating FP64 performance on GeForce cards by allowing full 1:3 rate double precision on a consumer GTX Titan ?



You make it sound like you can enable full power DP math on non-Titan GeForce chips. Let's get something perfectly clear. How many shaders and performance did this card have to dedicate to get that 1:3 DP math? You also gimp SP math when you enable full speed DP math. At least the 7970 just does compute well regardless if its DP or SP.


HumanSmoke said:


> It would seem rather obvious (at least to me) that Nvidia is casting a wider net than just Surround gaming. GK110 was developed primarily for compute, yet aside from disabling MPI and ECC memory, the Titan retains not only the full spec of the $3200-4500 Tesla (inc the 6GB memory component) but also allows for the option of a 35% clock increase if the users workload is FP32 based.



You know, ECC memory is pretty important when you're doing compute applications. If you start overclocking the GPU your results can't be guaranteed. ECC at least eliminates the concern for corrupted memory to a point. Most gamers won't ever need full DP math and as far as professionals who use the Tesla cards, I think they might be intested in spending the extra money knowing that they can have their compute scale using MPI and that memory is reliable (ECC).


HumanSmoke said:


> Strange math. You work on a different numbering system where you live?
> @ 2560 by W1ZZ's charts the Titan shows an increase of 31.57% over the 7970GE and 42.86% increase over the GTX 680



I was considering all resolutions which isn't the best gauge. 30% at 100% price is still a bit steep. A 1:3 ratio of performance relative to price against the 680 isn't exactly great still.


HumanSmoke said:


> GK 110 wasn't developed as a gaming chip- the GK104 and Tahiti were.


You're going to have to prove that I think. Just because it was in Tesla chips first does not mean that it was designed for compute, but the truth of the matter is, we don't know why it came out late and I doubt it was because it wasn't ready. I'm willing to bet if the 7970 was vastly faster and they needed the GK110, they would have released it. They didn't feel that they had to so they waited. The timing was pretty bad though IMHO but I don't agree that the GK110 was developed strictly with compute in mind.



HumanSmoke said:


> The only real argument is price- which nobody is disputing, and is largely irrelevant since the pricing is 1. Deliberately set high to ensure Nvidia need not keep the consumer channel supplied with GPUs that would return better margins as Quadro and Tesla, and 2. Not to undermine the professional cards above it in the product stack.


You know, for people who actually invest in Tesla and use it's features would think that not having MPI would suck because now it's that much harder to get more than one of them to work together. If Titan is designed for compute, it's designed to do it on its own because anything to allow it to scale or be truly reliable for compute has been gimped. Also once again, most data centers won't be wanted a Titan to crunch, they will be wanting something that's more reliable and has the features they need.

With all of that said, GK110 is a GPU that does DP math well when you enable it. I wouldn't go so far to say that it was designed for compute. Telsa has the extra hardware to do that the right way.


HumanSmoke said:


> Whats the point of Nvidia pricing Titan at $499 ?


That is what everyone else is saying, not me. I've been saying $700-750 USD would have been the sweet spot. 500-550 USD is too low and 1000 USD is too high. 750 feels like an acceptable medium that would get more buyers.


----------



## HumanSmoke (Feb 23, 2013)

Aquinus said:


> You make it sound like you can enable full power DP math on non-Titan GeForce chips. Let's get something perfectly clear. How many shaders and performance did this card have to dedicate to get that 1:3 DP math?


Precisely? Zero. Of the 2688 shaders on the chip,1792 are FP32 capable, 896 are FP32/64. There are *no* dedicated FP64 shaders on the chip. 


Aquinus said:


> You also gimp SP math when you enable full speed DP math. At least the 7970 just does compute well regardless if its DP or SP.


Yes...looks very gimped.
Single precision:




Double precision:






Aquinus said:


> You know, ECC memory is pretty important when you're doing compute applications.


The case for ECC with GDDR5. ECC is generally a province of pro graphics/math co-processors where error detection is critical. Obviously, Titan is being not being aimed at those markets- that is what Quadro and Tesla are for.


Aquinus said:


> You know, for people who actually invest in Tesla and use it's features would think that not having MPI would suck because now it's that much harder to get more than one of them to work together. If Titan is designed for compute, it's designed to do it on its own because anything to allow it to scale or be truly reliable for compute has been gimped....





> What I’m trying to say is that for the last week I’ve been having to fend off our CS guys, who upon hearing I had a GK110 card wanted one of their own. If you’ve ever wanted proof of just how big a deal GK110 is – and by extension Titan – you really don’t have to look too much farther than that....Titan, its compute performance, and the possibilities it unlocks is a very big deal for researchers and other professionals that need every last drop of compute performance that they can get, for as cheap as they can get it. This is why on the compute front Titan stands alone; in NVIDIA’s consumer product lineup there’s nothing like it, and even AMD’s Tahiti based cards (7970, etc), while potent, are very different from GK110/Kepler in a number of ways. Titan essentially writes its own ticket here....As compared to the server and high-end workstation market that Tesla carves out, NVIDIA will be targeting the compute side of Titan towards researchers, engineers, developers, and others who need access to (relatively) cheap FP64 performance, and don’t need the scalability or reliability that Tesla brings - Anandtech





Aquinus said:


> If you start overclocking the GPU your results can't be guaranteed.


Titan doesn't allow overclocking when full rate FP64 is enabled for that precise reason:


> The penalty for enabling full speed FP64 mode is that NVIDIA has to reduce clockspeeds to keep everything within spec. For our sample card this manifests itself as GPU Boost being disabled, forcing our card to run at 837MHz (or lower) at all times-Anandtech


FP64 calculation is obviously slower than FP32, and requires much more power to run. Just as well the laws of physics are in play- a 90% improvement over Tahiti using less power is probably not something AMD would like to see extended.


Aquinus said:


> ECC at least eliminates the concern for corrupted memory to a point.and as far as professionals who use the Tesla cards, I think they might be intested in spending the extra money knowing that they can have their compute scale using MPI and that memory is reliable (ECC)


Obviously, there are enough people who just require EDC than full ECC (I'll take the word of the Anandtech guys over a random here I think)...after all, EDC was good enough for every AMD GPU (ECC was implemented only with Southern Islands FirePro)


Aquinus said:


> Also once again, most data centers won't be wanted a Titan to crunch, they will be wanting something that's more reliable and has the features they need


I don't think anyone is suggesting Titan will be used in this manner.


Aquinus said:


> With all of that said, GK110 is a GPU that does DP math well when you enable it. I wouldn't go so far to say that it was designed for compute


If FP64 isn't compute (GPGPU) then what is it ? 
If you could please list some applications that require double precision that *aren't* considered compute ? 
I think you'll find that most commercial applications (i.e. _compute_ orientated Maya and AutoCAD for instance) use a combination of single and double precision.

So, what you are trying to convey is that enthusiast gamers wont buy the card because it is too expensive, and GPGPU users wont buy the card because it lacks features...so no one will buy the card! (There aren't a whole lot of options left). So your analysis differs -and you have me believe, superior, to Anandtechs staff and Nvidias strategic marketing planners. Well, hopefully you're right and the price craters a few weeks from now.


----------



## radrok (Feb 23, 2013)

If I can add to your debate, Vray (which is a renderer tool I use in 3Dstudio Max instead of the default one) uses both DP and SP code as far as I know.

I will use the heck out of my Titan CUDA cores on VRAY CUDA acceleration, this GPU is a bloody good entry level compute monster.

I'll give you more details as soon as my order arrives.


----------



## Cortex (Feb 23, 2013)

Quote:
Originally Posted by Aquinus View Post
You make it sound like you can enable full power DP math on non-Titan GeForce chips. Let's get something perfectly clear. How many shaders and performance did this card have to dedicate to get that 1:3 DP math?
*Precisely? Zero.* Of the 2688 shaders on the chip,1792 are FP32 capable, 896 are FP32/64. There are no dedicated FP64 shaders on the chip. 


No. 2688 FP32 only and 896 DP. (16*12 FP32SP and 16*4 FP64 SP per SMX)

http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last/3


----------



## HumanSmoke (Feb 23, 2013)

Cortex said:


> Quote:
> Originally Posted by Aquinus View Post
> You make it sound like you can enable full power DP math on non-Titan GeForce chips. Let's get something perfectly clear. How many *shaders* and performance did this card have to dedicate to get that 1:3 DP math?
> *Precisely? Zero.* Of the 2688 shaders on the chip,1792 are FP32 capable, 896 are FP32/64. There are no dedicated FP64 shaders on the chip.
> ...


The question was I believe was *dedicated* FP64 *cores/shaders*.
The 896 double precision units are linked to FP32 shaders. As far as my understanding goes, a conventional core/shader encompasses the whole graphics pipeline (Input Assemble > Vertex > Hull > Tessellation > Domain > Geometry > Raster > Pixel) while the FP64 unit is largely a separate entity - and that is why it's differentiated in the literature as a _unit_ rather than _shader_ or _core_. Is this not correct ?




I wouldn't argue that the units take up die real estate (as they do in any architecture), just that the units aren't shaders by definition- I have never heard that the GK 110 die for instance is a 3840 cores GPU. The number is usually defined as 2880.


----------



## Prima.Vera (Feb 23, 2013)

Where is the SLI/3 SLI review? The link is not working/...


----------



## radrok (Feb 23, 2013)

You mean this one? 

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/


----------



## syeef (Feb 25, 2013)

Just curious, could GPUs with Boost 1.0 be updated to Boost 2.0 with BIOS update in the future?


----------



## Am* (Feb 28, 2013)

This card is just a joke...I can see the promotional campaigns already:

GTX Turd of the Titan...one year late, overpriced at $1000, does not perform like a $1000 GPU, does not use the quality materials of a $1000 GPU (this doesn't use magnesium alloy of the 690 because it was "too expensive") and cannot even run future console ports with 6GB VRAM, since PS4 will use 8GB GDDR5. A gigantic fucking failure is what this card is. Judging by the release of this card, I will most likely skip the Kepler GPUs entirely at this rate.


----------



## qubit (Feb 28, 2013)

Am* said:


> This card is just a joke...I can see the promotional campaigns already:
> 
> GTX Turd of the Titan...one year late, overpriced at $1000, does not perform like a $1000 GPU, does not use the quality materials of a $1000 GPU (this doesn't use magnesium alloy of the 690 because it was "too expensive") and cannot even run future console ports with 6GB VRAM, since PS4 will use 8GB GDDR5. A gigantic fucking failure is what this card is. Judging by the release of this card, I will most likely skip the Kepler GPUs entirely at this rate.



That's just too good a rant. 

I think the card is really good, but it _is_ very late and ridiculously overpriced. If I was in the market for a card in this price range, I'd buy the 690 as it handily outperforms it. SLI issues be damned.

As it is, I just got myself an overclocked GTX 590 for the same price as a 680, which outperforms it and isn't all that far behind a Titan.


----------



## Ikaruga (Feb 28, 2013)

Am* said:


> This card is just a joke...I can see the promotional campaigns already:
> 
> GTX Turd of the Titan...one year late, overpriced at $1000, does not perform like a $1000 GPU, does not use the quality materials of a $1000 GPU (this doesn't use magnesium alloy of the 690 because it was "too expensive") and cannot even run future console ports with 6GB VRAM, since PS4 will use 8GB GDDR5. A gigantic fucking failure is what this card is. Judging by the release of this card, I will most likely skip the Kepler GPUs entirely at this rate.



not sure if serious.....


----------



## EarthDog (Mar 1, 2013)

Ikaruga said:


> not sure if serious.....



Me either. Especially with the ps4 comment... I was on board until that line!


----------



## Am* (Mar 3, 2013)

Ikaruga said:


> not sure if serious.....




I'm dead serious. Last gen consoles had what, 256MB (PS3) and 512MB (360) VRAM? Now show me a graphics card that came out before the 360 and the PS3 with less VRAM than the consoles that can RUN any of the recent console ports, let alone PLAY them at anywhere near the same level of visual quality...oh that's right, you can't, because console ports are insanely bloated by the time they hit the PC. And because they are physically incapable of doing so due to not having enough VRAM or the API better than what the consoles have -- DX10 (as these cards did not hit the market until long after the consoles were released). I know it's usually a silly idea to play the waiting game with hardware releases but with the current retarded GPU pricing from Nvidia, it has never made more sense to wait than now.

P.S. W1zzard, you may want to change the review a bit in regards to the magnesium alloy, as this card doesn't use it (see here. Even the guy from Nvidia confirmed it in this live launch video (skip to 07:00)).


----------



## librin.so.1 (Mar 3, 2013)

inb4
please rewrite Your post in at least a bit more structured manner. Because I have a very hard time grasping the point You are trying to make due to Your post being a complete mess.
Thank You.


----------



## LAN_deRf_HA (Mar 3, 2013)

Hey how come the Titan and 690 don't have bumpers like the 480 did?



Am* said:


> I'm dead serious. Last gen consoles had what, 256MB (PS3) and 512MB (360) VRAM? Now show me a graphic card that came out before the 360 and the PS3 with less than 256MB VRAM that can run any of the recent console ports, let alone at anywhere near the same level of visual quality...oh that's right, you can't, because console ports are insanely bloated by the time they hit the PC. And because they are physically incapable of doing so due to not having DX10 (as these cards did not hit the market until long after the consoles were released).



That ram isn't dedicated to the gpu. It may even end up being hard partitioned like the PS3s. If the new wave of consoles were just game dedicated they would have been in the 2-3 gb range. The ram quantity is to support heavy PC-like multitasking and bloat. Even the richest next gen games will fall well within the Titan's vram limits.


----------



## Am* (Mar 3, 2013)

LAN_deRf_HA said:


> Hey how come the Titan and 690 don't have bumpers like the 480 did?
> 
> 
> 
> That ram isn't dedicated to the gpu. It may even end up being hard partitioned like the PS3s. If the new wave of consoles were just game dedicated they would have been in the 2-3 gb range. The ram quantity is to support heavy PC-like multitasking and bloat. Even the richest next gen games will fall well within the Titan's vram limits.



While that is debatable at this time, I can almost guarantee you that it will pretty much be entirely dedicated to the GPU, because the only reason system RAM existed in games consoles was either to 

A. Run the operating system/graphical user interface OR
B. Used as cache/shared VRAM when the GPU ran out of dedicated VRAM OR
C. Pass data from the hard/optical drive to the VRAM of the GPU.

By using a tonne of fast VRAM and sharing it with both the processor and the graphics chip, they've removed the need for normal system RAM. That removes the need for options B and C (stated above), which means the only thing it will run is the graphical user interface & the OS, the rest will be entirely free to use for the graphics chip. The operating systems in consoles use about 1/20th of the memory resources of a desktop PC, if not much less (I read somewhere that PS3 uses about 8MB RAM for the OS when gaming -- that's about a 1/100th of the memory usage of Windows 7 at idle). Therefore you will need a graphics card with at least the same amount of VRAM to even stand a chance, regardless of the Titan's GPU performance.


----------



## librin.so.1 (Mar 3, 2013)

Games use a lot of memory for other assets - sounds, AI, physics engine, and a lot of other stuff in a game uses up a lot of memory which are NOT graphics related. Thus saying all the usage is from the graphics is false.
So when speaking of console ports, You have to take in mind that the memory a game is going to use is going to be distributed between the vRAM and system RAM. Definitely not a 1:1 split, but really, _at least_ a gigabyte of game's assets would have absolutely 0 reason to be put on the vRAM.

So stop implying a GPU needs to have as much dedicated vRAM as a console has in total just to be able to run them.


----------



## HumanSmoke (Mar 3, 2013)

Am* said:


> Therefore you will need a graphics card with at least the same amount of VRAM to even stand a chance, regardless of the Titan's GPU performance.


Sounds unlikely in the extreme. A consoles spec needs to carry it through its entire life cycle - you either over-engineer or don't create enough separation between the previous consoles.
If you are under the impression that 6GB of GDDR5 will fall short of the requirement for the next gen of games, then every single gaming card built thus far becomes obsolete overnight...and do you really think that either vendor and the game developers are going to welcome that particular user base backlash ?


----------



## Aquinus (Mar 3, 2013)

I think people are forgetting that the PS3 has 8Gb of *shared* GDDR5. I bet you in most cases that the PS3 will initially use significantly less of that memory for VRAM and will shift as the platform ages and more games come out for it.


----------



## librin.so.1 (Mar 3, 2013)

Aquinus said:


> I think people are forgetting that the PS3 has 8Gb of *shared* GDDR5. I bet you in most cases that the PS3 will initially use significantly less of that memory for VRAM and will shift as the platform ages and more games come out for it.



That was exactly my point!


----------



## Am* (Mar 3, 2013)

Vinska said:


> Games use a lot of memory for other assets - sounds, AI, physics engine, and a lot of other stuff in a game uses up a lot of memory which are NOT graphics related.



Nonsense. That "lot" of memory you're talking about is nothing compared to the amount taken up by the textures, models, objects etc.



Vinska said:


> Thus saying all the usage is from the graphics is false.



I never said it was going to be all of it, what I said was it is going to be the vast majority (90%-95% of it). Even with your supposed 1GB of VRAM dedicated for physics, sound and whatever else, it still leaves you with 1GB more than what you get in the Titan.



Vinska said:


> So stop implying a GPU needs to have as much dedicated vRAM as a console has in total just to be able to run them.



That's because it DOES, maybe not to run the game, but definitely to get anywhere near the same image quality and performance; every previous console release will only prove this. Stop trying to imply otherwise.




HumanSmoke said:


> Sounds unlikely in the extreme. A consoles spec needs to carry it through its entire life cycle - you either over-engineer or don't create enough separation between the previous consoles.
> If you are under the impression that 6GB of GDDR5 will fall short of the requirement for the next gen of games, then every single gaming card built thus far becomes obsolete overnight...and do you really think that either vendor and the game developers are going to welcome that particular user base backlash ?



What user backlash? And what do you mean by "obsolete"? Yes, a console's spec needs to carry it through its life cycle of 5-10 years...and remember, a PC graphics card's specs only need to carry it for about 1 year or less until the next generation hits.

Remember last console generation around this time, when the 7800GTX dropped and Far Cry was the ultimate benchmark at the time, for which it was overkill? Think of Far Cry (or Doom 3, both of the games were THE benchmarks back then IIRC) as today's Crysis 3.

We're on the 600 series now, which is the equivalent of last console generation's 6000 series Nvidia GPUs, and the Titan is the equivalent of the 7800GTX. All it means is that your currently proclaimed "overkill" cards that are running current gen games across 1-3 massive monitors will change to being "good enough" graphics cards at running next gen ports at 1080p, with some graphical settings tweaked (up or down, depending on how demanding the game will initially be on the consoles). There will be no "backlash", the same way there wasn't one when they released a 512MB version of the 7800GTX a few months after the 256MB version hit (which I'm betting is what will happen with the GTX 780 or whatever they will call their fully functioning GK110 follow up to the Titan with more VRAM) and the same way the 8000 series eventually butchered every 7000 series GPU.


----------



## Aquinus (Mar 3, 2013)

Am* said:


> Nonsense. That "lot" of memory you're talking about is nothing compared to the amount taken up by the textures, models, objects etc.



Only recently have games been starting to become 64-bit. A lot of games are still 32-bit and have the 32-bit memory limitation so your "lot" is usually still constrained to 2Gb.


Am* said:


> I never said it was going to be all of it, what I said was it is going to be the vast majority (90%-95% of it). Even with your supposed 1GB of VRAM dedicated for physics, sound and whatever else, it still leaves you with 1GB more than what you get in the Titan.



This statement is confusing. Could you elaborate on this?


Am* said:


> That's because it DOES, maybe not to run the game, but definitely to get anywhere near the same image quality and performance; every previous console release will only prove this. Stop trying to imply otherwise.



No it doesn't. You actually have it backwards. If anything the PC needs more resources because there is added overhead from everything else the computer is running. I'm willing to bet that your installation of Windows is many times larger than the OS for the PS3 will be because of what it's designed for. VRAM also doesn't determine image quality. There are cases where you drop the quality to get it to run better but that's not strictly a function of memory and implying that it is, quite frankly is wrong and you're spreading false information.



Am* said:


> What user backlash? And what do you mean by "obsolete"? Yes, a console's spec needs to carry it through its life cycle of 5-10 years...and remember, a PC graphics card's specs only need to carry it for about 1 year or less until the next generation hits.



More or less. I've had my first 6870 for several years now and only upgraded to crossfire because I could. Most games (even Farcry 3,) run pretty well on it. So I wouldn't be so specific to say that video cards last for a year because I had my first 6870 for a couple years before I got a second one.


Am* said:


> All it means is that your currently proclaimed "overkill" cards that are running current gen games across 1-3 massive monitors will change to being "good enough" graphics cards at running next gen ports at 1080p, with some graphical settings tweaked (up or down, depending on how demanding the game will initially be on the consoles).



That's not the problem. The problem is that for the added shaders and components in the Titan, it's performance is underwhelming for a price tag of $1000 USD. The Titan is not overkill because in reality it's not incredibly faster than the 7970 or the 680. The only thing that is overkill about the Titan is the price tag. Otherwise it looks like a damn fine piece of hardware.


----------



## Am* (Mar 3, 2013)

Aquinus said:


> Only recently have games been starting to become 64-bit. A lot of games are still 32-bit and have the 32-bit memory limitation so your "lot" is usually still constrained to 2Gb.



Of which, about 1980MB is used for graphics. Vinska was trying to suggest that suddenly physics and sound engines have taken over our RAM when he/she couldn't be further from the truth. The progression of in-game physics and sound have been baby steps compared to the graphical leaps in games -- sound design has been at a standstill since around 2005 or whenever the last hurrah for EAX happened (which they don't use on consoles anyway) and physics have barely advanced in games, since almost all of the games use pre-calculated physics effects for the most impressive/demanding parts for easy porting between platforms.



Aquinus said:


> This statement is confusing. Could you elaborate on this?



See above.



Aquinus said:


> No it doesn't. You actually have it backwards. If anything the PC needs more resources because there is added overhead from everything else the computer is running. I'm willing to bet that your installation of Windows is many times larger than the OS for the PS3 will be because of what it's designed for. VRAM also doesn't determine image quality. There are cases where you drop the quality to get it to run better but that's not strictly a function of memory and implying that it is, quite frankly is wrong and you're spreading false information.



I have no idea what you're arguing about here. I specifically stated that current GPUs need at LEAST the same amount of VRAM to run next gen console ports to a good standard or at least at comparable level to the consoles. I think you read my comment backwards or something. 

And YOU'RE the one seemingly spreading false information -- consoles, as you and every other PC enthusiasts believe, provide a "good enough" baseline of performance compared to desktops (and run at inferior minimal image quality compared to the PC), with driver overheads and so on (that you mentioned earlier), proves my comment 100% right.



Aquinus said:


> More or less. I've had my first 6870 for several years now and only upgraded to crossfire because I could. Most games (even Farcry 3,) run pretty well on it. So I wouldn't be so specific to say that video cards last for a year because I had my first 6870 for a couple years before I got a second one.



That has nothing to do with what I said. What I said was a PC graphics card's performance lifespan is only relevant for the games that are released around their time or before their replacement is released, unless the developer is specifically aiming at lower tier PC hardware. Normally, nobody rages because their top of the line GPU has been superceeded by something faster and better -- it comes with the hobby.



Aquinus said:


> That's not the problem. The problem is that for the added shaders and components in the Titan, it's performance is underwhelming for a price tag of $1000 USD. The Titan is not overkill because in reality it's not incredibly faster than the 7970 or the 680. The only thing that is overkill about the Titan is the price tag. Otherwise it looks like a damn fine piece of hardware.



I never disputed that. All I said was even though it is quite a bit more powerful than what the consoles are packing, the 6GB VRAM will limit its performance in next gen titles, judging by how badly most mainstream games are being ported nowadays.


----------



## librin.so.1 (Mar 3, 2013)

Am* said:


> Of which, about 1980MB is used for graphics. Vinska was trying to suggest that suddenly physics and sound engines have taken over our RAM when he/she couldn't be further from the truth.



Should I show You a dump of a typical game's memory map w/ annotations what data goes where?
There is A LOT of data a game needs to use that is not directly related to graphics.


----------

