# NVIDIA GeForce GTX Titan Graphics Card Pictured in Full



## btarunr (Feb 18, 2013)

Here it is, folks; the first pictures of NVIDIA's newest pixel crunching dreadnought, the GeForce GTX Titan. Pictures leaked by various sources east of the Greenwich Median reveal a reference board design that's similar in many ways to that of the GeForce GTX 690, thanks to the magnesium alloy cooler shroud, a clear acrylic window letting you peep into the aluminum fin stack, and a large lateral blower. The card features a glowy "GeForce GTX" logo much like the GTX 690, draws power from a combination of 6-pin and 8-pin PCIe power connectors, and features two SLI bridge fingers letting you pair four of them to run 3DMark Fire Strike as if it were a console port from last decade.



 

 




The GeForce GTX Titan PCB reveals that NVIDIA isn't using a full-coverage IHS on the GK110 ASIC, rather just a support brace. This allows enthusiasts to apply TIM directly on the chip's die. The GPU is wired to a total of twenty four 2 Gbit GDDR5 memory chips, twelve on each side of the PCB. The card's VRM appears to consist of a 6+2 phase design which uses tantalum capacitors, slimline chokes, and driver-MOSFETs. The PCB features a 4-pin PWM fan power output, and a 2-pin LED logo power output that's software controllable. 

Given the rumored specifications of the GTX Titan, the card could be overkill for even 2560 x 1600, and as such could be designed for 3DVision Surround (3 display) setups. Display outputs include two dual-link DVI, an HDMI, and a DisplayPort.

According to most sources, the card's specifications look something like this: 
28 nm GK110-based ASIC
2,688 CUDA cores ("Kepler" micro-architecture)
224 TMUs, 48 ROPs
384-bit GDDR5 memory interface
6 GB memory
Clocks:
o 837 MHz core
o 878 MHz maximum GPU Boost
o 6008 MHz memory
250W board power

*View at TechPowerUp Main Site*


----------



## KainXS (Feb 18, 2013)

review eta wizz?


----------



## okidna (Feb 18, 2013)

btarunr said:


> and features two SLI bridge fingers letting you pair four of them to run 3DMark Fire Strike as if it were a console port from last decade.





Neat design just like GTX 690, love it


----------



## Rei86 (Feb 18, 2013)

Review WIZZ COMEON !!@!@!

NDA should be lifted


----------



## Crowned Clown (Feb 18, 2013)

At last! Only thing left to start my new build is Ivy B-E!


----------



## Flibolito (Feb 18, 2013)

that is beautiful. wow i wish 4k monitors were available and i would hop on one of these.


----------



## NutZInTheHead (Feb 18, 2013)

That's an awesome looking card. I hope the price is reasonable.


----------



## Enmitynz (Feb 18, 2013)

The single most sexy card ive ever seen. Where is the insert hole for peen? I wanna fuck this burrito!


----------



## Delta6326 (Feb 18, 2013)

Now to have this in the Case Labs S3 with custom water...


----------



## Animalpak (Feb 18, 2013)

ladies and gentleman's... THIS IS OUR METEOR !! 


Pure show of strength !


----------



## renz496 (Feb 18, 2013)

the look alone looks expansive lol. so i suppose this thing should compete head to head to 7970 (which also rated at 250w)?


----------



## xenocide (Feb 18, 2013)

renz496 said:


> the look alone looks expansive lol. so i suppose this thing should compete head to head to 7970 (which also rated at 250w)?



It should crush the HD7970.


----------



## Delta6326 (Feb 18, 2013)

more like 7970 cfx...


----------



## The Von Matrices (Feb 18, 2013)

KainXS said:


> review eta wizz?



If he even acknowledged that the card existed he would be breaking the NDA.


----------



## zolizoli (Feb 18, 2013)

I am really worried abot the price... NVIDIA is my favorite forever.. but lately the become GIGA GREEDY.. They do the same like apple: nice designe,low cost hardware,gigantic marketing= Astronomical price.
I suspect that it will cost the same as a 690 but it will be slower and if that so there is no point on buying it.. i dont care of a bit of micro shuttering or 2gb less ram. Iam not a molecular scientist,iam just an average gamer who love tech...

Thats how its goes nowdays in NVIDIAs lab..
TECHNICIAN: Hey boss. we have a large amount of GK110 in the storage from the tesla program, should i throw them out to the container? cause we need space for the next project..
BOSS: No Way. There is a lots of stupid gamer out there. Call the design department to make a fancy name and design for this garbage and lets make some cash. If you can make it i will make you to my personal coffe maker!
TECHNICIAN: You got it BOSS.


----------



## Mathragh (Feb 18, 2013)

The VRM's dont look thát beefy. Does anyone know how this compares to a 7970?


----------



## btarunr (Feb 18, 2013)

Mathragh said:


> The VRM's dont look thát beefy. Does anyone know how this compares to a 7970?



It looks sufficient for 250W.


----------



## SIGSEGV (Feb 18, 2013)

let nvidia rolls this card out in the market and sucks much of money from people who always want a fastest thing in their life. i'd like to save my money and buy a ps4 console machine in the future. i'm done with nvidia.

anyway, its design looks cool and sweet. it surely would attracts many ants to get this sugar candy.


----------



## Mathragh (Feb 18, 2013)

btarunr said:


> It looks sufficient for 250W.



Not trying to bash the card or anything, just curious; if this is to be such a monster card, shouldn't it also have a monster VRM, or atleast, quite a beefy one? Atleast, if I were Nvidia when bringing out "the card of cards" I'd make sure that when people go all crazy with this card, the problem wont be the VRM's.

I'm not saying it has bad VRMs, but I'm wondering if anyone with knowledge of these things has anything to say about the VRM's we can see on the pictures.
The only thing that really got my attention is that apparently they didnt feel the need to fill all the space reserved for the VRM's, as there appear to be 2(or 1,5) empty spots.
I could be totally wrong, however


----------



## Nordic (Feb 18, 2013)

Arn't 680's voltage locked/limited in some way? It would be funny if they did that here.


----------



## okidna (Feb 18, 2013)

Mathragh said:


> Not trying to bash the card or anything, just curious; if this is to be such a monster card, shouldn't it also have a monster VRM, or atleast, quite a beefy one? Atleast, if I were Nvidia when bringing out "the card of cards" I'd make sure that when people go all crazy with this card, the problem wont be the VRM's.
> 
> I'm not saying it has bad VRMs, but I'm wondering if anyone with knowledge of these things has anything to say about the VRM's we can see on the pictures.
> The only thing that really got my attention is that apparently they didnt feel the need to fill all the space reserved for the VRM's, as there appear to be 2(or 1,5) empty spots.
> I could be totally wrong, however



If I'm not wrong (I'm no "VRM expert" at all), Titan is using 6+2 VRM config. 
More or less the same with reference design of 7970 (note that AMD also did not fill all the space reserved for the VRMs).

These 2 cards also use the same power connectors design, 6+8 pin. So IMHO, if this kind of VRM setup is sufficient for HD7970, it should be sufficient for Titan.

But just like you said, I could be totally wrong


----------



## Lionheart (Feb 18, 2013)

I must admit, that is one nice looking card  

Time to play the waiting game for *Wizzard's* review


----------



## Mathragh (Feb 18, 2013)

okidna said:


> If I'm not wrong (I'm no "VRM expert" at all), Titan is using 6+2 VRM config.
> More or less the same with reference design of 7970 (note that AMD also did not fill all the space reserved for the VRMs).
> 
> These 2 cards also use the same power connectors design, 6+8 pin. So IMHO, if this kind of VRM setup is sufficient for HD7970, it would be sufficient for Titan.
> ...



Makes sense


----------



## buggalugs (Feb 18, 2013)

zolizoli said:


> I am really worried abot the price... NVIDIA is my favorite forever.. but lately the become GIGA GREEDY..



Whadaya mean lately? Nvidia have been that way for 15 years.


----------



## Kovoet (Feb 18, 2013)

Think I'll wait till next year and maybe just go crossfire again.


----------



## zolizoli (Feb 18, 2013)

buggalugs said:


> Whadaya mean lately? Nvidia have been that way for 15 years.



Not really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
This greedy corporates holding back our technological evolution.


----------



## KashunatoR (Feb 18, 2013)

the should have only give it 4gb of VRAM and shave off a 100 bucks...


----------



## Nordic (Feb 18, 2013)

zolizoli said:


> Not really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
> I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
> The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
> This greedy corporates holding back our technological evolution.



Funding our future tech evolution.


----------



## renz496 (Feb 18, 2013)

^ AFAIK even 3GB is plenty for triple monitor gaming. maybe they (nvidia) want to brag their performance with 4k resolution monitor....


----------



## micropage7 (Feb 18, 2013)

somehow reminds me of robocop theme, with silver and black


----------



## Fluffmeister (Feb 18, 2013)

zolizoli said:


> Not really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
> I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
> The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
> This greedy corporates holding back our technological evolution.



Cool story bro!


----------



## 1c3d0g (Feb 18, 2013)

EGGcellent.  This should be an amazing number-crunching GPU for us BOINC'ers/Folding@Home users. Too bad the price will be too high (~ $900 last I heard), but at least by the end of the year the 780 GTX and crew will be available for a more reasonable amount.


----------



## T4C Fantasy (Feb 18, 2013)

http://www.techpowerup.com/gpudb/1996/NVIDIA_GeForce_GTX_Titan.html


----------



## Prima.Vera (Feb 18, 2013)

T4C Fantasy said:


> http://www.techpowerup.com/gpudb/1996/NVIDIA_GeForce_GTX_Titan.html



Next the review??


----------



## D4S4 (Feb 18, 2013)

that's one big ass die. 700cid bbc big.


----------



## lastcalaveras (Feb 18, 2013)

Should be a nice card and the cut down version for 760 ti should be even better for the price. However there are two dreadful words that make me think twice "Nvidia Greenlight".


----------



## Naito (Feb 18, 2013)

zolizoli said:


> The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech



:shadedshu I'd say it would be highly unlikely that it was Nvidia's intention to 'fool' their customers. Whether the reason was due to yield on a new fabrication node or simply because AMD's card performed less than Nvidia had anticipated, it doesn't matter; it allowed AMD to stay very competitive. As for the 680, the card well and truly performs where one would expect a high-end card to perform, regardless of whether or not it was originally designed to be as such.


----------



## Kaynar (Feb 18, 2013)

renz496 said:


> ^ AFAIK even 3GB is plenty for triple monitor gaming. maybe they (nvidia) want to brag their performance with 4k resolution monitor....



If the performance of this card is 100% faster than current top end, then 6GB makes more sence cause this card will be plenty enough for at least 4-5 years of gaming, so they have to account of future needs... several titles already need nearly 3GB or vRAM on 1 screen already today.


----------



## the54thvoid (Feb 18, 2013)

lastcalaveras said:


> Should be a nice card and the cut down version for 760 ti should be even better for the price. However there are two dreadful words that make me think twice "Nvidia Greenlight".



Cut down version? This is very much stand alone.  I have no idea of Nvidia's refresh schedule but there's a chance they'll release the 7xx series as the 6xx refresh.  This card is intended to stand out and not even be considered a 6 or 7 series model.


----------



## hardcore_gamer (Feb 18, 2013)

This card will make me buy..


..an XBOX 720 and a PS4.


----------



## T4C Fantasy (Feb 18, 2013)

D4S4 said:


> that's one big ass die. 700cid bbc big.



I made an hd version of the gk110 chip, this is what it will look like more or less 

http://www.techpowerup.com/gpudb/1996/NVIDIA_GeForce_GTX_Titan.html

if the pic stil llooks like the color die refresh the page


----------



## BigMack70 (Feb 18, 2013)

Looks like the paper launch got delayed till tomorrow...


----------



## the54thvoid (Feb 18, 2013)

BigMack70 said:


> Looks like the paper launch got delayed till tomorrow...



Unless the NDA is 8am Pacific Time.  That means it's 4pm here in the uk.


----------



## symmetrical (Feb 18, 2013)

Holy crap, it looks like a graphics card!


----------



## Prima.Vera (Feb 18, 2013)

Kaynar said:


> ... several titles already need nearly 3GB or vRAM on 1 screen already today.



Which titles? For 1080p or 1440p? And please don't tell me about latest mods for Skyrim. I can run it just fine with 1 GB vRAM only.


----------



## symmetrical (Feb 18, 2013)

zolizoli said:


> Not really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
> I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
> The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
> This greedy corporates holding back our technological evolution.



The thing is though, regardless of what something was "meant" to be, if the price is right and people are willing to pay, people are going to pay, and people did pay.

Although I admit I copped my GTX 680 second hand for $400 3 months after launch because I didn't want to pay $550+ like I did with my GTX 580.

But at the time, the performance of the GTX 680 compared to a 580 was 1.5 to 1.8. And liked it or not, it was the fastest single GPU Nvidia card.

I mean if we lived in a fair wonderful utopia, then the Titan card will "only" be $500. But we live in reality and in reality there is a thing called Business. And for Nvidia's Business it dictates the card's price/performance will be in the neighborhood of $800+


----------



## the54thvoid (Feb 18, 2013)

Slides....

http://wccftech.com/nvidia-official...x-titan-gk110-gpu-decimates-single-chip-gpus/






I make that 37 fps to the 680, 48 to Titan = about 11-12 fps difference = only 30% faster than a 680 on this specific game.


----------



## symmetrical (Feb 18, 2013)

Prima.Vera said:


> Which titles? For 1080p or 1440p? And please don't tell me about latest mods for Skyrim. I can run it just fine with 1 GB vRAM only.



The only game that ever allocated the full 2GB on my GTX 680 was the Crysis 3 Beta @1080p which is a rarity in itself.


----------



## Prima.Vera (Feb 18, 2013)

symmetrical said:


> The only game that ever allocated the full 2GB on my GTX 680 was the Crysis 3 Beta @1080p which is a rarity in itself.



Interesting. Haven't played that yet. What AA were you using?


----------



## symmetrical (Feb 18, 2013)

the54thvoid said:


> Slides....
> 
> http://wccftech.com/nvidia-official...x-titan-gk110-gpu-decimates-single-chip-gpus/
> 
> ...



I looked at the other slide and they are comparing 2 GTX 690s (quad SLI) vs GTX Titan's in 3 Way SLI. What a strange comparison.


----------



## symmetrical (Feb 18, 2013)

Prima.Vera said:


> Interesting. Haven't played that yet. What AA were you using?



FXAA and TXAA 4X

Although there was a bug in the beta in which TXAA made foliage look like crap.

But yeah most other games are barely hitting 1GB of VRAM, so I don't know what the other guy is talking about either.


----------



## Prima.Vera (Feb 18, 2013)

symmetrical said:


> FXAA and TXAA 4X
> 
> Although there was a bug in the beta in which TXAA made foliage look like crap.
> 
> But yeah most other games are barely hitting 1GB of VRAM, so I don't know what the other guy is talking about either.



Hmm...Why use both? As far as I read TXAA is really heavy on performance. Also, isn't an option to use just _*SMAA*_? I use it on Dead Space 3 and is by far better than any FXAA or MLAA both in performance and quality.
Cheers.


----------



## tastegw (Feb 18, 2013)

the54thvoid said:


> Slides....
> 
> http://wccftech.com/nvidia-official...x-titan-gk110-gpu-decimates-single-chip-gpus/
> 
> ...




"New Features such as GPU Boost 2.0 lets users manually overvolt and overclock their GPU without any restrictions hence achieving better clock speeds through Boost tech while users would be delighted with the new 80 Hz Vsync technology that allows better framrates in the latest gaming titles.

Read more: http://wccftech.com/nvidia-officially-unleashes-geforce-gtx-titan-gk110-gpu-decimates-single-chip-gpus/#ixzz2LGZM014A"

Nice, unlocked voltage


----------



## badtaylorx (Feb 18, 2013)

so...basically this thing is a 670sli on one chip.....

and you can use four of them....

mother of god, this is one fast card

this is as much of a war between Titan and HD7970 as a war between a man and a mosquito...


----------



## eventide (Feb 18, 2013)

*Video ram usage*

If you max up all graphic options in Serious Sam 3 in a mere 1920x1080, you will find yourself in need of up to 3GB vram. Proof in the screens:



Spoiler


----------



## the54thvoid (Feb 18, 2013)

badtaylorx said:


> so...basically this thing is a 670sli on one chip.....
> 
> and you can use four of them....
> 
> ...



It's not a comparison.  7970 release was December 2011.  This is February 2013.  I don't even consider for a second AMD will be batting an eyelid over this.  I'm quite sure they'd happily concede Titan is the far more powerful card.  

It's going to hurt 690 sales if anything (or boost them).  Given Nvidia's excellent track record at sli scaling and given the pricing rumours, I'd rather pay less for a higher performing 690 than a Titan card.

This is Nvidia's phallic symbol.  They don't even need to sell it - just make it say - look what we can do - up yours AMD and Intel.

If it's over the cost of a 690, it is not a good proposition.  Not at all.  The 690 is one of the best cards ever made (IMO).  The Titan is the biggest die ever made (especially considering process size).  Titan stands alone as a tech achievement.

Also, referring to your man versus mosquito quote - bad call, mosquitos 'carry' malaria which kills around 3/4 million people a year.


----------



## Am* (Feb 18, 2013)

Oh lord, what a turd. Not only is this card over a year late and $400 overpriced, but Nshittia could not even give us an extreme end card that is fully functional (14/15 SMX units? Get the fuck out of here Nvidia, you might as well have called it "GK110 Fail Edition"). The lazy bastards could not even be bothered to make a custom logo for it at the top, (they must have had several crates full of GTX 690 salvaged parts, which they decided to recycle on this card).

If shit stays this overpriced on the green side, I will be going back to AMD next year, even though I can easily afford 2 of these (and was actually considering buying 1 if it had all 2880 fully functional cores, for best consumer end compute performance if nothing else). I'm just waiting for the 8970 to repeat the days of the HD 4870 and rape this turd with 85%-90% performance at less than half the price (can't be that hard, judging by the image above, even with just a die shrink).

What's even MORE hilarious, were the rumours that there are "only 10,000" of these broken turds for sale...yeah and 100,000+ more to come. Good one Nshittia -- already trying to drum up demand for a card nobody wants or gives a flying shit about.



zolizoli said:


> They do the same like apple: nice designe,low cost hardware,gigantic marketing= Astronomical price.



I couldn't agree more.


----------



## BigMack70 (Feb 18, 2013)

the54thvoid said:


> It's not a comparison.  7970 release was December 2011.  This is February 2013.  I don't even consider for a second AMD will be batting an eyelid over this.  I'm quite sure they'd happily concede Titan is the far more powerful card.
> 
> It's going to hurt 690 sales if anything (or boost them).  Given Nvidia's excellent track record at sli scaling and given the pricing rumours, I'd rather pay less for a higher performing 690 than a Titan card.
> 
> ...



More or less this. The Titan isn't set to really change anything in the GPU market other than to upset AMD fanboys. It won't compete with any single GPU card because it's way too expensive, and it's not likely to be faster than the dual-GPU options currently available.

As such, this card is for:
1) Someone who really doesn't like the idea of multiple GPUs but who needs that type of performance.
2) Someone wanting to spend ~$2k on a pair of them in SLI for a setup way more awesome than anything currently available.


----------



## OneCool (Feb 18, 2013)

Maybe once the AIBs get it and drop it to 3gb vram custom coolers etc...Then the price can come down to say 600? It would be worth it.

Like it stands I can build an entire gaming rig plus monitor etc.. for the price of this Frankenstein :shadedshu



TBH for that kind of money it doesnt look good to me at all!!


----------



## Prima.Vera (Feb 18, 2013)

eventide said:


> If you max up all graphic options in Serious Sam 3 in a mere 1920x1080, you will find yourself in need of up to 3GB vram.



I see that, and is kinda strange. On the same game with full details it shown on mine ~ 650MB if I recall... Most I had on Crisys and BF3 (~925)


----------



## 15th Warlock (Feb 18, 2013)

the54thvoid said:


> It's not a comparison.  7970 release was December 2011.  This is February 2013.  I don't even consider for a second AMD will be batting an eyelid over this.  I'm quite sure they'd happily concede Titan is the far more powerful card.
> 
> It's going to hurt 690 sales if anything (or boost them).  Given Nvidia's excellent track record at sli scaling and given the pricing rumours, I'd rather pay less for a higher performing 690 than a Titan card.
> 
> ...



I mostly agree with what you say, if nVidia prices this card above the 690, they'll be shooting themselves in the foot. 

I've seen some benchmarks, and this card is around 70% faster than a single 680 in best case scenarios, but that number can improve with better driver support, still, it would seem that, barring some cases, a dual 680 setup or perhaps even a single 690 may be able to beat Titan in most instances, at least in properly SLI supported games (which is the vast majority of current games)

But let's not be hasty and jump to conclusions, remember there were rumors that the 680 was gonna be priced at $699 before it came out, and we all know how that turned out, if nVidia can price this card in the right slot, they may have a winner in their hands


----------



## Horrux (Feb 18, 2013)

25% faster than a 7970 ghz edition is pretty good. But it's just that - pretty good, especially given that the AMD GPU is over a year old. A 25% performance boost in a year is pretty ho-hum if you ask me.


----------



## erocker (Feb 18, 2013)

Horrux said:


> 25% faster than a 7970 ghz edition is pretty good. But it's just that - pretty good, especially given that the AMD GPU is over a year old. A 25% performance boost in a year is pretty ho-hum if you ask me.



If this is the case, the card isn't worth more than $600 bucks.


----------



## the54thvoid (Feb 18, 2013)

Horrux said:


> 25% faster than a 7970 ghz edition is pretty good. But it's just that - pretty good, especially given that the AMD GPU is over a year old. A 25% performance boost in a year is pretty ho-hum if you ask me.





erocker said:


> If this is the case, the card isn't worth more than $600 bucks.



Bad percentages.  The 7970 has 74% the performance of the Titan.  It actually makes the Titan, 35% faster than the 7970, based on...

Titan 100% = 100fps
7970 74% = 74fps
therefore 100-74 = 26fps difference = 26/74 = 35(%).

The price performance though still sucks.  If it's twice the price of a 7970GHz, it's losing big time.  IMO, 680's should have dropped in price to match 7970's and that would let Titan roll in about 50% more expensive.

But we don't know all for sure yet......


----------



## bpgt64 (Feb 18, 2013)

I love the people posting in this thread complaining about price.  Nvidia and AMD are FOR-PROFIT companies.  Don't like the price, don't buy it?  The only company that appears to care about your feelings is one that's spent money in advertizing to create that image.

Companies place products at every price point they think they can make money at.  Why that does that seem to upset people so.

The advantage to this card is it's the first non sli based single slot card that might be able to handle 1440-1600p without issues.  I am rocking twin GTX 670s, and there are just some games that do not optimize for SLI at all.


----------



## tastegw (Feb 18, 2013)

Y'all looking at a chart for 1920x1200 and complaining,  just wait for real reviews with real benchmarks.

I bet both the Titan and the 7970 would be close at 800x600


----------



## erocker (Feb 18, 2013)

I don't see anyone upset. I see people having a conversation about the card, it's performance and how price relates to it.



tastegw said:


> I bet both the Titan and the 7970 would be close at 800x600



hahahaha!!


----------



## BigMack70 (Feb 18, 2013)

bpgt64 said:


> I love the people posting in this thread complaining about price.  Nvidia and AMD are FOR-PROFIT companies.  Don't like the price, don't buy it?  The only company that appears to care about your feelings is one that's spent money in advertizing to create that image.
> 
> Companies place products at every price point they think they can make money at.  Why that does that seem to upset people so.
> 
> The advantage to this card is it's the first non sli based single slot card that might be able to handle 1440-1600p without issues.  I am rocking twin GTX 670s, and there are just some games that do not optimize for SLI at all.



Posting that a 680/7970 +35% performing card selling at 680/7970 +100% price point isn't complaining, it's pointing out that this card has next to zero value.

Also, 680/7970 +35% performance isn't going to change the 1440/1600p landscape all that much. Dual card configs will still be preferable by far for anyone wanting to push 60fps at max settings in all the most demanding games.

Where I can see this card having value is SLI for triple-monitor.


----------



## HumanSmoke (Feb 18, 2013)

erocker said:


> I don't see anyone upset. I see people having a conversation about the card, it's performance and how price relates to it.


Well, that's easy to sum up; The more extreme the price and performance, the lower the performance per dollar and the higher the incidence that buyers will 1. Have a different perspective on the price (and likely more disposable income), and 2. Have a different perspective on an upper price limit for their hobby/passion.
Cases in point- any tri/quad gaming GPU setup, a $1600 Asus Ares II, sub-zero/refrigerated/bespoke water cooling etc., etc...


----------



## Crap Daddy (Feb 18, 2013)

That chart still gives hope. If in real games scenarios it's anywhere between 35 to 50% better than a 680 at lower clocks and if overclocking is allowed fully we might see this expensive beast reaching 690 levels. The catch with the price is obvious, limited supply and overkill 6GB VRAM : luxury product. Price/perf is not what NV is after, it's rather premium brand statement. Take it or leave it. There is simply no better single GPU card (by a considerable margin) in existence.


----------



## Delta6326 (Feb 18, 2013)

I will wait for W1zz review, but I'm hopping the Titan is about 10-25% slower than the 690(well it would be sweet if it was as fast...)

But my big ?? is when the Titan comes out and then the GTX 780 is the 780 supposed to be faster or slower than the Titan? If it's slower than we don't have much to look forward too, but if its going to be faster than that will be sweet.


----------



## d1nky (Feb 18, 2013)

whats the point in having the best???, when the fun is in tweaking, trying to improve your performance and spending that money on urrm girls or maybe drugs lol


----------



## HumanSmoke (Feb 18, 2013)

Crap Daddy said:


> That chart still gives hope. If in real games scenarios it's anywhere between 35 to 50% better than a 680 at lower clocks and if overclocking is allowed fully we might see this expensive beast reaching 690 levels.


I took from the leaks so far that the Titan would support overvolting based upon thermal dissipation, which could add substantially to the OC potential. It might just explain W1ZZ's teaser











[source]

EDIT: Same info that tastegw linked to, tho' EH seems to be the original source material for WCCF (quelle surprise)


----------



## Horrux (Feb 18, 2013)

the54thvoid said:


> Bad percentages.  The 7970 has 74% the performance of the Titan.  It actually makes the Titan, 35% faster than the 7970, based on...
> 
> Titan 100% = 100fps
> 7970 74% = 74fps
> ...


You got your data from the slide?

Yeah, me too, except I took into account that it's an nVidia slide, which is bound to make the Titan look relatively good. They aren't using a representative bunch of benchmarks, they're using those that make the Titan look good. So, I decided to take the scoring that makes it look the least good off that slide, given its bias, in an un-scientific attempt to un-bias things.

It's all guesswork at this point anyway, right?

So I stand by my 25% number, with the annotation that it provides 25% more FPS. Doesn't mean that the card is only 25% faster, though.


----------



## bpgt64 (Feb 18, 2013)

No one buying this card is going to use it at 1900x1200.  If they are, they should kick themselves in the head.  The only reason to buy this card, over a GTX 680 is for 2560x1600/1440 resolutions.


----------



## Horrux (Feb 18, 2013)

bpgt64 said:


> No one buying this card is going to use it at 1900x1200.  If they are, they should kick themselves in the head.  The only reason to buy this card, over a GTX 680 is for 2560x1600/1440 resolutions.



You think this card can max out games at even 1080p at 120fps?

I doubt it will do so for MOST games. Some, sure, but not most. Still need 2 of 'em.

Unless you are one of those people still using 60hz dinosaur-monitors.


----------



## radrok (Feb 18, 2013)

Horrux said:


> You think this card can max out games at 1920x1200 at 120fps?
> 
> I doubt it will do so for MOST games. Some, sure, but not most.
> 
> Unless you are one of those people still using 60hz dinosaur-monitors.



Give me 60 constant FPS at 2560x1600, then maybe we'll talk about 120hz.
Played with a 120hz monitor, had to switch back INSTANTLY to my 1600p, once you are "baptized" by higher resolutions you'll gladly sacrifice 120hz.


----------



## the54thvoid (Feb 18, 2013)

If Titan turns out to be 50% better than the GTX 680 and is priced at twice it's selling point then:

a) when 780 releases it will not be 50% faster than the 680 (IMO) and will be priced substantially lower than Titan.  This creates a selling hierarchy whereby the series agnostic Titan can transcend product lines and 

b) you think there'll be that many Titans in retail to make it an issue?

but, I'm no psychic and I could be completely wrong but it doesn't make sense to release a product like Titan in any meaningful number if it will stop the next series flagship from being a flagship.
Likewise, if the 780 is better than Titan, folk that paid the big bucks for it (if it does cost the rumoured big bucks) will be furiously pissed off.  I can't see the 780 being launched at a substantially higher price point than the 680 did.
There's a marketing ceiling for non limited series topping products.  I see $500-$600 being that ceiling.


----------



## radrok (Feb 18, 2013)

Well we are not sure about the price, it could very well be lower than what speculation or preorder sites told us.

The only way to know is to wait when the NDA lifts and get the proper MSRP from Nvidia.

Anyway I am pretty sure this GPU will be worth its weight in gold when put under water, I mean look at all the 28nm GPUs, they are insane overclockers and if Nvidia really gives us unlocked voltages (which I hope for) we could very well be near 690 performance with OC.

I fear the OCed power consumptions though


----------



## the54thvoid (Feb 18, 2013)

Yeah, if it's overclockable, unexplodable and affordable, I'll be getting one.  If it's really affordable, two.


----------



## radrok (Feb 18, 2013)

the54thvoid said:


> unexplodable


----------



## Nordic (Feb 18, 2013)

radrok said:


> Well we are not sure about the price, it could very well be lower than what speculation or preorder sites told us.
> 
> The only way to know is to wait when the NDA lifts and get the proper MSRP from Nvidia.
> 
> ...



It would be funny if they locked the voltages


----------



## jihadjoe (Feb 18, 2013)

Dont forget how Nvidia dropped the 8800GT, offering 90% of the performance of the super-pricey 8800GTX while costing less than half as much.



zolizoli said:


> Not really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
> I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
> The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
> This greedy corporates holding back our technological evolution.


----------



## radrok (Feb 18, 2013)

james888 said:


> It would be funny if they locked the voltages



That would be evil. Still Nvidia is famous for supplying barely enough sufficient VRMs, could be a possibility.


----------



## Delta6326 (Feb 18, 2013)

the54thvoid said:


> Yeah, if it's overclockable, unexplodable and affordable, I'll be getting one.  If it's really affordable, two.



I will take one of your 7970's 













<----System Spec's


----------



## Finners (Feb 18, 2013)

me first delta!


----------



## Crowned Clown (Feb 18, 2013)

Are they going to cripple it again so it'll become useless for 3ds max design gpu accelerated rendering? so that I'll be forced to buy their super uber premium stupendously expensive quadro 6000 video card. 

My 680 cant even use quicksilver and mental ray; its slow in opengl apps like google sketchup as well. I wonder if this one to is.


----------



## 3lfk1ng (Feb 18, 2013)

radrok said:


> Give me 60 constant FPS at 2560x1600, then maybe we'll talk about 120hz.
> Played with a 120hz monitor, had to switch back INSTANTLY to my 1600p, once you are "baptized" by higher resolutions you'll gladly sacrifice 120hz.



Or you can get the best of both worlds with a 27" Overlord Tempest (X270OC) 2560x1440 @ 120Hz.


----------



## Nordic (Feb 18, 2013)

3lfk1ng said:


> Or you can get the best of both worlds with a 27" Overlord Tempest (X270OC) 2560x1440 @ 120Hz.



Don't those have a short life expectancy? That would be a monitor that would need titan.

Also, I just had a humorous thought. The 8000 series is named after the planets right? Greek or Roman gods were the planets.


----------



## radrok (Feb 18, 2013)

3lfk1ng said:


> Or you can get the best of both worlds with a 27" Overlord Tempest (X270OC) 2560x1440 @ 120Hz.



As far as I know the 120Hz refresh rate isn't guaranteed, right?


----------



## 3lfk1ng (Feb 19, 2013)

@james888
Yeah, you might only get 8 years of use instead of 12 years. Similar to overclocking your processor, it certainly won't make it last any longer but the results are worth it. The panel is a LG S-IPS panel, the exact panel found in the color accurate $999 Apple Cinema display for _just_ $529.99. It's currently the only 120Hz panel that comes with a warranty.

They will have new stock in come March.

@radrok
Correct, some users have been able to hit 130Hz, others only as high at 115Hz. Mine topped out at 122Hz but I keep it at 120Hz.


----------



## NeoXF (Feb 19, 2013)

So even after nVidia published some benchmarks most of you guys still expect some end-all do-all graphics card? Really?! It's barely 30% faster than GTX 680 or 25% faster than R7970GE by nVidia's own dodgey charts for crying out loud. ASUS Matrix Platinum 7970s that can clock to 1300-1325MHz, are probably gonna be faster the Titan in a lot of scenarios (like Metro VH/DoF/MSAA)...


----------



## radrok (Feb 19, 2013)

NeoXF said:


> So even after nVidia published some benchmarks most of you guys still expect some end-all do-all graphics card? Really?! It's barely 30% faster than GTX 680 or 25% faster than R7970GE by nVidia's own dodgey charts for crying out loud. ASUS Matrix Platinum 7970s that can clock to 1300-1325MHz, are probably gonna be faster the Titan in a lot of scenarios (like Metro VH/DoF/MSAA)...



30% more than GTX 680 is fine for me, I'm due to an upgrade anyway.


----------



## T4C Fantasy (Feb 19, 2013)

radrok said:


> 30% more than GTX 680 is fine for me, I'm due to an upgrade anyway.



you have 1 thousand dollars? to spend on a gpu?


----------



## THE_EGG (Feb 19, 2013)

Sexy beast!


----------



## tastegw (Feb 19, 2013)

NeoXF said:


> So even after nVidia published some benchmarks most of you guys still expect some end-all do-all graphics card? Really?! It's barely 30% faster than GTX 680 or 25% faster than R7970GE by nVidia's own dodgey charts for crying out loud. ASUS Matrix Platinum 7970s that can clock to 1300-1325MHz, are probably gonna be faster the Titan in a lot of scenarios (like Metro VH/DoF/MSAA)...



1920x1200

Wait for higher res benchmarks for better comparison


----------



## [H]@RD5TUFF (Feb 19, 2013)

Can't wait to get my hands on one of these!


----------



## ThunderStorm (Feb 19, 2013)

Oh snap!!
This card costs as much as 1 University credit !


----------



## HumanSmoke (Feb 19, 2013)

NeoXF said:


> So even after nVidia published some benchmarks most of you guys still expect some end-all do-all graphics card? Really?! It's barely 30% faster than GTX 680 or 25% faster than R7970GE by nVidia's own dodgey charts for crying out loud. ASUS Matrix Platinum 7970s that can clock to 1300-1325MHz, are probably *gonna be faster the Titan in a lot of scenarios* (like Metro VH/DoF/MSAA)...


You mean the Matrix Platinum that seems to have been discontinued ?

I'd think you are being overly optimistic in your evaluation....or do you think it unlikely that Nvidia might have conceivably tested the Titan against currently available competition and tailored clocks/power to make sure that the board does what it is intended to do ?


----------



## BigMack70 (Feb 19, 2013)

NeoXF said:


> So even after nVidia published some benchmarks most of you guys still expect some end-all do-all graphics card? Really?! It's barely 30% faster than GTX 680 or 25% faster than R7970GE by nVidia's own dodgey charts for crying out loud. ASUS Matrix Platinum 7970s that can clock to 1300-1325MHz, are probably gonna be faster the Titan in a lot of scenarios (like Metro VH/DoF/MSAA)...



It's always kinda dumb to compare an overclocked card to a stock one. The Titan card will overclock at least somewhat, and only then can you compare a manually overclocked 7970 to a (manually OC'd) Titan. While I agree that what Nvidia is bringing to the table isn't particularly impressive (according to the info we have now), the reason has more to do with price than anything else.

Not many 7970s, even Matrix ones, will hit 1300 MHz on air.


----------



## Fluffmeister (Feb 19, 2013)

NeoXF said:


> So even after nVidia published some benchmarks most of you guys still expect some end-all do-all graphics card? Really?! It's barely 30% faster than GTX 680 or 25% faster than R7970GE by nVidia's own dodgey charts for crying out loud. ASUS Matrix Platinum 7970s that can clock to 1300-1325MHz, are probably gonna be faster the Titan in a lot of scenarios (like Metro VH/DoF/MSAA)...


----------



## xenocide (Feb 19, 2013)

jihadjoe said:


> Dont forget how Nvidia dropped the 8800GT, offering 90% of the performance of the super-pricey 8800GTX while costing less than half as much.



Or the fact that the top end of Nvidia's GPUs has been on the decline since then;

Nvidia Launch Prices:
GTX280 - $650
GTX480 - $500
GTX580 - $500
GTX680 - $499

Granted not a huge drop towards the end, but I think $500 for the pinnacle of GPU performance is somewhat a sweet spot.



Crowned Clown said:


> Are they going to cripple it again so it'll become useless for 3ds max design gpu accelerated rendering? so that I'll be forced to buy their super uber premium stupendously expensive quadro 6000 video card.
> 
> My 680 cant even use quicksilver and mental ray; its slow in opengl apps like google sketchup as well. I wonder if this one to is.



Nope.  The GK110 should be a revision of the GK100, which was the successor to the GTX580.  That means all that compute functionality that was stripped away from GTX680 is back and presumably better than ever.



BigMack70 said:


> It's always kinda dumb to compare an overclocked card to a stock one. The Titan card will overclock at least somewhat, and only then can you compare a manually overclocked 7970 to a (manually OC'd) Titan. While I agree that what Nvidia is bringing to the table isn't particularly impressive (according to the info we have now), the reason has more to do with price than anything else.
> 
> Not many 7970s, even Matrix ones, will hit 1300 MHz on air.



If I recall, didn't the HD7970 launch with clocks similar to what is reported for Titan?  Around 800-850 MHz?  Who's to say the Titan won't overclock at least as well?  GK104 and Tahiti have had no problems being stretched upwards of 30% for retail cards...


----------



## Cuzza (Feb 19, 2013)

Dude, you can't pair four of something. A pair is two.


----------



## BigMack70 (Feb 19, 2013)

xenocide said:


> If I recall, didn't the HD7970 launch with clocks similar to what is reported for Titan?  Around 800-850 MHz?  Who's to say the Titan won't overclock at least as well?  GK104 and Tahiti have had no problems being stretched upwards of 30% for retail cards...



The 7970 launched at 925 MHz and the average overclock is around 1200 Mhz, which is crazy.

Average % OCs are a bit less for GTX 670/680 in comparison. 

I am curious to see how this Titan card OCs. I'm glad they're supposedly restoring real voltage control to the card, but with the chip being so large (7.1B transistors!!!), I kind of doubt that it's going to OC much more than 10-15%. 

The big average overclocks seen from the 7950 and 7970 are somewhat uncommon for high end GPUs.


----------



## NeoXF (Feb 19, 2013)

OK, I'm done in this thread. Had enough of fanboys or people with pockets deeper then their intellect/sanity well for one day. Enjoy your crummy stillborn stopgap GPU.

And for the record, when I saw ocholic.ch's "review" of it, I was absolutely ecstatic about getting one during the summer maybe... even tho they smelled like complete bullshit... Thanks nVidia for confirming that... no GPUs to look forward to anytime soon then...


----------



## xorbe (Feb 19, 2013)

~850MHz?  What'll it do under water @ 1150 and 400 watts ...


----------



## tastegw (Feb 19, 2013)

NeoXF said:


> OK, I'm done in this thread. Had enough of fanboys or people with pockets deeper then their intellect/sanity well for one day. Enjoy your crummy stillborn stopgap GPU.
> 
> And for the record, when I saw ocholic.ch's "review" of it, I was absolutely ecstatic about getting one during the summer maybe... even tho they smelled like complete bullshit... Thanks nVidia for confirming that... no GPUs to look forward to anytime soon then...


----------



## eidairaman1 (Feb 19, 2013)

Crowned Clown said:


> Are they going to cripple it again so it'll become useless for 3ds max design gpu accelerated rendering? so that I'll be forced to buy their super uber premium stupendously expensive quadro 6000 video card.
> 
> My 680 cant even use quicksilver and mental ray; its slow in opengl apps like google sketchup as well. I wonder if this one to is.



for professional graphics its all how the driver is setup a desktop driver is not the same as a professional driver, at least back in the day some GF and Radeon Cards could be converted over to the Quadro and Fire GL Series via a bios flash and use of the proper drivers for those cards


----------



## valentyn0 (Feb 19, 2013)

NeoXF said:


> OK, I'm done in this thread. Had enough of fanboys or people with pockets deeper then their intellect/sanity well for one day. Enjoy your crummy stillborn stopgap GPU.
> 
> And for the record, when I saw ocholic.ch's "review" of it, I was absolutely ecstatic about getting one during the summer maybe... even tho they smelled like complete bullshit... Thanks nVidia for confirming that... no GPUs to look forward to anytime soon then...



That's ironic, considering i was about to say earlier, to chill down, amd fanboy!


----------



## tacosRcool (Feb 19, 2013)

Looks like their Tesla series of GPUs


----------



## Crowned Clown (Feb 19, 2013)

xenocide said:


> Nope.  The GK110 should be a revision of the GK100, which was the successor to the GTX580.  That means all that compute functionality that was stripped away from GTX680 is back and presumably better than ever.



need some reviews before I'd jump to a 1K price tagged gpu. I wanna see how it performs in 3d rendering.



eidairaman1 said:


> for professional graphics its all how the driver is setup a desktop driver is not the same as a professional driver, at least back in the day some GF and Radeon Cards could be converted over to the Quadro and Fire GL Series via a bios flash and use of the proper drivers for those cards



Never knew that, so I did some experiment and bought a low prof pro gpu; a $130 quaddro 410 that performs 10x better in sketchup than my 680.


----------



## brandonwh64 (Feb 19, 2013)

Fan boi this fan boi that. I love these threads.


----------



## Horrux (Feb 20, 2013)

eidairaman1 said:


> for professional graphics its all how the driver is setup a desktop driver is not the same as a professional driver, at least back in the day some GF and Radeon Cards could be converted over to the Quadro and Fire GL Series via a bios flash and use of the proper drivers for those cards



I thought it had more to do with the silicon's error testing process, which (I thought) was long and involved for professional workstation GPUs and rudimentary for gaming GPUs. The logic being that you really need error free graphics computation in the pro market but not for gaming.

Actually, I'm fairly certain this is still the case.  Converting a gaming GPU to a "workstation" GPU through a bios flash will make a gaming GPU appear to be a workstation GPU, but it doesn't mean it will have the same lack of computation error...


----------



## Xzibit (Feb 20, 2013)

Cuzza said:


> Dude, you can't pair four of something. A pair is two.



I beg to differ.

Boobs 

If I have 1 girl I get 2 or X-Fire/SLI.
If I have a pair of girls I have 4 or Quad-Fire/Quad-SLI


----------



## eidairaman1 (Feb 20, 2013)

Horrux said:


> I thought it had more to do with the silicon's error testing process, which (I thought) was long and involved for professional workstation GPUs and rudimentary for gaming GPUs. The logic being that you really need error free graphics computation in the pro market but not for gaming.
> 
> Actually, I'm fairly certain this is still the case.  Converting a gaming GPU to a "workstation" GPU through a bios flash will make a gaming GPU appear to be a workstation GPU, but it doesn't mean it will have the same lack of computation error...




you ever compared specs between certain cards, they are exactly the same, on the professional side its all about those drivers and software


----------



## johnspack (Feb 20, 2013)

I want four of these,  and 2 of them I'll break up and smoke for 3 months straight.....


----------



## HumanSmoke (Feb 20, 2013)

eidairaman1 said:


> you ever compared specs between certain cards, they are exactly the same, on the professional side its all about those drivers and software


Not entirely. Both you and Horrux are correct.
Pro cards like the Tesla have a much more rigorous validation procedure which goes far beyond crafting UMD's (User Mode Drivers) and in-place ongoing test evaluation, so while the GPUs are binned for voltage and usable logic blocks, you'd find that a more fine scaled binning is also being used to test the integrity of the logic blocks that are functional. I don't think the binning of pro GPUs differs a great deal from pro CPUs like Xeon and Opteron in that respect.


----------



## Horrux (Feb 20, 2013)

eidairaman1 said:


> you ever compared specs between certain cards, they are exactly the same, on the professional side its all about those drivers and software



I know my shiz bro. I come across as humble, because I am, but I still know tech.


----------



## eidairaman1 (Feb 21, 2013)

HumanSmoke said:


> Not entirely. Both you and Horrux are correct.
> Pro cards like the Tesla have a much more rigorous validation procedure which goes far beyond crafting UMD's (User Mode Drivers) and in-place ongoing test evaluation, so while the GPUs are binned for voltage and usable logic blocks, you'd find that a more fine scaled binning is also being used to test the integrity of the logic blocks that are functional. I don't think the binning of pro GPUs differs a great deal from pro CPUs like Xeon and Opteron in that respect.



ya I wouldn't doubt it. I did research on the 5870 and its firegl/firestream counterpart only thing changed was capacity.of ram and clock speeds



Horrux said:


> I know my shiz bro. I come across as humble, because I am, but I still know tech.



ok cool dude


----------

