# GeForce GTX 780 Ti Pictured in the Flesh



## btarunr (Nov 2, 2013)

Shortly after its specifications sheet leak, pictures of a reference GeForce GTX 780 Ti (which aren't renders or press-shots) surfaced on ChipHell Forums. The pictures reveal a board design that's practically identical to the GTX TITAN and GTX 780, with the "GTX 780 Ti" marking on the cooler. The folks over at ChipHell Forums also posted five sets of benchmark results, covering various 3DMark tests, Unigine Valley, Aliens vs. Predator 3, Battlefield 3, and Bioshock: Infinite, on a test-bed running Core i7-4960X at 4.50 GHz, and 16 GB of quad-channel DDR3-2933 MHz memory. Given its specifications, it comes as no surprise that the GTX 780 Ti beats both the GTX TITAN, and R9 290X, and goes on to offer performance that's on par with dual-GPU cards such as the GTX 690, and HD 7990. For a single-GPU card, that's a great feat.



 

 

The benchmark results from ChipHell's run follow.





 

 

 

 



*View at TechPowerUp Main Site*


----------



## TheoneandonlyMrK (Nov 2, 2013)

Well that be unrealistically bollocks then just my opinion.
Two gpu card beater my ass and if so feck ya wallet you will need the credit card for another grand slapping.

I did note the cpu in these benches getting spanked too


----------



## xvi (Nov 2, 2013)

Interesting. I take it the Titan is now obsolete? (..or will this card cost ~$1,200?)


----------



## the54thvoid (Nov 2, 2013)

theoneandonlymrk said:


> *Well that be unrealistically bollocks* then just my opinion.
> Two gpu card beater my ass and if so feck ya wallet you will need the credit card for another grand slapping.
> 
> I did note the cpu in these benches getting spanked too



Not so sure, look at the power consumption - it's quite a bit higher compared to Titan and 290X.  Titan is what, 40-50 watts less than 290X but this 780Ti is a higher draw than the 290X by quite a bit in some cases - that would tie with those figures.

Essentially, Nvidia (if these benchies are real) just abandoned the power efficiency model to get a higher 'brute force' player.

Still, nothing really new here - we can't call it until the reviews come in.



xvi said:


> Interesting. I take it the Titan is now obsolete? (..or will this card cost ~$1,200?)



I thought it had been priced already at $750?


----------



## MxPhenom 216 (Nov 2, 2013)

the54thvoid said:


> not so sure, look at the power consumption - it's quite a bit higher compared to titan and 290x.  Titan is what, 40-50 watts less than 290x but this 780ti is a higher draw than the 290x by quite a bit in some cases - that would tie with those figures.
> 
> Essentially, nvidia (if these benchies are real) just abandoned the power efficiency model to get a higher 'brute force' player.
> 
> ...



$699


----------



## Slomo4shO (Nov 2, 2013)

I wonder how much overclock headroom this card will have considering the amount of power it is drawing at reference clocks... Also, is that total system power? The card is rumored to have only a 6+8 pin connectors so I am unsure how it would draw near 500W on its own.


----------



## the54thvoid (Nov 2, 2013)

MxPhenom 216 said:


> $699



Thanks, dunno where i got that number from 

What follows is conjecture based on given graphs, not proof of existence.....

Incidentally, red lines are fps, blue lines are power draw.  It looks a lot faster than 290X but also consumes 10-15% more power which is probably about 20% more than a Titan, which could be about 80-100 watts? 

But hey we've learned recently power draw doesn't matter.


----------



## Slomo4shO (Nov 2, 2013)

the54thvoid said:


> It looks a lot faster than 290X but also consumes 10-15% more power which is probably about 20% more than a Titan, which could be about 80-100 watts?



Take a look at the 4K benchmarks. Seems the card is limited by memory bandwidth...


----------



## the54thvoid (Nov 2, 2013)

Slomo4shO said:


> Take a look at the 4K benchmarks. Seems the card is limited by memory bandwidth...



Single card 4k is pointless tbh.  Need dualies for that.


----------



## Xzibit (Nov 2, 2013)

the54thvoid said:


> Not so sure, look at the power consumption - it's quite a bit higher compared to Titan and 290X.  Titan is what, 40-50 watts less than 290X but this 780Ti is a higher draw than the 290X by quite a bit in some cases - that would tie with those figures.
> 
> Essentially, Nvidia (if these benchies are real) just abandoned the power efficiency model to get a higher 'brute force' player.



Hopefully it doesn't run hot or sound like a turbine... 

Reference 780 & Titan already hitting 81C.

Next week is going to be interesting.

Remember when W1zzard did this.


----------



## qubit (Nov 2, 2013)

This is the card to have. Hope it's priced reasonably - reasonably for nvidia that is.


----------



## the54thvoid (Nov 2, 2013)

Xzibit said:


> Hopefully it doesn't run hot or sound like a turbine...



Well, you've got a point there.  That power, if true, will need cooling.....


----------



## Steevo (Nov 2, 2013)

qubit said:


> This is the card to have. Hope it's priced reasonably - reasonably for nvidia that is.



So you would willingly pay more for a card/chip to be squeezed to its limit of heat and power consumption and not ever be able to push it further, than to buy a cheaper chip/card that has obvious performance to be had with mods most will do any way that will still beat it in the end?


----------



## Am* (Nov 2, 2013)

Not bad. Graphs are layed out in a annoying format though, should be 290X and 780Ti side by side and everything else away from them.


----------



## xvi (Nov 2, 2013)

Am* said:


> Not bad. Graphs are layed out in a annoying format though, should be 290X and 780Ti side by side and everything else away from them.



While I agree, I think we're just lucky to have a benchmark leak.


----------



## DarkOCean (Nov 2, 2013)

Nice power consumption 
gtx 480 was nothing compared to this.
anybody selling a nuclear reactor?


----------



## xvi (Nov 2, 2013)

DarkOCean said:


> Nice power cosumption
> gtx 480 was nothing compared to this.
> anybody selling a nuclear reactor?
> http://www.techpowerup.com/img/13-11-02/4c.jpg



Of all the benchmarks, Futuremark is worse case for the 780 Ti. It's more reasonable in other tests, but it's true that the 780 Ti using that much power is quite scary.

Perhaps nVidia is ignoring power limits for known benchmarks to bolster performance/reviews?


----------



## Am* (Nov 2, 2013)

DarkOCean said:


> Nice power consumption
> gtx 480 was nothing compared to this.
> anybody selling a nuclear reactor?
> http://www.techpowerup.com/img/13-11-02/4c.jpg



I've been saying this for ages -- Kepler at full power is not much more power efficient than Fermi/GCN without gimping/throttling or the BIOS trickery that Nvidia have been playing these past 2 years. So long as this GTX 780 Ti has decent working compute performance or extra overclocking headroom/higher clock stability, some extra power consumption is a trade off I'll be willing to accept.


----------



## RCoon (Nov 2, 2013)

Looks like both AMD and NVidia are sucking the balls lately. Neither of these cards are must buys in my mind. Nvidia is overpriced, and AMD makes plasma heated Dyson vacuums.


----------



## qubit (Nov 2, 2013)

Steevo said:


> So you would willingly pay more for a card/chip to be squeezed to its limit of heat and power consumption and not ever be able to push it further, than to buy a cheaper chip/card that has obvious performance to be had with mods most will do any way that will still beat it in the end?



I want a card that runs fast and (relatively) quiet like NVIDIA cards do at stock, with a decent driver control panel and more or less free of annoying driver bugs, especially with SLI. I'll bet the 780 Ti delivers on this in spades. Reviews will tell us for sure, so let's not argue about it. And I'll bet wizzy has already played with it and knows the answer.  Damn you NDA! 

Does it help if I tell you that in my opinion, all high end GPUs from either brand are stressed out enough as it is with all that heat meaning that I don't overclock them, even if the headroom is there? Therefore, getting more out of it than stock isn't something I look for in a graphics card.

EDIT: I really like the unique features that NVIDIA deliver with their graphics cards like 3D Vision, LightBoost and now G-Sync.


----------



## DF is BUSY (Nov 2, 2013)

RCoon said:


> Looks like both AMD and NVidia are sucking the balls lately. Neither of these cards are must buys in my mind. Nvidia is overpriced, and AMD* makes plasma heated Dyson vacuums.*


----------



## Am* (Nov 2, 2013)

RCoon said:


> Looks like both AMD and NVidia are sucking the balls lately. Neither of these cards are must buys in my mind. Nvidia is overpriced, and AMD makes plasma heated Dyson vacuums.



Don't forget Nvidia's greatest hits too :


----------



## ShurikN (Nov 3, 2013)

Hey TPU, when are you going to test a watercooled 290X? Just to see how fast it really is. And than compare it to 780ti (when it releases).


----------



## the54thvoid (Nov 3, 2013)

ShurikN said:


> Hey TPU, when are you going to test a watercooled 290X? Just to see how fast it really is. And than compare it to 780ti (when it releases).



Unlikely to be a formal test.

W1zzard generally doesn't review water blocked cards.  And it would have to be a AIB card, not a custom cooled one.  However, they'll start to appear in the benchmark threads, keep your eyes peeled.


----------



## MxPhenom 216 (Nov 3, 2013)

Steevo said:


> So you would willingly pay more for a card/chip to be squeezed to its limit of heat and power consumption and not ever be able to push it further, than to buy a cheaper chip/card that has obvious performance to be had with mods most will do any way that will still beat it in the end?



So do you have GTX780Ti already and can confirm that you are unable to push it further? :shadedshu

Ill wait to see what Wizz says in terms of overclocking before making any statements on it. We might all be surprised.


----------



## Raptorpowa (Nov 3, 2013)

Whoever wins the crown I will go with the fastest one and treat it really well with my pedestal addition to my SM8, not only that I pre-order the RIVE black edition to replace the old fart i7 920. So this gpu will be happy with the new home. The 3 hd 7950 will stay with i7 920 to a new case prolly corsair 750....


----------



## ensabrenoir (Nov 3, 2013)

*Its the law of the Techno-Jungle....baby*

.......wow all this science, math, numbers  crunching.....boulder dash.!!!  You got the fastest gpu ? You claim the rights to what ever price you want!  Its the Law!


----------



## 1d10t (Nov 3, 2013)

So this early leaks shows nVidia "naturally" 4-5% faster than R9 290X while requiring 12-15% more power.Now,we need that guy that always bash R9 290X to claim this card had audible sound at max load


----------



## SIGSEGV (Nov 3, 2013)

1d10t said:


> So this early leaks shows nVidia "naturally" 4-5% faster than R9 290X while requiring 12-15% more power.Now,we need that guy that always bash R9 290X to claim this card had audible sound at max load



let's don't forget that this is 699$ card and their reference design cooler is kind of shiny and elegant. I'm so sure that cooler is able to cool this card better than 290x reference (549$)  /sarcasm.
I love how the way nvidia milking the cash cow


----------



## radrok (Nov 3, 2013)

This card needs two eight pin power connectors and at least eight power phases just for the core. 

And lol to people saying this hasn't overclocking headroom, 2688 cuda Titans can reach 1300/1400 Mhz core with 1.3v. 

Wouldn't be surprised to see 1500 mhz core on 1.5v classifieds with this chip.


----------



## ShurikN (Nov 3, 2013)

radrok said:


> Wouldn't be surprised to see 1500 mhz core on *1.5v* classifieds with this chip.


Yea... good luck with the power bill...


----------



## SIGSEGV (Nov 3, 2013)

ShurikN said:


> Yea... good luck with the power bill...



they don't care


----------



## Suka (Nov 3, 2013)

The power figures of the 780Ti make the 290x look good now the guys who complained about the power consumption will be like (fill in your thoughts here)  Assuming all this is true


----------



## 1d10t (Nov 3, 2013)

SIGSEGV said:


> let's don't forget that this is 699$ card and their reference design cooler is kind of shiny and elegant. I'm so sure that cooler is able to cool this card better than 290x reference (549$)  /sarcasm.
> I love how the way nvidia milking the cash cow



Oh don't forget their feature and proprietary stuff...$699 is nothing for such a fancy cooler that trade 2 db bla...bla 2 times louder bla...bla,3D -sooo 2010-active shutter glass,boost lightning on TN panel and  future Gay-Sync to mark a duet between Justin Timberlake and Justin Bieber  
Subjective is a bliss,ignorance is new logic


----------



## xorbe (Nov 3, 2013)

780Ti length: 281mm (11.0")
Titan length: 267mm (10.5")


----------



## The Von Matrices (Nov 3, 2013)

Slomo4shO said:


> Take a look at the 4K benchmarks. Seems the card is limited by memory bandwidth...



The spec sheet clearly shows 7GHz memory, which on a 384-bit bus would give it 336GB/s of memory bandwidth.  This is more than the R9 290X's 320GB/s bandwidth, so according to your reasoning it shouldn't lose to or tie the 780Ti at 4K (even though it does).  I suspect ROP performance is more of the issue here.



Suka said:


> Assuming all this is true



I think more likely than not these numbers are correct.  But what I've learned from R9 290X speculation and hype is just how cherry picked these initial leaked benchmarks are (for better or for worse depending on the bias of the source.)  You can't get a full idea of the card's advantages and disadvantages just based on 5 benchmarks.  The R9 290X's performance looked great from initial benchmarks and specifications, but then the final reviews showed the heatsink and power consumption, which significantly dulled the appeal.


----------



## mastrdrver (Nov 3, 2013)

the54thvoid said:


> Single card 4k is pointless tbh.  Need dualies for that.



I disagree as the 4k benches help anyone with multiple monitors to get a good feel of how the card will perform.

You also do not need multiple cards, but you do need bandwidth. That's the biggest killer of 4k and multiple monitor setups. You can see this in the benchmarks of the 290x as the resolution scales to 4k.


----------



## Eagleye (Nov 3, 2013)

I just hope W1zzard tests this card in the same manner he did the 290X e.g. sticking his hand in front of the air-vent to see how it does. I also hope all reviews including W1zzard warm the card up for benches as was done for the 290X, Otherwise the tests are null.

Now back to the card.. Wow this thing is going to take the record for hottest, highest power usage and probably loudest card ever made. Just the electric bill will double the price on this card within a year.


----------



## chinmi (Nov 3, 2013)

when r9 290x came out, compared to a gtx 780, the r9 290x is :

cheaper 
faster 
hotter
more power consumption
then gtx780... most nvidia fanboys reaction about the r9 290x on youtube and review comment : it's too hot, and that power bill is outrageous !! who cares if the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780 ftw !

then 780ti came out, compared to a r9 290x, the r9 290x is :

cheaper 
slower 
hotter
more power consumption
then gtx780ti... i bet most nvidia fanboys reaction on youtube and review comment gonna be : it's faster !! who cares about heat and power bill, even though the r9 290x is cheaper, no way i'm gonna buy the r9 290x ! gtx 780ti  ftw !


----------



## Sihastru (Nov 3, 2013)

An argument that can be used by both camps is not an argument at all. And you forgot about the noise levels.


----------



## jagd (Nov 3, 2013)

I am agreed with you but problem is more  complicated , if a company is giving chery pciked benchmark list to reviewers/review sites and asks to them shown and give some spec list must mentioned it is time the question how independent are reviewers and how many step(s )away PR/marketers for that company ? Similiar thing happened whit xbox360 and most of gaming media trying downplay 720p games on xboxone vs 1080p BF4 and CoD at PS4 
http://www.neogaf.com/forum/showthread.php?t=704836 



The Von Matrices said:


> I think more likely than not these numbers are correct.  But what I've learned from R9 290X speculation and hype is just how cherry picked these initial leaked benchmarks are (for better or for worse depending on the bias of the source.)



Are you sure they are only fanboys ? Schills ? Social Media marketers ? Focus group members ?Remember  nvidia got cought while its hand in cookie jar   


chinmi said:


> then gtx780ti... i bet most nvidia fanboys reaction on youtube and review comment gonna be : it's faster !!


----------



## rainzor (Nov 3, 2013)

So noone noticed how in half of those "tests" 290x is on par with Titan, and in the other half on par with gtx780 when it comes to power consumption? Every bench ive seen so far shows it consumes at least 40w more then the titan and double that compared to 780.

oh yea, 599 for 3gb version and 649 for 6gb or gtfo


----------



## OC-Rage (Nov 3, 2013)

*HA HA BEATS all GPUS*

hi

see this card with 3GB Vram beats all single and duall GPUS

power is there and cheaper than all GPUS

performance that's on par with dual-GPU cards such as the GTX 690, and HD 7990. For a 

single-GPU card, that's a great feat


----------



## repman244 (Nov 3, 2013)

So why is nobody complaining about the power consumption now?


----------



## Raptorpowa (Nov 3, 2013)

is it 4gb vram and 512 bit bus like 290X? If not...290X is the one for me cuz I will be rocking 3 27 " crossover monitor soon....


----------



## the54thvoid (Nov 3, 2013)

repman244 said:


> So why is nobody complaining about the power consumption now?





> but also consumes 10-15% more power which is probably about 20% more than a Titan, which could be about 80-100 watts?



See that 'eek'?  I've mentioned twice in this thread about it's power consumption.  It's bloody high.  The only mitigating factor is that _*IF*_ the figures are true, it matches dual gpu performance.  

The beautiful thing we are about to see is the power of hypocrisy.  If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this card.  The power usage isn't an issue.  It's on the same node as a GK104 chip, therefore has the same (in)efficiencies.  So if this single card matches a GTX690, we should expect it to draw similar power.  If it draws lots more than the relative increase over a GTX690 then it is less efficient.

Power usage is only an argument from a performance/watts ratio.  Apologies for using the 290X graph but it is relevant and has all the big players.


----------



## qubit (Nov 3, 2013)

the54thvoid said:


> The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this card.



Quite agree. If it's hot and noisy then you can bet I'll be criticising it. I might generally prefer NVIDIA's products, but if they put out a lemon, I'm gonna call them out on it.


----------



## Crap Daddy (Nov 3, 2013)

the54thvoid said:


> The beautiful thing we are about to see is the power of hypocrisy. If this thing is hot and noisy then all those blasting the 290X will need to keep their mouths shut or also criticise this car



First, these leaks are very far from what a professional review means. Videocardz says the 780Ti was clocked 50Mhz above stock. I find it hard to believe that Nvidia will launch a card that's as noisy, hot and power hungry like the 290X. At stock clocks expect the reference 780Ti to draw less power than the 290X while performing better. While it seems it's impossible to surpass convincingly the 290X on several Gaming Evolved titles, I do think it's fair to assume that in every other game this card can achieve around 10% improvement.
I also think that finally Nvidia will allow better overclocking of the card, a situation where power consumption and heat will shoot through the roof. But that is to be expected.


----------



## 1c3d0g (Nov 3, 2013)

repman244 said:


> So why is nobody complaining about the power consumption now?



Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out?


----------



## 20mmrain (Nov 3, 2013)

My question is who cares about 4k resolution when 99% of the people can't afford it and are not using it?
Christ most people still don't use a 2560x1440 monitor either. Why don't graphics card manufactures concentrate on something more important .... like I know building a card that doesn't use 350 Watts by itself and doesn't require nuclear facility to cool it. It won't belong before all cards come with water blocks or need a 700 watt PSU to power the card by itself

All of this seems like laziness to me! It's been a long time since any real progress has been made in the video card front! This re-badge card just proves it some more.


----------



## Pandora's Box (Nov 3, 2013)

20mmrain said:


> My question is who cares about 4k resolution when 99% of the people can't afford it and are not using it?
> Christ most people still don't use a 2560x1440 monitor either. Why don't graphics card manufactures concentrate on something more important .... like I know building a card that doesn't use 350 Watts by itself and doesn't require nuclear facility to cool it. It won't belong before all cards come with water blocks or need a 700 watt PSU to power the card by itself
> 
> All of this seems like laziness to me! It's been a long time since any real progress has been made in the video card front! This re-badge card just proves it some more.



We really need 20nm to move forward at this point. 28nm is holding AMD and Nvidia back.


----------



## RCoon (Nov 3, 2013)

Pandora's Box said:


> We really need 20nm to move forward at this point. 28nm is holding AMD and Nvidia back.



When 20nm comes its going to be a lot more costly for both AMD and NVidia. 28nm is refined and as far as they are concerned it's dirt cheap to produce something that has much better failure rates - one of the many reasons 20nm is taking so long. If and when 20nm does come, the GPU's are probably going to be pretty expensive.


----------



## NeoXF (Nov 3, 2013)

It's Radeon HD 7970 vs GeForce GTX 680 all over again...

I see this being mostly a driver war... again, which is great, we didn't have drivers that squeezed the living daylight (in terms of performance) before that generation (at least, in the limited way high-level APIs can allow it).

It's fair to think that AMD might release a newer stepping of Hawaii Pro/XT (Pro-H2/XT2? Ha...) with better yields and maybe higher stock clocks (for one thing tho, the memory could be a lot higher clocked).

R9 290X w/ increased efficiency and 1075MHz reference boost clock & 8GB of 6500MHz GDDR5s please...

BTW, anyone know if Hawaii XT is full Hawaii chip? 3072 ALUs / 192 TMUs sounds like way better numbers... Maybe only for the Pro market. 

All in all, performance is on the way up and prices on the way down, and the (GP)GPU landscape isn't boring anymore, for the moment.


----------



## symmetrical (Nov 3, 2013)

Yayyy it looks like..... the other gpus.....


----------



## SIGSEGV (Nov 3, 2013)

1c3d0g said:


> Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out?




Shitty card which tackle 650$ even 1000$ card
Yes sir. It's a shitty card indeed.. driver blah blah classy...

Are you mad bro?


----------



## repman244 (Nov 3, 2013)

1c3d0g said:


> Because we get faster performance, greater driver stability, less noise (etc. etc. etc.) compared to whatever shitty card AMD puts out?



U mad? 
Faster performance at higher power (noise usually goes hand in hand with power so don't get your hopes up) and at higher price while AMD offered higher performance at lower price 
Next thing you know you'll also say that the higher power consumption is a plus also 
Driver point is moot since both sides have it's fair share of problems.

It's just funny to see how very few mention the power consumption now like they did when the 290x was tested. I personally don't care but it's funny to read.


----------



## EpicShweetness (Nov 3, 2013)

I might hold out for the R9 290, but with the performance I get now it's questionable as to why not wait for 20nm. All these "new" cards are 28nm being pushed to it's limits, and while impressive there are key areas (in my book) these cards sacrificed for there sheer performance. So whatever, I'll wait, there are certain things I want with my cards.


----------



## OC-Rage (Nov 3, 2013)

*who say?*



repman244 said:


> U mad?
> Faster performance at higher power (noise usually goes hand in hand with power so don't get your hopes up) and at higher price while AMD offered higher performance at lower price
> Next thing you know you'll also say that the higher power consumption is a plus also
> Driver point is moot since both sides have it's fair share of problems.
> ...



i think this card is better faster and cheaper than R9 290x

when any company unleashe any GPU faster and powerfull think about

power consumption and noise and heats

i dont think come soon any 20 nm gpu

waiting for any more benchs you see GTX 780 Ti beats all AMD GPUS single and duals


----------



## MxPhenom 216 (Nov 3, 2013)

OC-Rage said:


> i think this card is better faster and cheaper than R9 290x
> 
> when any company unleashe any GPU faster and powerfull think about
> 
> ...



Um, 

Cheaper no. Faster yes.

NVidia already confirmed 699 price for the 780ti ($150 more then the r290x). It also probably wont beat Dual GPU cards, but if it even gets near one it'll be right around 690 performance.


----------



## Jaffakeik (Nov 3, 2013)

Are those really leaks?As they say in description,I thionk its been done on purpose just to add some hype to potential customers,so they just nameing it as leak just to hide advert  BUt it looks monster


----------



## HumanSmoke (Nov 3, 2013)

SIGSEGV said:


> Shitty card which tackle 650$ even 1000$ card
> Yes sir. It's a shitty card indeed.. driver blah blah classy...
> 
> Are you mad bro?


Conversely, the $550 290X offers 4% (quiet) / 11.1% (scream) more performance than the reference 780 for a 10% higher price and three fewer games ( or 16% higher price w/ 2 fewer games)...and that's assuming that you could find a 290X in stock.

Nice cherry picking. Even nicer incoherent rant.
Kirk Lazarus doesn't approve.


----------



## Recus (Nov 3, 2013)

Looks like smaller die is useless die.



> NVIDIA clearly targets Radeon R9 290X in their comparisons. The green team is well aware of AMD problems with the noise and the temperature of their new flagship. NVIDIA’s theory is that since R9 290X is using 455mm2 die and GTX 780 Ti is based on 533 mm2 die, it equals to lower thermal density, thus less power condensed into smaller area. Long story short, NVIDIA’s GPU will generate less thermal density per square millimeter, so it is easier to dissipate.
> 
> NVIDIA made a test 20 minute Crysis 3 run with R9 290X and 780 TI. According to their data after 2 minutes R9 290X drops to 720 MHz, while GTX 780 Ti sustains 940 MHz clock. The average clock speeds are 799 MHz and 968 MHz for 290X and 780 Ti respectively.



http://videocardz.com/47576/nvidia-geforce-gtx-780-ti-official-specifications


----------



## radrok (Nov 3, 2013)

Recus said:


> Looks like smaller die is useless die.
> 
> 
> 
> http://videocardz.com/47576/nvidia-geforce-gtx-780-ti-official-specifications



I don't get what you people gain by throwing gasoline on threads, especially you.


----------



## the54thvoid (Nov 3, 2013)

radrok said:


> I don't get what you people gain by throwing gasoline on threads, especially you.



You should know by now Rad that some people can only post in certain colours.  I like my gfx cards like i like my hoes - hot, expensive and composed of quality silicon.


----------



## Crap Daddy (Nov 3, 2013)

Supposedly Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:


----------



## The Von Matrices (Nov 3, 2013)

I'm calling BS on their power consumption numbers; something is seriously wrong.

W1zzard tested the GTX 690 at 274W Peak and the R9 290X at 282W peak.  Yet somehow the 780Ti in these charts consumes 75W more than either of those.  That would make it a 375W card, which it couldn't be since it only has a 300W power design (6+8pin PCIe connectors).

I'm not expecting the 780Ti to use less power than either of the aforementioned cards, but it cannot be more than 15-20W above those while still remaining in spec.  These power consumption numbers can't be correct for the benchmarks.

The only way I could possibly see those numbers make sense is if they were taken from Furmark or OCCT.  AMD clamps down on power much harder than NVidia in those power viruses so the results are really unrepresentative of the normal power consumption of the card.


----------



## qubit (Nov 3, 2013)

Crap Daddy said:


> Supposedly Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:
> 
> http://img.techpowerup.org/131103/NVIDIA-GeForce-GTX-780-Ti-gaming-performance.png



NVIDIA wins in everything. Nice.


----------



## Crap Daddy (Nov 3, 2013)

The Von Matrices said:


> I'm not expecting the 780Ti to use less power than either of the aforementioned cards, but it cannot be more than 15-20W above those while still remaining in spec. These power consumption numbers can't be correct for the benchmarks.





> Moving on to power characteristics. GTX 780 Ti has much lower TDP than 290X, which is 250 watts. This is actually the same number as for TITAN and 780.



source: http://videocardz.com/47576/nvidia-geforce-gtx-780-ti-official-specifications

Apparently they have official specs of the card. Another interesting thing is:



> NVIDIA equipped its GTX 780 Ti with a new feature called Power Balancing. Without this feature power drawn from: 6-pin power connector, 8-pin power connector and PCI Express interface would be balanced across these three sources respectively depending on the current load. However if user overclocks the card power delivery becomes unbalanced, thus card draws more power from one source than the others. To fix this problem NVIDIA came up with an idea of Power Balancing. You probably see where this is going. With this feature enabled GPU can steer the power from one input to another. This will improve overclocking capabilities in comparison to GTX 780 or the TITAN.


----------



## NeoXF (Nov 3, 2013)

The amount of ignorance and fanboysm in this thread is too damn high!


----------



## MxPhenom 216 (Nov 3, 2013)

NeoXF said:


> The amount of ignorance and fanboysm in this thread is too damn high!



The amount of worthiness in this post is too damn low.


----------



## Eagleye (Nov 3, 2013)

The Von Matrices said:


> The only way I could possibly see those numbers make sense is if they were taken from Furmark or OCCT.  AMD clamps down on power much harder than NVidia in those power viruses so the results are really unrepresentative of the normal power consumption of the card.



WRONG; Nvidia clamps down on power whereas AMD does nothing to throttle power on most if not all benchmarks, especially furmark. See TPU Reviews

This so called leak is fake. If it was real, am sure its being shown in its best case scenario (Power included)


----------



## nemesis.ie (Nov 3, 2013)

the54thvoid said:


> You should know by now Rad that some people can only post in certain colours.  I like my gfx cards like i like my hoes - hot, expensive and composed of quality silicon.



Are you sure you don't prefer your ladies with "quality silicone" instead?


----------



## The Von Matrices (Nov 3, 2013)

Eagleye said:


> WRONG; Nvidia clamps down on power whereas AMD does nothing to throttle power on most if not all benchmarks, especially furmark. See TPU Reviews
> 
> This so called leak is fake. If it was real, am sure its being shown in its best case scenario (Power included)



Are you looking at the same TPU data I am?  It's obvious that both AMD and NVidia clamp board power consumption to the specification.  Look at the maximum power consumption chart from the R9 290X review.  Titan and the 780 are both 250W cards, and they are held to within 7% of that specification.  R9 290X is a 300W card, and it is held to 5% of that specification.  The leaked specifications call the 780Ti a 250W card.  There is no way that the 780Ti is drawing 75W more than the R9 290X even though other 250W NVidia cards draw 50W less.

I share your skepticism, but unlike you I don't think in any way this is the best case power consumption scenario for the 780Ti; In fact, I would argue that this is much worse than the worst case.  *It looks like the tester tried to overclock a reference board to get better performance results but also is drawing vastly more power because of it.*  I would expect actual 780Ti performance to be slightly less than these results but board power to be significantly less.


----------



## erocker (Nov 3, 2013)

Release a card for $500 bucks over all other GPU's. Some buy it, some are on the fence, some think it's a ripoff.

...Then release a card for $200 over all other GPU's and it's a bargain!!

Cunning moves!


----------



## Melvis (Nov 3, 2013)

Crap Daddy said:


> Supposedly Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:
> 
> http://img.techpowerup.org/131103/NVIDIA-GeForce-GTX-780-Ti-gaming-performance.png



Quiet mode...realy? 

Where's the uber mode chart?


----------



## qubit (Nov 3, 2013)

I think this thread has run its course. Let's wait until some hard evidence of its performance turns up. It won't be long now.


----------



## SIGSEGV (Nov 3, 2013)

HumanSmoke said:


> Conversely, the $550 290X offers 4% (quiet) / 11.1% (scream) more performance than the reference 780 for a 10% higher price and three fewer games ( or 16% higher price w/ 2 fewer games)...and that's assuming that you could find a 290X in stock.
> 
> Nice cherry picking. Even nicer incoherent rant.
> Kirk Lazarus doesn't approve.



You should thanks to the competition amd have made. Without 290x your favourite card wouldn't get price cut and stay on 650$ price point.

What the different term between scream and boost v.2 is?


----------



## The Von Matrices (Nov 3, 2013)

I looked at the source (translated; I can't read Chinese) and it seems like the card has a "quiet" and "uber" mode just like the R9 290X, but in the case of the 780Ti the "uber" only has a high temperature target and doesn't restrict power consumption like the R9 290X's "uber" mode does.  It looks like this card was tested in "uber" mode for these results, which would explain the findings exactly and why it uses 75W more than the 290X.  I would expect the "quiet" mode to stick to the 250W power target and probably only slightly edge out Titan.


----------



## 1d10t (Nov 4, 2013)

Crap Daddy said:


> First,* these leaks are very far* from what a professional review means.



You don't believe leaks from Chiphell that makes nVidia look bad,eh?



Crap Daddy said:


> *Supposedly* Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:
> 
> http://img.techpowerup.org/131103/NVIDIA-GeForce-GTX-780-Ti-gaming-performance.png



Now you quoting some rumour from vidz and stating it's was a "supposedly performance" ?

Yeah, fanaticism > logic


----------



## Steevo (Nov 4, 2013)

qubit said:


> I want a card that runs fast and (relatively) quiet like NVIDIA cards do at stock, with a decent driver control panel and more or less free of annoying driver bugs, especially with SLI. I'll bet the 780 Ti delivers on this in spades. Reviews will tell us for sure, so let's not argue about it. And I'll bet wizzy has already played with it and knows the answer.  Damn you NDA!
> 
> Does it help if I tell you that in my opinion, all high end GPUs from either brand are stressed out enough as it is with all that heat meaning that I don't overclock them, even if the headroom is there? Therefore, getting more out of it than stock isn't something I look for in a graphics card.
> 
> EDIT: I really like the unique features that NVIDIA deliver with their graphics cards like 3D Vision, LightBoost and now G-Sync.





qubit said:


> As long as it doesn't blow up my card, like those other drivers recently I'll be happy.





qubit said:


> I tell you, my GTX 590 runs so damned hot, even in desktop mode that I'd appreciate any reduction in power.





qubit said:


> It's odd how the WHQL driver has the same version number as the beta driver.
> 
> Also, it's not on the home page of the geforce.com website, but the article links do point to the WHQL driver being published just yesterday and searching for a driver returns the same result.
> 
> I wonder if nvidia have gotten a bit confused somewhere?






qubit said:


> It's a good card in terms of framerate and features, but I really don't like that cooler.
> 
> It's noisy, hot and ugly - nothing to recommend it. nvidia normally make really quiet and efficient coolers, so I think they've let the side down today.
> 
> If I was in the market for this card, I'd go for a non-reference model with a custom cooler without thinking twice about it.






qubit said:


> I do love a good rebrand!
> 
> /sarcasm





qubit said:


> I'm glad I haven't gotten round to installing it yet. As I've got a GTX 590 which can be easily damaged through overstress, it might be extra prudent to leave it out.
> 
> In fact, I don't overclock any of my graphics cards to avoid potentially damaging them.




Both companies have their issues, and how is that hot noisy 790 treating you that can easily be damaged by drivers killing it?

Seriously though, the 290x was a cheaper answer to the Nvidia gouging for those who want to and like to mod their cards to get the most. I would still pay for a 290x so I could stick on my waterblock and remove the voltage limit to see what it was capable of. For much less than the green camp.


----------



## qubit (Nov 4, 2013)

Steevo said:


> Both companies have their issues, and how is that hot noisy 790 treating you that can easily be damaged by drivers killing it?
> 
> Seriously though, the 290x was a cheaper answer to the Nvidia gouging for those who want to and like to mod their cards to get the most. I would still pay for a 290x so I could stick on my waterblock and remove the voltage limit to see what it was capable of. For much less than the green camp.



Damn, I can't believe you spent all that time to dig out all my quotes!  Bottom line is that NVIDIA cards work pretty well regardless if the odd model is a bit noisy or a rebrand etc. It also proves I don't fanboy any particular brand as I do call them out when things should be improved.

Yeah sure, if you want to mod the 290x with a waterblock or something go right ahead, that's proper enthusiast territory. I'd be interested to see how it improves the framerate performance and noise.  It'll be significant whatever the actual numbers are.


----------



## nem (Nov 4, 2013)

i hope than all this is no only benchmarketing


----------



## harry90 (Nov 4, 2013)

Crap Daddy said:


> Supposedly Nvidia gaming performance chart comparing 780Ti and 290X (Quiet) courtesy videocardz:
> 
> http://img.techpowerup.org/131103/NVIDIA-GeForce-GTX-780-Ti-gaming-performance.png



Not fair benchmarks amd R290x quiet mode lowers its core to 800mhz under full load to save power. Try finding benchmarks with R290x Ubermode and 780Ti stock, before you(NVidia fans) call amd cards shitty. Even if slower R290x is a steal at $550 cheers


----------



## Bjorn_Of_Iceland (Nov 4, 2013)

qubit said:


> Yeah sure, if you want to mod the 290x with a waterblock or something go right ahead, that's proper enthusiast territory. I'd be interested to see how it improves the framerate performance and noise.  It'll be significant whatever the actual numbers are.



Well you can always mod your 780ti with water and over voltaged bios, so that is pretty much back to square one in terms of 780ti being faster vs a modded 290x.


----------



## HumanSmoke (Nov 4, 2013)

harry90 said:


> Not fair benchmarks amd R290x quiet mode lowers its core to 800mhz under full load to save power


More a sound level (fan RPM) dynamic...probably why it's called "Quiet Mode" rather than "Power Saving Mode". And this is representative of actual clocking:







harry90 said:


> Even if slower R290x is a steal at $550 cheers


Yep, My sentiments were exactly the same. $485 for an overclocked GTX 780 with 3 AAA games added. Couldn't justify paying 20% more to get a few percentage points increase in framerate and a single game....that's if I could actually find a 290X in stock at MSRP.


----------



## btarunr (Nov 4, 2013)

ShurikN said:


> Hey TPU, when are you going to test a watercooled 290X? Just to see how fast it really is. And than compare it to 780ti (when it releases).



When someone sends us one, which has a closed loop water cooler (think Sapphire Atomic). We've never tested cards that are just PCB + FC blocks.


----------



## Steevo (Nov 4, 2013)

Bjorn_Of_Iceland said:


> Well you can always mod your 780ti with water and over voltaged bios, so that is pretty much back to square one in terms of 780ti being faster vs a modded 290x.



Not close. The 290 is pulling better numbers at 200Mhz less than the Titan and with no voltage limit it will perform much faster than a chip that is already at maximim,  plus the piece of silicon Nvidia is selling costs more as its significantly larger. I am not saying it will be the fastest ever, but bang for the buck only Nvidia fanboys will be buying one.


----------



## MxPhenom 216 (Nov 4, 2013)

Steevo said:


> Not close. The 290 is pulling better numbers at 200Mhz less than the Titan and with no voltage limit it will perform much faster than a chip that is already at maximim,  plus the piece of silicon Nvidia is selling costs more as its significantly larger. I am not saying it will be the fastest ever, but bang for the buck only Nvidia fanboys will be buying one.



Or for people that don't give a rats ass about bang for buck. Why do you think people bought Titans, or why I went from a 680 to a 780 right away.


----------



## qubit (Nov 4, 2013)

Bjorn_Of_Iceland said:


> Well you can always mod your 780ti with water and over voltaged bios, so that is pretty much back to square one in terms of 780ti being faster vs a modded 290x.



Comparing them both on water would be interesting.


----------



## harry90 (Nov 4, 2013)

HumanSmoke said:


> More a sound level (fan RPM) dynamic...probably why it's called "Quiet Mode" rather than "Power Saving Mode". And this is representative of actual clocking:
> http://img.techpowerup.org/131104/000powertune2.0.jpg
> 
> Yep, My sentiments were exactly the same. $485 for an overclocked GTX 780 with 3 AAA games added. Couldn't justify paying 20% more to get a few percentage points increase in framerate and a single game....that's if I could actually find a 290X in stock at MSRP.



well your article actually confirmed my claim. At quiet mode not only the fans spin slower, the clock rate is also reduced.


----------



## The Von Matrices (Nov 4, 2013)

harry90 said:


> Not fair benchmarks amd R290x quiet mode lowers its core to 800mhz under full load to save power. Try finding benchmarks with R290x Ubermode and 780Ti stock, before you(NVidia fans) call amd cards shitty. Even if slower R290x is a steal at $550 cheers





harry90 said:


> well your article actually confirmed my claim. At quiet mode not only the fans spin slower, the clock rate is also reduced.



Actually, W1zzard found that "Quiet" mode uses more power than "Uber" mode in his R9 290X testing, so the conclusion is exactly the opposite.  Why this is I don't know.


----------



## Xzibit (Nov 4, 2013)

The Von Matrices said:


> Actually, W1zzard found that "Quiet" mode uses more power than "Uber" mode in his R9 290X testing, so the conclusion is exactly the opposite.  Why this is I don't know.
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/power_average.gif



Just to be clear. It just tells us the average power draw in Crysis 2.



> •Average: Crysis 2 at 1920x1080, Extreme profile, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen).



The charts be it real or fake aren't limited to just 1 game.


----------



## Nihilus (Nov 4, 2013)

Whoa, the 7990x owns 4k.  Not bad for 3 GB memory.  I would of liked to of seen higher demanding games like bf4 or crisis.  Btw the 780ti power consumption shows that it's mostly a hyper clocked titan.  Little performance gain at the price of high power increase.


----------

