# NVIDIA GeForce GTX 780 Ti 3 GB



## W1zzard (Nov 7, 2013)

NVIDIA's new GTX 780 Ti uses a full GK110 GPU with 2880 shaders. This enables impressive performance that is 10% faster than the GTX Titan / R9 290X. The card also comes with massive overclocking potential that let us overclock GPU frequency by almost 30%.

*Show full review*


----------



## qubit (Nov 7, 2013)

Great card. Definitely gonna look to put this in my rig within the next couple of months.


----------



## RCoon (Nov 7, 2013)

Well it's the same price as the 780 when it was released. It beats the 290X at everything except in a few rare circumstances. He who controls the crown sets the price. May even sell my 780 and buy into a 780ti for the sake of having a blower cooler in my ITX instead of the windforce heating things up. I dislike the price a great deal.


----------



## Naito (Nov 7, 2013)

Apart from the price premium, stuff is really getting interesting. Can't wait to see what Maxwell brings.


----------



## Fairlady-z (Nov 7, 2013)

Great card, but gotta say not feeling the itch to move from my 780's one bit... maybe two gens  down the road. If I was in the market id buy one of these and be set for 1080/1440 120hz for sure.


----------



## Fluffmeister (Nov 7, 2013)

Awesome card


----------



## Cheeseball (Nov 7, 2013)

It's the gaming performance king at the moment, although the price is steep ($150+ more than the 290X) this thing overclocks pretty well.


----------



## Suka (Nov 7, 2013)

Fluffmeister said:


> Awesome card
> 
> http://blogtotheoldskool.com/blog/images/do-want-squirell.jpg


You know you want some of that 
All in All great performance from the 780 Ti. Am sure those who were waiting for it will be very happy. Great Reaview


----------



## v12dock (Nov 7, 2013)

Go Go 290X GHz Ed


----------



## Castiel (Nov 7, 2013)

Wow, this is nice. Even though the price tag is a little much, I feel that this was a great move by Nvidia.


----------



## Frick (Nov 7, 2013)

So is it $150 better than 290x?, or $110 better than the 290x with good cooling?


----------



## Sihastru (Nov 7, 2013)

The king is dead. Long live the king! That ~30% overclock is amazing.


----------



## Sasqui (Nov 7, 2013)

Competition at it's finest.  We all win!!!  As always, awesome review.

BTW - W1z, were you the first to post a 290 (non-x) review?


----------



## msamelis (Nov 7, 2013)

It seems like an epic card indeed and I've been looking to upgrade for a while now. I'll be patient though and wait for other editions - cooler wise - to come out. I also can't wait to see AMD's next move.


----------



## GreiverBlade (Nov 7, 2013)

not bad not bad ... now titan owner have the right to feel riped off (unless they absolutely need DP compute.) oh well sticking to my 770 now and if upgrade needed then 2nd 770 or if lucky maybe a 780Ti or a R9-290X


----------



## Fourstaff (Nov 7, 2013)

Frick said:


> So is it $150 better than 290x?, or $110 better than the 290x with good cooling?



Definitely not $250 better than 290


----------



## xorbe (Nov 7, 2013)

Do want, but not worth the hassle of side-stepping from an oc'd Titan, for a few frames while losing 3GB of vram.  I can't see how they are going to move this with the 4GB 290 at $399


----------



## Naito (Nov 7, 2013)

msamelis said:


> It seems like an epic card indeed and I've been looking to upgrade for a while now.



We can tell. That GTX 480 is looking awfully worn...


----------



## N3M3515 (Nov 7, 2013)

Wow, this bad boy is faster than i expected!
Although sli 780ti vs 290x cf at 4K will be a tie me thinks


----------



## 1d10t (Nov 7, 2013)

Great card from nVidia,they managed to take the crown back.As for VRM design,nVidia well aware that they need more clean and stable power,thus adding some filtering caps and redesigning into dual channel/tunneling helped this card much at higher clock


----------



## Amrael (Nov 7, 2013)

Hanging on to my GTX 780 Classified, this card OC'd didn't even break 10% over my trusty Classy. Yes it is faster. Would I take the loss and sell my classy for this? Don't think so, same technology, a little bit faster.


----------



## radrok (Nov 7, 2013)

Holy...  Look at those overclocks,  seems like the 28 nm process is getting very refined

Just wow. 



GreiverBlade said:


> not bad not bad ... now titan owner have the right to feel riped off (unless they absolutely need DP compute.) oh well sticking to my 770 now and if upgrade needed then 2nd 770 or if lucky maybe a 780Ti or a R9-290X




Why do we have to feel ripped off? We knew the price all along and it would have been delusional to think something better wouldn't have come out,  stop saying that please  

I'm definitely upgrading  All I'm waiting is a 6gb variant. 
This for the gaming rig and Titans on the workhorse


----------



## LAN_deRf_HA (Nov 7, 2013)

It feels like with that high price tag that they expect the average buyer to compare all the points it has going for it, heat/noise/power/oc/performance, but I'd imagine most will just look at the FPS, and for them that $150 premium over the 290X might look pretty bad when comparing just the FPS. $620 would have been a good bit more attractive while still making the point of "our card is the best."


----------



## BiggieShady (Nov 7, 2013)

Wow, a fully functional GK110 ... with double precision units disabled ... and that massive overclock. I'm wondering what was the fan noise at load while overclocked ... that info is missing.


----------



## Hilux SSRG (Nov 7, 2013)

That's a monster OC on a stock cooler. Can't wait to see what Gigabyte and Evga do with their custom cooling solutions. Great review.


----------



## Lionheart (Nov 7, 2013)

Do want


----------



## R0H1T (Nov 7, 2013)

N3M3515 said:


> Wow, this bad boy is faster than i expected!
> Although sli 780ti vs 290x cf at 4K will be a tie me thinks


Not even close, the 290 & 290x really shine at higher resolutions so does the new CFX on PCIe ~
http://images.anandtech.com/graphs/graph7492/59654.png
http://images.anandtech.com/graphs/graph7492/59664.png
http://images.anandtech.com/graphs/graph7492/59668.png
http://images.anandtech.com/graphs/graph7492/59672.png
http://images.anandtech.com/graphs/graph7492/59676.png


----------



## TRWOV (Nov 7, 2013)

I think that Wizzards assessment is right: 290 + aftermarket cooler seems to be the way to go for the budget conscious. 

Still this card is beautiful.


----------



## erixx (Nov 7, 2013)

I am fine until the next generation with my 670. But the battle is so funny!


----------



## W1zzard (Nov 7, 2013)

TRWOV said:


> I think that Wizzards assessment is right: 290 + aftermarket cooler seems to be the way to go for the budget conscious.
> 
> Still this card is beautiful.



i'm not saying "is the way to go", i'm saying "could be an option to consider"


----------



## qubit (Nov 7, 2013)

So how do NVIDIA gimp the compute performance? Is it just a BIOS/driver nobbling or is the GPU itself somehow nobbled?


----------



## rpsgc (Nov 7, 2013)

269W peak power consumption
260W *maximum* power consumption

NVIDIA throttling cards in Furmark. Those values are pointless, why even bother? To make AMD look bad?


----------



## Yellow&Nerdy? (Nov 7, 2013)

Price is steep, but maybe not so steep if you factor in overclocking. The reference 290X almost doesn't overclock because of the massive heat output. The 780 Ti on the other hand overclocks very well, so comparing overclocked results, it pulls ahead by quite a margin. It's still not worth 700$ though, I think realistically speaking it should be priced at 625 - 650$. But obviously for being the single GPU "king", it can take a bit of a price premium. 

Not that I can afford any of the graphics cards in this range, but a bit of price war is always great.


----------



## Aithos (Nov 7, 2013)

Do we know if the aftermarket cards (EVGA Superclocked and Classified) will be out in time for the holiday bundle?  I have been putting off my buying decision until this card was out and I'd really like to take advantage of the free games but I don't want a reference model...


----------



## mmaakk (Nov 7, 2013)

Great card. OC numbers are magic!!

But im not replacing my GTX670 (@ 1080p) until I get a 4K screen.

290X & 780Ti imo still aren't a perfect fit @ 4K res...

I bet late 2014/2015 will be the boom for ultra resolution.


----------



## tt_martin (Nov 7, 2013)

@W1zzard
R9 290 is $400, not $450


----------



## iO (Nov 7, 2013)

Huge price premium but still a really nice card.
But that OC potential kinda smells like a golden sample...


----------



## Tatty_One (Nov 7, 2013)

Very impressive to be honest, I too didn't think they would quite acheive this performance, whats exciting for me now is to see what the non reference 290X's can do, can a non reference overclocked version of the 290X close a 8% gap across resolutions, if it can then even better for all of us because we will see even cheaper overclocked 780Ti's!


----------



## Sasqui (Nov 7, 2013)

W1zzard said:


> i'm not saying "is the way to go", i'm saying "could be an option to consider"



Judging by the fan noise, I'm officially saying "is the way to go"


----------



## Rebel333 (Nov 7, 2013)

Great card, it does everything what the reference 290x cannot. Although I still gonna buy the Radeon 290x, but with custom cooler. Than I believe I have GTX780ti performance or more for cheaper.


----------



## swagnuggets123 (Nov 7, 2013)

*Dissapointing*

Still nvidia hasn't taken the performance crown from the radeon 7990.  Much disappoint. Power consumption is actually pretty good. Price premium still bad. :shadedshu Overclocking!!!!! Yes!!!!!   Still nvidia. Step it up. Beat the 7990. Any chance of a gtx 790? Or AMD r9 300x?


----------



## Amrael (Nov 7, 2013)

iO said:


> Huge price premium but still a really nice card.
> But that OC potential kinda smells like a golden sample...



I think you might be up to something, lets wait for retail reviews and we'll see. As it stands I think the price for my second GTX 780 Classified is nigh, $580, I think that would hit the spot nicely for $120 less. Classy's in Sli woohoo!!


----------



## W1zzard (Nov 7, 2013)

rpsgc said:


> 269W peak power consumption
> 260W *maximum* power consumption
> 
> NVIDIA throttling cards in Furmark. Those values are pointless, why even bother? To make AMD look bad?



AMD cards throttle in Furmark, too. These numbers are mostly to provide an idea what kind of PSU will be good enough, worst case.



tt_martin said:


> @W1zzard
> R9 290 is $400, not $450



fixed


----------



## MxPhenom 216 (Nov 7, 2013)

swagnuggets123 said:


> Still nvidia hasn't taken the performance crown from the radeon 7990.  Much disappoint. Power consumption is actually pretty good. Price premium still bad. :shadedshu Overclocking!!!!! Yes!!!!!   Still nvidia. Step it up. Beat the 7990. Any chance of a gtx 790? Or AMD r9 300x?



What in the fuck. Who cares about dual gpu cards.


----------



## Casecutter (Nov 7, 2013)

Got to say I didn't think an extra SMX and bump in clocks would be that impressive.  Power is what takes me aback!  Almost like they're using binned chips with the best efficiency that originally went to only Tesla professional products.  Given maturity of the process and that most big customer for professional side stuff has been realized, can Nvidia now allocate some chips to this venue?

I'd think they know they won't fulfill big numbers on this, as Titan and GTX780 has already captured their a biggest volume.  While those who had stepped-up for those now can't really sell a Titan/780 and see much ROI to make the switch justifiable.   This GTX780Ti is more a "see what we could've done", they don't see or need this as a big mover, more justly basking in what Kepler had on tap verse the GNC.

Here's the question will Nvidia release a 10-11 SMX part that would compete with the R9 290at $400, or do they just absorb anything that unworthy as the price of GK110 production?  I would think there still would be large a volume of geldings that fall below the standard 780 12-SMX's and how well might those do perf/watt against a R9 290?


----------



## W1zzard (Nov 7, 2013)

Casecutter said:


> Almost like they're using binned chips with the best efficiency that originally went to only Tesla professional products.



I, too, suspect that this is the case. Good for the customer. On the other hand, ASIC quality on my sample came back with around 75%, so nothing magical there.


----------



## Steevo (Nov 7, 2013)

Nice card, and I am guessing a power plane may be the respin B1, getting more stable power to all the die helps reduce thermal load and voltage droop, and thus power consumption. 


I await a GSync review too, I may switch camps!!


----------



## ST.Viper (Nov 7, 2013)

swagnuggets123 said:


> Still nvidia hasn't taken the performance crown from the radeon 7990.



It was never intended to do so. IMHO it is great card, considering it has more transistors than competitor card yet it produce less heat that is impressive. If nothing else it may force AMD to starts thinking about bundle more games with R9 series graphics cards so it will be win win situation for us customers.


----------



## hardcore_gamer (Nov 7, 2013)

Does nvidia have any plan to reduce the price of gtx 780 to $399 to counter 290 ? If so, 780SLI for $100 extra will be a much better deal than this card. I can only hope.


----------



## Tonim89 (Nov 7, 2013)

Amazing card, but I see no point to pricing it $699. An aftermarket cooler would make R9 290X faster and the final price would be still 50~75$ lower.

699$ should be the price for top cooled and oveclocked versions of GTX 780 Ti, like MSI Lightning or Asus Matrix.


----------



## ST.Viper (Nov 7, 2013)

Tonim89 said:


> An aftermarket cooler would make R9 290X faster



This remain to be seen. Wait for non reference 290/290x then we can judge. Peace


----------



## Eroticus (Nov 7, 2013)

ST.Viper said:


> This remain to be seen. Wait for non reference 290/290x then we can judge. Peace



Crossfire of 290x already better then SLI 780ti + Price is lower


----------



## Fourstaff (Nov 7, 2013)

Eroticus said:


> Crossfire of 290x already better then SLI 780ti



So obviously you buy 290x for CF, and 780 for single graphics solution right now


----------



## hardcore_gamer (Nov 7, 2013)

Fourstaff said:


> So obviously you buy 290 for CF, and 780 for single graphics solution right now



Fixed


----------



## beck24 (Nov 7, 2013)

Hilux SSRG said:


> That's a monster OC on a stock cooler. Can't wait to see what Gigabyte and Evga do with their custom cooling solutions. Great review.



Exactly! With a top custom cooler like Windforce it should get another 20% performance while still being cool and quiet. Go MSI, Gigabyte, and EVGA!


----------



## Eroticus (Nov 7, 2013)

Fourstaff said:


> So obviously you buy 290x for CF, and 780 for single graphics solution right now



Nope .. nvidia lost for me, i will wait to maxwell


10~15 Fps for 300$ is to high price for me ~.~

working hard for every pennie and not going to waste 300$ for nothing.


----------



## arterius2 (Nov 7, 2013)

looks like THIS is the CARD im getting this Christmas. shit just got better boys!


----------



## brandonwh64 (Nov 7, 2013)

W1z, when will you be implementing BF4 to the benchmark lineup?


----------



## hardcore_gamer (Nov 7, 2013)

brandonwh64 said:


> W1z, when will you be implementing BF4 to the benchmark lineup?



..and more importantly,a 4K monitor.


----------



## TRWOV (Nov 7, 2013)

Sasqui said:


> Judging by the fan noise, I'm officially saying "is the way to go"



AMD should just sell the bare boards


----------



## Tonim89 (Nov 7, 2013)

ST.Viper said:


> This remain to be seen. Wait for non reference 290/290x then we can judge. Peace



You are correct, but keep in mind that that powertune will give you about 100 MHz only changing the cooler.

I expect those 100 MHz to shift the tide in for AMD direction. That's why I think GTX 780 Ti should be priced 600~650$, and 699$ for top cooler versions.


----------



## arterius2 (Nov 7, 2013)

in China 290x is 800USD, while gtx780ti card is 830USD, its a no brainer to get this card.


----------



## Boozad (Nov 7, 2013)

RCoon said:


> May even sell my 780 and buy into a 780ti



I think that's exactly what I'll be doing.


----------



## R0H1T (Nov 7, 2013)

beck24 said:


> Exactly! With a top custom cooler like Windforce it should get another 20% performance while still being cool and quiet. Go MSI, Gigabyte, and EVGA!


This is seriously stretching it, 20% on top of ~30% will not be possible without water or LN2, its not as if Nvidia is going to ship binned chips to every one of their buyers!


----------



## W1zzard (Nov 7, 2013)

brandonwh64 said:


> W1z, when will you be implementing BF4 to the benchmark lineup?



next rebench (December). also adding cod ghosts and assassin's creed IV (if it turns out to be usable for benching, ie no fps lock). and kicking out Skyrim



hardcore_gamer said:


> ..and more importantly,a 4K monitor.


If I have to buy one with my own money, probably later next year, once they have new HDMI and come down in price.

If someone sends one to me tomorrow? Day after tomorrow.

Who here is using a 4K monitor? Nobody, AMD is just pushing 4K because their architecture scales slightly better at high resolutions


----------



## Am* (Nov 7, 2013)

Decent GPU...but it sure as hell isn't $150 better than a 290X or $300 better than a 290. Had they made this version at $600 and the 6GB at this $700 price, it'd be a winner, but paying $150 more for a card that's VRAM starved is just ridiculous. I wouldn't settle for anything less than 4GB at this sort of price, not even at my "low" 1080p resolution, as I've seen BF4 and Crysis go into the 2.5GB-2.8GB mark on the old Tahitis. 4GB is what I consider "safe with some headroom left" for future titles at 1080p, any setup above that and I wouldn't recommend going anywhere near a card with less than 4GB.

And I'm not buying into the BS of the crazy OC this card manage to pull off either, pretty certain Nvidia sent out their best samples; everyone and their grandma were shouting about the old Keplers overclocking to the moon and yet I've had 2 different ones that could barely stay stable at normal boost clocks. Fair play to you if yours managed it, but I'm still going to take my pessimistic approach when looking at overclocks of review samples.

I think that about settles it: R290X/290 with non-reference cooling for me. This lead is nothing that can't be made up with better drivers later down the road IMHO. This GPU has a huge lead in driver performance that I think is going to close once AMD's Mantle and normal driver updates come into play, and even if they don't, at $700 I want either a 6GB version or a ton of freebies to make up the difference.

_



			BTW w1z, are you going to add BF4 to your review benchmarks anytime soon? I'd be interested to see what sort of lead this card has over the 290X in this game, if any.
		
Click to expand...

_*EDIT* nevermind, missed the ninja'd post above mine.


----------



## arterius2 (Nov 7, 2013)

"I felt a great disturbance in the Force, as if millions of voices of AMD fanboys suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened."

-Obi-wan Kenobi


----------



## hardcore_gamer (Nov 7, 2013)

W1zzard said:


> Nobody, AMD is just pushing 4K because their architecture scales slightly better at high resolutions



It is a bit ironic because their frame pacing driver doesn't work at this resolution.


----------



## Eroticus (Nov 7, 2013)

arterius2 said:


> "I felt a great disturbance in the Force, as if millions of voices of AMD fanboys suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened."
> 
> -Obi-wan Kenobi


In fear before what  ? low resolution settings  ?

or Over priced ..... ?

Limited 290x for "queit" mode even better -.-


----------



## TRWOV (Nov 7, 2013)

arterius2 said:


> "I felt a great disturbance in the Force, as if millions of voices of AMD fanboys suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened."
> 
> -Obi-wan Kenobi



 Seriously? Oh, noes! 15% more performance for 27% more money. Yeah, AMD fanbois are surely crying. 


Wondering how many will actually make it to the market. I'm pretty sure nVidia would rather sell these as Tesla cards.


----------



## Fluffmeister (Nov 7, 2013)

The 4K results are nice, but with those monitor prices the value argument also goes out of the window.

I'll stick with my 250 quid 1440P Crossover thanks very much.


----------



## Amrael (Nov 7, 2013)

Tonim89 said:


> Amazing card, but I see no point to pricing it $699. An aftermarket cooler would make R9 290X faster and the final price would be still 50~75$ lower.
> 
> 699$ should be the price for top cooled and oveclocked versions of GTX 780 Ti, like MSI Lightning or Asus Matrix.



Totally agree


----------



## Anth0789 (Nov 7, 2013)

Oh my next card probably!


----------



## Big_Vulture (Nov 7, 2013)

Very impressive in the test, but unfortunately these test cards are built for selected chips and the ones in retail channel will be, slower, warmer, more power hungry. It would be better to see randomly picked cards from the stores. Either-way too expensive, I'll buy R290 and a Arctic's Accelero Xtreme III cooler.


----------



## SIGSEGV (Nov 7, 2013)

arterius2 said:


> "I felt a great disturbance in the Force, as if millions of voices of AMD fanboys suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened."
> 
> -Obi-wan Kenobi



troll attempt failed..


----------



## arterius2 (Nov 7, 2013)

TRWOV said:


> Seriously? Oh, noes! 15% more performance for 27% more money. Yeah, AMD fanbois are surely crying.
> 
> 
> Wondering how many will actually make it to the market. I'm pretty sure nVidia would rather sell these as Tesla cards.



27% more money?? what?

In China the 780ti cost 830USD right now, while 290x still selling for $800USD, that's about 4% price increase. I think the choices are pretty clear here, I don't want another vacuum cleaner in the house.


----------



## Am* (Nov 7, 2013)

TRWOV said:


> Wondering how many will actually make it to the market. I'm pretty sure nVidia would rather sell these as Tesla cards.



I'm pretty certain they don't have a choice in this -- they've been sitting on GK110 salvages for so long, they must have 14-18 months' worth of scrap Tesla/Quadro GK110s ready to sell in GeForce GPUs just like this one. Judging by the power consumption, these are clearly pro cards that didn't make the cut.


----------



## Casecutter (Nov 7, 2013)

Hey it was shown in the forums yesterday that there’s some significant changes to the PCB and power sections.  Can we get an understanding of those changes?  How did those assist the OC'n performance or efficiency.  I mean if Nvidia really improved the PCB that would certainly made the Uber OC'n crowd see value to drop what they own even with a glaring forfeiture just to grab this.  

How much would you all think a Titan or GTX780 that's been battered is worth today? As a buyer, hard ridden 780's could be $350-400, while Titan $550-600, for those who early adopted that not too bad.  Either owner is looking to have $45-50 hammered off each month if they bought super early, buck-n-half a day... not the worst.   While there's another $100 to add from a Titan to drop toward the new one... the GTX780 owner is needing to ante-up at minimum $300!


----------



## R0H1T (Nov 7, 2013)

arterius2 said:


> 27% more money?? what?
> 
> In China the 780ti cost 830USD right now, while 290x still selling for $800USD, that's about 4% price increase. I think the choices are pretty clear here, I don't want another vacuum cleaner in the house.


Why are you bringing the PRC card in this, in most of the countries around the world this card will be 10~50% higher in price than 290x so your argument is invalid to begin with :shadedshu


----------



## Eroticus (Nov 7, 2013)

arterius2 said:


> 27% more money?? what?
> 
> In China the 780ti cost 830USD right now, while 290x still selling for $800USD, that's about 4% price increase. I think the choices are pretty clear here, I don't want another vacuum cleaner in the house.



LMAO .. It's just in CHINA !

It's like Titan Price in Israel was about 6K (1714$)


----------



## arterius2 (Nov 7, 2013)

R0H1T said:


> Why are you bringing the PRC card in this, for most of the countries around the world world this card will be 10~50% higher in price than 290x so your argument is invalid to begin with :shadedshu



Because I live there? why wouldn't I take the local prices into consideration when making my purchases? 290x cost about 800USD in UK as well. the only place where the 290/290x is actually selling for their advertised MSRP price is on newegg.

there was a thread here where people were posting local prices of the 290x around the world, and pretty much all of them are overpriced.


----------



## Eric_Cartman (Nov 7, 2013)

Looks like AMD is really in trouble now.

A year later and GK110 is still beating the crap out of them.

It just leaves more time for nVidia to refine maxwell.

And where are all the people that were saying nVidia's PWM is weaker than AMD's?

Well guess what, it doesn't matter, the 780 Ti overclocks like mad and doesn't throttle to insanely low clocks!

An 290X and a 780 Ti, both overclocked, the 780 Ti will smoke the 290X.


----------



## Eroticus (Nov 7, 2013)

arterius2 said:


> Because I live there? why wouldn't I take the local prices into consideration when making my purchases? 290x cost about 800USD in UK as well. the only place where the 290/290x is actually selling for their advertised MSRP price is on newegg.



So it's your problem not AMDs

your government made that price higher in 100% and not AMD.


----------



## arterius2 (Nov 7, 2013)

Eroticus said:


> So it's your problem not AMDs
> 
> your government made that price higher in 100% and not AMD.



here you go
http://www.techpowerup.com/forums/showthread.php?t=193142


----------



## Eroticus (Nov 7, 2013)

Eric_Cartman said:


> Looks like AMD is really in trouble now.
> 
> A year later and GK110 is still beating the crap out of them.
> 
> ...



no it's not.  plastic cooler not same like nvidias one.


----------



## Eroticus (Nov 7, 2013)

arterius2 said:


> here you go
> http://www.techpowerup.com/forums/showthread.php?t=193142
> 
> not my problem, I'll just buy the better card, which is the 780ti, I make shit load of money to not really give a shit anyways.



Here you go .

http://en.wikipedia.org/wiki/Tax


Go to school first . thank you.


----------



## Am* (Nov 7, 2013)

arterius2 said:


> *...290x cost about 800USD in UK as well. *



You're wrong. 290X was around £440-£480 (around $700 minimum) with ltd edition cards at £515 max. Some etailers price gouged more than others here, but £440 was the cheapest. This card is £560-£600 so far (around $900 minimum) and we have yet to see the ltd edition's price of this card. 290X pricing was somewhat crappy at launch, but Nvidia took the ultimate piss yet again, now with this 780 Ti GPU.


----------



## Recus (Nov 7, 2013)

ST.Viper said:


> This remain to be seen. Wait for non reference 290/290x then we can judge. Peace



I think non reference 290/290x are myth. When they released and not a single leak about custom model. 

Unlike GTX 780Ti.



















Eroticus said:


> Nope .. nvidia lost for me, i will wait to maxwell
> 
> 
> 10~15 Fps for 300$ is to high price for me ~.~
> ...



So you bought CPU for $560 but grudge few pennies for GTX 780Ti? Not sure if serious.



Eroticus said:


> In fear before what  ? low resolution settings  ?
> 
> or Over priced ..... ?
> 
> ...



Funny thing they fear highest game settings.

Metro: Last Light high not very high. Crysis 3 medium <---is this even legal?  I wouldn't be surprised if someone with R9 290X getting 60 fps@4k@low game settings.


----------



## Aithos (Nov 7, 2013)

No one cares about 4k performance.  None of you are playing games on 4k monitors.  If you think 4k is "the future" then you're probably the same people who bought into 1080p for 4x the cost with ZERO content coming for 5 years.  There isn't even a significant gaming population above 1080p, which is why reviewers generally look at 1080p and 1440p or 1600p results, where the 780ti smokes the 290x without taking overclocking into consideration.

You AMD fans need to stop trying to skew results, it's ridiculous.  Wait until aftermarket 290x and aftermarket 780ti are out and reviewed and then let's see what the 1440p results say.  I can tell you this right now:  No factory card (not on water) from the 290x line is going to beat the 780ti and I highly doubt that even on water any of them will beat a 780ti on water.  The 290x has major heat issues, major noise issues (SLI 708ti is quieter than 290x uber).  

I will gladly pay the $150 dollars to avoid AMDs software, drivers, the heat and noise.  When you're talking about two GPUs at $1100 or $1400 who cares?  If you can afford that system in the first place then $300 is pretty much nothing when considering the total system cost...


----------



## sanadanosa (Nov 7, 2013)

Eroticus said:


> Another reason : People in your country are liars and trying making money on stupid people.
> 
> That what you said now.



I'm not from PRC but I think your words are very offensive.


----------



## Casecutter (Nov 7, 2013)

arterius2 said:


> why wouldn't I take the local prices into consideration when making my purchases?


While I agree, it really isn't a topic to prejudice to merits of the hardware this early in release (Nvidia or AMD).  More a conversation of global economies... every day!   MSRP is what at this point we can reasonable corralate, as to early market jitters, or whatever creates such discrepancy is across regional boarders it shouldn't be our concerning focus.  

But right you have to Live where your Life is.


----------



## qubit (Nov 7, 2013)

Am* said:


> And I'm not buying into the BS of the crazy OC this card manage to pull off either, pretty certain Nvidia sent out their best samples; everyone and their grandma were shouting about the old Keplers overclocking to the moon and yet I've had 2 different ones that could barely stay stable at normal boost clocks. Fair play to you if yours managed it, but I'm still going to take my pessimistic approach when looking at overclocks of review samples.



Couldn't agree more.


----------



## ShurikN (Nov 7, 2013)

The only thing left to see is how 290x works with aftermarket coolers and with no throttling (if it can be removed at all).
AMD said they are waiting for the 780ti benchies to give AIBs the green light anyway...


----------



## Hilux SSRG (Nov 7, 2013)

Kind of sad that nvidia rained on the 290X parade without working to hard. Seems as if they snatched the crown right back for top single GPU. 

This chip should have been released a year ago for far less. At least AMD has the 290 priced pretty well for the specs.


----------



## arterius2 (Nov 7, 2013)

Eroticus said:


> Another reason : People in your country are liars and trying making money on stupid people.
> 
> That what you said now.



lmao I'm a Canadian working overseas, hardly counts as "people in my country" but you should probably be banned for racial slurs anyhow.


----------



## arterius2 (Nov 7, 2013)

Am* said:


> You're wrong. 290X was around £440-£480 (around $700 minimum) with ltd edition cards at £515 max. Some etailers price gouged more than others here, but £440 was the cheapest. This card is £560-£600 so far (around $900 minimum) and we have yet to see the ltd edition's price of this card. 290X pricing was somewhat crappy at launch, but Nvidia took the ultimate piss yet again, now with this 780 Ti GPU.



you didn't count tax, im talking about after tax.


----------



## R0H1T (Nov 7, 2013)

arterius2 said:


> Because I live there? why wouldn't I take the local prices into consideration when making my purchases? 290x cost about 800USD in UK as well. the only place where the 290/290x is actually selling for their advertised MSRP price is on newegg.
> 
> there was a thread here where people were posting local prices of the 290x around the world, and pretty much all of them are overpriced.


See this ~





Am* said:


> You're wrong. 290X was around £440-£480 (around $700 minimum) with ltd edition cards at £515 max. Some etailers price gouged more than others here, but £440 was the cheapest. This card is £560-£600 so far (around $900 minimum) and we have yet to see the ltd edition's price of this card. 290X pricing was somewhat crappy at launch, but Nvidia took the ultimate piss yet again, now with this 780 Ti GPU.


Its you who seems to be trying to paint your exceptional case as the norm, cause for the majority of us(anywhere in the world) Nvidia is more expensive than AMD period btw you're also conveniently omitting the part where the current prices reflect a ~5% premium for the 780Ti over 290x at your place


----------



## manofthem (Nov 7, 2013)

Nice beast card, thanks for the review W1zz


----------



## Am* (Nov 7, 2013)

arterius2 said:


> you didn't count tax, im talking about after tax.



I'm also talking about prices after tax, including UK-standard VAT @ 20% for both AMD and Nvidia.


----------



## Eroticus (Nov 7, 2013)

arterius2 said:


> lmao I'm a Canadian working overseas, hardly counts as "people in my country" but you should probably be banned for racial slurs anyhow.



449$ + Your Country tax. .. AMD PRICE IS FUCKING 449$ ! if price is higher its mean FUCKING TAX ! 




Recus said:


> I think non reference 290/290x are myth. When they released and not a single leak about custom model.
> 1: GIFT
> 2: http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html
> 
> ...


----------



## Ja.KooLit (Nov 7, 2013)

sanadanosa said:


> I'm not from PRC but I think your words are very offensive.





arterius2 said:


> lmao I'm a Canadian working overseas, hardly counts as "people in my country" but you should probably be banned for racial slurs anyhow.



agree......

im also not from korea just living here....

korean prices of gpu's generally hardware are insane....

would you believe that 7970 still at 500$ish here?

780ti cost 900$ here

290x cost 700$.

should I blame AMD or NVIDIA? definitely not....

think about shipping, tax, so on so forth...

should I buy from here? Maybe yes. Saves me alot of shipping cost for RMA'ing.....


----------



## arterius2 (Nov 7, 2013)

Am* said:


> I'm also talking about prices after tax, including UK-standard VAT @ 20% for both AMD and Nvidia.



I was just looking at amazon.co.uk to get a sense of a baseline price there

what I'm seeing is 290x being listed at 500 quids being the cheaper end and upwards to 525 quids which works out to a little over 800 USD. (500.00 GBP  = 803.605 USD  )


----------



## Eroticus (Nov 7, 2013)

night.fox said:


> agree......
> 
> im also not from korea just living here....
> 
> ...



Hero !


----------



## nem (Nov 7, 2013)

but went Mantle roll out what happen Ati users with R9 290pro gona a laugh of GTX780TI users

And where is Bench Bttf4 O.O


----------



## Am* (Nov 7, 2013)

arterius2 said:


> I was just looking at amazon.co.uk to get a sense of a baseline price there
> 
> what I'm seeing is 290x being listed at 500 quids being the cheaper end and upwards to 525 quids which works out to a little over 800 USD.



DON'T pay any attention to Amazon UK prices. They are not a computer component retailer and stock very few PC components if any at all, and any prices on there from them are almost never at RRP (always above) unless it's an old product with too much supply. Also bear in mind any prices set by third party sellers include a 15% markup (I think, if I remember correctly) due to Amazon's charges for selling on their site.

Our standard UK retailers for PC components are the likes of scan, aria, ocuk, dabs, ebuyer, novatech etc. My prices are based on all of the above.


----------



## hardcore_gamer (Nov 7, 2013)

Aithos said:


> No one cares about 4k performance. None of you are playing games on 4k monitors. If you think 4k is "the future" then you're probably the same people who bought into 1080p for 4x the cost with ZERO content coming for 5 years.



You don't have to be an early 4k buyer to think that 4k is the future. Like it or not, It is the future. Being a PC enthusiast is all about getting the greatest and latest. They won't get disappointed when the price comes down after some time. Same applies for those bought Titan for $1000. 

The comment about "ZERO" content is not true. Games can easily scale the resolution.


----------



## Hilux SSRG (Nov 7, 2013)

nem said:


> but went Mantle roll out what happen Ati users with R9 290pro gona a laugh of GTX780TI users



But what will happen if Nvidia implements Mantle too?  I think there was talk about it being open API but I'm not sure. 

Also wouldn't recommend buying for "future features" bc who knows if it will be better or worse?


----------



## arterius2 (Nov 7, 2013)

nem said:


> but went Mantle roll out what happen Ati users with R9 290pro gona a laugh of GTX780TI users



I doubt there's much to laugh about when they already got months of use out of it.

If you are always afraid that something down the road is going to devalue whatever you buy now you will never buy anything.


----------



## Am* (Nov 7, 2013)

Hilux SSRG said:


> But what will happen if Nvidia implements Mantle too?  I think there was talk about it being open API but I'm not sure.



People seem to misunderstand what "open" or "free to use" means when AMD or any other company states this. "Open" or "free to implement" for developers DOES NOT mean free to use by competing hardware manufacturers. AMD own the technology, so Nvidia would have to pay them a license fee to use their Mantle tech. The same way mp3 is a FREE to use music format for anyone recording or making music, but anyone manufacturing an mp3 player has to pay licensing fees to the likes of Thomson for using chips capable of playing back mp3 files.


----------



## R0H1T (Nov 7, 2013)

Hilux SSRG said:


> But what will happen if Nvidia implements Mantle too?  I think there was talk about it being open API but I'm not sure.
> 
> Also wouldn't recommend buying for "future features" bc who knows if it will be better or worse?


There is absolutely zero indication from anywhere that mantle will be open especially for a competitor like Nvidia, so no AMD is gonna ride the mantle wave so long as they can deliver better results with it & leverage their console monopoly alongside.


----------



## PopcornMachine (Nov 7, 2013)

Not crazy about the power usage and heat, but the $400 R9 290 is still the card to get.

Not paying %75 more for just 13% performance improvement.  290 will do me just fine.


----------



## Eroticus (Nov 7, 2013)

PopcornMachine said:


> Not crazy about the power usage and heat, but the $400 R9 290 is still the card to get.
> 
> Not paying %75 more for just 13% performance improvement.  290 will do me just fine.



Heat & Noise already solved.

http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html


----------



## msamelis (Nov 7, 2013)

Naito said:


> We can tell. That GTX 480 is looking awfully worn...


That's affirmative sir


----------



## HTC (Nov 7, 2013)

I was indeed right: AMD shot themselves in the foot with a cannon ball when they decided on that crappy cooler.

By using such a performance hampering cooler, they left themselves open to a "counter attack", which just happened today.



Congratulations nVidia for the fastest single GPU card. Not only it's faster but is much less noisier too.

Too bad the price sucks: @ $700, it's price is steep. I could see this card sell allot more if @ $600 and this price would make it more attractive then reference RX 290X because of it's performance/noise combo. @ $700, either purchasing an aftermarket cooler or waiting for non-reference AMD cards should prove the better option, IMO.

The huge overclocking headroom does seem fishy but it could be genuine: not enough data to make a more definitive opinion on that.


This cards throttles allot less then AMD's R9 cards but it still throttles sometimes: i would like to know the *efficiency of the throttling technology from both camps*.

Maybe W1zzard can help with that.


----------



## nem (Nov 7, 2013)

Hilux SSRG said:


> But what will happen if Nvidia implements Mantle too?  I think there was talk about it being open API but I'm not sure.
> 
> Also wouldn't recommend buying for "future features" bc who knows if it will be better or worse?



I was thinking the same maybe resurrecting nVidea Glide and could respond but then I realized that it is impossible or at least if you try nVidea hardly managed to make it more efficient than Mantle. To understand what that emerged Mantle and must point out that Mantle would not exist if AMD was not present on all consoles and although it is not business as nVidea said perhaps the worst mistake of underestimating Nvidea consoles .

Mantle is an optimization api conosola close to the level which means it can not be better performance than the standard console. Microsoft has its own derivative of DX api q is more efficient than mantle and is equal for api PS4 which is also more efficient than Mantle by logic we can deduce that mantle is a common denominator of these apis own consoles (GCN ) took only what is different and leave the redundant like a common denominator , as I see the emergence of Mantle is a consequence of the consoles are GCN because most developers always make their games first for consoles ( GCN ) and then make the port a PC , and how far they are working with the same on console and PC hard it turns out that the port with optimization level near console is something feasible since low-level hardware assembly instructions are the same = GCN consoles and PC = GCN . But it is understood that if GCN was not on consoles as Mantle would not be doable .

Mantle could not be on consoles that is something logical that each company works with its own api . Let what I mean each console has its own api console optimized for that mantle level can not be on the consoles.

pd Dont blame to me blame to google Traslate


----------



## PopcornMachine (Nov 7, 2013)

Eroticus said:


> Heat & Noise already solved.
> 
> http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html



Right, well, I'm getting a water block.



HTC said:


> I was indeed right: AMD shot themselves in the foot with a cannon ball when they decided on that crappy cooler.
> 
> By using such a performance hampering cooler, they left themselves open to a "counter attack", which just happened today.



I was going to put a water block on any card I got.  So not a big deal for me.

I disagree they shot themselves in the foot by a long way.  Just wait for aftermarket coolers, or do like I'm doing.

I just ordered a 290, so I think they missed their foot. 

...


----------



## the54thvoid (Nov 7, 2013)

I'm digging the blatant hypocrisy...

As someone who sold his children to work in a Foxconn silicon mining factory for 25 hours/day, 8 days a week so I could buy a Titan, I'm amazed at how folk use the cost argument and then say how good it is on 4K monitors which currently are extortionate (as such). 

If you're budget minded, you're probably not thinking about 4K.  

So let's ignore the price argument if we're talking 4K.  

All that being said, it's not *worth* the premium over the 290X but it's what Nvidia can charge for having the fastest GPU - it's what they do, like it or not.

I aint shifting from my Titan anyways, not till my kids have worked the mines for another 30 years.


----------



## HTC (Nov 7, 2013)

@W1zzard: any chance you could show this card version of this chart?







Would like a better sense of the throttling amount this card has, if any.


----------



## Am* (Nov 7, 2013)

the54thvoid said:


> I'm digging the blatant hypocrisy...
> 
> As someone who sold his children to work in a Foxconn silicon mining factory for 25 hours/day, 8 days a week so I could buy a Titan, I'm amazed at how folk use the cost argument and then say how good it is on 4K monitors which currently are extortionate (as such).
> 
> ...



Nice post there, good sir. Gave me a good chuckle.  



Just wanted to point out though that, 4K, although a moot point now, is relevant to me because I tend to keep my cards for a while (went back to rolling on my old 460 after I RMA'd my last card). I've went through several monitors, despite mostly staying on my ancient card, with the exception of some stopgap cards that didn't last (680, 660). This is why I still look at 4K benches. If I buy a card now, it'll be for at least 3 years and by then 4K will more than likely be the norm.


----------



## rawad (Nov 7, 2013)

Interesting,  on guru3d I see they managed to overclock it 10%.


----------



## Pandora's Box (Nov 7, 2013)

Just used my evga step-up for one of my 780's to a 780ti. Ordered a second 780ti from amazon. should be here saturday. can't wait. end of the day this will cost me around $300


----------



## Bjorn_Of_Iceland (Nov 7, 2013)

So when will AMD be releasing their overclocked R290x? Because.. I am quite excited over an overclocked graphics card.


----------



## nem (Nov 7, 2013)

HTC said:


> @W1zzard: any chance you could show this card version of this chart?
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/analysis_quiet.gif
> 
> Would like a better sense of the throttling amount this card has, if any.



All this has a solution customized the systems of dissipation, and also that only happens in quiet mode while in uber mode performance  R9 290x not fall for anything even be louder


----------



## HTC (Nov 7, 2013)

nem said:


> All this has a solution customized the systems of dissipation, and also that only happens in quiet mode while in uber mode performance  R9 290x not fall for anything even be louder



I think you miss understood: i just want a graph just like the one i posted but for this card.

The graph i posted shows severe throttling. I know this card doesn't throttle nowhere near as much but i still want to know how much.


----------



## the54thvoid (Nov 7, 2013)

Am* said:


> Nice post there, good sir. Gave me a good chuckle.
> 
> 
> 
> Just wanted to point out though that, 4K, although a moot point now, is relevant to me because I tend to keep my cards for a while (went back to rolling on my old 460 after I RMA'd my last card). I've went through several monitors, despite mostly staying on my ancient card, with the exception of some stopgap cards that didn't last (680, 660). This is why I still look at 4K benches. If I buy a card now, it'll be for at least 3 years and by then 4K will more than likely be the norm.



Can't disagree.  The 290X is the better choice with 4GB.  But in a couple of years, what happens if the memory requirement for 4K with high res textures goes above 4Gb?  290X should hold out longer though but eventually, all our great tech today will start to suffer, be it red or green or blue.


----------



## ST.Viper (Nov 7, 2013)

Recus said:


> I think non reference 290/290x are myth.



Well...question is not if they manufacture them, but when. Correct me if I'm wrong but I do not remember any hiend/mainstream gpu from recent history that was not custom made as well (perhaps Titan and 690 but other than that I dunno).


----------



## Hayder_Master (Nov 7, 2013)

this is what nvidia 700 series should be


----------



## MxPhenom 216 (Nov 7, 2013)

hardcore_gamer said:


> You don't have to be an early 4k buyer to think that 4k is the future. Like it or not, It is the future. Being a PC enthusiast is all about getting the greatest and latest. They won't get disappointed when the price comes down after some time. Same applies for those bought Titan for $1000.
> 
> The comment about "ZERO" content is not true. Games can easily scale the resolution.



I sure as hell don't care about 4k performance right now, as I don't have a monitor that does 4k, nor will these GPUs out today be able run 4k at frames I like, unless I run CF or SLI. And at that price $6000 to play a game today, id rather spend that money on more worthwhile things. 

Not to say 4k isn't the future, but right now. its useless for most people. Once 4k gets a bit more popular, and affordable, these current GPUs will be a thing of the past....


----------



## 15th Warlock (Nov 7, 2013)

Whoa! The king is dead, long live the king!

What an exciting couple of weeks these have been, I don't remember the two teams trading punches like this in many years, now the ball is on AMD's court, what's next? R290XTX?

Too much for me, I'm done for this gen, will wait to see what Maxwell brings to the table next year, too many changes happening too fast, at least Titan was king for 8 months, 290X had the crown for less than a month, but it remains undefeated in price to performance ratio, although I guess the 290 is a better choice...

I bet AMD is working on an XTX version of Hawaii using a more efficient cooler, and I bet they'll charge a premium for it too, this is insane!


----------



## nullington (Nov 7, 2013)

> Arguably one of the most anticipated online shooters of recent times, Battlefield 3 is the latest addition to some of the most engaging online multiplayer shooter franchises.


----------



## Tonim89 (Nov 7, 2013)

4K is the future, not the present. When 4k gaming becomes an accessible reality, both 290x and 780 Ti will suffer o run anything.

When 1080p monitors debuted in reviews and benchmarks, we had available cards like 7900GTX. Nowadays they are totally irrelevant and performance-wise inferior than any 70$ card.


----------



## Am* (Nov 7, 2013)

15th Warlock said:


> I bet AMD is working on an XTX version of Hawaii using a more efficient cooler, and I bet they'll charge a premium for it too, this is insane!



I don't think they'll do anything that major for another year -- this was only supposed to be a stopgap generation until 20nm node comes into full effect next year. Anyway, all AMD will have to do now is release the k̶r̶a̶k̶e̶n̶   AIBs with non-reference coolers to even out the playing field once again.


----------



## MxPhenom 216 (Nov 7, 2013)

Tonim89 said:


> 4K is the future, not the present. When 4k gaming becomes an accessible reality, both *290x and 780 Ti* will suffer o run anything.
> 
> When 1080p monitors debuted in reviews and benchmarks, we had available cards like 7900GTX. Nowadays they are totally irrelevant and performance-wise inferior than any 70$ card.



Like I said, they will be long gone and forgotten by that time.


----------



## Zubasa (Nov 7, 2013)

arterius2 said:


> in China 290x is 800USD, while gtx780ti card is 830USD, its a no brainer to get this card.


I don't know where in China you live in, but most of the 290X are around $700.
And some goes as low as 3900RMB around $600 
http://item.taobao.com/item.htm?spm=a230r.1.14.75.ihyco5&id=35690597422


----------



## Tatty_One (Nov 7, 2013)

It strikes me that some of you clearly do not know how to:

1.  Debate or disagree with any degree of finesse or maturity.
2.  Be open minded enough to argue a good honest and unbiased case without clouded fanboyist minds
3.  In some cases, show common decency to your fellow community members.

Really?  Damn my grandaughter would put one or two of you to shame and she is 6!  Only once have I ever found it appropriate to wade into a thread and hand out multiple infractions and oddly enough that was over a graphics card comparison but let me make one thing clear, if some of you cannot disagree without flaming and insults there will be a 2nd time and it's coming soon.

On a more cheerful note... I will crawl back into my armchair and stop the rant!


----------



## xenocide (Nov 7, 2013)

MxPhenom 216 said:


> Like I said, they will be long gone and forgotten by that time.



I would be surprised if before about 10 years 4K were accessible...


----------



## Zubasa (Nov 7, 2013)

xenocide said:


> I would be surprised if before about 10 years 4K were accessible...


At this rate maybe when we all becomes grandpas 
Monitor manufacturers are more than happy to milk 1080P monitors for all eternity.


----------



## ST.Viper (Nov 7, 2013)

When 1080p monitors debuted in reviews and benchmarks, we had available cards like 7900GTX. Nowadays they are totally irrelevant and performance-wise inferior than any 70$ card.[/quote]

So true...Hope that one day when 4k monitors became mainstream they also make them at 22-24" size. Because 32" is way too big to sits just 30cm away from your head.


----------



## Big_Vulture (Nov 7, 2013)

Pandora's Box said:


> Just used my evga step-up for one of my 780's to a 780ti. Ordered a second 780ti from amazon. should be here saturday. can't wait. end of the day this will cost me around $300



For how much you going to sell your 780s, maybe around $300? I feel very sorry for Nvidia users, they were cheated with so much money, they do not even realize. R290 for $400 is the way to go or two of them and 780ti deep in the water.


----------



## W1zzard (Nov 7, 2013)

Tatty_One said:


> It strikes me that some of you clearly do not know how to:
> 
> 1.  Debate or disagree with any degree of finesse or maturity.
> 2.  Be open minded enough to argue a good honest and unbiased case without clouded fanboyist minds
> ...



earlier today:


----------



## ST.Viper (Nov 7, 2013)

W1zzard said:


> earlier today: http://img.techpowerup.org/131107/Capture2954.jpg



lol


----------



## Tatty_One (Nov 7, 2013)

W1zzard said:


> earlier today: http://img.techpowerup.org/131107/Capture2954.jpg



Of course...... I expected no less, I am rarely disappointed!


----------



## Tatty_One (Nov 7, 2013)

the54thvoid said:


> 3 1/2 years ago I got a 5 point infraction for saying the quoted text below (please don't give me one for quoting myself!)
> 
> 
> 
> ...



I became a Mod the month after (April 2010) otherwise I would have given you 10 points!    Now deleting your post as it's off topic


----------



## HumanSmoke (Nov 7, 2013)

Am* said:


> Just wanted to point out though that, 4K, although a moot point now, is relevant to me because I tend to keep my cards for a while (went back to rolling on my old 460 after I RMA'd my last card). I've went through several monitors, despite mostly staying on my ancient card, with the exception of some stopgap cards that didn't last (680, 660). This is why I still look at 4K benches. If I buy a card now, it'll be for at least 3 years and by then 4K will more than likely be the norm.


That begs the questions:
1. What is the likelihood that games in three years will have outgrown the performance of a card bought today- whether it be image quality enhancements or changes to the API specs?
You already have designed obsolescence in effect for cards currently on store shelves regarding optional features ( VLIW4 and VLIW5 cards w/ Mantle and TrueAudio), so how long before AMD and Nvidia put some further distance between their offerings and Intel by leveraging post-shader process compute functionality as virtually a core component of gaming rather than an optional extra? 

2. What are the chances that when 4K tends towards the norm (by your calculation) the price of a second or third [insert current card preference] could be had for relative peanuts?
Seen the prices on three year old cards lately?


----------



## Casecutter (Nov 7, 2013)

HTC said:


> By using such a performance hampering cooler, they left themselves open to a "counter attack", which just happened today.


I don't know if there's much difference in the actual abilities of either vapor chamber design; although I've yet to see them laid out bare next to the other to for a clear comparison.  From some random picture it appears AMD has more densely pack fins which would be more restrictive, but it's hard to say with certainty of that is effective or not.  A 290X is a ½" long so perhaps AMD got longer vapor chamber under there.  Honestly there only so much that a vapor chamber constrained in that size envelope will permit.

The one difference that I notice is the fan and blade design; AMD has what looks to hold to the tried and true curve, however Nvidia has a slight “krick” for a better term in the leading edge and then curves similar to what I’ve come to know.  Could Nvidia lock a fan supplier that has a patent to the blade and fan design, leaving AMD stuck on the old traditional noisy blower?  

Then there's thermal displacement through a 30% reduction in surface area AMD has to deal with.  And yes while Kepler has shown good efficiency I think some of the better watt's numbers and thermal come from higher binned chips (B-1 stepping) that almost certainly originally destined for Tesla professional product.   While also this new Power Balancing Feature that was skipped over by W1zzard probably is assisting a lot toward efficiency and OC’n.  

It's really the last two things that make the GTX780 epic, and have marginalize Titan and 780's to a lesser level they really never should’ve.  A GTX780Ti is a Professional Grade Gaming product and not regular Enthusiast.


----------



## Boilerhog (Nov 7, 2013)

Aithos said:


> I will gladly pay the $150 dollars to avoid AMDs software, drivers, the heat and noise. When you're talking about two GPUs at $1100 or $1400 who cares? If you can afford that system in the first place then $300 is pretty much nothing when considering the total system cost...



My Epeen ,says I have to spend the xtra 300 bucks,,FTW !

Just wondering how many people actually have a 2560 x 1600 display,i have one and I know a few reviewers ,but most are still on 1080 ,oh and I had 2 x 512 meg BFG tech 6800 ultras when I bought it and it barely ran anything at 2560 x1600, same scene with 4 k me thinks, we be doing 2560 x 1600 on our 4 k cause  hardware can't do it..Battlefield 7 at 4k resolution. you know ,290x cf and 780 ti sli be like my 512 6800 ultra sli .......OLD!!!!


----------



## erocker (Nov 7, 2013)

I can't think of any console port that needs this card. Not for that price anyways.


----------



## HR_The_Butcher (Nov 7, 2013)

I don't know if this has been discussed, but it would be nice to include some BF4 benchmarks.

Or at least change the description for BF3 as it's no longer "Arguably one of the most anticipated online shooters of recent times"


----------



## harry90 (Nov 7, 2013)

Aithos said:


> No one cares about 4k performance.  None of you are playing games on 4k monitors.  If you think 4k is "the future" then you're probably the same people who bought into 1080p for 4x the cost with ZERO content coming for 5 years.  There isn't even a significant gaming population above 1080p, which is why reviewers generally look at 1080p and 1440p or 1600p results, where the 780ti smokes the 290x without taking overclocking into consideration.
> 
> You AMD fans need to stop trying to skew results, it's ridiculous.  Wait until aftermarket 290x and aftermarket 780ti are out and reviewed and then let's see what the 1440p results say.  I can tell you this right now:  No factory card (not on water) from the 290x line is going to beat the 780ti and I highly doubt that even on water any of them will beat a 780ti on water.  The 290x has major heat issues, major noise issues (SLI 708ti is quieter than 290x uber).
> 
> I will gladly pay the $150 dollars to avoid AMDs software, drivers, the heat and noise.  When you're talking about two GPUs at $1100 or $1400 who cares?  If you can afford that system in the first place then $300 is pretty much nothing when considering the total system cost...



Do you understand that GPU doesn't make any noise by itself, it is the cooler that causes all that noise. Also Tomshardware Put a Acceleto III fan on R290, the maximum temperature was 65C and it didn't throttle at all. They overclocked it to 1150MHZ without changing the voltage!!! It gain 5.5FPS over stock in MetroLL. So 290X with aftermarket cooler will equal 780TI. Also check this from HardOCP.


----------



## TRWOV (Nov 7, 2013)

Eroticus said:


> Heat & Noise already solved.
> 
> http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html



AMd should just sell the bare boards at this point. 

That being said, I really dig the reference cooler's look with the red highlights. I wonder if it would fit onto a 7970  (unlikely)


----------



## ST.Viper (Nov 7, 2013)

HR_The_Butcher said:


> I don't know if this has been discussed, but it would be nice to include some BF4 benchmarks.
> 
> Or at least change the description for BF3 as it's no longer "Arguably one of the most anticipated online shooters of recent times"



W1zzard mentioned that he add bf4 along with other games next month.


----------



## erocker (Nov 7, 2013)

TRWOV said:


> AMd should just sell the bare boards at this point.
> 
> That being said, I really dig the reference cooler's look with the red highlights. I wonder if it would fit onto a 7970  (unlikely)



A 280x cooler should..maybe. Not sure if the VRM section would line up "perfectly".


----------



## PopcornMachine (Nov 7, 2013)

Aithos said:


> No one cares about 4k performance.  None of you are playing games on 4k monitors.  If you think 4k is "the future" then you're probably the same people who bought into 1080p for 4x the cost with ZERO content coming for 5 years.  There isn't even a significant gaming population above 1080p, which is why reviewers generally look at 1080p and 1440p or 1600p results, where the 780ti smokes the 290x without taking overclocking into consideration.
> 
> You AMD fans need to stop trying to skew results, it's ridiculous.  Wait until aftermarket 290x and aftermarket 780ti are out and reviewed and then let's see what the 1440p results say.  I can tell you this right now:  No factory card (not on water) from the 290x line is going to beat the 780ti and I highly doubt that even on water any of them will beat a 780ti on water.  The 290x has major heat issues, major noise issues (SLI 708ti is quieter than 290x uber).
> 
> I will gladly pay the $150 dollars to avoid AMDs software, drivers, the heat and noise.  When you're talking about two GPUs at $1100 or $1400 who cares?  If you can afford that system in the first place then $300 is pretty much nothing when considering the total system cost...



I agree with you on 4K.  Worthless to even bring up right now.  1440p is what many people can afford.

But talk about skewing things.  Maybe $300 is nothing to you, but that is skewed financial sense to me.

Particularly when Wizards performance numbers at 2560x1600 show the 780ti at 100% and the 290 (non-x) at 87%.

Again, maybe 13% performance is worth 75% more in cost to you, but I would say that is skewed reasoning.

...


----------



## swagnuggets123 (Nov 7, 2013)

ST.Viper said:


> It was never intended to do so. IMHO it is great card, considering it has more transistors than competitor card yet it produce less heat that is impressive. If nothing else it may force AMD to starts thinking about bundle more games with R9 series graphics cards so it will be win win situation for us customers.


But also the nvidia chips are on a much bigger die


----------



## Fluffmeister (Nov 7, 2013)

PopcornMachine said:


> I agree with you on 4K.  Worthless to even bring up right now.  1440p is what many people can afford.
> 
> But talk about skewing things.  Maybe $300 is nothing to you, but that is skewed financial sense to me.
> 
> ...



The problem is people have been bringing up 4K benchmarks in order to show the benefits of the 290(x) over the 780 Ti.

Kinda flies in the face of their "good value" argument.


----------



## qubit (Nov 7, 2013)

Is there any game that can max out 2GB RAM at 1080?

Just wondering, because the 690 can now be had for around the same price as a 780 Ti in some places and it's generally faster, if you don't mind SLI (I don't). However, it's only got 2GB useable RAM (2x2 config) whereas the 780 Ti has 3GB. Obviously, if the RAM maxes out the performance will tank or the game will crash if badly written, so is to be avoided at all costs.

By the time 4K performance will actually matter, with affordable single pane 4K monitors, all of today's cards will be obsolete anyway, hence I don't care about 4K performance (sorry AMD).


----------



## PopcornMachine (Nov 7, 2013)

Fluffmeister said:


> The problem is people have been bringing up 4K benchmarks in order to show the benefits of the 290(x) over the 780 Ti.
> 
> Kinda flies in the face of their "good value" argument.



I'm sure with their 512-bit bus they will do fine at higher resolutions.

Personally just think that talk about 4k is still very premature.  

In making their decision, people should stick to what matters now, and realize $300 is a lot money and they are not using 4k.

I find people bringing up non-important points for the sake of arguing pointless.

...


----------



## ST.Viper (Nov 7, 2013)

erocker said:


> A 280x cooler should..maybe. Not sure if the VRM section would line up "perfectly".



What you say, would it fit?

1st gigabyte windforce 280x, 2nd asus directcu 280x, 3rd 290x


----------



## erocker (Nov 7, 2013)

ST.Viper said:


> What you say, would it fit?
> 
> 1st gigabyte windforce 280x, 2nd asus directcu 280x, 3rd 290x



I don't think it would work with any of those as examples.

This MSI 280x though: http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/images/front.jpg

Looks pretty spot-on to a 7970.


----------



## Serpent of Darkness (Nov 7, 2013)

HTC said:


> @W1zzard: any chance you could show this card version of this chart?
> 
> http://tpucdn.com/reviews/AMD/R9_290X/images/analysis_quiet.gif
> 
> Would like a better sense of the throttling amount this card has, if any.



Anybody ever did this test or review with Ultra Low Power State (ULPS) set to 0 in the registry?  Personally,  I think down-throttling is due to the fact that the card doesn't need to push higher Core Frequencies or GPU Loads to do the same level of work.

In CrossfireX, in a lot of current games, the 1st GPU won't push full GPU Load, or Core Frequency because it isn't necessary to get the same amount of output needed to finish drawing frames.  In other games that are more intensive, the GPUs will push 100% at full stock frequencies on both GPUs at 95 degs C tops.

Disabling ULPS would probably prove whether people are making a big deal about the throttling anomalies, or there are merits to back up the claims.  I'm leaning towards the possibility that it's just NVidia consumers blowing something simple and insignificant, out of proportion...

About GTX 780 Ti.  Nice card, but in some ways, I feel as if NVidia kicked it's consumers in the balls when they bought GTX Titan for $1069 and $1099 for partial 2880 Cuda Cores, and K6000 with a whopping Price Tag of almost $4000.00 for 6 GBs VRam less, 64bit floating precision, and the small minor additions that come with Workstation Cards.

RX9-290x still has a higher Max Consumption Wattage over 300 Watts, but GTX 780 Ti was only 10 watts different full load in comparison to the RX9-290x.  Max Temps are less than 10 degs C difference.  RX9-290x is 95 degs c on full load, GTX 780 Ti is 89 degs on full load?...  I only took a glimpse of the numbers.  For games optimized for AMD, there's only a 2 to 7 FPS difference, and for games that are optimized for NVidia, there's only a 10 to 20 fps difference.  Still, in theory, GTX 780 Ti is suppose to be a theoretical 15% performance increase, but it seems like GTX Titan inches closer to GTX 780 Ti on resolutions higher than 1600p.  For $699.99, I am thinking more of like $749.99 on Newegg because they need to make a profit, is probably what you're looking to pay on the first day of release.  I remember when GTX 680s first hit the shelves, Newegg jacked the price up from $599.99 to $699.99.  Well the price was somewhere in between those figures...  Also, to take into account, with that price tag, NVidia users are getting DX11.2 full support.  Ya... If you look at the "nitty gritty", NVidia consumers aren't getting a whole lot more back.  Just a GTX 780 Refresh with full Titan Cores, and some additional perks...  Like the Frame Time Variance Graphs on the GTX 780 Ti.  Curve Band looks smaller, and the minimum extreme is a little lower than RX9-290x on a single card setup.  This is something that should have been seen back in GTX 680, 690, Titan, and 780....  Lower frame times equate to higher fps, and small frame time bands equates to less deviation or stalling.  AMD is now better at Multi-GPU setups because Scaling on the new PCIe Based CrossfireX is roughly from 1.8 to 2.0x the FPS.  NVidia is now, again, the better single card/GPU solution.  Seems like AMD and NVidia are playing musical chairs by switching between these two factors.

One other thing.  GTX 780 Ti OC's past 1100 Mhz.  Asus ROG Ares II has a turbo clock of 1100 Mhz, dual GPU solution, and it can actually OC past 1200 Mhz Core, 1750 Mhz Mem with an ASCI Quality of 71%...  I've expected more from GTX 780 Ti.  I'll be expecting more from RX9-290x with a better cooling solution to go past the 1250 Mhz to 1300 Mhz mark with a higher power envelope then it's competitors.

@ the Btarunr and W1zzard,

You should provide screen shots of the GPU-Z in your testing setup.  The reason is this.  You don't really state it in your write ups, but a lot of readers are under the assumption that you're testing on the PCIe 3.0 x16.  On some other sites, they don't.  Now there won't be a difference between PCIe 3.0 x16 and PCIe 2.0 x16 except for the bandwidth, but for your readers, I think you should just add in the screen shot to show that you're using the Graphic Card, and that it's using that PCIe interface in the test.  Just something minor to consider.  The only ones who use a screen shot of GPU-Z during their benches is Legitreviews.com.


----------



## radrok (Nov 7, 2013)

Serpent of Darkness said:


> One other thing. GTX 780 Ti OC's past 1100 Mhz. Asus ROG Ares II has a turbo clock of 1100 Mhz, dual GPU solution, and it can actually OC past 1200 Mhz Core, 1750 Mhz Mem with an ASCI Quality of 71%... I've expected more from GTX 780 Ti. I'll be expecting more from RX9-290x with a better cooling solution to go past the 1250 Mhz to 1300 Mhz mark with a higher power envelope then it's competitors.



Usually w1z does not increase the voltage in his reviews.

Be sure that this card, provided you get a good bin like the sample he reviewed, will hit 1400 MHz with 1.35v on the core.

But that's just trivial because you are comparing just frequencies without taking architectures and core count into the equation.

On a side note, 780 ti 3GB just came up here at our retailers, I'm kinda on the fence about waiting the 6GB models, I'd only use more than 3GB with Skyrim TBH.


----------



## EarthDog (Nov 7, 2013)

radrok said:


> But that's just trivial because you are comparing just frequencies without taking architectures and core count into the equation.




ALl matching up the clock speeds do is show the differences in architectural efficiency.

Legit does that (GPUz screen)? http://www.legitreviews.com/evga-geforce-gtx-780-superclocked-acx-cooling-video-card-review_2206/5

I don't see a GPUz ss anywhere...


----------



## W1zzard (Nov 7, 2013)

Serpent of Darkness said:


> You should provide screen shots of the GPU-Z in your testing setup. The reason is this. You don't really state it in your write ups, but a lot of readers are under the assumption that you're testing on the PCIe 3.0 x16. On some other sites, they don't. Now there won't be a difference between PCIe 3.0 x16 and PCIe 2.0 x16 except for the bandwidth, but for your readers, I think you should just add in the screen shot to show that you're using the Graphic Card, and that it's using that PCIe interface in the test. Just something minor to consider. The only ones who use a screen shot of GPU-Z during their benches is Legitreviews.com.



the oc page shows a gpuz screenshot, and we do test at x16 3.0, of course


----------



## Fluffmeister (Nov 7, 2013)

PopcornMachine said:


> I'm sure with their 512-bit bus they will do fine at higher resolutions.
> 
> Personally just think that talk about 4k is still very premature.
> 
> ...



Oh I absolutely agree, I'm just a little bemused that on one hand people argue $300 savings, whilst using benchmarks done on $2800+ monitors to prove their point.


----------



## nem (Nov 7, 2013)

item # 1
nVidea shitting in the face of their customers again more ... do not know
well but only if nVidea drawing conclusions could actually have
Kepler launched the Titan big with their 2880 cuda cores
NV unlocked and actually did not to compete with the
Quadro k6000 ( full unlocked ) to know but did not
and could do so for those who bought the Titan as the best that
could offer to disappoint ... pfff ¬ ¬


Point # 2
Keppler and not improve much but will go up 290x performance with better drivers might be a tie technician .

In walks consumption almost equal 290x and 780Ti

TechPowerUp .com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html


Point # 3
No actual stock still ... have not even begun to sell out not that day but still not like the play of the demo XD

noticias3d .com/articulo.asp?idarticulo=1873&pag=30


Point  # 4
Mantle remains to show what if a simple R9 290pro gives thrashed to  GTX780Ti

Point # 5
Keppler is CPU dependent so if you do not have a i7 4770 forgetting better go see the reviewss fps XD


----------



## arterius2 (Nov 7, 2013)

Fluffmeister said:


> Oh I absolutely agree, I'm just a little bemused that on one hand people argue $300 savings, whilst using benchmarks done on $2800+ monitors to prove their point.



its pretty obvious that they come here not to actually argue a sensible point but to criticize with any means necessary. -doesn't matter if it actually makes sense or not


----------



## radrok (Nov 7, 2013)

Fluffmeister said:


> Oh I absolutely agree, I'm just a little bemused that on one hand people argue $300 savings, whilst using benchmarks done on $2800+ monitors to prove their point.



Also taking the price tag out of the equation, 4K just isn't worth it as a purchase yet.

Current monitors that are being offered suck overall, I would never pick up the ASUS monitor, not even for 1K € considering what it offers.

Go take a look at the Anandtech review, it doesn't deliver in colors (which I admit I'm a bit picky on the subject), responsiveness (that much latency could give issues even on single player games) and it's a bloody tiled display.

What 4K needs is a native 4K panel that doesn't use MST over DP and does 4k at 60Hz without two streams, then I'd be tempted to purchase one.

Oh I forgot the new HDMI, that's important, too.


----------



## HTC (Nov 7, 2013)

Serpent of Darkness said:


> Anybody ever did this test or review with Ultra Low Power State (ULPS) set to 0 in the registry?  Personally,  I think down-throttling is due to the fact that the card doesn't need to push higher Core Frequencies or GPU Loads to do the same level of work.
> 
> In CrossfireX, in a lot of current games, the 1st GPU won't push full GPU Load, or Core Frequency because it isn't necessary to get the same amount of output needed to finish drawing frames.  In other games that are more intensive, the GPUs will push 100% at full stock frequencies on both GPUs at 95 degs C tops.
> 
> ...



I think you missed what i'm trying to know.

With the R9 290x:

1 - take any benchmark you like that's able to push the card so that it throttles a lot @ stock settings (everything, fan included) and try and get the average speed of it throughout the test
2 - set the default speed of the card to the value you discovered in point #1
3 - the card will now throttle way less and, in theory, it should produce the same result as with everything @ stock

If throttling lots of times makes it slower then throttling a few times in the above scenario, then the fact it throttles too much due to the shoddy cooler is actually hampering performance: *that's what i want to know.*


nVidia's approach is different and i just don't know if it's possible to test 

I sure hope it is possible: i'm very curious to know how the different technologies compare, efficiency wise!


----------



## arterius2 (Nov 7, 2013)

HTC said:


> I think you missed what i'm trying to know.
> 
> With the R9 290x:
> 
> ...



1 and 2 is the samething, you are taking an average of something why should the result be any different? 4+8 is the same as 6+6 [ (4+8)/2 is the same as (6+6)/2]


----------



## Casecutter (Nov 7, 2013)

swagnuggets123 said:


> But also the nvidia chips are on a much bigger die


Although the count is one thing it's all about transistor density.  AMD is much greater and packing those transistors that much closer, while keeping them cooling is a big achievement   



Fluffmeister said:


> The problem is people have been bringing up 4K benchmarks in order to show the benefits of the 290(x) over the 780 Ti.





PopcornMachine said:


> I'm sure with their 512-bit bus they will do fine at higher resolutions.
> 
> Personally just think that talk about 4k is still very premature.
> 
> ...



Exactly, I don't say 4k is relevant in market, but to really see the engineering and processing power to push all those pixel's is something to revel in, and applaud the technical hurdle it's achieving for that size of die!


----------



## HTC (Nov 7, 2013)

arterius2 said:


> 1 and 2 is the samething, you are taking an average of something why should the result be any different? 4+8 is the same as 6+6



If the throttling is efficient, then yes. If not, then no.

That's the very thing i want to know: the efficiency!


----------



## arterius2 (Nov 7, 2013)

HTC said:


> If the throttling is efficient, then yes. If not, then no.
> 
> That's the very thing i want to know: the efficiency!



ok im starting to see your point.

what you are saying is that the card running on a constant (slightly) lower temperature is more efficient than doing a cycle of low-to-maximum spikes


----------



## HTC (Nov 7, 2013)

arterius2 said:


> ok im starting to see your point.
> 
> *what you are saying is that the card running on a constant (slightly) lower temperature is more efficient than doing a cycle of low-to-high spikes*



Not temperature but speed, but yes: low-to-high spikes in speed.


----------



## EarthDog (Nov 7, 2013)

Its a good question, but, the switching is so fast, I do not think it would make a difference. Not to mention, you lose the peaks too by starting out lower. 

Turn the fan up.


----------



## radrok (Nov 7, 2013)

EarthDog said:


> Turn the fan up.



Slap a waterblock on it!


----------



## harry90 (Nov 7, 2013)

PopcornMachine said:


> I'm sure with their 512-bit bus they will do fine at higher resolutions.
> 
> Personally just think that talk about 4k is still very premature.
> 
> ...



The problem is that 780TI suffers at resolutions higher than 1080p. Its 6-7% faster in 1080p but is equal at 1600p or higher resolutions. 3GB just isn't enough with full AA, 4X or 8X. Moreover, R290x still lowers it clock even at 55%fan and Ubermode. So 780TI's victory is pretty narrow 3-4% average. well c an updated review of both cards with non-reference coolers soon.


----------



## HTC (Nov 7, 2013)

EarthDog said:


> Its a good question, but, the switching is so fast, I do not think it would make a difference. Not to mention, you lose the peaks too by starting out lower.
> 
> Turn the fan up.



If the test reveals inefficiency, then there's legitimate reason to avoid any lower quality cooled cards, regardless of which camp.


----------



## EarthDog (Nov 7, 2013)

If

(Big "if" LOL!)


----------



## radrok (Nov 7, 2013)

harry90 said:


> 3GB just isn't enough with full AA, 4X or 8X.



To be fair I don't know if I can confirm this statement, after W1zzard asked me to bring him proof about more than 3GB usage on my Titans I couldn't find any game going over 2.8GB on 1600p even with 16X AA.

The only game that would go over 3GB was Skyrim, not vanilla.

Talking about 1600p.


----------



## arterius2 (Nov 7, 2013)

HTC said:


> If the test reveals inefficiency, then there's legitimate reason to avoid any lower quality cooled cards, regardless of which camp.



my question is, AMD didn't know or cared about this during their months(or years) of internal testing?


----------



## EarthDog (Nov 7, 2013)

arterius2 said:


> my question is, AMD didn't know or cared about this during their months(or years) of internal testing?


Great point, which is what I was thinking in formulating my opinion... If the performance difference was more than negligible, they likely would have done something about it.



radrok said:


> To be fair I don't know if I can confirm this statement, after W1zzard asked me to bring him proof about more than 3GB usage on my Titans I couldn't find any game going over 2.8GB on 1600p even with 16X AA.
> 
> The only game that would go over 3GB was Skyrim, not vanilla.
> 
> Talking about 1600p.


Its not the vram that is the problem there it seems... it is the 384bit bus and AA that may be the biggest difference. 

I know this is apples and oranges but here is some GREAT testing (IMO) across several games at 1080p and 5760x1080 as well as 3D for vram use...

http://www.overclockers.com/forums/showthread.php?t=718118

IN BF3, I hit 2.3/4GB used in BF4 I hit 2.4/5GB used (1440p default Ultra settings).


----------



## arterius2 (Nov 7, 2013)

EarthDog said:


> Great point, which is what I was thinking in formulating my opinion... If the performance difference was more than negligible, they likely would have done something about it.



things don't happens by chance these days, they have their reasons. my take on this?

It was probably never AMD's intention for 290/290x to be clocked this high to begin with, it seems that somewhere along their development path, they made a last-ditch effort to pretty much push the power envelope to its limit (possibly this happened when they learned news of Nvidia's plans)

the same sort last-ditch effort they made to 290 to "suddenly" lift the fan speed limit to 47% at last moment.

in summary, it looks like AMD was forced to ship their cards clocked this high at last minute (this also explains the inadequate cooler - it was never meant to cool a card of this power envelope) to remain competitive to whatever they thought NVidia had planned, and it seems they were right on the money for that. literally.


----------



## PopcornMachine (Nov 7, 2013)

radrok said:


> Slap a waterblock on it!



Ordered 290, but waterblocks not available just yet.

Going to when I can.


----------



## HTC (Nov 7, 2013)

arterius2 said:


> my question is, AMD didn't know or cared about this during their months(or years) of internal testing?



Dunno.

If it's proven to be inefficient and this result is then "shoved in their faces", don't you think they'll be forced to up the reference coolers quality?

But don't think i'm only talking about AMD here: i'm also talking about nVidia and should it's throttle efficiency prove bad, i'll "direct my guns" @ them as well.


But this talk is premature: the efficiency may actually be very good. Don't know until someone actually tests this *subtle hint for W1zzard* ...


----------



## the54thvoid (Nov 7, 2013)

I think the >3GB Vram usage argument is mostly irrelevant, especially with findings from the link given by Earthdog.  

People have jumped on it as a validation as to why the 4GB is better than the 3GB of 780 and now 780Ti without _even using any evidence_.  The irony being a single 290X or 780Ti is barely, if at all powerful enough to game at 4K res.  Nvidia even state that - single GPU's are not capable yet of rendering for 4K.  The proof is in the reviews.

When was the last 'what monitor size do you use' poll?  How many people that argue about 4K even have 1440p?  Very few.  Even fewer have 1600p.  

Gaming on BF4 with ultra and max gfx settings I do not see above 2.8GB usage.  Most maps are actually settled at 2.5-2.6GB.  In fact, i've gamed at 1440p for a good while now and have never seen anything higher for single monitor gaming (and I've got a pretty redundant 6GB VRam on my card).





EDIT:

On a total side note - it's an appalling lack of professionalism that so many reviews I've read have used PCB pics from a 780 or Titan.  Well done W1zzard for not taking a half assed lazy approach.  W1zz's shots show us there is a different power circuitry in place.


----------



## EarthDog (Nov 7, 2013)

If you check out the steam stats, I believe that less than 1% of users have a resolution higher than 1920x1200 or multimonitor setups.


----------



## the54thvoid (Nov 7, 2013)

EarthDog said:


> If you check out the steam stats, I believe that less than 1% of users have a resolution higher than 1920x1200 or multimonitor setups.



I guarantee someone will wade in with a "steam surveys are useless post".


----------



## swagnuggets123 (Nov 7, 2013)

W1zzard said:


> [page=Introduction & Specifications]



Why no bf4 benchmark wiz?


----------



## HTC (Nov 7, 2013)

EarthDog said:


> If you check out the steam stats, I believe that less than 1% of users have a resolution higher than 1920x1200 or multimonitor setups.



IMO, i think that's the wrong way to think about it.

One should check the percentage of steam users have either camp mid high to high end cards and then, out of those, how many have a resolution higher than 1920x1200 or multimonitor setups.


----------



## EarthDog (Nov 7, 2013)

the54thvoid said:


> I guarantee someone will wade in with a "steam surveys are useless post".


Clearly it is not a completely accurate representation of ALL users, however, its big and broad enough to at least get an idea of what the landscape is like...


----------



## manofthem (Nov 7, 2013)

swagnuggets123 said:


> Why no bf4 benchmark wiz?



He already said it's going to be added to next round of benchmark reviews, as well as a few others; I think it was the new AC and CoD


Here: 


brandonwh64 said:


> W1z, when will you be implementing BF4 to the benchmark lineup?





W1zzard said:


> next rebench (December). also adding cod ghosts and assassin's creed IV (if it turns out to be usable for benching, ie no fps lock). and kicking out Skyrim


----------



## arterius2 (Nov 7, 2013)

EarthDog said:


> Clearly it is not a completely accurate representation of ALL users, however, its big and broad enough to at least get an idea of what the landscape is like...



last time I checked Intel HD graphics was still the most popular graphics card used by steam users.


----------



## EarthDog (Nov 7, 2013)

manofthem said:


> He already said it's going to be added to next round of benchmark reviews, as well as a few others; I think it was the new AC and CoD


yeah, gotta give reviewers time to, for this type of game with no built in benchmark, to find a good location (may I suggest Tashgar on the SP campaign?), and then retest... oh how I hate retesting......


----------



## Gadgety (Nov 7, 2013)

"While 3 GB doesn't seem like a lot of memory, I am convinced it is enough for all games, in all resolutions, and in the forseeable future. My guess is that we'll also see board partners release additional SKUs with 6 GB, to cater to buyers who think they absolutely need more memory (even though they don't)."

Yes 3GB might be enough for all games, but the "buyers who think they absolutely need more memory" (and really do), are for example those who work with rendering and want to maximise both CUDA cores as well as memory in order to generate faster renderings in larger scenes. I'm sure there's room for 4, 5 and 6GB versions, or even larger, in the marketplace. Value wise this is what it looks like currently:


The Quadro K6000 is $5000 for 2880 CUDA cores: 0.576 CUDA cores/dollar
The TeslaK20 is $3500 for 2496 CUDAs: 0.7 CC/dollar.
When the Titan launched it was a bargain at $1000 for 2688 CUDAS: 2.69 CC/dollar
When the memory limited GTX780 launched it was 3.55 CC/dollar


The GTX780Ti provides 2880 CUDAs for $699: 4.12 CC/dollar. Apart from FP64, the thing the Titan still has over the 780ti is the memory. Bringing out larger memory version 780ti's, and they will have a market among those who really do need them.


----------



## W1zzard (Nov 7, 2013)

EarthDog said:


> Its not the vram that is the problem there it seems... it is the 384bit bus and AA that may be the biggest difference.



you do realize that gtx 780 ti has more memory bandwidth than r9 290x, right? it's not only how many lanes you have on your highway but also how fast the cars go



EarthDog said:


> oh how I hate retesting..


look at how many cards I have to retest :/ gonna drop a few older ones but should still be 25-30


----------



## arterius2 (Nov 7, 2013)

W1zzard said:


> you do realize that gtx 780 ti has more memory bandwidth than r9 290x, right? it's not only how many lanes you have on your highway but also how fast the cars go



hrm interesting, I used to always thought bandwidth as how many lanes and clock speed for how fast the cars ago, and memory size? I guess you can think of them as how much weight can this bridge hold before it collapse


----------



## W1zzard (Nov 7, 2013)

arterius2 said:


> hrm interesting, I used to always thought bandwidth as with how many lanes and clock speed for how fast the cars ago, and memory size? I guess you can think of them as how much weight can this bridge hold before it collapse



bandwidth = bus width * clock rate (+ more math for gddr5, and 8 bit=1 byte)

should become obvious if you think about it for a minute: bandwidth = how many cars per hour can go through the toll station on the highway

for memory size you could think of how many cars fit on the parking lot, connect that to your memory bus highway, and now ask how people can get home when work ends and wont be late for dinner (= wife aggro)


----------



## theonedub (Nov 7, 2013)

Are the heatsink fins behind the plexi window black or did Nvidia tint the window a little bit? The heatsinks on the Ti look jet black whereas on my 780 they look bright silver (sorry if this was mentioned in the review somewhere, I just took a quick glance).


----------



## Ja.KooLit (Nov 7, 2013)

radrok said:


> To be fair I don't know if I can confirm this statement, after W1zzard asked me to bring him proof about more than 3GB usage on my Titans I couldn't find any game going over 2.8GB on 1600p even with 16X AA.
> 
> The only game that would go over 3GB was Skyrim, not vanilla.
> 
> Talking about 1600p.




max payne 3. turn on all eye candy.... requires alot  even my crossfired 7950 cant....

talking about the hbao and ssao... dont know the difference and function 

this is at 1600p


----------



## johnnyfiive (Nov 7, 2013)

This is getting stupid (in a good way).

Here is your call AMD, where is the R290X GHz Edition that comes with a superior cooling solution?! You're losing again! (Already..)

I bet we see a R290X GHz within the next 3 months, and it will cost $629.99... lol


----------



## Fluffmeister (Nov 7, 2013)

night.fox said:


> max payne 3. turn on all eye candy.... requires alot  even my crossfired 7950 cant....
> 
> talking about the hbao and ssao... dont know the difference and function
> 
> this is at 1600p



GTX 770 4GB vs 2GB showdown, benches run at 1920×1080, 2560×1600 and 5760×1080:

http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/



			
				alienbabeltech.com said:
			
		

> There is one last thing to note with Max Payne 3:  It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it claims to require 2750MB.  However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting.  And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.



Can't help but think people blow VRAM requirements out of proportion all the time.


----------



## arterius2 (Nov 7, 2013)

johnnyfiive said:


> This is getting stupid (in a good way).
> 
> Here is your call AMD, where is the R290X GHz Edition that comes with a superior cooling solution?! You're losing again! (Already..)
> 
> I bet we see a R290X GHz within the next 3 months, and it will cost $629.99... lol



considering the power envelope already, I don't think a Ghz Edition is viable.

you simply cannot ship a factory default card already stressed to the absolute max.


----------



## Eroticus (Nov 7, 2013)

johnnyfiive said:


> This is getting stupid (in a good way).
> 
> Here is your call AMD, where is the R290X GHz Edition that comes with a superior cooling solution?! You're losing again! (Already..)
> 
> I bet we see a R290X GHz within the next 3 months, and it will cost $629.99... lol



Sorry bro .. AMD is not nVidia ...

you get that price down only bcuz AMD are good guys !

without AMD your fan boys still bought titan for 1k$ and maybe ever higher.


1,000 USD Demo version ... joke of the year edition.


----------



## The Von Matrices (Nov 7, 2013)

Out of curiosity, is AMD's high power consumption for the same level of performance related to GCN or something else?

I ask because I'm wondering if Mantle is actually going to hurt AMD in the future.  It's obvious that NVidia's Kepler architecture is more power efficient than GCN at the same performance level.  Announcing Mantle has essentially forced AMD to retain CGN for the foreseeable future, and if the high power consumption is something inherent to GCN, then AMD is stuck with it for a half a decade or longer.


----------



## Eroticus (Nov 7, 2013)

The Von Matrices said:


> Out of curiosity, is AMD's high power consumption for the same level of performance related to GCN or something else?
> 
> I ask because I'm wondering if Mantle is actually going to hurt AMD in the future.  It's obvious that NVidia's Kepler architecture is more power efficient than GCN at the same performance level.  Announcing Mantle has essentially forced AMD to retain CGN for the foreseeable future, and if the high power consumption is something inherent to GCN, then AMD is stuck with it for a half a decade or longer.



Speaking the guy with 1250wat power supply and  2 7970 ... turn off 1 and be GREEN ! + (3 Monitors LMAO )

Hahaha god please .... most funny to hear about energy saving from the guy like you.


----------



## The Von Matrices (Nov 7, 2013)

Eroticus said:


> Speaking the guy with 1250wat power supply and  2 7970 ... turn off 1 and be GREEN !



I used to have 3 7970s to mine Bitcoin, and I did sell one when it was no longer profitable.  That's where the 1250W power supply comes from.  And as far as turning one of the two off, I already do when I play games because the as of yet unresolved stuttering produces no useful frames from the second card.

Energy consumption for saving electricity is not the concern here, it's reducing the heat generated and the noise required to remove that heat.  Just because I have a system that consumes a lot of electricity doesn't mean that efficiency improvements are worthless.  The particular thorn in AMD's side is the multi-monitor power consumption.  I could consume 50W less at idle if I had an NVidia card.

Back to the original question, can AMD really improve its efficiency while still retaining GCN?


----------



## Eroticus (Nov 7, 2013)

The Von Matrices said:


> I used to have 3 7970s to mine Bitcoin, and I did sell one when it was no longer profitable.  That's where the 1250W power supply comes from.  And as far as turning one of the two off, I already do when I play games because the as of yet unresolved stuttering produces no useful frames from the second card.
> 
> Energy consumption for saving electricity is not the concern here, it's reducing the heat generated and the noise required to remove that heat.  Just because I have a system that consumes a lot of electricity doesn't mean that efficiency improvements are worthless.  The particular thorn in AMD's side is the multi-monitor power consumption.  I could consume 50W less at idle if I had an NVidia card.
> 
> Back to the original question, can AMD really improve its efficiency while still retaining GCN?



http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html

The card isn't nosily and heat is not high to .

+ 50 wat doesn't change anything ...

Same story was 7k and 5k series when they lunched with stock cooler.

we all know that AMD the only 1 good cooler was on 7990 ... and i didn't know why they didn't saved it for new series ,better to pay 50~100$ more but this will come with 7990 cooling system.


----------



## Casecutter (Nov 7, 2013)

The Von Matrices said:


> Back to the original question, can AMD really improve its efficiency while still retaining GCN?


That’s' a good thought and this might or might not affect the answer... I would say on a 20Nm process the chip could look better perf/watt, I thought I heard transistor density advantage power a performance.  Advantage might not be the best word.


----------



## the54thvoid (Nov 7, 2013)

Eroticus said:


> Speaking the guy with 1250wat power supply and  2 7970 ... turn off 1 and be GREEN ! + (3 Monitors LMAO )
> 
> Hahaha god please .... most funny to hear about energy saving from the guy like you.



Ignorance is bliss, isn't it. Having an "x" watt power supply is meaningless for your argument. The efficiency is the relevant factor. A 1200watt PSU will only pull what the system requires. An inefficient supply will draw more than the system requires. 

As for your other posts, you are revealing yourself to be a bit of a cave dweller with a tendency for pseudo abusive posting. How about you give it a rest and try to be constructive?


----------



## The Von Matrices (Nov 7, 2013)

Casecutter said:


> That’s' a good thought and this might or might not affect the answer... I would say on a 20Nm process the chip could look better perf/watt, I thought I heard transistor density advantage power a performance.  Advantage might not be the best word.



You have a point.  It seems that GCN is more efficient in performance/transistor, but Kepler is more efficient at performance/power.  I still wonder how much AMD can change GCN to improve it while still maintaining compatibility with the Mantle codepaths of games.  At a high level it seems like they can only change large scale features like number of shaders or ROP but the underlying architecture is fixed.

Then again, maybe CGN will turn out like x86 where there is a core set of instructions that never change and new instructions continuously get added.  If those newer instructions are used, efficiency improves while compatibility is maintained for older games that use older instructions.  Of course, x86 is a good example of becoming a huge burden for very low level programming with the thousands of instructions available (https://en.wikipedia.org/wiki/X86_instruction_listings).


----------



## Amrael (Nov 7, 2013)

People this is basic economics and marketing at its most extreme. First off, AMD's monster R9's were tested and released with the shoddy cooler so that after market vendors would get a big chunk of the action. For example the card heats up like a freaking toaster then what do we do? We buy after market cooling, AMD already made a profit on the reference design and they will continue to sell those amazing gpu's (for those who actually saw the performance without throttling like Toms hardware review of the R9 290+Accelero extreme III) no matter what. Now After market vendors (Sapphire, Msi, Asus, XFX etc...) will release their own version with better cooling so they will also sell lots of them and on and on ad infinitum. 

Nvidia just unlocked the full potential of Kepler and in essence crippled Titan halfway. I don't think a 780ti 6GB version is coming because there is that small "insignificant" thing that makes people buy Titans and water blocks, because they want those 6GB of ram. While those of us who want to get ahead and don't have the means end up with 780's with elpida memory which is what's gonna happen when people start receiving their retail versions of the 780ti. Reviewers and the media get those Hynix, Samsung memory equipped, golden sample Gpu toting samples and we get the ones with mediocre asic GPU's and elpida memory. They won' t get to where most review samples get and we'll end up with basically the same GTX 780 with a bump in the core clock. And after a few weeks the after market crew comes in with Winforce 3, Lightning, Classified, Direc Cu II-III etc...) and it will sell again. People I urge you to pick the card of your choice and stick by it, you paid for it enjoy it; go play. 

In the end to actually enjoy gaming (and no 4K, its still too immature) I don't think anyone needs more than a single GTX 770 or a 7970 (aka R9 280x) but if you need more then fine just aim for whats more convenient for you because I have seen to many weird variations on the reviews I've read the past few weeks and I've tested myself with my own Nvidia and AMD Gpu's to know that something fishy is going on. The monster 780 Ti is maybe 700 3dmarks (Firestrike non extreme) away from my overclocked GTX 780 Classified and yet in games the Ti even overclocked doesn't overtake me more that a couple of frames (from 2-5) so I have no reason just like most of you to upgrade, its just hype wait for it and the best products will be available soon, those reference cards won't be kings much longer. Anyway I repeat; go play and let the companies sort themselves, after awhile coma back and buy whichever pleases you the most.


----------



## tuklap (Nov 8, 2013)

*Worth the Money??*

Any compute performance?? if its not that good, probably better to go with r9 290


----------



## radrok (Nov 8, 2013)

Could we stop please talking about racism, countries and all things not related to graphics cards?

Just please, please.


----------



## MxPhenom 216 (Nov 8, 2013)

tuklap said:


> Any compute performance?? if its not that good, probably better to go with r9 290



When does compute matter in games?


----------



## radrok (Nov 8, 2013)

MxPhenom 216 said:


> When does compute matter in games?



Only when it's relevant in making an argument, like 4K for performance.


----------



## Wittermark (Nov 8, 2013)

radrok said:


> Only when it's relevant in making an argument, like 4K for performance.



Go easy on him, its his first post. 

I guess AMD must be pretty desperate over there now to be sending them en-mass


----------



## ViperXTR (Nov 8, 2013)

So when is the Black edition coming out? :O


----------



## Amrael (Nov 8, 2013)

ViperXTR said:


> So when is the Black edition coming out? :O



GTX Titan Ti Black 12GB, so you can play at the movie theater screen at 30FPS overclocked


----------



## radrok (Nov 8, 2013)

^^^

I honestly don't know what kind of use 6GB would have on a card like this, I could fill up 6GB pretty easy on Octane Renderer for 3DSMax but as I said before I couldn't fill it up while gaming unless playing Skyrim (which I still don't know if it was being used or JUST allocated, big difference).

Seems like CUDA performance is gimped too, so Titan still holds in its segment.

Will have to give it a try when I get my hands on one of these, until then the CUDA part remains a mystery for me but I find it hard they left it functional.


----------



## Amrael (Nov 8, 2013)

Has anyone tried to put a normal GTX 780 and a Ti in mismatch Sli?


----------



## HammerON (Nov 8, 2013)

Tatty_One said:


> It strikes me that some of you clearly do not know how to:
> 
> 1.  Debate or disagree with any degree of finesse or maturity.
> 2.  Be open minded enough to argue a good honest and unbiased case without clouded fanboyist minds
> ...



Re-post for those that missed it...


----------



## Tonduluboy (Nov 8, 2013)

just wondering how many people willing to change their current card with this $700 card. That a lot of money, that is ALMOST 3 months of my monthly car payment!


----------



## radrok (Nov 8, 2013)

Amrael said:


> Has anyone tried to put a normal GTX 780 and a Ti in mismatch Sli?



It's nVidia we are talking about, I'm almost 99.9% sure it won't work.


----------



## Rowsol (Nov 8, 2013)

Fluffmeister said:


> GTX 770 4GB vs 2GB showdown, benches run at 1920×1080, 2560×1600 and 5760×1080:
> 
> http://alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
> 
> ...



Those are some nice charts.


----------



## Fourstaff (Nov 8, 2013)

Rowsol said:


> Those are some nice charts.



There is a need to chart Skyrim HD textures at 2Gb and 4Gb though, and I can't find it.


----------



## haswrong (Nov 8, 2013)

radrok said:


> I honestly don't know what kind of use 6GB would have on a card like this..



depends on the renderer. try thinking outside of the box a bit, and assume somebodys going to introduce a realtime raytracer.. then you shall utilize it and maybe need some more. i understand that vram has grown in size quite a lot for past few years, i started with 2 megs of ram, now have 512 of it. but that doesnt mean it is now balanced for next redering solution.


----------



## Nihilus (Nov 8, 2013)

0


----------



## Nihilus (Nov 8, 2013)

Eroticus said:


> Sorry bro .. AMD is not nVidia ...
> you get that price down only bcuz AMD are good guys !
> without AMD your fan boys still bought titan for 1k$ and maybe ever higher.
> 1,000 USD Demo version ... joke of the year edition.



   So true, this is the HD 4870/4850 all over again.  Look how much the overcharged for the gtx 260 until AMD came in.  As for the gtx 780ti, just another overpriced e-peen card.  It does nothing useful in single configuration over the 290x in resolutions below 4k rez except 5 frames here and there, and then it nose dives in 4k when paired in SLI (see bf4).
    Yeah, yeah, the 290x has a crap cooler, but that is a much easier fix than a $700 price tag and 3gb memory.


----------



## Xzibit (Nov 8, 2013)

Are there any other reviews that detail throttling ?

Tom's Hardware noted you had to have the reference cooler running at 80% to stop thermal throttling in Metro Last Light to avoid it for a 30min gaming session.
Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked - Fan and Video Comparison

PcPer did similar thing with the 290X and found that it needed 50%+ fan speed to avoid throttling.

AMD & Nvidia are benchmark-marketing these cards even more now.

I don't play benchmarks but I do like to play the occasional game here and there and right now its BF4. Gaming sessions don't last 3mins either so are any site doing these test yet ?


----------



## HumanSmoke (Nov 8, 2013)

Xzibit said:


> Are there any other reviews that detail throttling ?


It would depend on what environment the card was tested in - and possibly ambient temp, although most review sites are European and North American so the latter shouldn't be an issue.
Hardware France - whose reviews I trust more than TH, didn't note any significant throttling


> The GeForce GTX 780 operates between 863 and 915 MHz, with an average of 876 MHz, against a maximum frequency of 1006 MHz.  As for the GTX 780 Ti, its frequency varies between 876 and 967 MHz for an average of 915 MHz, whereas our sample can rise up to 1019 MHz.



915MHz average sits fairly close to the minimum guaranteed boost of 928 taking into account a fluctuating workload. That may change with the non-default +106% setting, but then, the R9-290X's "Uber" setting isn't the default either...which just goes to show how quickly conventional benchmarking comparisons are becoming irrelevant.


----------



## Xzibit (Nov 8, 2013)

HumanSmoke said:


> It would depend on what environment the card was tested in - and possibly ambient temp, although most review sites are European and North American so the latter shouldn't be an issue.
> Hardware France - whose reviews I trust more than TH, didn't note any significant throttling
> 
> 915MHz average sits fairly close to the minimum guaranteed boost of 928 taking into account a fluctuating workload. That may change with the non-default +106% setting, but then, the R9-290X's "Uber" setting isn't the default either...which just goes to show how quickly conventional benchmarking comparisons are becoming irrelevant.



I don't trust one source either. 

PcPer didn't mention which Nvidia card they used in the 290X test but in the long time comparison they showed the Nvidia boost clock dropped and never went back up again.





What i'm currently understanding is you need to run 290X @ 60% fan speed & 780 Ti @ 80% fan speed to keep throttling in check. From these limited examples and increasing fan speed will increase power consumption and noise.  

With the intro of PowerTune and GPU Boost we might just be looking at top clocks being reached only at game intros  and would have to manually adjust settings to reach results.

I'm more looking for long term time throttling/clock test per game. At the minimal like what Toms Hardware did.  A benchmark loop for a longer duration to see if performance is decreased or sustained.


----------



## Bjorn_Of_Iceland (Nov 8, 2013)

erocker said:


> I can't think of any console port that needs this card. Not for that price anyways.



CoD Ghosts needs all that power for its dog A.I.


----------



## W1zzard (Nov 8, 2013)

HumanSmoke said:


> It would depend on what environment the card was tested in - and possibly ambient temp, although most review sites are European and North American so the latter shouldn't be an issue.
> Hardware France - whose reviews I trust more than TH, didn't note any significant throttling
> 
> 
> 915MHz average sits fairly close to the minimum guaranteed boost of 928 taking into account a fluctuating workload. That may change with the non-default +106% setting, but then, the R9-290X's "Uber" setting isn't the default either...which just goes to show how quickly conventional benchmarking comparisons are becoming irrelevant.



I did a statistical clock speed analysis on page 31: 
	

	
	
		
		

		
		
	


	




as you can see it does go down, but average and median are very high


----------



## the54thvoid (Nov 8, 2013)

Xzibit said:


> Are there any other reviews that detail throttling ?
> 
> Tom's Hardware noted you had to have the reference cooler running at 80% to stop thermal throttling in Metro Last Light to avoid it for a 30min gaming session.
> Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked - Fan and Video Comparison
> ...



I'm pretty sure it will throttle (all boost 2.0 cards do).  My Titan could hit 993 at stock (no OC) but after a few mins it would start to downclock to keep the temp at 79 degrees.

However, the difference is that the Kepler cards advertise a base clock and base boost whereas the actual boost is far higher.  In advertising terms, the Kepler card will generally remain *above* the marketed 'base boost' value.
Hawaii on the other hand is being 'touted' as a 1000Mhz card so when it down throttles it seems to be under performing.  I know AMD have not stated anywhere it is guaranteed to run at that speed but i think it's just a miscommunication on their (PR) part that the cards are throttling badly.  It's more a case they are working as designed but the thermals wont allow them to hit the 'advertised' boost.  What does the actual retail packaging say for a 290X?


----------



## alwayssts (Nov 8, 2013)

Very well done review, as always.

I think I take away from this what most people do:

The AMD cards will be very similar when you clock them to a similar level with a decent cooler that won't throttle (as badly).  Sure, you might need ~1050mhz and slightly more than 5000mhz ram on 290x to compete with this at stock, and ~1200mhz/>6000 or so to compete with it overclocked, but who says that isn't possible, even if it does use more power?  The ram is rated to 6ghz, which should allow the clock to scale to that level in a non-gimped (by bios, voltage, or whatever other nonsense comes about) configuration, and the voltage scaling in the 290x article clearly shows this as a possibility...perhaps hitting 1200mhz right around that 1.263 level ATi _always seems to unofficially target_.

Someone said 'bring on 290x Ghz ed'.  In one capacity or another (officially or through partner cards), I don't think that is far from the truth...What AMD needs I've pretty much outlined above, not to say that others haven't probably reached the same conclusion.        

I agree with Xzibit whom said they are both evermore making 'benchmark review' cards, and often it appears obvious they are targeting certain limitations of the competition's configuration (like AMD's 5ghz ram clock in this case, which would bottleneck it under the 780ti median boost clock).  It would appear that TPU is a valuable source to their analysis teams, as the ~1% average wins (across this suite) and average/median clocks vs the main competitors specs perpetually show this to be the case.

I think you should feel loved, W1z.


----------



## HumanSmoke (Nov 8, 2013)

the54thvoid said:


> What does the actual retail packaging say for a 290X?


Well, truth is obviously the first casualty in XFX and MSI advertising...although there's probably a disclaimer somewhere on the site that notes that the fan must be run at 100% in a walk-in freezer.





You can add Asus and Gigabyte to that list. Sapphire and HIS ( "up to 1000MHz") and Club3D ("~1000MHz") seem a little more circumspect.


----------



## alwayssts (Nov 8, 2013)

the54thvoid said:


> However, the difference is that the Kepler cards advertise a base clock and base boost whereas the actual boost is far higher.  In advertising terms, the Kepler card will generally remain *above* the marketed 'base boost' value.



Correct.  It's a marketing fallacy.

Look back at W1z' 600/700 series reviews.  While cards are marketed as x base clock and x+1 boost clock (median), typical speeds usually vary from x+2 to x+3 (consistantly across most boards from varying aibs for the same product).

I don't know if it is them being conservative so they don't look like amd does now (and arguably they did with the 'up to 1112mhz' 680 that ran around 1085mhz), to throw off the competition from the real clock (which is usually a bin or two higher than the difference listed), or simply to appear to have more performance at a lower clock than they actually do.

For instance, the only somewhat-honest comparison you can make here is that a (993-)1020mhz 780ti is ~15% faster than a 'usually' 947mhz 290, as those are consistent values for the most part.  

Whatever way you look at it, it's frustrating.  I personally find it deceiving, but understand how others may see it differently.


----------



## Xzibit (Nov 8, 2013)

the54thvoid said:


> I'm pretty sure it will throttle (all boost 2.0 cards do).  My Titan could hit 993 at stock (no OC) but after a few mins it would start to downclock to keep the temp at 79 degrees.
> 
> However, the difference is that the Kepler cards advertise a base clock and base boost whereas the actual boost is far higher.  In advertising terms, the Kepler card will generally remain *above* the marketed 'base boost' value.
> Hawaii on the other hand is being 'touted' as a 1000Mhz card so when it down throttles it seems to be under performing.  I know AMD have not stated anywhere it is guaranteed to run at that speed but i think it's just a miscommunication on their (PR) part that the cards are throttling badly.  It's more a case they are working as designed but the thermals wont allow them to hit the 'advertised' boost.  What does the actual retail packaging say for a 290X?



What I would like to know is if these cards are even capable of delivering there clocks in a gaming run.

Benchmarking well we have the answers. What we don't know is duration and settings needed to maintain the performance.

It would be time consuming.  Same benchmarks and loop it for 30mins to see if the clocks can be maintained.  In this case it will be thermal limits to maintain it from down clocking so it will vary by card.

I cant use any other examples since no one has bothered yet.

*COLD RUN*





*WARM RUN*






> The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts.  The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout.  That frequency is also above the advertised *base clock*.


*Doesn't mention maintaining boost clocks*
Its hard to tell since he didn't disclose what card he was using.

*COLD RUN*





*WARM RUN*





^This is only Crysis 3 60sec run @ 2560x1440

*Metro Last Light Max Settings 30minutes Gaming Loop*





^Now if this can be done per game/resolution/settings to find out throttling thresholds per cards to add to the benchmarks results.  It would serve a site well.  Then you would know at the very least in a control environment such as a open air test bench an idea of what you can expect to do once you purchase one of these to maintain the performance in gaming.   Not just a quick bench run less then 3mins but a 30min gaming experience.

*That's why I asked If anyone knew any sites doing such test.*


----------



## VulkanBros (Nov 8, 2013)

Does the 780 TI support DirectX 11.2?

*Nope - just found out:*
GTX 780 Ti GPU Engine Specs:
2880CUDA Cores
875Base Clock (MHz)
928Boost Clock (MHz)
210Texture Fill Rate (GigaTexels/sec)
GTX 780 Ti Memory Specs:
7.0 GbpsMemory Clock
3072 MBStandard Memory Config
GDDR5Memory Interface
384-bitMemory Interface Width
336Memory Bandwidth (GB/sec)
GTX 780 Ti Support:
4.3OpenGL
PCI Express 3.0 Bus Support
YesCertified for Windows 7, Windows 8, Windows Vista or Windows XP
GPU Boost 2.0, 3D Vision, CUDA, DirectX 11, PhysX, TXAA, Adaptive VSync, FXAA, NVIDIA Surround, SLI-readySupported Technologies
Yes3D Vision Ready
Microsoft DirectX 11.1 API


----------



## sweet (Nov 8, 2013)

HumanSmoke said:


> Well, truth is obviously the first casualty in XFX and MSI advertising...although there's probably a disclaimer somewhere on the site that notes that the fan must be run at 100% in a walk-in freezer.
> http://img.techpowerup.org/131108/2...aphics_Cards/R9290X4GD5/#specifications"]Asus and Gigabyte to that list. Sapphire and HIS ( "up to 1000MHz") and Club3D ("~1000MHz") seem a little more circumspect.



LOL, choose higher fan target and your card will stable at 1000 MHz. Powercolor did the right thing when removing the quiet bios
http://www.techpowerup.com/reviews/Powercolor/R9_290X_OC/31.html

Just choose uber mode and you will be fine. If you can't live with that noise, 40 bucks for an after market cooler with help you, like in my case.


----------



## qubit (Nov 8, 2013)

VulkanBros said:


> Does the 780 TI support DirectX 11.2?
> 
> *Nope - just found out:*
> GTX 780 Ti GPU Engine Specs:
> ...



Oh well, that scuppers any speculation that the B revision GK110 is for DX11.2 support.


----------



## EarthDog (Nov 8, 2013)

Who was speculating that???!


----------



## The Von Matrices (Nov 8, 2013)

qubit said:


> Oh well, that scuppers any speculation that the B revision GK110 is for DX11.2 support.



Do you realize how confusing that would be?  GTX 780 Ti would be a guaranteed B revision but GTX Titan and GTX 780 would be a mix of A and B.  Then you'd have some retailers sorting A and B cards and marking up the B cards for a premium price.  I know this is done to an extent with revisions of CPUs for overclocking ability, but very few retailers ever do this, and as far as I know there have never been major feature changes on different silicon revisions, only minor errata fixes.



sweet said:


> LOL, choose higher fan target and your card will stable at 1000 MHz. Powercolor did the right thing when removing the quiet bios
> http://www.techpowerup.com/reviews/Powercolor/R9_290X_OC/31.html
> 
> Just choose uber mode and you will be fine. If you can't live with that noise, 40 bucks for an after market cooler with help you, like in my case.



Completely out of curiosity, what heatsink did you use that only cost $40?  The Accelero Xtreme III that everyone is recommending starts at $75.


----------



## qubit (Nov 8, 2013)

EarthDog said:


> Who was speculating that???!





The Von Matrices said:


> Do you realize how confusing that would be?  GTX 780 Ti would be a guaranteed B revision but GTX Titan and GTX 780 would be a mix of A and B.  Then you'd have some retailers sorting A and B cards and marking up the B cards for a premium price.  I know this is done to an extent with revisions of CPUs for overclocking ability, but very few retailers ever do this, and as far as I know there have never been major feature changes on different silicon revisions, only minor errata fixes.



lol, no one in particular. I was simply saying just in case anyone was speculating about this, it scuppers it, that's all.


----------



## EarthDog (Nov 8, 2013)

Only muppets speculate that.


----------



## BiggieShady (Nov 8, 2013)

EarthDog said:


> Only people who don't know that revision is called revision because it does not introduce new functionality, speculate that.



ftfy


----------



## the54thvoid (Nov 8, 2013)

Xzibit said:


> ^Now if this can be done per game/resolution/settings to find out throttling thresholds per cards to add to the benchmarks results.  It would serve a site well.  Then you would know at the very least in a control environment such as a open air test bench an idea of what you can expect to do once you purchase one of these to maintain the performance in gaming.   Not just a quick bench run less then 3mins but a 30min gaming experience.
> 
> *That's why I asked If anyone knew any sites doing such test.*


*

I know where you're coming from.  From learning all about throttling from the Titan stock BIOS, lots of guys simply post up Afterburner logs.  For a review site to do this would not be that intensive, AB logs aren't that large.  Even screen shots post test run would bring up an indication of throttling.

My card throttled (on all stock settings) from heat.  I only needed a small bump in fan parameters to negate it to be honest.  But, I live in Scotland where my room temp is never above 20-23 degrees (even in our "Summer" ).

The other throttling from power is also present and that requires BIOS flashing - bad Nvidia 



The Von Matrices said:



			Do you realize how confusing that would be?  GTX 780 Ti would be a guaranteed B revision but GTX Titan and GTX 780 would be a mix of A and B.  Then you'd have some retailers sorting A and B cards and marking up the B cards for a premium price.  I know this is done to an extent with revisions of CPUs for overclocking ability, but very few retailers ever do this, and as far as I know there have never been major feature changes on different silicon revisions, only minor errata fixes.
		
Click to expand...


FWIW, this is from the Tech Report review.




			When I asked Nvidia where it found the dark magic to achieve this feat, the answer was more complex than expected. For one thing, this card is based on a new revision of the GK110, the GK110B (or it is GK110b? GK110-B?). The primary benefit of the GK110B is higher yields, or more good chips per wafer. Nvidia quietly rolled out the GK110B back in August aboard GTX 780 and Titan cards, so it's not unique to the 780 Ti

Click to expand...





BiggieShady said:



			ftfy
		
Click to expand...


looooooooooool.  I had to google "ftfy".  I thought it meant..



Spoiler



fuck that fuck you



What can I say?  I'm Scottish, swearing is our native tongue. *


----------



## radrok (Nov 8, 2013)

the54thvoid said:


> What can I say? I'm Scottish, swearing is our native tongue.



http://wiki.teamfortress.com/wiki/Demoman





haswrong said:


> depends on the renderer. try thinking outside of the box a bit, and assume somebodys going to introduce a realtime raytracer.. then you shall utilize it and maybe need some more. i understand that vram has grown in size quite a lot for past few years, i started with 2 megs of ram, now have 512 of it. but that doesnt mean it is now balanced for next redering solution.



This doesn't make the slightest sense.


----------



## SIGSEGV (Nov 9, 2013)

The Von Matrices said:


> Completely out of curiosity, what heatsink did you use that only cost $40?  The Accelero Xtreme III that everyone is recommending starts at $75.



50 bucks is enough to cool 290x


----------



## Aerpoweron (Nov 9, 2013)

VulkanBros said:


> Does the 780 TI support DirectX 11.2?
> 
> Nope - just found out:
> GTX 780 Ti GPU Engine Specs:
> ...



Is it the real DX11.1 this time, or just 11.0 like on the other 7xx cards?

Nvidia is still confusing me with their Direct X labeling.


----------



## sweet (Nov 9, 2013)

The Von Matrices said:


> Do you realize how confusing that would be?  GTX 780 Ti would be a guaranteed B revision but GTX Titan and GTX 780 would be a mix of A and B.  Then you'd have some retailers sorting A and B cards and marking up the B cards for a premium price.  I know this is done to an extent with revisions of CPUs for overclocking ability, but very few retailers ever do this, and as far as I know there have never been major feature changes on different silicon revisions, only minor errata fixes.
> 
> 
> 
> Completely out of curiosity, what heatsink did you use that only cost $40?  The Accelero Xtreme III that everyone is recommending starts at $75.



Well, it's a bit ugly though
http://www.ebay.com.au/itm/Gelid-So...o_Card_GPU_Cooling&hash=item19e4830891&_uhb=1


----------



## qubit (Nov 9, 2013)

*What that B1 revision is for*



> As for what’s new for B1, NVIDIA is telling us that it’s a fairly tame revision of GK110. NVIDIA hasn’t made any significant changes to the GPU, rather they’ve merely gone in and fixed some errata that were in the earlier revision of GK110, and in the meantime tightened up the design and leakage just a bit to nudge power usage down, the latter of which is helpful for countering the greater power draw from lighting up the 15th and final SMX. Otherwise B1 doesn’t have any feature changes nor significant changes in its power characteristics relative to the previous revision, so it should be a fairly small footnote compared to GTX 780.



AnandTech


----------



## Animalpak (Nov 9, 2013)

i read that will be released a black edition with black case... 

Any information about it ? Differences ? Specs ?


----------



## MxPhenom 216 (Nov 9, 2013)

Animalpak said:


> i read that will be released a black edition with black case...
> 
> Any information about it ? Differences ? Specs ?



Probably just 6 and 12GB memory differences. the 12GB card will probably be $1000.


----------



## cm33le (Nov 9, 2013)

breaking news

780ti is being recalled in China
Ram power supply problem affecting all brands. The new 780ti uses 6pin instead of pci-e(780) for ram power supply, but the first batch 780ti still using the old pcb, result in possiblely burning the broad.


----------



## HTC (Nov 9, 2013)

cm33le said:


> breaking news
> 
> 780ti is being recalled in China
> Ram power supply problem affecting all brands. The new 780ti uses 6pin instead of pci-e(780) for ram power supply, but the first batch 780ti still using the old pcb, result in possiblely burning the broad.



Really?

Source, please?


----------



## cm33le (Nov 9, 2013)

HTC said:


> Really?
> 
> Source, please?




only in chinese

from forum
http://www.chiphell.com/thread-897383-1-1.html

also from HK local retailer's facebook (that make it offical)
https://www.facebook.com/Centralfield


----------



## HTC (Nov 9, 2013)

cm33le said:


> only in chinese
> 
> from forum
> http://www.chiphell.com/thread-897383-1-1.html
> ...



If true, i wonder what those saying "temps on RX 290 series are too high" will say to this ...


----------



## TheoneandonlyMrK (Nov 9, 2013)

Animalpak said:


> i read that will be released a black edition with black case...
> 
> Any information about it ? Differences ? Specs ?



probably be dearer, have a toy with it and a black edition sticker on the cooler


----------



## Animalpak (Nov 9, 2013)

pff now maybe should wait for MSI or Asus putting their custom coolers.

MSI lightning 2 ?


----------



## radrok (Nov 9, 2013)

I'd call it GTX 780 black dong edition, whoever buys a 12 GB gaming graphics card surely finds itself with a dong :O

6GB on the other hand makes sense for some application.


----------



## the54thvoid (Nov 9, 2013)

cm33le said:


> breaking news
> 
> 780ti is being recalled in China
> Ram power supply problem affecting all brands. The new 780ti uses 6pin instead of pci-e(780) for ram power supply, but the first batch 780ti still using the old pcb, result in possiblely burning the broad.



I took the liberty of googling such an issue and found nothing mentioned.  I then google translated all the posts of the Chiphell OP, Luminox.  From what i read it seems to suggest it is mostly a Galaxy (i.e. makers of HOF cards) problem.  Is this not so?
Also, it is already fixed.  Need more info for any real conclusions to be drawn but Galaxy HOF 780's when overclocked (on custom BIOs's) were getting a few burny burny problems.


----------



## HTC (Nov 9, 2013)

the54thvoid said:


> I took the liberty of googling such an issue and found nothing mentioned.  I then google translated all the posts of the Chiphell OP, Luminox.  From what i read it seems to suggest it is mostly a Galaxy (i.e. makers of HOF cards) problem.  Is this not so?
> Also, it is already fixed.  Need more info for any real conclusions to be drawn but Galaxy HOF 780's when overclocked (on custom BIOs's) were getting a few burny burny problems.



I tried finding here and @ XS but didn't spot any posts referring this, other then his, so i thought this was a false claim, and by this dude's 1st and 2nd posts, btw.


----------



## qubit (Nov 9, 2013)

If this is really a significant problem, then it'll be all over the news soon enough without anyone having to decipher obscure Chinese forum posts.


----------



## erocker (Nov 9, 2013)

One version of the Galaxy 780Ti is having problems. That's it. There is no problem with 780ti as a whole... Other than it's price.


----------



## BiggieShady (Nov 9, 2013)

Aerpoweron said:


> Is it the real DX11.1 this time, or just 11.0 like on the other 7xx cards?
> 
> Nvidia is still confusing me with their Direct X labeling.



I'm not happy either, it's still only partial Direct X 11.1 support, they don't have 4 non-gaming features: 

Target-Independent Rasterization (2D rendering only)
16xMSAA Rasterization (2D rendering only)
Orthogonal Line Rendering Mode
UAV in non-pixel-shader stages
Everything 3D gaming related from 11.1 is covered.


----------



## Fluffmeister (Nov 9, 2013)

BiggieShady said:


> I'm not happy either, it's still only partial Direct X 11.1 support, they don't have 4 non-gaming features:
> 
> Target-Independent Rasterization (2D rendering only)
> 16xMSAA Rasterization (2D rendering only)
> ...



Indeed, everything that is relevant for a gaming card is supported on this card aimed at gamers.

Although I still waked up in the middle of the night in a hot sweat thinking my card didn't support Orthogonal Line Rendering Mode.


----------



## Arjai (Nov 10, 2013)

Fluffmeister said:


> Indeed, everything that is relevant for a gaming card is supported on this card aimed at gamers.
> 
> Although I still waked up in the middle of the night in a hot sweat thinking my card didn't support Orthogonal Line Rendering Mode.



LOL


----------



## TRWOV (Nov 10, 2013)

^seconded


----------



## cm33le (Nov 10, 2013)

just sharing
it seems not all 780ti are affected, otherwise it will be all over the news
and the admin of that chinese forum is trying to block the news....
i will keep update

http://www.chiphell.com/thread-897838-2-1.html

780ti pcb?


----------



## The Von Matrices (Nov 10, 2013)

cm33le said:


> just sharing
> it seems not all 780ti are affected, otherwise it will be all over the news
> and the admin of that chinese forum is trying to block the news....
> i will keep update
> ...



From my interpretation it looks like an issue with the power balancing introduced with the 780Ti.  It almost looks like the entire GPU load was trying to be drawn on the PCIe slot, which would burn out the small traces only designed to output 75W.


----------



## qubit (Nov 10, 2013)

cm33le said:


> just sharing
> it seems not all 780ti are affected, otherwise it will be all over the news
> and the admin of that chinese forum is trying to block the news....
> i will keep update
> ...



It wouldn't surprise me if that wasn't actually the card's fault.

You could get this if the card wasn't making good contact with the PCI-E slot perhaps, or same again with the PCI-E connectors. Perhaps the user was dumb enough to pull out one or both of those connectors while the card was running? Perhaps the mobo itself was faulty?

No need to immediately blame the card for the damage.


----------



## R0H1T (Nov 11, 2013)

erocker said:


> One version of the Galaxy 780Ti is having problems. That's it. There is no problem with 780ti as a whole... Other than it's price.


Not here to insinuate someone but the Tom's article about the 290/x retail parts being inferior to their review samples quickly went viral, especially since it was Nvidia that pointed it out, so to say that such a thing is no "big deal" is being disingenuous & Nvidia is known to employ such tactics to hurt their competition aka AMD, just saying it out loud you know


----------



## the54thvoid (Nov 11, 2013)

R0H1T said:


> Not here to insinuate someone but the Tom's article about the 290/x retail parts being inferior to their review samples quickly went viral, especially since it was Nvidia that pointed it out, so to say that such a thing is no "big deal" is being disingenuous & Nvidia is known to employ such tactics to hurt their competition aka AMD, just saying it out loud you know



Anandtech have printed a follow up to their own review in which they reference the Toms article.  The deviation in performance was down to fan speeds being variable.  To rectify this there is a new beta which takes care of this by.... making the fan rpm  higher (based on creating a more effective cooling solution to allow higher more stable clocks).  The result is a 100-150rpm increase.  This creates no detriment to the 290 but it does make some more noise on the 290x cards.  Performance is negligible but more stable.

http://www.anandtech.com/show/7501/amd-changes-290-series-fan-algorithms


----------



## Pandora's Box (Nov 11, 2013)

This is what happens when you push it to the wall:
















1286MHz Core, 7.7GHz Memory, 1.212 Volts

Card was using 106% of TDP on this custom bios. 100% TDP = 300 Watts, so I was pulling 318 Watts lol


----------



## Pandora's Box (Nov 11, 2013)

W1zzard, in TechPowerUps Sleeping Dog benchmark, are you guys using High Quality Textures?












This is at 1.2GHz Core 7.7Ghz Mem

Crazy that I'm hitting a CPU bottleneck here. 3770K at 4.5GHz and the 780 Ti is only hitting 80% utilization.

Also It'd be awesome if TPU could start providing minimum framerates in their benchmarks.


----------



## DeadSkull (Nov 11, 2013)

HTC said:


> I tried finding here and @ XS but didn't spot any posts referring this, other then his, so i thought this was a false claim, and by this dude's 1st and 2nd posts, btw.



Its mostly on overclock net. You can also check newegg reviews and see the problem is reported there.


----------



## R0H1T (Nov 11, 2013)

the54thvoid said:


> Anandtech have printed a follow up to their own review in which they reference the Toms article.  The deviation in performance was down to fan speeds being variable.  To rectify this there is a new beta which takes care of this by.... making the fan rpm  higher (based on creating a more effective cooling solution to allow higher more stable clocks).  The result is a 100-150rpm increase.  This creates no detriment to the 290 but it does make some more noise on the 290x cards.  Performance is negligible but more stable.
> 
> http://www.anandtech.com/show/7501/amd-changes-290-series-fan-algorithms


I'm very much aware of the developments at Tom's & AT after this story went viral, however it was Nvidia that first reported it to some journos & then it was picked up at Tom's after which nearly every other review site has had an article about it as if it was the end of the world! If this isn't stooping to a new low then I don't know what is, however the fact that Nvidia did this is not a surprise anymore.


----------



## Pandora's Box (Nov 11, 2013)

cm33le said:


> breaking news
> 
> 780ti is being recalled in China
> Ram power supply problem affecting all brands. The new 780ti uses 6pin instead of pci-e(780) for ram power supply, but the first batch 780ti still using the old pcb, result in possiblely burning the broad.





the54thvoid said:


> I took the liberty of googling such an issue and found nothing mentioned.  I then google translated all the posts of the Chiphell OP, Luminox.  From what i read it seems to suggest it is mostly a Galaxy (i.e. makers of HOF cards) problem.  Is this not so?
> Also, it is already fixed.  Need more info for any real conclusions to be drawn but Galaxy HOF 780's when overclocked (on custom BIOs's) were getting a few burny burny problems.





HTC said:


> I tried finding here and @ XS but didn't spot any posts referring this, other then his, so i thought this was a false claim, and by this dude's 1st and 2nd posts, btw.





DeadSkull said:


> Its mostly on overclock net. You can also check newegg reviews and see the problem is reported there.




so...lol? turns out that the burned video card connector is not from a 780 Ti, it's infact from a 7970...

Burnt Card:






780 TI PCB:






7970 PCB






http://www.overclock.net/t/1438886/official-nvidia-gtx-780-ti-owners-club/1300_50#post_21166432

http://www.overclock.net/t/1440518/various-geforce-gtx-780ti-reviews/1000_50#post_21166984


----------



## HTC (Nov 11, 2013)

Pandora's Box said:


> so...lol? turns out that the burned video card connector is not from a 780 Ti, it's infact from a 7970...
> 
> Burnt Card:
> 
> ...



Really? OMFG!!!!!

What on earth was the dude that posted that photo as a burnt nVidia 780 Ti card thinking?

The shear amount of FAIL baffles me ...


----------



## qubit (Nov 11, 2013)

HTC said:


> Really? OMFG!!!!!
> 
> What on earth was the dude that posted that photo as a burnt nVidia 780 Ti card thinking?
> 
> The shear amount of FAIL baffles me ...



Indeed. This is why I didn't get too excited when I first saw this posted on here. Something didn't look right about it from the off.

Good detective work by Pandora's Box to identify the actual card.


----------



## TheoneandonlyMrK (Nov 11, 2013)

Pandora's Box said:


> so...lol? turns out that the burned video card connector is not from a 780 Ti, it's infact from a 7970...
> 
> Burnt Card:
> 
> ...



A post on one forum chinese whispering another chinese forum and the case is solved , dont talk ass chouder.
Probably all fud anyway but third hand news isn't definitive facts imho.


----------



## cm33le (Nov 11, 2013)

Case closed

Only a batch of 780ti is affected in China, mainly from galaxy, as they wrongly placed a label containing metal at MOSFET, making them over heat. Here is the offical Galaxy China announcement copy, only serial no. 13B0020705-13B0020759, total 55 780ti, need to be replaced.


----------



## nemesis.ie (Nov 11, 2013)

ST.Viper said:


> So true...Hope that one day when 4k monitors became mainstream they also make them at 22-24" size. Because 32" is way too big to sits just 30cm away from your head.



I disagree, I've been running a 30" since 2007 and it's actually about perfect at 30cm, (and you can easily sit another 5 to 10cm away if you like unless you have a tiny desk) that almost immerses the main field of view. ~42" at 50cm, 4k sounds like a winner to me (when more affordable).


----------



## Boilerhog (Nov 11, 2013)

nemesis.ie said:


> I disagree, I've been running a 30" since 2007 and it's actually about perfect at 30cm, (and you can easily sit another 5 to 10cm away if you like unless you have a tiny desk) that almost immerses the main field of view. ~42" at 50cm, 4k sounds like a winner to me (when more affordable).



Me too.  been using a 30 "  Dell @ 2560 x 1600 now since 2006 , I can not go back, only forward ,come on !with affordable 4k at 30" would be fine with me..got the ol-lady a nice asus 27 " recently just so I could sit at her computer from time to time..lol


----------



## TheHunter (Nov 11, 2013)

Pandora's Box said:


> W1zzard, in TechPowerUps Sleeping Dog benchmark, are you guys using High Quality Textures?
> 
> http://i.imgur.com/0Lxi1v2.gif
> 
> ...




Well Techpoweup used 4770K @ 4.2Ghz and that is a little faster in cpu bound locations.


----------



## DeadSkull (Nov 11, 2013)

Pandora's Box said:


> so...lol? turns out that the burned video card connector is not from a 780 Ti, it's infact from a 7970...
> 
> Burnt Card:
> 
> ...



nvm


----------



## TheoneandonlyMrK (Nov 12, 2013)

DeadSkull said:


> nvm




???????????????? what?

=




cm33le said:


> Case closed
> 
> Only a batch of 780ti is affected in China, mainly from galaxy, as they wrongly placed a label containing metal at MOSFET, making them over heat. Here is the offical Galaxy China announcement copy, only serial no. 13B0020705-13B0020759, total 55 780ti, need to be replaced.



exactly


----------



## DeadSkull (Nov 12, 2013)

theoneandonlymrk said:


> ???????????????? what?
> 
> =
> 
> ...



I thought they were still talking about the GTX 780 Hall of Fame cards cooking of because of bad vrms.


----------



## haswrong (Nov 12, 2013)

DeadSkull said:


> I thought they were still talking about the GTX 780 Hall of Fame cards cooking of because of bad vrms.



:shadedshu 
http://i.imgur.com/02X7X7c.png​


----------



## qubit (Nov 12, 2013)

W1zz, are we gonna see an SLI review with this? I'm really curious to see what it can do.

I was thinking perhaps downclocking the EVGA ACX card to reference speeds and using that if you don't have another reference card to hand.


----------



## nullington (Nov 16, 2013)

Thanks for the review!

Btw the Battlefield 3 description is out of date. Its not the latest addition anymore.


----------

