# EVGA GTX 780 Ti SuperClocked w/ ACX Cooler 3 GB



## W1zzard (Nov 12, 2013)

EVGA strapped their ACX cooler onto the GTX 780 Ti just days after NVIDIA's launch. The new card is also overclocked out of the box, which results in a large performance increase over the stock GTX 780 Ti - EVGA's latest card now even matches the performance of the dual-GPU GTX 690.

*Show full review*


----------



## qubit (Nov 12, 2013)

Dammit I want one!


----------



## FreedomEclipse (Nov 12, 2013)

Interesting ConClusion.

But IMO... What money you are *SAVING* buying a 290X you end up paying for in your electric bill. Obviously thats not true for everyone but here in the UK, people are being forced to decide between having skipping meals to pay for heating or vice versa. 

But I digress.... Its gonna be a harsh winter and the 95'c of the 290X may well come in handy as a room heater.

My crossfired 6970s that routinely ran up to 85-90'c surely did (thank you XFX)


----------



## SKBARON (Nov 12, 2013)

FreedomEclipse said:


> Interesting ConClusion.
> 
> But IMO... What money you are *SAVING* buying a 290X you end up paying for in your electric bill. Obviously thats not true for everyone but here in the UK, people are being forced to decide between having skipping meals to pay for heating or vice versa.
> 
> ...




I used to warm my hands with the exhaust from my old 4890


----------



## Frick (Nov 12, 2013)

FreedomEclipse said:


> But IMO... What money you are *SAVING* buying a 290X you end up paying for in your electric bill. Obviously thats not true for everyone but here in the UK, people are being forced to decide between having skipping meals to pay for heating or vice versa.









In Furmark it ranks up to a 36W difference though, which absolutely no one can use as an argument for this card being more more economic than a 290x. That's less than an old lightbulb.

And if you have to choose to buy food or heating you're probably not in a position to buy a high end GPU anyway.


----------



## Go To Sleep (Nov 12, 2013)

That... is neat!


----------



## Kaynar (Nov 12, 2013)

Awesome card with that stock speed... but I can't agree with FreedomEclipse's point... We are taking about at 10W difference with the 290X  average usage... that's not even a light bulb's consumption... the difference is negligible... its not like 250W vs 400W....

On the other hand, as Wizz says, let's see what will happen with the retail cards. On AMD's side things look terrible according to some reviews with retail models being 10+% slower than review samples... (due to throttling from heat and driver changes)


----------



## the54thvoid (Nov 12, 2013)

That's insane.

Wtf is the EVGA 780Ti Classified going to do?


----------



## Pandora's Box (Nov 12, 2013)

the54thvoid said:


> That's insane.
> 
> Wtf is the EVGA 780Ti Classified going to do?



780 TI Classified PCB:


----------



## the54thvoid (Nov 12, 2013)

Pandora's Box said:


> 780 TI Classified PCB:
> 
> http://cdn.overclock.net/7/78/78e2da61_PCBB.jpeg



That's a 680 Classified PCB  

http://www.hardwareluxx.com/index.p...-evga-geforce-gtx-680-classified.html?start=2


----------



## FreedomEclipse (Nov 12, 2013)

Frick said:


> http://tpucdn.com/reviews/EVGA/GTX_780_Ti_SC_ACX_Cooler/images/power_peak.gif
> 
> In Furmark it ranks up to a 36W difference though, which absolutely no one can use as an argument for this card being more more economic than a 290x. That's less than an old lightbulb.
> 
> And if you have to choose to buy food or heating you're probably not in a position to buy a high end GPU anyway.



FYI - Im comparing the max power draw, not the peak draw






Though, even 11w bulbs are pretty bright now days.


though starving or not it doesnt make my point any less valid - the X290 still isnt as power efficient as the 780Ti however marginal the differences are - It still has an impact on the electricity bill at the end of the month.

I went from an older system - C2Q with 6970s in crossfire and there was quite a substantial drop in my electricity bill when that particular rig was retired and I transfered over to a 2500k@4.9Ghz and 680s in SLi


----------



## Frick (Nov 12, 2013)

FreedomEclipse said:


> FYI - Im comparing the max power draw, not the peak draw
> 
> http://img.techpowerup.org/131112/Power.jpg
> 
> ...



36W then. Which is an unrealistic load (Furmark), and you can get the same savings from ... spending less time at the can? Yes, there are savings, but at that point they are negligible. Now if you had like 4 cards and ran Furmark 24/7 it would make a nice differance.

Different systems has different power draws, that is obvious. I don't get your point with that example.

BTW, the jump from the 780Ti to this particular model is nearly the same as the jump from this model to a 290x.


----------



## LAN_deRf_HA (Nov 12, 2013)

No need to harp on power savings here, the performance gap is enough. You can talk about power when some custom 290X tries to close the gap with factory overclocks. Then you'll have a ridiculous power consumption difference.


----------



## FreedomEclipse (Nov 12, 2013)

Frick said:


> 36W then. Which is an unrealistic load (Furmark), and you can get the same savings from ... spending less time at the can? Yes, there are savings, but at that point they are negligible. Now if you had like 4 cards and ran Furmark 24/7 it would make a nice differance.
> 
> Different systems has different power draws, that is obvious. I don't get your point with that example.
> 
> BTW, the jump from the 780Ti to this particular model is nearly the same as the jump from this model to a 290x.



780Ti still has less power draw.


----------



## manofthem (Nov 12, 2013)

That card is awesome! I wish I had the money for this! Thanks W1zz


----------



## Casecutter (Nov 12, 2013)

Peak is gaming in a loaded condition and closest to real world, looking at Furmark means nothing.  What would be more correct on such cards is pull the power usage over a group of titles (6 minimum) then also translate that Perf/watt.  

I find it odd that Wizz uses Crysis 2 at 1920x1080, Extreme profile for those Peak numbers while that isn't even a title that FpS results are provided for?  How is perf/watt calculated?


----------



## Pandora's Box (Nov 12, 2013)

Honestly, do we really care about perf/watt when we are talking about $700+ video cards?


----------



## W1zzard (Nov 12, 2013)

Pandora's Box said:


> Honestly, do we really care about perf/watt when we are talking about $700+ video cards?



that's up to you to decide. I simply provide the data. I'm usually not looking at it too hard for high-end, unless it affects heat/noise.



Casecutter said:


> I find it odd that Wizz uses Crysis 2 at 1920x1080, Extreme profile for those Peak numbers while that isn't even a title that FpS results are provided for?



We chose Crysis 2 as a standard test representing typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is a current game that is supported on all cards because of its DirectX 9 roots; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs in a relatively short time and renders a non-static scene with variable complexity.

If I change the test I'll have to retest all existing cards. I doubt any newer game will make a difference, but I'll give it a try before the next round of rebenches.

Edit: If we used something like BF4 at Ultra settings (which is probably what you are looking for), what will happen to all these cards with 2 GB and below? They will either not run or be bottlenecked by swapping from VRAM into system memory, showing wrong power consumption. What about IGP power consumption?

Don't worry, I do not pick tests randomly


----------



## Casecutter (Nov 12, 2013)

Pandora's Box said:


> Honestly, do we really care about perf/watt when we are talking about $700+ video cards?


We should "care" cards should provide some degree of improvement (probably something we can't truely expect holding to 28Nm), but not focus on that one limited data point to call it good, bad, or whatever isn’t a proper matrix.


----------



## Casecutter (Nov 12, 2013)

W1zzard said:


> Don't worry, I do not pick tests randomly


Oh I understand the need for repeatable data, but at least that Crysis 2 should be still a game that the FpS is provided.  As many these cards fluctuate clocks while spitting out frames it’s hard to correlate a single collected point and allocate that to all titles.

Perhaps a separate article that runs a half a dozen newer games at 2650x and pulls watts from each of those, then that into FpS, and then that averaged.  I think it would at least in the present juncture it come set a base-line that using the current method is valid.

Thks,


----------



## TheoneandonlyMrK (Nov 12, 2013)

FreedomEclipse said:


> Interesting ConClusion.
> 
> But IMO... What money you are *SAVING* buying a 290X you end up paying for in your electric bill. Obviously thats not true for everyone but here in the UK, people are being forced to decide between having skipping meals to pay for heating or vice versa.
> 
> ...



I swear this is a point many companies and people need to look into because a 1 grand pc folding on two gpu's plus an 8 core cpu puts out pretty much the same heat as most 1kwatt heaters yet does not use 1Kwatt and has the added bonus of doing more good then just warming fingers.
I really do think its a big opportunity going missed as any house heat generator should be made of multiple compute elements designed for high temps and long term use, wheres the IOT and why isnt it working on warming my ass(or fingers).

might as well get OT, card looks good, cant see any issues bar price and AIB sourced R9 290x's should help the price situation before xmass


----------



## VulkanBros (Nov 12, 2013)

Ha...talking about heat.....
I still toast my eggs on my GTX 480


----------



## Casecutter (Nov 12, 2013)

theoneandonlymrk said:


> I swear this is a point many companies and people need to look into


Gee thanks your going to give my wife a reason to run the heater! 
Hunny I'm trying to save mankind...


----------



## qubit (Nov 12, 2013)

W1zzard said:


> Edit: If we used something like BF4 at Ultra settings (which is probably what you are looking for), what will happen to all these cards with 2 GB and below? They will either not run or be bottlenecked by swapping from VRAM into system memory



I've been curious of the effect on a high end graphics card if it ran out of memory.

Say something like a GTX 690 with "just" 2GB was averaging 100fps, but then ran out of memory on an especially intensive part of the benchmark, what sort of framerate figures would we see?


----------



## TheoneandonlyMrK (Nov 12, 2013)

Casecutter said:


> Gee thanks your going to give my wife a reason to run the heater!
> Hunny I'm trying to save mankind...



Thats the spirit, you could hunt for aliens or some such as an alternate


----------



## W1zzard (Nov 12, 2013)

qubit said:


> Say something like a GTX 690 with "just" 2GB was averaging 100fps, but then ran out of memory on an especially intensive part of the benchmark, what sort of framerate figures would we see?



It depends on how much of those textures the application really uses on an ongoing basis. If it constantly has to shuffle stuff from main memory via PCIe into VRAM and back it will be extremely slow.

In reality it won't be as much of an issue, educated guess, around 30% performance lost. I'm not thinking about GTX 690 now, more like lower midrange 1 GB card.


----------



## Slomo4shO (Nov 12, 2013)

Impressive results, when can we expect the Classy or non-reference 290[X] reviews?


----------



## W1zzard (Nov 12, 2013)

Slomo4shO said:


> Impressive results, when can we expect the Classy or non-reference 290[X] reviews?



No concrete date for either. The vendors probably don't know themselves.


----------



## qubit (Nov 12, 2013)

I've just properly read that review (I skimmed it previously) and I must say I'm impressed with what this cooler has done for the 780 Ti. It's even increased the bang-for-buck rating, which isn't normally the case.

I'd love to see what custom coolers from other players can do with this card.

Great review as always, W1zz.


----------



## Aithos (Nov 12, 2013)

Is there any chance of getting a ACX vs Reference SLI review?  I was going to wait for the ACX EVGA card but ended up going with the EVGA superclocked reference model because it was available last week and the ACX wasn't (and people seem to think the blower coolers are better for SLI).  I'd love to see a SLI benchmark pitting the superclocked versions against each other...


----------



## Slomo4shO (Nov 12, 2013)

W1zzard said:


> No concrete date for either. The vendors probably don't know themselves.



How about the inclusion of 5760x1080P on the performance summaries since you have run many of the individual benchmarks at these settings?


----------



## W1zzard (Nov 12, 2013)

Aithos said:


> Is there any chance of getting a ACX vs Reference SLI review?  I was going to wait for the ACX EVGA card but ended up going with the EVGA superclocked reference model because it was available last week and the ACX wasn't (and people seem to think the blower coolers are better for SLI).  I'd love to see a SLI benchmark pitting the superclocked versions against each other...



no plans for any 780 Ti SLI reviews



Slomo4shO said:


> How about the inclusion of 5760x1080P on the performance summaries since you have run many of the individual benchmarks at these settings?



that's planned for next rebench. the results cant be included in the overall summary of all resolutions though, because I don't test all cards at that resolution, nor do they all support triple monitor


----------



## Zubasa (Nov 12, 2013)

*For people complaining about power draw*

Power draw?
It is one of the most efficient GPU out there at higher resolution, nuff said.


----------



## Slomo4shO (Nov 12, 2013)

I would like to see the actual performance difference between equally clocked 780 and 780 Ti.


----------



## Kaynar (Nov 12, 2013)

W1zzard said:


> No concrete date for either. The vendors probably don't know themselves.



I can understand that there are no custom PCB cards for the new AMDs for about 1-2 months after release, but why there are no custom coolers from day 1 on AMD's side? XFX showed it was possible with the 7970 cards (but other brands didn't have any options for more than a month) and on NVidia side we also get custom cooled card rly fast. 

If at least AMD stock cooler was not a total failure...




Slomo4shO said:


> I would like to see the actual performance difference between equally clocked 780 and 780 Ti.



Yeah I'd like that too... plus the Titan. All 780s and Titans can do a base 3d clock of 1000mhz without voltage boost I guess...


----------



## msamelis (Nov 13, 2013)

It seems I don't understand marketing very well.. 

First, Nvidia announces some uber expensive and performing GPU, the Titan. Afterwards it releases the 780 which is pretty close and makes it look like good value for money compared to the Titan - but still expensive nonetheless (good strategy to get more money I guess). Then they release the "Titan killer" - which AMD was supposed to do - with the 780Ti. If I am not mistaken, the only thing that the Titan has to offer over the 780Ti is extra RAM practically.

On the AMD side, they came with cards that run hot and are noisy as hell but are excellent performers and with a good price. The 290x started trading blows with the 780 - never mind the Titan any longer - but then they won't let them be released with aftermarket coolers. After that, they release the 290 which was very close performance wise to the 290x but even cheaper and it was a better OCer.

Is it only me that is getting confused here? Quite honestly, I want to upgrade my GPU since the 480 is starting to rust, so to speak and I no longer have any clue as to what to get.. Sell my left kidney and buy a 780Ti or keep it and get deafened and sweaty by a 290 or 290x? I honestly don't know any more but it seems I should wait until they release more 780Ti's and for AMD to let aftermarket coolers added on their new GPUs.


----------



## qubit (Nov 13, 2013)

msamelis said:


> It seems I don't understand marketing very well..
> 
> First, Nvidia announces some uber expensive and performing GPU, the Titan. Afterwards it releases the 780 which is pretty close and makes it look like good value for money compared to the Titan - but still expensive nonetheless (good strategy to get more money I guess). Then they release the "Titan killer" - which AMD was supposed to do - with the 780Ti. If I am not mistaken, the only thing that the Titan has to offer over the 780Ti is extra RAM practically.
> 
> ...



Good points you raise there - and I'm not surprised you're confused as these things are always a bit of a grey area.

Personally, I prefer NVIDIA as they tend to have the better all-round product and I've been using them happily for years - just check the reviews on here to see what I mean.

You can stick a 780 / 780 Ti in your system and have superb framerate performance for reasonable noise without having to worry about any annoying issues. Oh and that stock cooler is very nice in terms of performance and looks. All this comes at a rather premium price, of course and that does suck, I know.

However, after all is said and done, the only thing wrong I can really see with the 290 / 290X is that daft stock cooler. Yes, it seems to be beyond comprehension why AMD are forcing their partners to wait with custom coolers and it's frustrating, but they are.

I'd say wait a month or so before buying anything. The 780 Ti will drop in price and those 290 / 290X cards with custom coolers will be out along with the reviews. We'll then see what this chip can _really_ do when not strangled by heat. You'll then be in a much better position to decide what to buy - and you'll get more for your money.

Whatever you do, don't buy the Titan as it's stupidly expensive and a total waste of money. That 6GB RAM isn't required, it runs slower than a 780 Ti in games with its only advantage being compute, which isn't used by most enthusiasts. It's more for scientific modelling, things like that.


----------



## xenocide (Nov 13, 2013)

msamelis said:


> It seems I don't understand marketing very well..
> 
> First, Nvidia announces some uber expensive and performing GPU, the Titan. Afterwards it releases the 780 which is pretty close and makes it look like good value for money compared to the Titan - but still expensive nonetheless (good strategy to get more money I guess). Then they release the "Titan killer" - which AMD was supposed to do - with the 780Ti. If I am not mistaken, the only thing that the Titan has to offer over the 780Ti is extra RAM practically.
> 
> ...



You're forgetting that Titan is nearly a year old, and was basically a poor-man's Tesla K20/K20X.  It was a really good card for people that needed compute, and saved them some money (about $2000+ in fact), it just happened to have solid enough gaming performance.  The Nvidia Titan was a bad deal for gamers, but a great deal for people who wanted Tesla's.  It was a workstation GPU that was poorly marketted to gamers as well.


----------



## EpicShweetness (Nov 13, 2013)

Looking at the power consumption (nearly identical to the R9 290/X) this give me high hopes for after cooling on those AMD products. Seriously someone that's as much of a dunce as EVGA at cooling can do that, imagine the nuts at MSI or ASUS.


----------



## xenocide (Nov 13, 2013)

EpicShweetness said:


> Looking at the power consumption (nearly identical to the R9 290/X) this give me high hopes for after cooling on those AMD products. Seriously someone that's as much of a dunce as EVGA at cooling can do that, imagine the nuts at MSI or ASUS.



Keep in mind the recent AMD releases were designed to run hot, so I wouldn't expect the temperatures to absolutely plummet, but there's no way that stock\reference cooler is anything short of loud garbage.


----------



## SIGSEGV (Nov 13, 2013)

xenocide said:


> Keep in mind the recent *AMD releases were designed to run hot*



sadly, that things would do serious damage hit to the score of many reviews out there. i'm sure many reviews would still give at least with same score or better evev if nvidia or their partner board charged this card on $900 price point as long as it stays cool and remain silent than competition

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/36.html


----------



## sweet (Nov 13, 2013)

Kaynar said:


> Awesome card with that stock speed... but I can't agree with FreedomEclipse's point... We are taking about at 10W difference with the 290X  average usage... that's not even a light bulb's consumption... the difference is negligible... its not like 250W vs 400W....
> 
> On the other hand, as Wizz says, let's see what will happen with the retail cards. On AMD's side things look terrible according to some reviews with retail models being 10+% slower than review samples... (due to throttling from heat and driver changes)



W1zz already benched a retail card, Powercolor 290x OC
http://www.techpowerup.com/reviews/Powercolor/R9_290X_OC/


----------



## buggalugs (Nov 13, 2013)

W1zzard said:


> No concrete date for either. The vendors probably don't know themselves.



well I think the vendors (AIB partners)would know, and are probably making them now. It takes a while to design and manufacture non- reference cards or any new cards. If they are expected before Christmas, they must be making them now.

 Same with the 780Ti, they are available in stores and in stock now, just days after AMDs 290X release. Those cards must have started design and manufacture months ago. Just waiting for AMD's announcement.

 Its obvious both companies know exactly what cards are coming and the performance that can be expected loooong before the public or media know about it. That's makes sense if both chips are made at the same factory.

 SO while Nvidia were spruiking Titan as a killer $1,000 card, they already working on the 780Ti and knew in just a few months, consumers will have a much faster and much cheaper card in the 780Ti. Waiting patiently for the AMD 290x announcement then BAAM, heres a 780Ti.

If anyone thinks Nvidia/AMD only find out about the competition's cards when they are publically released, it just doesn't work that way.

 We need an insider in Taiwan, I don't know how they keep this information secret for so long.


----------



## W1zzard (Nov 13, 2013)

buggalugs said:


> well I think the vendors (AIB partners)would know, and are probably making them now.



Last I checked with one of the biggest AIBs the answer was "the only thing we know is that we don't know". That's for custom AMD cards. plans for custom NVIDIA cards are progressing as planned

Regarding EVGA Classified, I haven't asked. Still haven't found time to review the 780 Non-Ti Classified that they offered


----------



## Frick (Nov 13, 2013)

sweet said:


> W1zz already benched a retail card, Powercolor 290x OC
> http://www.techpowerup.com/reviews/Powercolor/R9_290X_OC/



Doesn't count, it's a stock cooler.


----------



## buggalugs (Nov 13, 2013)

W1zzard said:


> Last I checked with one of the biggest AIBs the answer was "the only thing we know is that we don't know". That's for custom AMD cards. plans for custom NVIDIA cards are progressing as planned
> 
> Regarding EVGA Classified, I haven't asked. Still haven't found time to review the 780 Non-Ti Classified that they offered



 They must be good at keeping secrets, or maybe the PR guys don't know whats going on in the factories which is probably more likely. 

 We know AMD and Nvidia come down hard on any leaks, I guess its working....




Frick said:


> Doesn't count, it's a stock cooler.



 Hes talking about a RETAIL card, not a non-reference card


----------



## W1zzard (Nov 13, 2013)

buggalugs said:


> They must be good at keeping secrets, or maybe the PR guys don't know whats going on in the factories which is probably more likely.
> 
> We know AMD and Nvidia come down hard on any leaks, I guess its working....



They tell me what's going on with their NVIDIA lineup in December and exactly what changes they are making to the card designs, when they ship, and then they tell me "no idea" when talking about AMD. Your theory seems flawed 

Everybody knows I'm not leaking, so they are comfortable talking about stuff. oh boy TPU news would be different, you'd be the first to know


----------



## the54thvoid (Nov 13, 2013)

W1zzard said:


> They tell me what's going on with their NVIDIA lineup in December and exactly what changes they are making to the card designs, when they ship, and then they tell me "no idea" when talking about AMD. Your theory seems flawed
> 
> *Everybody knows I'm not leaking*, so they are comfortable talking about stuff. oh boy TPU news would be different, you'd be the first to know



I dunno man, the years of drugs and hookers must have taken their toll.  You sure you're not at least dribbling a bit?


----------



## Eric_Cartman (Nov 13, 2013)

Poor AMD fanboys.

They had their hopes set so high that aftermarket coolers would save the poor R9 290X.

Then EVGA comes out with this best GTX 780 Ti and just destroys the R9 290X.

No aftermarket cooler could ever make the R9 290X as good as this card.

It is extremely sad that AMD can't make cards to compete with nVidia.

At this point nVidia is just re-using their year old technology and AMD still can't catch up.


----------



## BiggieShady (Nov 13, 2013)

Eric_Cartman said:


> No aftermarket cooler could ever make the R9 290X as good as this card.



I don't know, maybe the time has come for quad slot coolers  crossfired 290X's on air are more or less out of the question anyway

Evga made a great card - it's 35 dB at load and overclockable for almost 10 % more performance on top of the out-of-the-box OC. It doesn't say how much the fan ramps up while overclocked though, and it would be nice to know.


----------



## manofthem (Nov 13, 2013)

Eric_Cartman said:


> Poor AMD fanboys.
> 
> They had their hopes set so high that aftermarket coolers would save the poor R9 290X.
> 
> ...




Cool story bro.  You're fantastic, aren't you :shadedshu

You going to upgrade your 660 to a 780ti? Didn't think so.


----------



## Eric_Cartman (Nov 13, 2013)

manofthem said:


> Cool story bro.  You're fantastic, aren't you :shadedshu
> 
> You going to upgrade your 660 to a 780ti? Didn't think so.



I don't have to.

I bought an nVidia card that is bad ass already.

It is the AMD fanboys that have been waiting all this time for a card that can finally compete with nvidia so they could upgrade and AMD failed.

AMD just released a bunch of rebrands of cards the AMD fanboys already had and one that requires a nuclear reactor in your house to power and overheats if you use it for more than 2 minutes at a time.


----------



## erocker (Nov 13, 2013)

Did someone learn a new word today? Yeesh! Tone it down man!


----------



## MxPhenom 216 (Nov 13, 2013)

That's an Nvidia fanboy if I have ever seen one.

Nvidia cards are good, but the prices can get bent.



Eric_Cartman said:


> AMD just released a bunch of rebrands of cards the AMD fanboys already had and one that requires a nuclear reactor in your house to power and overheats if you use it for more than 2 minutes at a time.



I guess you do not remember Nvidia and their rebranding frenzies of a few GPUs in every generation since the 8000 cards....

If Nvidia does that, AMD can too.


----------



## qubit (Nov 13, 2013)

Eric_Cartman said:


> I don't have to.
> 
> I bought an nVidia card that is bad ass already.
> 
> ...



Look, I'm an NVIDIA owner who loves his cards and gets your sentiment, but you should realize that this domination by NVIDIA isn't a good situation for us customers.

What we want is two (or preferably more) competitors that are more or less equal duking it out in the marketplace. That gives us the best graphics cards at the cheapest prices. Hopefully with a full-on price war. 

Great example: don't you think the 780 Ti is the card NVIDIA should have released 8 months ago instead of the crippled Titan at a ludicrous $1000? The minute AMD comes back with something moderately competitive we get a better product at a much cheaper price, the 780 Ti. If AMD were equal to NVIDIA in everything just think how good the products would be that we could choose from both companies.

_This_ is where we want to be. Not feeling smug at the other side for having a supposedly inferior product.


----------



## Steevo (Nov 13, 2013)

qubit said:


> _This_ is where we want to be. Not feeling smug at the other side for having an inferior product.



Not inferior, cheaper. 

All these arguments remind me of people talking trash about sports cars and all when they drive a Prius.


----------



## MxPhenom 216 (Nov 13, 2013)

Steevo said:


> Not inferior, cheaper.
> 
> All these arguments remind me of people talking trash about sports cars and all when they drive a Prius.


----------



## TheoneandonlyMrK (Nov 13, 2013)

Eric_Cartman said:


> I don't have to.
> 
> I bought an nVidia card that is bad ass already.
> 
> ...



very strange few posts you made as no one was throwin mud in yours or nvidias direction.

Lots of bile considering your running an Amd cpu and (at least in my eyes) an overpriced mid range gpu with no intent of changeing up, note im not calling the 780Ti as i already stated a while ago in this thread ,nice card would have one but a bit too dear for my pocket.

are you feeling butt hurt that 660Ti's are not worth what you paid now, or back then.


----------



## erocker (Nov 13, 2013)

theoneandonlymrk said:


> very strange few posts you made as no one was throwin mud in yours or nvidias direction.
> 
> Lots of bile considering your running an Amd cpu and (at least in my eyes) an overpriced mid range gpu with no intent of changeing up, note im not calling the 780Ti as i already stated a while ago in this thread ,nice card would have one but a bit too dear for my pocket.
> 
> are you feeling butt hurt that 660Ti's are not worth what you paid now, or back then.



Who cares. It's off topic anyways. I asked him to tone it down from a moderator standpoint. Best way to combat these posts are to ignore them completely.

Cheers.


----------



## LAN_deRf_HA (Nov 13, 2013)

I've heard these ACX coolers resonate around 1300 RPM, that still apply to this?


----------



## the54thvoid (Nov 13, 2013)

LAN_deRf_HA said:


> I've heard these ACX coolers resonate around 1300 RPM, that still apply to this?



It's in the review:



> Under load, I did notice the two fans emitting a whine from time to time (depending on RPM), which appears to be due to air interference. Not much noisier, it is just of a higher frequency, which makes it noticeable. The noise has more of a whoosh-like quality to it once fan speeds change.


----------



## qubit (Nov 13, 2013)

Steevo said:


> Not inferior, cheaper.



No, I meant inferior, but I don't think you quite understood my context because perhaps it could have been worded a little more clearly. I've now edited it to read "_This_ is where we want to be. Not feeling smug at the other side for having a supposedly inferior product."


----------



## xorbe (Nov 13, 2013)

How does this compare against the EVGA SC model without the ACX cooler?  Kinda interested in the boost MHz chart for that one too.


----------



## LiveOrDie (Nov 14, 2013)

I have one on order should be getting it by the end of next week  .


----------



## Amrael (Nov 15, 2013)

Right now I don't see the point. I'm hitting frames equal to this card at stock with my GTX 780 Classified (overclocked it gets like 10.02% over mine on Battlefield 3) so three months earlier I would've bought one of these or the upcoming GTX 780 Ti Classified but right now I see it as a more viable option to buy another 780 Classified, put them both together and then make them have babies. Seriously I think two Classys in Sli would tide me over for a couple of years at the least and right now it is on special at the Egg for $554.99 so yeah I think that thats the plan. Really powerful Sli for $1254.99 (If you buy both now instead of like me then it would set you back to $1110.00, 350 bucks cheaper) instead of $1460.00. And then Explain this to me: 

https://www.facebook.com/photo.php?...41830.117293321684748&type=1&relevant_count=1


----------



## beck24 (Nov 15, 2013)

*Awesome*



qubit said:


> Dammit I want one!


WOW! DITTO! Super fast, quiet, and so friggin cool! Amazing job Nvidia and EVGA>
I want this and a Gsync monitor. That should be enough for the next year or so, no problem.


----------



## qubit (Nov 15, 2013)

beck24 said:


> WOW! DITTO! Super fast, quiet, and so friggin cool! Amazing job Nvidia and EVGA>
> I want this and a Gsync monitor. That should be enough for the next year or so, no problem.



I tell you, this card, even more than the stock one shows up NVIDIA's Titan as the hyped-up marketing exercise that it was.

I couldn't believe it when they released a card for an eyewatering $1000 (£800 in English money) with an effing crippled GPU! It looks like they named it "Titan" and gave it a completely unnecessary 6GB RAM as shiny baubles to get the gullible to buy it. :shadedshu That company certainly knows how to milk their customers.


----------



## the54thvoid (Nov 15, 2013)

qubit said:


> I tell you, this card, even more than the stock one shows up NVIDIA's Titan as the hyped-up marketing exercise that it was.
> 
> I couldn't believe it when they released a card for an eyewatering $1000 (£800 in English money) with an effing crippled GPU! It looks like they named it "Titan" and gave it a completely unnecessary 6GB RAM as shiny baubles to get the gullible to buy it. :shadedshu That company certainly knows how to milk their customers.



Qubit, sometimes your mouth runs away with you.  :shadedshu

How was buying a Titan being gullible.  It's inane comments like that that make folk like you sound like you're on some kind of fascist finance gang.  

One last time for the very cheapest seats.  It's called Titan because it is what is used in the Titan super computer.  All they did was remove the ECC from it.  It has the same memory, same DP compute function and the same memory size.  It's a compute card made viable for a gaming set up.  It costs so much because it IS a compute card.  

And where are all these amazing time machines people have?  Titan came out 8-9 months ago.  I've been enjoying blistering frames and/or ultra settings with playable frames in every game I've played.  My Titan replaced two misfiring 7970's and made a huge improvement to my gaming quality.

Nobody buying a Titan was gullible.  It was at the time the most powerful single gpu that could render the highest frames for me.  It still kicks ass.  It's been replaced by other cards but it still is a superb card.  Overpriced? No.  Not if you wanted to buy it.  Would I buy one now?  Of course not.  There are better options.  Back in Feb?  It was the best option to replace my crossfire cards.

Think before you post Qubit.  Sometimes you seem clever, other times you sound really naive.


----------



## qubit (Nov 16, 2013)

54th, I stand by my statement and I don't sugar coat these things either as you can see. I see that you have a Titan, which would give you a reason to feel miffed about it if you took it the wrong way. Rest assured I wasn't directing this comment at you specifically. 

Look, Titan is actually slightly less of a compute card than 780 Ti for the simple reason the GPU is crippled. The only reason that it's "better" at compute functions is because NVIDIA have artificially crippled the 780 Ti in the driver to protect sales of their crippled $1000 card. Hack the driver (or convince NVIDIA to remove the performance loss! lol) and the 780 Ti will handily beat Titan on this, to. It has too, because it's got the full GPU, with both GPU and memory running at higher clock speeds than Titan. Ok, it's got half the memory, but does that matter so much even for compute? I dunno, but I'll bet that extra 3GB doesn't actually cost that price difference to put on and don't forget the crippled GPU and slower clocks too, lowering the price difference. Titan should actually be cheaper than 780 Ti, not the other way round. Finally, the original compute card, the K20 or whatever it's called, has the full GK110 while Titan doesn't.

Yeah, it was named after a supercomputer called Titan. Very convenient for marketing, no? 

Gamers who bought Titan were gullible, because it was quite obvious from the start that in a few months NVIDIA would bring out a better, uncrippled card (you can tell I really don't like crippled GPUs can't you? lol).

Ok, I'll give you a let in that some people have money to spare and wanted the fastest thing on earth _now_ in some cases and didn't give a damn that there would be a better version at a cheaper price in a few months' time. Fair play there, but I think most people that bought a Titan didn't see this coming and were cought out.


----------



## beck24 (Nov 16, 2013)

This card kicks serious butt! It's another league of refinement and power from the 290x, which is noisy, much hotter, and slower. IMHO $150 premium is reasonable over the life of the card. I'm amazed at the temps. It's made for overclocking.


----------



## HTC (Nov 16, 2013)

beck24 said:


> This card kicks serious butt! It's another league of refinement and power from *the 290x, which is noisy, much hotter*, and slower. IMHO $150 premium is reasonable over the life of the card. I'm amazed at the temps. It's made for overclocking.



The 290s are made to withstand higher temps then the 780s.

That being said, the reference 290s is definitely noisier and, being hotter, switching the cooler to an aftermarket non-blower type would most definitely increase the temps of other hardware which could prove troublesome. Not everybody is able/willing to water cool.

Personally, *and if i had the need for such a card*, i wouldn't go for the 290s unless they came up *blower cooler* with a far far more acceptable noise level. On the other hand, i wouldn't go for the 780 Ti either because of it's stupid prices: the 780 would do just fine, in that regard.


----------



## Amrael (Nov 16, 2013)

HTC said:


> The 290s are made to withstand higher temps then the 780s.
> 
> That being said, the reference 290s is definitely noisier and, being hotter, switching the cooler to an aftermarket non-blower type would most definitely increase the temps of other hardware which could prove troublesome. Not everybody is able/willing to water cool.
> 
> Personally, *and if i had the need for such a card*, i wouldn't go for the 290s unless they came up *blower cooler* with a far far more acceptable noise level. On the other hand, i wouldn't go for the 780 Ti either because of it's stupid prices: the 780 would do just fine, in that regard.



You see people, sound reasoning. It's not about which card is pricier, hotter, louder or who has money to burn. It's about which card you choose and which card gets you where you want to be thats all. All became real quiet about the GTX 780 but in reality that is a really great card. It doesn't have the huge price tag (although it had one), it is a little bit slower than a GTX 780 ti, a tad slower than the 290x but it overclocks really well and it doesn't heat up as much or create a racket when set to full blast. In other words, the product that actually covers the most bases is the original GTX 780, it's a more versatile, mature and affordable product. Do the math and please can someone explain why on the same hardware two GTX 780's in Sli trounced the 780 ti on the guru's reviews.


----------



## qubit (Nov 16, 2013)

Amrael said:


> Do the math and please can someone explain why on the same hardware two GTX 780's in Sli trounced the 780 ti on the guru's reviews.



Because two cards are faster than one.


----------



## BiggieShady (Nov 16, 2013)

Amrael said:


> Do the math and please can someone explain why on the same hardware two GTX 780's in Sli trounced the 780 ti on the guru's reviews.





qubit said:


> Because two cards are faster than one.



I see what you did there  ...  what he probably meant was why sometimes SLI 780 is just as fast as SLI 780 Ti ... or even why sometimes single 780 is just tad slower than SLI 780 Ti ... in any case answer is bad SLI scaling in drivers for that particular game, CPU bottleneck or both. Dat Hitman Absolution.


----------



## TheoneandonlyMrK (Nov 16, 2013)

Amrael said:


> You see people, sound reasoning. It's not about which card is pricier, hotter, louder or who has money to burn. It's about which card you choose and which card gets you where you want to be thats all. All became real quiet about the GTX 780 but in reality that is a really great card. It doesn't have the huge price tag (although it had one), it is a little bit slower than a GTX 780 ti, a tad slower than the 290x but it overclocks really well and it doesn't heat up as much or create a racket when set to full blast. In other words, the product that actually covers the most bases is the original GTX 780, it's a more versatile, mature and affordable product. Do the math and please can someone explain why on the same hardware two GTX 780's in Sli trounced the 780 ti on the guru's reviews.



now, it covers most of your bases now, just yours and a few others though. 
as even that was and is too deer for me now(780)

you need to realise there are a great many perspectives out there, and for me id  go xfire 7970s as its cheaper cooler and runs better(xfire on these is well sorted) but then thats from my perspective


----------



## Steevo (Nov 16, 2013)

FreedomEclipse said:


> Interesting ConClusion.
> 
> But IMO... What money you are *SAVING* buying a 290X you end up paying for in your electric bill. Obviously thats not true for everyone but here in the UK, people are being forced to decide between having skipping meals to pay for heating or vice versa.
> 
> ...



So the fact the real thermal output is the same has no bearing in your post shows you have no understanding of thermal load and cooling, or just a rabid fanboi to even bring it up in a unrelated competitive thread. 

Good work captain.


I am more interested in the efficiency loss of the overclock, only 9.9% despite both core and memory increase of 12% Is it at the point o diminishing returns or what is at play?


----------



## Amrael (Nov 16, 2013)

qubit said:


> Because two cards are faster than one.





BiggieShady said:


> I see what you did there  ...  what he probably meant was why sometimes SLI 780 is just as fast as SLI 780 Ti ... or even why sometimes single 780 is just tad slower than SLI 780 Ti ... in any case answer is bad SLI scaling in drivers for that particular game, CPU bottleneck or both. Dat Hitman Absolution.



Duh, of course I know that two cards are faster than one and of course a lost of Sli drivers are bad. I know all that (Farcry 3 crap Sli performance and freeze ups is a testament to that) what I'm referring to is this. 

https://www.facebook.com/photo.php?...772.1073741830.117293321684748&type=1&theater



theoneandonlymrk said:


> now, it covers most of your bases now, just yours and a few others though.
> as even that was and is too deer for me now(780)
> 
> you need to realise there are a great many perspectives out there, and for me id  go xfire 7970s as its cheaper cooler and runs better(xfire on these is well sorted) but then thats from my perspective



Well maybe you can consider Gtx 770's 4GB in Sli. A 7970 Xfire setup works really well too. Yes I know that $400 or $500 is a steep price even for a commodity and that value wise 2 7970's/R9 280X are a really good option nowadays and they scale really well over multi monitor setups so 2 thumbs up to your rig. In the matter of the GTX 780 well I kinda like the performance and this is what works for me. I might try to get a second one not because I particularly like multi card setups but it's cool to have some extra power for future proofing. The 780 Ti would interest me if I didn't have a GTX 780 already since in my opinion it is the best card for single card configurations available today with the exception of its price (Yes the 290x does really well too but I would have to go with aftermarket cooling which in and of itself would lessen the value crown. If AMD's partners come up with some nice aftermarket solutions (XFX Ghost, Msi Lightning, ASUS Direct Cu, Gigabyte Winforce) for an reasonable price then I could sell of and go back to AMD for a while until the next real upgrade comes around.


----------



## Amrael (Nov 16, 2013)

theoneandonlymrk said:


> now, it covers most of your bases now, just yours and a few others though.
> as even that was and is too deer for me now(780)
> 
> you need to realise there are a great many perspectives out there, and for me id  go xfire 7970s as its cheaper cooler and runs better(xfire on these is well sorted) but then thats from my perspective



Well maybe you can consider Gtx 770's 4GB in Sli. A 7970 Xfire setup works really well too. Yes I know that $400 or $500 is a steep price even for a commodity and that value wise 2 7970's/R9 280X are a really good option nowadays and they scale really well over multi monitor setups so 2 thumbs up to your rig. In the matter of the GTX 780 well I kinda like the performance and this is what works for me. I might try to get a second one not because I particularly like multi card setups but it's cool to have some extra power for future proofing. The 780 Ti would interest me if I didn't have a GTX 780 already since in my opinion it is the best card for single card configurations available today with the exception of its price (Yes the 290x does really well too but I would have to go with aftermarket cooling which in and of itself would lessen the value crown. If AMD's partners come up with some nice aftermarket solutions (XFX Ghost, Msi Lightning, ASUS Direct Cu, Gigabyte Winforce) for an reasonable price then I could sell of and go back to AMD for a while until the next real upgrade comes around.


----------



## qubit (Nov 16, 2013)

Amrael said:


> Duh, of course I know that two cards are faster than one and of course a lost of Sli drivers are bad. I know all that (Farcry 3 crap Sli performance and freeze ups is a testament to that) what I'm referring to is this.
> 
> https://www.facebook.com/photo.php?...772.1073741830.117293321684748&type=1&theater



Well it is what you _said_ so you can't blame someone for replying to that and there was no link in that post to clarify. 

Looking at that comparative graph, I can see that the 780 Ti SLI does worse than the 780 SLI, which it obviously shouldn't do. This is obviously a driver or game glitch that I would expect to be ironed out in short order.

Mind you, it's a bit annoying when you just wanna play your game and you have to put up with this, isn't it? Knowing that it'll be fixed "soon" isn't all that comforting at that moment.


----------



## Steevo (Nov 16, 2013)

qubit said:


> Well it is what you _said_ so you can't blame someone for replying to that and there was no link in that post to clarify.
> 
> Looking at that comparative graph, I can see that the 780 Ti SLI does worse than the 780 SLI, which it obviously shouldn't do. This is obviously a driver or game glitch that I would expect to be ironed out in short order.
> 
> Mind you, it's a bit annoying when you just wanna play your game and you have to put up with this, isn't it? Knowing that it'll be fixed "soon" isn't all that comforting at that moment.



Diminishing returns. This chip has reached it. I want to see what they have next.


----------



## qubit (Nov 16, 2013)

Steevo said:


> Diminishing returns. This chip has reached it. I want to see what they have next.



Well, we may be seeing diminishing returns, but those graphs clearly look like a driver/game issue as it's obvious that the Ti should be faster.

I'm keen to see what Maxwell can bring and makes me wonder if I should sit out the 780 Ti since it's very expensive and my 580s are plenty powerful.


----------



## radrok (Nov 16, 2013)

Steevo said:


> Diminishing returns. This chip has reached it. I want to see what they have next.



This GPU still has a lot to give for who is willing to give it what it's necessary, voltage.

This thing will literally fly with a classified/lightning PCB.

Take a look at this review, it will give you an idea of how much GK110 scales with voltage

http://www.overclockers.com/evga-gtx780-classified-hydro-copper-waterblock-review


----------



## Steevo (Nov 17, 2013)

radrok said:


> This GPU still has a lot to give for who is willing to give it what it's necessary, voltage.
> 
> This thing will literally fly with a classified/lightning PCB.
> 
> ...



Not voltage, Intel tried that with P4, and it needed 4-5Ghz clocks to reach what they thought was possible and still failed. I am as much concerned with IPC as with pure speed, and what I was referring to is the drop of efficient work done in relation to the core speed increase. 

Good job on relying on only voltage and core speed, that helps in the "290X requires too much power" argument.



qubit said:


> Well, we may be seeing diminishing returns, but those graphs clearly look like a driver/game issue as it's obvious that the Ti should be faster.
> 
> I'm keen to see what Maxwell can bring and makes me wonder if I should sit out the 780 Ti since it's very expensive and my 580s are plenty powerful.



I am interested in the push for performance efficiency that AMD brought to the table with the die size/speed of the 290. It has a high thermal load for the die size, but not for the work done at speed, so we can infer the efficiency of the shaders is at least 25% higher than what Nvidia has in this die. Imagine what would happen with this die size, shrunk to 20nm and as efficient per transistor in IPC? We are talking about a 50% increase in computational performance. I hope they get Maxwell out soon and its what I believe it could be.


----------



## radrok (Nov 17, 2013)

Steevo said:


> Not voltage, Intel tried that with P4, and it needed 4-5Ghz clocks to reach what they thought was possible and still failed. I am as much concerned with IPC as with pure speed, and what I was referring to is the drop of efficient work done in relation to the core speed increase.
> 
> Good job on relying on only voltage and core speed, that helps in the "290X requires too much power" argument.



To be fair I never complained about 290x power consumption, high end shouldn't care about it. 

I only praised that card while whining about the cooler itself. 

In fact I run my GPUs at 1.35v which make my custom bios hover at 500w for each card  

Performance comes at a price after certain thresholds


----------



## qubit (Nov 17, 2013)

Steevo said:


> I am interested in the push for performance efficiency that AMD brought to the table with the die size/speed of the 290. It has a high thermal load for the die size, but not for the work done at speed, so we can infer the efficiency of the shaders is at least 25% higher than what Nvidia has in this die. Imagine what would happen with this die size, shrunk to 20nm and as efficient per transistor in IPC? We are talking about a 50% increase in computational performance. I hope they get Maxwell out soon and its what I believe it could be.



Yes, that design efficiency is interesting. I'm keen to see what a 290x can do with a proper aftermarket cooler.


----------



## BiggieShady (Nov 17, 2013)

qubit said:


> Yes, that design efficiency is interesting. I'm keen to see what a 290x can do with a proper aftermarket cooler.



Too bad we won't see aftermarket cooler 290x until December. I guess that's because it's hard to cool all VRMs properly without blowout cooler so they need time to engineer the cooling system. People who replaced stock cooler with Arctic cooling's or Gelid's solution had to run fans at molex directly to be able to OC and keep VRMs under 80C.


----------



## radrok (Nov 17, 2013)

qubit said:


> I'm keen to see what a 290x can do with a proper aftermarket cooler.





http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/1090#post_21166617

http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/740#post_21101666


----------



## qubit (Nov 17, 2013)

radrok said:


> http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/1090#post_21166617
> 
> http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/740#post_21101666



Those look like great scores and gives us an idea, but I'd like to se an apples-to-apples comparison with a regular 290x playing real games, like we do on TPU reviews. We'll see these on here in a month or so.


----------



## radrok (Nov 17, 2013)

It's probably going to be like my FPS increase when I overclocked, 20-35% gains across the board.


----------



## erocker (Nov 17, 2013)

radrok said:


> http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/1090#post_21166617
> 
> http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/740#post_21101666



1340mhz GPU clock.. nice.


----------



## LiveOrDie (Nov 21, 2013)

Look what i got in the mail today .


----------



## the54thvoid (Nov 21, 2013)

Live OR Die said:


> Look what i got in the mail today .



Nice 

Now get benching!


----------



## LiveOrDie (Nov 24, 2013)

I'm getting these frames on bioshock infinite on max on 2560x1440 frames only drop as low as 50 when it auto saves. 

Frames: 111498 - Time: 1124657ms - Avg: 99.140 - Min: 50 - Max: 130


----------



## micropage7 (Nov 24, 2013)

Live OR Die said:


> Look what i got in the mail today .


uuumm.. mini itx with beast card. love that


----------



## beck24 (Dec 8, 2013)

BiggieShady said:


> Too bad we won't see aftermarket cooler 290x until December. I guess that's because it's hard to cool all VRMs properly without blowout cooler so they need time to engineer the cooling system. People who replaced stock cooler with Arctic cooling's or Gelid's solution had to run fans at molex directly to be able to OC and keep VRMs under 80C.


I think the delay for the 290x aftermarket cooler is because the card is pushed so hard and so hot already, that it won't be nearly as easy to work with as the 780 ti.


----------



## Vlada011 (Dec 15, 2013)

These ACX cards are perfect... I was fan of reference cards before ACX.
Biggest reason is because from side panel ACX models look incredible nice... Cards with heat pipes on side I don't like to see in my PC.
Several years no one didn't remember to make some simple and nice clean dual cooler cards as this. Over 10 manufacturer bot from AMD and NVIDIA.
All of them in race who will make weirdest, biggest metal shape and cooler.
Now we have and visual and nice temps and silent card.


----------

