# NVIDIA GeForce GTX 980 4 GB



## W1zzard (Sep 15, 2014)

Today, NVIDIA releases their new GeForce GTX 980, which brings the Maxwell architecture to the high-end. It features massive power efficiency improvements and reduced noise, but also beats the GTX 780 Ti in raw performance. Pricing is quite acceptable, too, with an MSRP of $549.

*Show full review*


----------



## newtekie1 (Sep 19, 2014)

Wow!  This card is so bad ass.


----------



## utengineer (Sep 19, 2014)

"Oh, and AMD seems fucked."  That should have been the title of the article, ha!


----------



## Mistral (Sep 19, 2014)

Have to say, I'm freaking genuinely impressed by every single aspect of this card except the price. It's amazing how much nV has managed to suck out of the 28nm process, mad props to them.

While it doesn't seem like any game really demanding an upgrade are coming in the near future, I can't wait to see what kind of performance 20nm will bring next year.


----------



## v12dock (Sep 19, 2014)

Oh god bring on the wave of "AMD is screwed" comments...


----------



## Athlon2K15 (Sep 19, 2014)

v12dock said:


> Oh god bring on the wave of "AMD is screwed" comments...



Truth is the truth, they have been shafted for years by Nvidia and Intel.


----------



## silapakorn (Sep 19, 2014)

The performance is not a big wow for me. Still waiting for the 980ti.


----------



## v12dock (Sep 19, 2014)

AthlonX2 said:


> Truth is the truth, they have been getting shafted for years by Nvidia and Intel.


Lets forget the fact that they have been training blows with Nvidia for so long now...


----------



## bim27142 (Sep 19, 2014)

THAT power consumption at THIS performance level... just picked up my jaw on the floor!


----------



## PatoRodrigues (Sep 19, 2014)

What an amazing card. I'm selling my R9 290 CFX ASAP. I mean, A.S.A.P.

Maxwell came along kicking the s**t out of everybody. But i wonder, just like with Kepler... Isn't the GTX 970 (220$ cheaper) a friendly fire?


----------



## Nihilus (Sep 19, 2014)

Not even the most rabid of AMD fanboys (me included) can deny the 970 and 980 are spectacular cards.  This reminds me of when Nvidia released the 8800 GTS that blew the socks off of Ati in every measure - until they released the equally impressive HD 4800.  AMD better go all in with the R9 390 series.


----------



## Steevo (Sep 19, 2014)

Only a 9.6? Seriously, I think I just became a Nvidia fan.  I wonder what it would do with a larger memory bus and liquid cooling. 
I feel bad for ATI/AMD for the shit that was the 285 even more now, they are making them look like monkeys throwing poo at the circus with that fuckup.


----------



## PatoRodrigues (Sep 19, 2014)

Man, AMD needs jaw-dropping performance from the R9 3xx series. Immediately.

The 295x2 looks utterly pathetic now. I'm beyond impressed.


----------



## LAN_deRf_HA (Sep 19, 2014)

Interesting they chose power consumption over big performance gains yet again. Given AMDs usual approach to closing performance gap, "overclock it till it competes" at the expense of everything else, I fear the power efficiency and noise gap will never close. It things like that which are why nvidia is considered the premium card brand.


----------



## badtaylorx (Sep 19, 2014)

PatoRodrigues said:


> Man, AMD needs jaw-dropping performance from the R9 3xx series. Immediately.
> 
> The 295x2 looks utterly pathetic now. I'm beyond impressed.


did you read the review???


----------



## HalfAHertz (Sep 19, 2014)

This is the new GTX8800!


----------



## HumanSmoke (Sep 19, 2014)

Steevo said:


> I feel bad for ATI/AMD for the shit that was the 285 even more now, they are making them look like monkeys throwing poo at the circus with that fuckup.


OK, that's an image I didn't need in my head 

How weird is the turnaround? Four years ago AMD were putting out an awesome Cypress GPU that killed on both performance and power usage metrics, and Nvidia were relegated to the sidelines and couldn't get a fully enabled GF 100 out the door. Now we have Nvidia going for dial-you-own performance/power and AMD debut a GPU for the first time in recent memory with a salvage part.


----------



## GhostRyder (Sep 19, 2014)

The funny thing I look back on is all the people saying the GTX 980 was not going to compete with the GTX 780ti and yet it seems to be a winner all around.

One heck of a card in all honesty even at that price since the performance matches.  The 4k power of this card is what makes it so appealing to me on top of the overclocking performance!



AthlonX2 said:


> Truth is the truth, they have been getting fucked ass and mouth for years by Nvidia and Intel.


I would not say that, AMD has had the Dual GPU crown continuously for quite some time with the only exception being the 690 vs 7990.  As for single GPU the 7970 and GTX 680 were neck and neck with the 7970 being a better card for high resolution as with the same argument when comparing the GTX 780ti to the R9 290X.  Its a newer generation card, of course its going to be better...

I see some major price cuts in my crystal ball !

But out of all that, the real winner here is the GTX 970.


----------



## hippogriff (Sep 19, 2014)

Bah, nothing special besides consumption (that is not very important to the majority of people). It is a OCed 780Ti with 4GB of RAM and $50 cheaper. 

The 970 is really good for the price.


----------



## Sempron Guy (Sep 19, 2014)

yeah, classifying these two cards I'd take the GTX 970 for more performance on my buck while the GTX 980 for performance per watt. And I guess this is common amongst some type of people in every card launch. Praise the card on how good it is through other card's disadvantage.


----------



## fullinfusion (Sep 19, 2014)

Nice card for sure, and I love the big ass cooler, and the back plate! but IMO Even though AMD is a few months behind I believe this card is going to drop an easy $100 in price once AMD bring out it's 3 series cards.


----------



## Jstn7477 (Sep 19, 2014)

At least I got my R9 290 from Xazax for $225, but damn, I want one (or two) for my 5820K/X99 setup arriving next week.


----------



## 15th Warlock (Sep 19, 2014)

Thanks for the review W1zz.

I'm amazed at the efficiency of this architecture, can't wait for big Maxwell at 20nm, Nvidia seems to have found the same secret sauce recipe Intel found years ago when they abandoned the Netburst architecture. Compared to Fermi, Maxwell truly is an outstanding feat of thermal and power efficiency engineering.


----------



## puma99dk| (Sep 19, 2014)

could look like i found my new card for my 27inch 2560x1440@120hz and it don't use much power either compared to my GTX 780


----------



## Steevo (Sep 19, 2014)

HumanSmoke said:


> OK, that's an image I didn't need in my head
> 
> How weird is the turnaround? Four years ago AMD were putting out an awesome Cypress GPU that killed on both performance and power usage metrics, and Nvidia were relegated to the sidelines and couldn't get a fully enabled GF 100 out the door. Now we have Nvidia going for dial-you-own performance/power and AMD debut a GPU for the first time in recent memory with a salvage part.


 I am really excited to see what this would do on a mature 20nm process, 165W now, and with a shrink perhaps 25-35W less and generally you get a frequency bump from it as well, at this rate we could see 2Ghz GPU cores with 14nm. 

AMD has been suffering from their poor management, and what seems to be an unsure direction and forced melding of ATI into the ranks. I really hope they get something better and more reasonable than the broken card they tried to pass off in the next 6 months. 

I think the 970 is the card I am going to buy for GTA5.


----------



## LightningJR (Sep 19, 2014)

vnice cards nvidia, great review also. Imagine when 20nm is available and nvidia gives us the full maxwell. The efficiency they got on 28nm is nothing short of amazing. Anyone who TRULY understands tech, which I know is a lot of us on here, knows that nvidia has just innovated the proper way, great job. I knew the 750ti's architecture was great but I didn't know if it would scale well.. Man that 970 though, $220 less and a powerhouse, that would be my preferred card for sure, maybe 2 of them


----------



## birdie (Sep 19, 2014)

I wonder why I don't hear a roar of moaning decrying NVIDIA for the fact that the company charges so much for a card with a limited transistors budget (vs. their competitor)?

Sarcasm and jokes aside, NVIDIA has overdone themselves. I'm now eagerly awaiting Maxwell based Tegra.


----------



## HumanSmoke (Sep 19, 2014)

fullinfusion said:


> Nice card for sure, and I love the big ass cooler, and the back plate! but IMO Even though AMD is a few months behind I believe this card is going to drop an easy $100 in price once AMD bring out it's 3 series cards.


That kind of depends when Bermuda actually arrives. Most of the AMD-centric talk seems centred upon Fiji (the large GPU)- which might have its own problems with the GM 200. In any event, both Bermuda and Fiji are slated for 2015 introduction, so I'm guessing AMD will have a rather meagre Christmas.


----------



## Steevo (Sep 19, 2014)

birdie said:


> I wonder why I don't hear a roar of moaning decrying NVIDIA for the fact that the company charges so much for a card with a limited transistors budget (vs. their competitor)?
> 
> Sarcasm and jokes aside, NVIDIA has overdone themselves. I'm now eagerly awaiting Maxwell based Tegra.



Nvidia needs to keep focusing on GPU, compute, and getting their hands on a X86 license, a tegra running angry birds isn't interesting, and any phone with hardware decode can run the new steam in home streaming client with as good or better support.


----------



## HumanSmoke (Sep 19, 2014)

Steevo said:


> Nvidia needs to keep focusing on GPU, compute, and getting their hands on a X86 license


That won't happen I don't think. The Intel settlement/cross-license agreement pretty much shot down Nvidia on the x86 front - both architecturally and through emulation (a la Transmeta's Crusoe). Nvidia would be worse equipped than AMD to take on Intel - at least AMD have a reasonable complement of x86 chip architects. Even if Intel gave them an early Christmas present of an x86 license, it would be years before anything eventuated from it - years of R&D that would be better spent in the ARM commodity market.

Back to the GM 204...
How long before Asus teases the Mars III   ?


GhostRyder said:


> The funny thing I look back on is all the people saying the GTX 980 was not going to compete with the GTX 780ti and yet it seems to be a winner all around.


Well, I'll put my hand up as one of those people. In my defence I would say that I never expected to see a 400mm², 28nm GPU boosting to 1250+MHz in stock standard configuration. The fact that overclocked cards are already pushing 1650+MHz on day one with limited SKUs makes good reading for future offerings (MSI Lightning, ASUS Matrix, EVGA Classified)...god knows what a Galax HOF, or EVGA KingpIn edition would clock at.


----------



## puma99dk| (Sep 19, 2014)

HumanSmoke said:


> That won't happen I don't think. The Intel settlement/cross-license agreement pretty much shot down Nvidia on the x86 from - both architecturally and through emulation (a la Transmeta's Crusoe). Nvidia would be worse equipped than AMD to take on Intel - at least AMD have a reasonable complement of x86 chip architects. Even if Intel gave them an early Christmas present of an x86 license, it would be years before anything eventuated from it - years of R&D that would be better spent in the ARM commodity market.
> 
> Back to the GM 204...
> How long before Asus teases the Mars III   ?



problem with the Mars serie is the limited production of the cards and the high price...

i still remember EVGA's GTX 560 Ti 2Win and i think it was sold for about 516usd which was only like 20 bucks over sli price, which is my world is more then acceptable


----------



## HumanSmoke (Sep 19, 2014)

puma99dk| said:


> problem with the Mars serie is the limited production of the cards and the high price...


When has that ever stopped Asus?
There will be a dual GTX 990(?) based on the power envelope demonstrated here....and for duallie lovers - of which I am not one - a dual GTX 970 for ~$600-650 would be a killer SKU (unfortunately it probably won't happen). If -more likely when- there is a dual-GPU GM 204 arrives (and it looks a certainty if only as payback to AMD for having the temerity to embarrass the hugely-expensive-card-that-shall-not-be-named), Asus will surely have to go one better.


----------



## fullinfusion (Sep 19, 2014)

HumanSmoke said:


> That kind of depends when Bermuda actually arrives. Most of the AMD-centric talk seems centred upon Fiji (the large GPU)- which might have its own problems with the GM 200. In any event, both Bermuda and Fiji are slated for 2015 introduction, so I'm guessing AMD will have a rather meagre Christmas.


I agree, AMD isn't known for low power, but hey I'm betting they give these new green cards a good run for their money if not better on raw HP!

I Love AMD cards but this new gpu interest me to no end. Would I buy one? Nope but they sure work nice. Either way AMD vs Nvidia is a win all around when they finally launch them.

This time I'm going to be on the Reference bus, not like the last time but those miners fuckd it all up lol... no offense guys I'm just saying


----------



## birdie (Sep 19, 2014)

HalfAHertz said:


> This is the new GTX8800!



More like 8800GT (~980) after 8800 GTX ( ~ 780Ti).


----------



## Naito (Sep 19, 2014)

Wow. Colour me impressed! Definitely wasn't expecting it to be hitting frames over the GTX 780Ti. Pretty much 7% faster all round! 

Brilliant review as per usual W1z! Always a great read.


----------



## HumanSmoke (Sep 19, 2014)

fullinfusion said:


> I agree, AMD isn't known for low power, but hey I'm betting they give these new green cards a good run for their money if not better on raw HP!


History would say you are correct. Timeframe will be the important part. Pity HBM looks slated only for big die/expensive cards, a two-thirds reduction in memory power usage could be applied quite nicely to a performance range of cards. I think the current vRAM usage for a 3-4GB card runs about 50-80W - quite a sizeable chunk of the power budget.


----------



## CounterZeus (Sep 19, 2014)

Buying this card when I can. Will be a nice upgrade from a three year old gtx 560ti


----------



## Melvis (Sep 19, 2014)

Well well well the GTX 980 is faster then a GTX 780 Ti and its cheaper, well done Nvidia, im impressed with the GTX 980 for sure. The GTX 970 is also pretty good for the price, great cards indeed  Its all about price/performance and these cards are looking good for the first time in along time for high end Nvidia cards (Aus) 

Edit: Now this is an impressive price for a GTX 970, got me interested Nvidia big time. http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1692&products_id=29120


----------



## fullinfusion (Sep 19, 2014)

@Nvidia I love your cards but hate your driver software!

Gimme an AMD type software and you got my cash!


----------



## Pancho (Sep 19, 2014)

Dang, I was hoping this would be a nice improvement over last generation, but it is a mediocre upgrade at best... Now I will be waiting till April or May, for nVidia to bring something to the table worth spending my money on.... count me not impressed.


----------



## fullinfusion (Sep 19, 2014)

Pancho said:


> Dang, I was hoping this would be a nice improvement over last generation, but it is a mediocre upgrade at best... Now I will be waiting till April or May, for nVidia to bring something to the table worth spending my money on.... count me not impressed.


Yeah but think a 780TI is slower not by much, but these clock like mad, sammy memory I'm sure will have no problem Cracking 8000MHz

And the power difference!

Think of it like this, A 780TI would need 100% of its game settings to be Beautifull, But... These cards do it at moreless half the power and Half the clocks


----------



## Pancho (Sep 19, 2014)

fullinfusion said:


> Yeah but think a 780TI is slower not by much, but these clock like mad, sammy memory I'm sure will have no problem Cracking 8000MHz
> 
> And the power difference!
> 
> Think of it like this, A 780TI would need 100% of its game settings to be Beautifull, But... These cards do it at moreless half the power and Half the clocks



Power means nothing to me, I am all for performance, and the only company that offers true performance is Nvidia, so this release is disappointing.  I want a stock card that is at the very least 20% to 30% greater than last generation... a 3% to 4% increase is pulling an AMD gimmick, no thanks... even at OC's people are seeing a 14% to 17% increase over last gen, sorry, but that is still terrible.   Anyways just my opinion, you don't have to agree.


----------



## fullinfusion (Sep 19, 2014)

Pancho said:


> Power means nothing to me, I am all for performance, and the only company that offers true performance is Nvidia, so this release is disappointing.  I want a stock card that is at the very least 20% to 30% greater than last generation... a 3% to 4% increase is pulling an AMD gimmick, no thanks... even at OC's people are seeing a 14% to 17% increase over last gen, sorry, but that is still terrible.   Anyways just my opinion, you don't have to agree.


I agree power means shit! Im the same way but really i see the same or better on my 290'S

No big deal but I agree, The same performer but turned down to 780TI speeds for the same eye candy.


----------



## MxPhenom 216 (Sep 19, 2014)

fullinfusion said:


> @Nvidia I love your cards but hate your driver software!
> 
> *Gimme an AMD type software and you got my cash!*



Wait seriously? 98% of people would have it the other way.



Pancho said:


> Dang, I was hoping this would be a nice improvement over last generation, but it is a mediocre upgrade at best... Now I will be waiting till April or May, for nVidia to bring something to the table worth spending my money on.... count me not impressed.


They will wait till AMD counters with te R9 3xx, then release something to beat that. Same cycle it has been for years.


----------



## overpass (Sep 19, 2014)

What a smart card! What an efficient card! 
This card reminds me of a something out of sci-fi, something that materializes out of the blue, does amazing things with so much technology (MFAA!) and mastery of rendering packed in, effortlessly and beautifully. nVidia makes leaps in generational tech that makes me a believer in not only the performance but opens my eyes unto possibility per watt and decibel, that 'Ge' in 'GeForce' now must mean 'Great efficiency'.  The 970 should be standard part in every Steam boxes, it is as a 'reference' part in computer graphics as any I've seen for a long, long time. Even the 980's MSRP, $549, nears that golden bar of high performing, new generation part and with rebates I can certainly see it reaching that figure. A mature, applicable technology, smart design, and fair price, nVidia crosses all the t's and dots all i's and then some (Ts and Is...hmm do I smell a 980 TI down the road???). It is only a shame that nVidia doesn't design CPUs then it will be a beautiful and greener World the way Louis Armstrong envisioned. 

THIS REVIEW AND WASTELAND 2 RELEASE THIS VERY DAY MAKE ME A VERY HAPPY PERSON IS ALL.


----------



## LeonVolcove (Sep 19, 2014)

I am about to spend my money for R9 290 and now this "Behemoth" coming, Damn, what should i do?????


----------



## HumanSmoke (Sep 19, 2014)

LeonVolcove said:


> I am about to spend my money for R9 290 and now this "Behemoth" coming, Damn, what should i do?????


Wait until GTX 670  970's are widely available - then you'll have your choice of an expensive fully enabled GTX 980, a much cheaper GTX  670  970 with decent OC options, or a R9 290 with newly cratered pricing.


overpass said:


> A mature, applicable technology, smart design, and fair price, nVidia crosses all the t's and dots all i's and then some (Ts and Is...hmm do I smell a 980 TI down the road???).


Probably not since the GTX 980's GM 204 is fully enabled anyway. Although a GTX 970 Ti is almost a certainty.


----------



## Lionheart (Sep 19, 2014)

Okay Nvidia you won me over


----------



## HammerON (Sep 19, 2014)

What an amazing card (even more so is the 970)
This really puts the pressure on AMD.


----------



## fullinfusion (Sep 19, 2014)

MxPhenom 216 said:


> Wait seriously? 98% of people would have it the other way.
> 
> 
> They will wait till AMD counters with te R9 3xx, then release something to beat that. Same cycle it has been for years.


Im the 2 % lol

and @HammerON  I dont think so... AMD is still going to be top heat and power sucking sob of a card but I do believe its going to be on par fps wise..... and cheaper


----------



## Melvis (Sep 19, 2014)

LeonVolcove said:


> I am about to spend my money for R9 290 and now this "Behemoth" coming, Damn, what should i do?????



Im in the same boat as you, got my cards up for sale ready to buy a R9 290 once my cards are sold but now I think ill be getting the GTX 970 instead, and people called me an AMD fanboy lol


----------



## HammerON (Sep 19, 2014)

This might be the card which allows me to go away from a dual-card setup to run a 2560x1600 resolution and have decent/average 60 fps.....
For a decent price


----------



## ensabrenoir (Sep 19, 2014)

......dude mini itx users all across the world just all smiled at the same time....... and power supply manufacturers just cursed ....... scraping  plans for  1800 and 2000  watt psu's and mulling price cuts on the 1000 to 1600 watt range.... oh such glorious times....


----------



## erixx (Sep 19, 2014)

Any suggestion for ordering now in Europe? (I never did this crazy before, but hey, it is a 50% improvement over 670! )  "Oh, and *** seems fucked." LOL Wiz!


----------



## Ja.KooLit (Sep 19, 2014)

wow. Look at that power consumptions. 

Now lets wait for AMD what to say.


----------



## Rahmat Sofyan (Sep 19, 2014)

How AMD will response these Maxwell?


----------



## erixx (Sep 19, 2014)

just finished reading W1z 970 SLI review... wow!!!!! did I say WOW?


----------



## the54thvoid (Sep 19, 2014)

Pancho said:


> Dang, I was hoping this would be a nice improvement over last generation, but it is a mediocre upgrade at best... Now I will be waiting till April or May, for nVidia to bring something to the table worth spending my money on.... count me not impressed.



You're missing very much the point of the product.  The GM204 chip has been engineered to better the GTX680 part by X2.  This is Nvidia moving with the times and creating the perfect platform for developing future GPU's.  Some are moaning about it not being x% faster than a GTX780ti.   You people need to go back to the lab and study up on physics.  This card uses 2 billion less.... yes, 2 BILLION less transistors than 780ti.  This card still performs higher than 780ti (though at stock clocks that might just be a frequency effect).  That said, it overclocks to stupid levels - so far most reviewers are getting it to boost at 1400-1500Mhz.  Cmon, ffs - that's insane.  Even at those clocks it consumes less than a 780Ti and bests it by 20-25%.

This is a fantastic move.  And if you really must bitch on about it not being so much more awesomely fantastic in performance - it's not the x10 chip, it's the x04 chip.  To replace the 780/780ti chip you need the GM210 part.  You'll get your jump in performance there but as folk have said, they'll sit on that and perfect it and wait for AMD to make their move.

If people are disappointed with the technical performance of this card, they really ought to get a grip on scientific reality.

All that said, I'll keep my 'relatively speaking' gas guzzling 780ti until the big maxwell comes out.


----------



## LeonVolcove (Sep 19, 2014)

HumanSmoke said:


> Wait until GTX 670's are widely available - then you'll have your choice of an expensive fully enabled GTX 980, a much cheaper GTX 670 with decent OC options, or a R9 290 with newly cratered pricing.
> 
> Probably not since the GTX 980's GM 204 is fully enabled anyway. Although a GTX 970 Ti is almost a certainty.



GTX670???


----------



## hardcore_gamer (Sep 19, 2014)

LeonVolcove said:


> GTX670???



He was talking about my 670s. Two of 'em will be available after I get a 970 SLI


----------



## HumanSmoke (Sep 19, 2014)

LeonVolcove said:


> GTX670???


Sorry, typo. GTX *9*70


----------



## HumanSmoke (Sep 19, 2014)

night.fox said:


> Now lets wait for AMD what to say.


AMD Exec: "Hey guys, I'm going out for fresh underwear I seem to have shit mine....anyone need a pai-.......OK, a pair for everyone"


----------



## buildzoid (Sep 19, 2014)

I really hate this card. It has the performance I want but it has the Nvidia treatment that I hate Nvidia for. Crap PCB, IMO crap control panel, locked voltage, low power limits and a price tag that is still higher than the one on the R9 290X when it launched(my R9 290X cost 12,000czk, one of these is 15,000+czk).


----------



## HTC (Sep 19, 2014)

Nice performance, consumption and noise: quite impressed.

Now, all that's needed is a "reply" from AMD with cards that come VERY close to this performance (above or below: doesn't matter) so the price wars starts.


----------



## 1d10t (Sep 19, 2014)

Big thanks for Wizz (at last) providing graphs for multimonitor geeks 
Great card and that power consumption though  
So this is their weapon of choice,cut down transistors making it cooler and more efficient...they had the reason to cut down VRM module to mediocre level cause these chip ain't gonna need more than 200W.Great feat nVidia


----------



## adulaamin (Sep 19, 2014)

Although I'm super impressed with the power consumption, I'm gonna stick with my 780ti until 20nm comes around or until AMD releases their new GPUs to see which would offer better perf/$ and perf/watt. 

Thanks for the review Wiz!


----------



## HumanSmoke (Sep 19, 2014)

buildzoid said:


> I really hate this card. It has the performance I want but it has the Nvidia treatment that I hate Nvidia for. Crap PCB, IMO crap control panel, locked voltage, low power limits


Yep, pretty shitty. It's a wonder Nvidia engineers can look at themselves in the mirror...









buildzoid said:


> and a price tag that is still higher than the one on the R9 290X when it launched(my R9 290X cost 12,000czk, one of these is 15,000+czk).


Well, if price is a concern, that's what the GTX 970 is for isn't it ? I'm pretty sure they are cheaper than the 290X's launch price







Rahmat Sofyan said:


> How AMD will response these Maxwell?


The usual - price cuts. 290X gets chopped down to $399 + a refreshed game bundle next week apparently.


----------



## ChristTheGreat (Sep 19, 2014)

Tri-SLI over a 1000w PSU without any problem? Wow

I won't be seeling my R9 290 OC, for the price I paid it (280$) and having now a waterbloc on it, but wow.

This will make alot of used card I think..  and at a low price.. AMD will drop the price for sure, R9 290, will have to go to 300$.. and the 970 is like almost half power of a R9 290 at max...


----------



## hardcore_gamer (Sep 19, 2014)

So 4K is now playable with a card setup under $700. Peasants are still stuck at  1080/900/720, depending on which "nextgen" cablebox they bought.


----------



## hardcore_gamer (Sep 19, 2014)

Sony Xperia S said:


> Hmm, and you can get for less than $700 the performance of TitanZ which is $2999.99.
> 
> 
> 
> AMD executing management is not very strong but nvidia's is not either.



I don't give a flying fvck about AMD/nvidia's management as long as they give me cards like these.


----------



## JBVertexx (Sep 19, 2014)

You guys absolutely killed it with your coverage of this release today - best coverage on the net.  Well done!


----------



## Sanhime (Sep 19, 2014)

Currently running GTX 690.  Worth upgrading to GTX 980?


----------



## crow1001 (Sep 19, 2014)

" Oh, and AMD seems fucked "

Real professional. Looks like someone is in bed with Nvidia.


----------



## BigMack70 (Sep 19, 2014)

the54thvoid said:


> You're missing very much the point of the product.  The GM204 chip has been engineered to better the GTX680 part by X2.  This is Nvidia moving with the times and creating the perfect platform for developing future GPU's.  Some are moaning about it not being x% faster than a GTX780ti.   You people need to go back to the lab and study up on physics.  *This card uses 2 billion less.... yes, 2 BILLION less transistors than 780ti.  This card still performs higher than 780ti (though at stock clocks that might just be a frequency effect).*  That said, it overclocks to stupid levels - so far most reviewers are getting it to boost at 1400-1500Mhz.  Cmon, ffs - that's insane.  Even at those clocks it consumes less than a 780Ti and bests it by 20-25%.
> 
> This is a fantastic move.  And if you really must bitch on about it not being so much more awesomely fantastic in performance - it's not the x10 chip, it's the x04 chip.  To replace the 780/780ti chip you need the GM210 part.  You'll get your jump in performance there but as folk have said, they'll sit on that and perfect it and wait for AMD to make their move.
> 
> ...



What matters in the end, unless you are paying some kind of absolutely absurd rate for your electricity, is performance. Performance/watt is a nice metric, but acting like end users should be impressed by that (as many reviewers are doing) is silly. There are only three times performance/watt matters:

1) You pay through the nose for electricity and see a meaningful difference in your monthly electricity bill by saving 100W or so of electricity (99.9% of people don't pay electricity rates this high)
or
2) Performance/watt is so bad that the card can't cope (too hot/loud... think GTX 480 / R9 290X)
or
3) You want to use multiple GPUs but have a small/midrange PSU and don't want to upgrade it


#1 applies to almost nobody, and #2 hasn't really applied to an Nvidia card since the 480, so all the fawning over performance/watt is overblown. #3 is the only real-world usefulness to this, I think - it's pretty epic that you could more or less go 980 SLI on a 650W power supply. But do you spend all your time when your PC is on obsessing over the readout of your Kill-A-Watt meter? No? I didn't think so. So let's talk more about performance and less about performance/watt.

And saying that the point of this product is to replace the GK104 cards is silly. They EOL'd the GK110 gaming cards; clearly this is seen as a replacement for them. And from a pure performance standpoint, it's unimpressive. ~10% better than previous gen. *Yawn* ... all the lavish praise being given to this card is unjustified. It's nice, but "the new 8800" this is not.

The 970, on the other hand, is a real home run at $330. Insanely good value there. But the 980 is very by-the-numbers "meh".

Call me when the real high end Maxwell gets here.

Also, that last line in the review is hilarious (and almost certainly accurate), but quite unprofessional. Are we back in grade school here? Disappointing, W1zzard.


----------



## ensabrenoir (Sep 19, 2014)

Wow.....some people. ... The fact that Nvidia released something this great and didn't price it ridiculously is a win for every one.   this is just progress.   Next amd will answer and the cycle will continue


----------



## BigMack70 (Sep 19, 2014)

ensabrenoir said:


> Wow.....some people. ... The fact that Nvidia released something this great and didn't price it ridiculously is a win for every one.   this is just progress.   Next amd will answer and the cycle will continue


Because the performance increase of the 980 is so small relative to the 780 ti, this release is not a win for:
1) Anyone who owns a 290/780 or better
2) Anyone wanting 4K gaming to become more viable

#2 is the big disappointment for me... it looks like we'll have to wait for big Maxwell to get any meaningful improvements to the 4k experience as far as performance goes.

It's a big win for those with GTX 6xx/HD 7xxx or older cards who are looking for an upgrade, though.


----------



## Katanai (Sep 19, 2014)

"Oh, and AMD seems fucked."


To read this on techpowerup was sooooooooooo good. Ahahahahahahahaha!


----------



## ensabrenoir (Sep 19, 2014)

BigMack70 said:


> Because the performance increase of the 980 is so small relative to the 780 ti, this release is not a win for:
> 1) Anyone who owns a 290/780 or better
> 2) Anyone wanting 4K gaming to become more viable
> 
> ...




The skip a generation rule will never go away for top tier products.  Everyone on mid  to low will love this though. The best feature of all is the PRICE...coming from Nvidia of all people ..


----------



## hippogriff (Sep 19, 2014)

I didn't expect anything phenomenal, so I am not disappointed. I will wait for Titan II or whatever will be a 20% improvement over my 780Ti Classified....


----------



## BigMack70 (Sep 19, 2014)

ensabrenoir said:


> The best feature of all is the PRICE...coming from Nvidia of all people ..



I wholeheartedly agree with this if we're talking about the 970. $330 for what is basically a 780 ti is absolutely insane. It's almost equivalent to a 50% price cut on their flagship card. That's just freaking nuts and I never expected that from Nvidia given their recent pricing trends.

I don't really agree with the impressiveness of the price on the 980 though. It's basically a 10% price cut and 10% performance boost compared to the 780 ti. That's nice and certainly nothing to complain about, but it's a very ordinary setup for a new GPU release, nothing special.

IMO I don't see why anyone would buy a 980 outside of forum signature e-peen. The 970 is a vastly better value proposal, and if you have the spare cash, 970 SLI is just a measly $100 more expensive than the 980 for WAY more performance.


----------



## ensabrenoir (Sep 19, 2014)

BigMack70 said:


> I wholeheartedly agree with this if we're talking about the 970. $330 for what is basically a 780 ti is absolutely insane. It's almost equivalent to a 50% price cut on their flagship card. That's just freaking nuts and I never expected that from Nvidia given their recent pricing trends.
> 
> I don't really agree with the impressiveness of the price on the 980 though. It's basically a 10% price cut and 10% performance boost compared to the 780 ti. That's nice and certainly nothing to complain about, but it's a very ordinary setup for a new GPU release, nothing special.



Yeah. ..this is still Nvidia were talking about here....Im sure they've got a ti with the performance you want and a price we both hate up their sleeves just waiting


----------



## BigMack70 (Sep 19, 2014)

ensabrenoir said:


> Yeah. ..this is still Nvidia were talking about here....Im sure they've got a ti with the performance you want and a price we both hate up their sleeves just waiting



Yup GM210/GM200 or whatever the big maxwell chip winds up being called when it's released. That's what I'm interested in... I want to see 4k 60 fps achievable with two top cards in SLI, and we're not there yet. Fingers crossed that big maxwell will get us there.


----------



## the54thvoid (Sep 19, 2014)

crow1001 said:


> " Oh, and AMD seems fucked "
> 
> Real professional. Looks like someone is in bed with Nvidia.



Why? for stating the obvious.  W1zzard has a personality.  And TPU isn't a highbrow, etiquette driven formal government body.  If it was, there would be no profanity allowed on any post, in fact there'd probably not even be  forum.

NV have dropped a bomb smack in AMD's performance product line up.  They have nothing to answer with for now and what is troubling for AMD is that we know GM210 will be coming too.  GTX 970 is a very sweet price/perf card.  The AIB versions are even better. 

The fact that our esteemed leader speaks his mind is a fucking breath of fresh air.  If you don't like it GTFO, it's his website.



BigMack70 said:


> What matters in the end, ..._<all the things you said>_...Call me when the real high end Maxwell gets here.
> 
> Also, that last line in the review is hilarious (and almost certainly accurate), but quite unprofessional. Are we back in grade school here? Disappointing, W1zzard.



What matters *IS* power/watt performance.  If you think that you and the folk that say MORE POWAH are all spot on, then wow, the tech industry must ALL be going in the wrong direction.  I won't argue with you as it's your point of view but frankly, what NV have done is spot on from a business and tech point of view.  I see you have 2 x 780's so I don't know what your beef is.  This upgrade isn't for you, as it isn't for me.  Although TBH, if I can move sideways to a better performing product that consumes less energy then hey, that's a positive to me.

And it's not about the money on electricity, it's the requirement of a growing population to be more frugal with finite resources.  But I'm not here to lecture anyone.

EDIT: and I'm not having a go at you BigMack, you're a decent forumer


----------



## 64K (Sep 19, 2014)

BigMack70 said:


> I wholeheartedly agree with this if we're talking about the 970. $330



I agree. I made my mind up early this morning after reading the reviews to go with the EVGA GTX 970 SC with ACX cooler. W1zzard says it's only 8% slower than a reference GTX 980 and it's $210 cheaper. It can be had at Newegg for $330 including the $10 EVGA rebate. It's a very nice upgrade from my GTX 680.


----------



## BigMack70 (Sep 19, 2014)

the54thvoid said:


> And it's not about the money on electricity, it's the requirement of a growing population to be more frugal with finite resources.



Then why aren't we marking these midrange/high-end cards down a lot and encouraging everyone to buy a 750 ti or use integrated graphics? Better yet, stop playing video games altogether and change your hobby to hiking. 

Seems to me that using this as a philosophical standard to evaluate GPUs is fundamentally hypocritical.

Performance/watt matters in the sense that if you can improve that metric, it will ultimately enable you to achieve much higher performance. But performance/watt is a means to an end, not an end in itself, and so we shouldn't be praising cards solely on the basis of performance/watt, as I'm seeing done so much around the 980 reviews.


----------



## the54thvoid (Sep 19, 2014)

BigMack70 said:


> Then why aren't we marking these midrange/high-end cards down a lot and encouraging everyone to buy a 750 ti or use integrated graphics? Better yet, stop playing video games altogether and change your hobby to hiking.



Thought someone would say that.  The resort to 'give up' technology is pretty much what the Taliban seek, so I'm not going to align myself with crazy fundamentalists.  No, while we work with technology that requires power, we will always strive to reign in that power.  I'm sorry you don't see it that way but that is how technology develops.  If you think that's wrong, argue with Intel, not me.



BigMack70 said:


> Seems to me that using this as a philosophical standard to evaluate GPUs is fundamentally hypocritical.
> 
> Performance/watt matters in the sense that if you can improve that metric, it will ultimately enable you to achieve much higher performance. But performance/watt is a means to an end, not an end in itself, and so we shouldn't be praising cards solely on the basis of performance/watt, as I'm seeing done so much around the 980 reviews.



Like someone has said and like i said earlier to another post, this is the GM204 chip.  You should bloody well know that the 'big' Maxwell part is still coming.  The stack is now aligned as 770->780 ->780ti, (GK104 ->GK110->GK110).  This is the start process again, 970->980->980ti(we assume).  It's still GM204->GM210->GM210(we assume).  

Don't worry, the more powerful part will appear.  You'll have your performance beast.


----------



## ensabrenoir (Sep 19, 2014)

crow1001 said:


> " Oh, and AMD seems fucked "
> 
> Real professional. Looks like someone is in bed with Nvidia.



No.... if he was Amd would have listened to him and  never tried to release that  285  nonsense to compete with the 760s in the face of these the true competition


----------



## GhostRyder (Sep 19, 2014)

ensabrenoir said:


> No.... if he was Amd would have listened to him and  never tried to release that  285  nonsense in the face of these the true competition


At what point do you think the R9 285 was at all competition for the GTX 970 or 980?  Its like saying the GTX 750ti was competition for the R9 290X, it was not even in the same category and was intended to fit into the area of mid range and replace the old mid range cards which a new architecture introduction.  Which is exactly what it did, lower power consumption and more performance than the card it replaces...

The GTX 980 price point is not that bad but it seems horrid because the GTX 970 is at such a low spot.  The GTX 970 is the real winner card here and seems to be the choice card right now.  Its seated right where its expected to be considering its better overall than the GTX 780ti which costs more while using less power and containing more ram.  Its an overall excellent launch.


----------



## RazrLeaf (Sep 19, 2014)

BigMack70 said:


> Because the performance increase of the 980 is so small relative to the 780 ti, this release is not a win for:
> 1) Anyone who owns a 290/780 or better
> 2) Anyone wanting 4K gaming to become more viable
> 
> ...


Honestly, with the sheer amount of power required for 4K, I don't think you could pack enough transistors on the die at a process larger than 20nm.
You could expect to get double the performance (of 1440p @ 60fps) in a generation, but it will never happen at this point. =/  It's not physically possible, and it wouldn't make business sense either.


----------



## BigMack70 (Sep 19, 2014)

the54thvoid said:


> Thought someone would say that.  The resort to 'give up' technology is pretty much what the Taliban seek, so I'm not going to align myself with crazy fundamentalists.  No, while we work with technology that requires power, we will always strive to reign in that power.  I'm sorry you don't see it that way but that is how technology develops.  If you think that's wrong, argue with Intel, not me.



This doesn't at all answer my objection that performance/watt is a means to an end and not an end in itself, and thus we should not be heaping praise on a high end GPU for reason of that metric alone. Performance/watt is meaningless in and of itself - the reason it's meaningful is because it enables either higher performance OR lower power consumption. It's the same thing with regard to how technology develops... technology doesn't push forward in an arbitrary quest for more performance/watt. It pushes forward toward the goal (depending on the application) of either more performance or lower power, and performance/watt improvements are the MEANS by which technology gets there.



RazrLeaf said:


> Honestly, with the sheer amount of power required for 4K, I don't think you could pack enough transistors on the die at a process larger than 20nm. You could expect to get double the performance (of 1440p @ 60fps) in a generation, but it will never happen at this point. =/



Given that Maxwell approaches 2x the efficiency of Kepler, I think it's reasonable to expect a pair of GM210/200 to hit 4k 60 Hz. That's just speculation of course, but I think reasonable. We're close as it is - only the most demanding games require you to turn settings down on a pair of GK110/GM204/Hawaii cards.


----------



## buggalugs (Sep 19, 2014)

980 is nice card  but wow some people have short memories.

AMD is already building a competitor for the 980X. When the 7970 came out, it was much better than the 580 "Oh no Nvidia is fucked" Then Nvidia release 680 a few months later 10% faster than 7970.  Then AMD release the 7970 Ghz, same or better than 680Then Nvidia release 780 "oh no AMD is fucked" Then AMD release 290X matching 780 performance. Now Nvidia release the new generation 980 and "AMD is fucked again" even though AMD is already manufacturing a competitor.

 You cant compare old gen to new gen. AMD and NVidia never release new generations at the same time, usually a few months apart.

4870  vs 280,same gen 4890 vs GTX285
5870  vs 480
6970  vs 580
7970 vs  680 same gen 7970 Ghz edition
290X vs 780 same gen 780Ti
390X vs 980  Hasn't happened yet

 If there is some inside knowledge about the 390X, not being able to compete with 980 please let us know, otherwise its too early to write off AMD.


----------



## v12dock (Sep 19, 2014)

buggalugs said:


> 980 is nice card  but wow some people have short memories.
> 
> AMD is already building a competitor for the 980X. When the 7970 came out, it was much better than the 580 "Oh no Nvidia is fucked" Then Nvidia release 680 a few months later 10% faster than 7970.  Then AMD release the 7970 Ghz, same or better than 680Then Nvidia release 780 "oh no AMD is fucked" Then AMD release 290X matching 780 performance. Now Nvidia release the new generation 980 and "AMD is fucked again" even though AMD is already manufacturing a competitor.
> 
> ...


Thank You.


----------



## GhostRyder (Sep 19, 2014)

buggalugs said:


> 980 is nice card  but wow some people have short memories.
> 
> AMD is already building a competitor for the 980X. When the 7970 came out, it was much better than the 580 "Oh no Nvidia is fucked" Then Nvidia release 680 a few months later 10% faster than 7970.  Then AMD release the 7970 Ghz, same or better than 680Then Nvidia release 780 "oh no AMD is fucked" Then AMD release 290X matching 780 performance. Now Nvidia release the new generation 980 and "AMD is fucked again" even though AMD is already manufacturing a competitor.
> 
> ...


^^^This guy!  Great response!


----------



## W1zzard (Sep 19, 2014)

BigMack70 said:


> There are only three times performance/watt matters:
> 
> 1) You pay through the nose for electricity and see a meaningful difference in your monthly electricity bill by saving 100W or so of electricity (99.9% of people don't pay electricity rates this high)
> or
> ...


Performance/Watt matters everywhere nowadays, I agree that cost for electricity is not significant for most people and saving the environment nobody really cares about.

You can not put just copy and paste more and more transistors/shaders/units into a GPU to make it more powerful. You always have to cool it somehow. And you need the voltage regulation circuitry to do so, too.

So let's assume AMD doubles their Hawaii chip, this means twice the power consumption=heat generated inside a 2-slot graphics card. Any ideas how to cool? Alright, water. What about a few years down when people are asking for another performance doubling? Can't cool. Not even with water.

Also the voltage regulation circuitry for such a card would be big, how do you plan on powering such a card? 3x 8-pin ? 4x 8-pin? Maybe its own wall powered PSU ? You also need to feed these insane currents into the GPU through the little pins/solder balls. 400 W = 400 Amps at 1 V GPU.

Option 1) Wait for TSMC to come out with 20 nm, and then wait another few years for 16/14/12 nm. And beg they'll take your chip manufacturing order before Apple's.
Option 2) Improve power efficiency from the ground up. Think about stuff that happens inside the chip and ask your smartest engineers how to do it differently with less power, even if it might cost you more transistors or die area.

NVIDIA did that and, boom, twice as efficient. If you need faster you can just copy and paste more units onto the chip, without having to worry about power consumption.

Now take that efficiency and build a notebook gaming chip with it. Most important for laptop: battery life + heat. Win! -> $$
Then start building smartphone and tablet processors with awesome GeForce tech and better power consumption than everybody else: more $$


----------



## Max Mojo (Sep 19, 2014)

Didn't I swear yesterday I will not upgrade until my GTA V PC delay pain will be finito at January 27th 2015?
Luckily I haven't stick to my words. Obviously a super card at first glance.

Now I'm waiting for the EVGA card reviews. Interesting model for me is the EVGA GTX 980 Superclocked ACX 2.0. Though I'd prefer the vanilla stock version which looks so stunning cool. By the way the I/O connectors positions look strange. Why not 3 Displayports in a row one after the other?  
Definitely I will upgrade from GTX 680 SLI to single GTX 980.


----------



## BigMack70 (Sep 19, 2014)

W1zzard said:


> Performance/Watt matters everywhere nowadays, I agree that cost for electricity is not significant for most people and saving the environment nobody really cares about.
> 
> You can not put just copy and paste more and more transistors/shaders/units into a GPU to make it more powerful. You always have to cool it somehow. And you need the voltage regulation circuitry to do so, too.
> 
> ...



I agree 100% that performance/watt matters in the sense that you must get performance/watt improvements to achieve either lower power consumption or higher performance. 

I do not agree that performance/watt matters in abstraction from those other metrics (power use or overall performance).

On a high end GPU, I do not think you can make a plausible argument that low power consumption is the desired metric by itself (if low power is the only concern, why are you buying a high end GPU at all?). I suppose you could make an argument that a balance between power consumption and raw performance is important (something which GM204 excels at), but I personally don't agree with this unless the performance/watt balance is so out of whack as to upset one of those criteria I mentioned earlier (your electric bill, noise/heat of the card, PSU requirements). 

I still think that unless perf/watt is at offensive levels, it is an irrelevant metric on a high end GPU and only the performance itself matters.


----------



## ensabrenoir (Sep 19, 2014)

[QUOTE="GhostRyder, post: 3166638, member: 149328"*]At what point do you think the R9 285 was at all competition for the GTX 970 or 980?* .[/QUOTE]

aaahhh......NONE......thats why i said 760.... The point of my statement was that they're efforts and resources should have placed else where.... The 760 should not have been the benchmark to beat.  After the 750ti.... everyone knew where Nvidia was headed.   What amd did is essentially like Ford building a New(2014) car to compete against a 2012 Honda when Honda is about to release the 2015 model. I feel that in order for Amd to truly right itself, it must think beyond the "just good enough" mentality. Unless they're moving  tremendous volume, they will never grow, just maintain. I honestly believe.......honestly Hope...  Amd has better ideas and products than what were seeing now.  And yeah i have Nvidia now but i started with ati and went amd for the most part of my enthusiast life.


----------



## W1zzard (Sep 19, 2014)

BigMack70 said:


> that low power consumption is the desired metric by itself


I agree, at least not for our subset of users. But low power consumption is the gateway to success these days, and as I mentioned, it trickles down into your other products. Intel's x86 isn't so good for mobile devices right now, but what if they save another 20% on power due to design changes and another 50% due to new process (Intel has their own fabs, has the best and most advanced process tech and the money for r&d and construction).


----------



## Nabarun (Sep 19, 2014)

W1zzard said:


> Performance/Watt matters everywhere nowadays, I agree that cost for electricity is not significant for most people and saving the environment *nobody* really cares about.
> ...


I care...  And the cost for electricity is un-freakin-believable in this neighbourhood. May be in the US/Europe the prices are much lower, because of many nuclear reactors. In developing countries the situation is nightmarish. The cost per unit of current burnt rises exponentially over here. My PC runs 24x7 mostly, so the power efficiency is a huge factor. From what I've gathered courtesy of your 980/970 reviews, it looks like the 970 will be my new card after all. Thanks for such extensive coverage of game-specific performance and with so many different cards.


----------



## W1zzard (Sep 19, 2014)

Nabarun said:


> I care...  And the cost for electricity is un-freakin-believable in this neighbourhood. May be in the US/Europe the prices are much lower, because of many nuclear reactors. In developing countries the situation is nightmarish. The cost per unit of current burnt rises exponentially over here. My PC runs 24x7 mostly, so the power efficiency is a huge factor. From what I've gathered courtesy of your 980/970 reviews, it looks like the 970 will be my new card after all. Thanks for such extensive coverage of game-specific performance and with so many different cards.


Why don't you turn off your PC as often as possible if you care so much?  

edit: Personally one of the reasons I use a NV graphics card in my work PC is because I have multi-monitor, where AMD has very high power consumption, so I get you


----------



## Nabarun (Sep 19, 2014)

W1zzard said:


> Why don't you turn off your PC as often as possible if you care so much?


Because I also care about sharing (read seeding)


----------



## thebluebumblebee (Sep 19, 2014)

> Just like on GTX 750 Ti, NVIDIA's new architecture brings *massive power consumption improvements.*


Your statement is true if you are talking performance/watt.  But the GM204 uses the same amount of power as does the GK104.  The performance has been improved, the power consumption has not.  According to your own testing, the peak power numbers are GTX 770 = 182 watts, GTX 980 = 184 watts.


----------



## ZoneDymo (Sep 19, 2014)

guess ill be devils advocate through all this praise.

Yes the power usage is impressive BUT performance really is not, barely any better then the GTX780ti which is a huge disappointment as buying this card imo with 4k gaining ground as fast as it does just seems pointless.
This card will not do 4k well, in fact, anything shy of the 295x won't, so whats the point? BRING OUT FASTER CARDS!


----------



## Nabarun (Sep 19, 2014)

thebluebumblebee said:


> Your statement is true if you are talking performance/watt.  But the GM204 uses the same amount of power as does the GK104.  The performance has been improved, the power consumption has not.  According to your own testing, the peak power numbers are GTX 770 = 182 watts, GTX 980 = 184 watts.


It's the same thing. 


ZoneDymo said:


> guess ill be devils advocate through all this praise.
> 
> Yes the power usage is impressive BUT performance really is not, barely any better then the GTX780ti which is a huge disappointment as buying this card imo with 4k gaining ground as fast as it does just seems pointless.
> This card will not do 4k well, in fact, anything shy of the 295x won't, so whats the point? BRING OUT FASTER CARDS!


2x970 will be a much better option imho.


----------



## MxPhenom 216 (Sep 19, 2014)

thebluebumblebee said:


> Your statement is true if you are talking performance/watt.  But the GM204 uses the same amount of power as does the GK104.  The performance has been improved, the power consumption has not.  According to your own testing, the peak power numbers are GTX 770 = 182 watts, GTX 980 = 184 watts.



Good point, but ill take a pretty decent performance increase while staying at relatively the same power consumption any day of the week.


----------



## Assimilator (Sep 19, 2014)

HumanSmoke said:


> The usual - price cuts. 290X gets chopped down to $399 + a refreshed game bundle next week apparently.



Except the GTX 970 @ $329 beats the R9 290X @ $399 in every metric. So unless you want free games, it's a no-brainer as to which card to go for. IMO AMD will have to price the 290X at $349 to even HOPE of competing, and considering they're probably hemorrhaging cash at $399... like W1zz said, they're fucked.


----------



## Sony Xperia S (Sep 19, 2014)

Assimilator said:


> Except the GTX 970 @ $329 beats the R9 290X @ $399 in every metric. So unless you want free games, it's a no-brainer as to which card to go for. IMO AMD will have to price the 290X at $349 to even HOPE of competing, and considering they're probably hemorrhaging cash at $399... like W1zz said, they're fucked.



I agree, R9 290X at $399.99 is still a NO-GO. 

AMD needs very urgently to slash it at least to $299.99 and be thankful that they have the two consoles to supply.


----------



## Assimilator (Sep 19, 2014)

buggalugs said:


> 7970 vs 680 same gen 7970 Ghz edition
> 290X vs 780 same gen 780Ti



Except those *aren't* the same generation. 7970 is Southern Islands, 680 is Kepler. 290X is Volcanic Islands, 780Ti is still Kepler. What you actually have is a "next" generation architecture (Volcanic Islands) being only marginally better than "old" generation architecture (Kepler). And now we have Maxwell, which is newer and better than 290 - so nVIDIA has actually leapfrogged AMD!

Pirate Islands/R9 300 series will either be a massive improvement, in which case AMD is safe - or it'll be an incremental update to the Volcanic Islands architecture. If the latter (seems likely considering the lack of R9 300 performance figure "leaks"), W1zz's remarks about AMD are correct, and they can either choose to launch R9 300 at money-losing prices and hope to gain/retain market share - or go back to the drawing board, which loses them time and hence market share. Either way, I would guess that the AMD GPU design team is going to be having some very late nights over the next few months...


----------



## GhostRyder (Sep 19, 2014)

Assimilator said:


> Except those *aren't* the same generation. 7970 is Southern Islands, 680 is Kepler. 290X is Volcanic Islands, 780Ti is still Kepler. What you actually have is a "next" generation architecture (Volcanic Islands) being only marginally better than "old" generation architecture (Kepler). And now we have Maxwell, which is newer and better than 290 - so nVIDIA has actually leapfrogged AMD!


Umm, no that is not true...HD 7970/ghz edition (Tahiti) and the Hawaii (R9 290/X) are all still part of GCN Architecture which are all still part of the larger picture similar to that of the Kepler Family GK 104 and 110 (680/770 or GTX 780/ti/titans).  The Hawaii is just the full powered GCN architecture just like the GK 110 was the full powered Kepler Architecture.


Assimilator said:


> Except the GTX 970 @ $329 beats the R9 290X @ $399 in every metric. So unless you want free games, it's a no-brainer as to which card to go for. IMO AMD will have to price the 290X at $349 to even HOPE of competing, and considering they're probably hemorrhaging cash at $399... like W1zz said, they're fucked.


Also no it does not...The R9 290X is still better in certain areas but its price will come down to reflect that.

The GTX 970 is one heck of a value for a card, and the GTX 980 is the best card on the market at the moment.


----------



## thebluebumblebee (Sep 19, 2014)

The GM204 is doing exactly what it is supposed to do.  It is beating the previous generation's top dog.  That's what the GK104 did.  That's what the GF104 did.
GTX 980 @ 184 watts beats the GTX 780 Ti @ 268 watts
GTX 770 @ 182 watts beats the GTX 580 @ 226 watts (GTX 480 @ 257)
GTX 460 @ 119 watts beats the GTX 285 @ 164 watts
I should mention that the GTX 560 Ti (GF114) used 147 watts.
(all w1zzard's peak power consumption numbers)

@W1zzard what GPU do you see as the one that shocked the GPU world the most?  8800 GTX, 8800GT, 8800 GTS/512, HD5870, GTX 980, or something else?  My history with GPU's is limited, but my vote goes for the HD5870.

For those who see doom and gloom at AMD, remember how bad they were with the HD2xxx series?  Sometimes it takes a good punch from the competition to get the ball rolling.  Think Intel's Core response to Athlon 64.  But, the brain drain that has occurred at AMD over the past few years does concern me. Competition is good for all of us.

Now, the big question: Is Nvidia phasing out the Gx210 GPU's?  With global pressure on power consumption, I would not be surprised if they did.


----------



## overpass (Sep 19, 2014)

They can overclock this card to 1,500 mhz on air  That is where the power consumption and temperature matter, but perhaps the review sites get the 'platinum' binned chips that are hand-picked for the purpose of showcasing Maxwell's true power. Variance will be there, and the 780 TI trappings of higher quality components or memory bandwidth will hopefully supplant the 980 to allow it to fulfill its potential. I can't wait to see the reviews of MSI or Gigabyte cards to see if they would be the ones to do that, to push the advantage of the chip even further.


----------



## apoe (Sep 19, 2014)

64K said:


> I agree. I made my mind up early this morning after reading the reviews to go with the EVGA GTX 970 SC with ACX cooler. W1zzard says it's only 8% slower than a reference GTX 980 and it's $210 cheaper. It can be had at Newegg for $330 including the $10 EVGA rebate. It's a very nice upgrade from my GTX 680.


I purchased one, but all the current EVGAs being sold use the old ACX 1.0 cooler rather than the new ACX 2.0. After finding out, I cancelled the order. Amazon noted this in their listings but newegg didn't mention it anywhere, it wasn't until I took a closer look at the images that I realized my mistake.


----------



## Rahmat Sofyan (Sep 19, 2014)

Can anybody Explain These



Spoiler











&



Spoiler


----------



## v12dock (Sep 19, 2014)

Rahmat Sofyan said:


> Can anybody Explain These
> 
> 
> 
> ...


I have also seen other reviews that are putting the 980 a little more power hungry


----------



## Hilux SSRG (Sep 19, 2014)

I'm getting to conversation late, but holy sh*t.  I did not expect that low idle power consumption.

The cut down gtx970 looks to be the better value overall, NVidia hit one out of the park!


----------



## The Von Matrices (Sep 19, 2014)

thebluebumblebee said:


> Your statement is true if you are talking performance/watt.  But the GM204 uses the same amount of power as does the GK104.  The performance has been improved, the power consumption has not.  According to your own testing, the peak power numbers are GTX 770 = 182 watts, GTX 980 = 184 watts.


It all depends on your perspective of what the previous product was.  If you go by price bracket, the previous $550 card was the GTX 780 Ti, and the GTX 980 has better performance and lower power.  If you go by the market segment of the die, then GM204 only has higher performance than GK104.  I think that price is a better way to compare than the die used.


----------



## Rahmat Sofyan (Sep 19, 2014)

v12dock said:


> I have also seen other reviews that are putting the 980 a little more power hungry



Yeahh, not too much indeed, but not that "efficiency".

I knew on TPU, measure the power from the card, but it shouldn't too much the different with other review.

That's why I need an explanation, what happen with the difference actually?


----------



## HumanSmoke (Sep 19, 2014)

crow1001 said:


> " Oh, and AMD seems fucked "
> Real professional. Looks like someone is in bed with Nvidia.


More like someone with a firm grasp of reality.
AMD's present financial viability is reliant on their graphics sales ( $82m in income versus $9m for the processor division). Without that graphics revenue, AMD falls further into the red. AMD are already under their $1bn in cash/securities to maintain a sustainable business (R&D) at $948m. AMD's interest payments of debt (which is rising) amounts to $50m per quarter.
Will AMD bounce back with better sales in the future? Very likely.
Does that help them in Q4 2014 and the all-important holiday season sale bonanza? No
In the short term, AMD will just be adding to their debt burden (more so if they have to cut and run on some SKUs, as price cuts tend to put pressure on margins down the product stack), decreasing their ability to fund R&D for future products.

W1zzard just encapsulated this whole post into a short and succinct statement of fact.


Rahmat Sofyan said:


> Can anybody Explain These


Techspot and Tom's used the Gigabyte G1 Gaming cards - almost the highest factory overclocked parts available at the moment (EVGA's GTX 970 FTW ships with a slightly faster 1367MHz boost clock). I'm sure if you clocked any Kepler or AMD card that high it might also be reflected in power usage.


thebluebumblebee said:


> Now, the big question: Is Nvidia phasing out the Gx210 GPU's?  With global pressure on power consumption, I would not be surprised if they did.


Unlikely. GM200 is already circulating for testing and system validation. Why would Nvidia can a GPU intended primarily for professional (high price margin) use?
Even moving to lower power GPUs don't really help since 3 or 4 of them in CrossfireX/SLI pretty much put you back to square one re: power consumption.


----------



## erixx (Sep 19, 2014)

If anybody that says: "I am not buying now, no massive performace improvement!" is NOT buying the Titan 2 the day  it comes out, they will be ordered to kneel and suck!!


----------



## 1d10t (Sep 19, 2014)

Why the "word F" is become a norm here?I though this were a civilized site.Or everybody already lose their minds for a piece of card?


----------



## W1zzard (Sep 19, 2014)

1d10t said:


> I though this were a civilized site


nope, not here.


----------



## eidairaman1 (Sep 20, 2014)

Thumbs down, this feels more like an amateur review plus very biased they way it is written.

One thing is right though. Not much gain over a 780Ti. This feels more like a 880 at best.


----------



## Nordic (Sep 20, 2014)

I remember when it was said that w1zzard was biased towards amd.


----------



## 64K (Sep 20, 2014)

james888 said:


> I remember when it was said that w1zzard was biased towards amd.




I remember.


----------



## D007 (Sep 20, 2014)

Are you sure it is HDMI 2.0?
On MSI it says HDMI 1.4
http://us.msi.com/product/vga/GTX_980_4GD5.html#hero-specification

I have been looking everywhere for clarification but even on Nvidias site, they don't even list what type of HDMI support it is.
If it's 1.4, man they dropped the ball.


----------



## Maban (Sep 20, 2014)

D007 said:


> Are you sure it is HDMI 2.0?
> On MSI it says HDMI 1.4
> http://us.msi.com/product/vga/GTX_980_4GD5.html#hero-specification
> 
> ...


It is indeed HDMI 2.0


----------



## D007 (Sep 20, 2014)

Maban said:


> It is indeed HDMI 2.0


I hope so because I have 2 on the way right now..lol

EDIT: I talked to EVGA on the phone just now and they verified HDMI 2.0.


----------



## Maban (Sep 20, 2014)

D007 said:


> I hope so because I have 2 on the way right now..lol


If you'd like to look at all of what Maxwell can do you can take a look at the whitepaper: http://international.download.nvidi...nal/pdfs/GeForce_GTX_980_Whitepaper_FINAL.PDF


----------



## D007 (Sep 20, 2014)

Maban said:


> If you'd like to look at all of what Maxwell can do you can take a look at the whitepaper: http://international.download.nvidi...nal/pdfs/GeForce_GTX_980_Whitepaper_FINAL.PDF



Well thanks again. 
I was getting tired of electric bills over 300 a month lol.


----------



## The Von Matrices (Sep 20, 2014)

Why do they send such fancy boxes to the reviewers?  I can say though that the Maxwell–Ampère equation is a nice touch.


----------



## Steevo (Sep 20, 2014)

This is one of the most civil and well discussed threads I have ever seen on a open forum. 


Keep it up fuckers!


----------



## BiggieShady (Sep 20, 2014)

I'm astonished ... it took me time to put in perspective that GTX 980 SLI sips less power than a single 290X ... then it took me some more time to realize we are finally getting quiet air cooled SLI setups


----------



## Sony Xperia S (Sep 20, 2014)

1d10t said:


> Why the "word F" is become a norm here?I though this were a civilized site.Or everybody already lose their minds for a piece of card?



In this case you are right. 

No more usage of those words with f*** and s&I*. 

Please, guys, let's keep a higher level of conversations.


----------



## the54thvoid (Sep 20, 2014)

Rahmat Sofyan said:


> Can anybody Explain These
> 
> 
> 
> ...



The second chart at least says SYSTEM power consumption.  W1zzard uses card power consumption.  A lot of reviewers measure system power - you need to double check the test set ups to see it sometimes.



eidairaman1 said:


> Thumbs down, this feels more like an amateur review plus very biased they way it is written.
> 
> One thing is right though. Not much gain over a 780Ti. This feels more like a 880 at best.



The only bias people see on TPU from any of the reviewers is when it doesn't suit that individuals brand alignment.  For all the civility we have on this thread, there is a small enough section of tears to show some folk can't take the truth.

Here's W1zz referring to the Titan Z (It had to make it into the thread, didn't it!)



> is nowhere as *obnoxious* as the $3000 NVIDIA will ask for its GeForce GTX TITAN-Z since that is $1000 costlier than buying a pair of GTX TITAN-Black cards. In a way, NVIDIA’s marketing *arrogance* is working in AMD's favor.



Yeah, W1zz is biased, for sure.


----------



## HumanSmoke (Sep 20, 2014)

Sony Xperia S said:


> In this case you are right.
> No more usage of those words with f*** and s&I*.
> Please, guys, let's keep a higher level of conversations.


I'm OK then. I never use the word f***, and seemingly you don't either





I promise to only use "fuck" and "shit" and not those weird "f***" and "s&!*" notations. Two thumbs up Sony, F*** and s&!* is for pussies!


----------



## birdie (Sep 20, 2014)

D007 said:


> Are you sure it is HDMI 2.0?
> On MSI it says HDMI 1.4
> http://us.msi.com/product/vga/GTX_980_4GD5.html#hero-specification
> 
> ...



It's 100% HDMI 2.0 (enabled) - AnandTech specifically clarifies it.


----------



## birdie (Sep 20, 2014)

Can we please stop discussing the single use of curse words in the article and move onto the article and GPUs themselves? It's getting boring.


----------



## the54thvoid (Sep 20, 2014)

birdie said:


> Can we please stop discussing the single use of curse words in the article and move onto the article and GPUs themselves? It's getting boring.



Only if you stop double posting 

In fairness, the topic is relevant as what we are seeing is _grossly hypocritical behaviour_ that forums should call out.   Sony used that rant forwarded at me and yet he's suggesting we all shouldn't swear?  Yeah right.  Using colloquial language is not an offense, using it to offend is.  If is say, that gfx card is fucking awesome - that's OTT but not offensive.  If I call someone a f****** douchebag - that is and I'd expect an infraction for it.

But back to the card.

I'd like to say back at BigMac he's right about the power issue - I see his point but we're just arguing from different landmarks.  I'd be quite happy to have a card 30-50% faster than what I have now at the same energy cost, rather than card at the same performance with less energy cost.

But I know this card is only a 204 core and not the 210, so I'm still impressed.  This chip replaces the 680 effectively, not the 780 or 780ti.  What some folk are missing is that the GTX 670/680 and GTX 780/780ti are _all Kepler cards_.  This is Maxwell and Maxwell is about 75% faster than it's predecessor - the GTX680 (the GK 104 core).

What we have is:

GK104 -> GM204 = about 75% better performance with the same power cost
GM110 -> GM210 = nobody knows yet.

Just because this is the 9xx series card, it is only one generation on from the 6xx series.  If Nvidia didn't have such schizophrenic naming schemes for it's marketing, we'd all be clearer.  So again, yes, performance is not that great over a 780/780ti but this is not the card it's replacing, the 980 replaces the 680 (architecturally), not the 780/780ti.

Now, if the GM210 is not way better than the 780/780ti, then hey, I'll be deeply unimpressed.


----------



## Frick (Sep 20, 2014)

Dunno if anyone has said this, but an Extreme Edition would be interesting. Put some insane cooling on them, blow the power considerations to smithereens and see how fast they can get. Or is that what EVGA is doing with the kingpins or however you spell them?

EDIT:

http://www.techpowerup.com/205444/e...ak-new-records-with-evga-geforce-gtx-980.html

Waith yeah that's it. 2Ghz. Shitsticks.


----------



## DaedalusHelios (Sep 20, 2014)

Well I bought the faster clocked EVGA GTX 980 at Newegg. After seeing W1zzard's review I couldn't resist. After drivers mature the SLI scaling of the new architecture I might go SLI.





64K said:


> I remember.



Do you remember when AMD/ATI fanboys insisted power consumption was more important than performance when the GTX 480 came out? What will they say now?
I just hope AMD comes up with something that comes close in some way. We need competition to drive innovation.


----------



## GhostRyder (Sep 20, 2014)

BiggieShady said:


> I'm astonished ... it took me time to put in perspective that GTX 980 SLI sips less power than a single 290X


No it does not?  Even under max load it is not double because that would me a single 290X uses ~380 watts which it does not unless your factoring that theres a big inefficiency loss in SLI which cause power to not scale as high as it does max and drops it to ~150 watts.  Decent SLI-4 way review...


On a different note, I think that with the GTX 980 overclocking is going to be something fun to watch.  When we factor in the Lightning and classifieds I bet those will really bring something awesome to the table this go round.  I would really like to see some of those hit the market.


----------



## ZoneDymo (Sep 20, 2014)

the54thvoid said:


> The second chart at least says SYSTEM power consumption.  W1zzard uses card power consumption.  A lot of reviewers measure system power - you need to double check the test set ups to see it sometimes.



I think its not about the overall number, but how the cards stack up against the established cards and on those charts dont look nearly as efficient as portrait in the review here.


----------



## BiggieShady (Sep 20, 2014)

GhostRyder said:


> No it does not?



Yeah, I meant entire system. Average at load I presume from pcperspective - I don't know how, but entire system with GTX 980 SLI needs less power at load than system with single 290X


----------



## NorthboundOcclusive (Sep 20, 2014)

"Oh, and AMD seems fucked."


----------



## N3M3515 (Sep 20, 2014)

Pancho said:


> Dang, I was hoping this would be a nice improvement over last generation, but it is a mediocre upgrade at best... Now I will be waiting till April or May, for nVidia to bring something to the table worth spending my money on.... count me not impressed.



One can't ask too much from 28nm, let's wait for 20nm


----------



## Protagonist (Sep 20, 2014)

Its been a while since i was in here, i see TPU is better than ever... nice review keep it up.

The GM204 and the 9xx naming i guess that hints most likely we wont see a refresh for Maxwell coz this apears to be the refresh already coz 750TI was the first gen Maxwell GM107, maybe next node be it 20nm or 16nm will be Pascal... just my thoughts.


----------



## Frick (Sep 20, 2014)

BiggieShady said:


> Yeah, I meant entire system. Average at load I presume from pcperspective - I don't know how, but entire system with GTX 980 SLI needs less power at load than system with single 290X



... we have entered a phase in which we can power high end desktop systems with cheap 400W PSU's. Ye gods.


----------



## DaedalusHelios (Sep 20, 2014)

Frick said:


> ... we have entered a phase in which we can power high end desktop systems with cheap 400W PSU's. Ye gods.


I would say 500Watt would be safe for SLI gtx 980 4GB and 400 watt would be safe for a single gtx 980 4GB. You want at least 100watt breathing room to make sure your PSU has a long life.

That being said, my 1200watt PSU is seriously overkill now.


----------



## tajoh111 (Sep 21, 2014)

GhostRyder said:


> At what point do you think the R9 285 was at all competition for the GTX 970 or 980?  Its like saying the GTX 750ti was competition for the R9 290X, it was not even in the same category and was intended to fit into the area of mid range and replace the old mid range cards which a new architecture introduction.  Which is exactly what it did, lower power consumption and more performance than the card it replaces...
> 
> The GTX 980 price point is not that bad but it seems horrid because the GTX 970 is at such a low spot.  The GTX 970 is the real winner card here and seems to be the choice card right now.  Its seated right where its expected to be considering its better overall than the GTX 780ti which costs more while using less power and containing more ram.  Its an overall excellent launch.



From an design standpoint, the r9 285 is the competition for gm204 or the gtx 970 and 980. If you didn't know, Tonga cards are part the first of a new series of cards called pirate islands which is analogous to Nvidia's maxwell cards.

What makes r9 285 or the tonga chip the direct competitor to gm204 is they both will be the second biggest chip in the product stack. In addition, these two chips will probably have the closest die size compared to other chips around them for this uncoming generation. Having the same chip size also means, the GPU themselves cost a similar amount to produce.

Financially, what this means is Nvidia can charge more for their gpu chips for something that costs them a similar amount to produce as Tonga. This is a technological advantage that is producing a financial one.

The problem for AMD is the gtx 970, is shaving between 150 dollars off the profit margins of their chips and with these prices dropping, the rest of AMD product stack has to drop in price(a r9 285 doesn't make sense at 250 if an r9 290x is 299) to the extent where they might not make any money, atleast to recover the R and D they spend on it.



buggalugs said:


> 980 is nice card  but wow some people have short memories.
> 
> AMD is already building a competitor for the 980X. When the 7970 came out, it was much better than the 580 "Oh no Nvidia is fucked" Then Nvidia release 680 a few months later 10% faster than 7970.  Then AMD release the 7970 Ghz, same or better than 680Then Nvidia release 780 "oh no AMD is fucked" Then AMD release 290X matching 780 performance. Now Nvidia release the new generation 980 and "AMD is fucked again" even though AMD is already manufacturing a competitor.
> 
> ...



To reiterate the above, the 390x is going to be facing not facing the gtx 980, it will be facing the gm200 or 210 which is big maxwell.

390x or Fiji will be the replacement for Hawaii or 290x, 285 or Tonga will be the replacement for tahiti or 280 chips, For Nvidia, the gtx 980 is the replacement for the gtx 680 and the gm210 is a replacement for gk110 chips.

What this means is for cards sold above 200 dollars, AMD will be making cards using Tonga and Fiji while Nvidia will be making chips using gm204 and gm210. This yields two problems for AMD, since Tonga and Fiji are relatively new products to upcoming products, they can't drum up nee GPU in that product range in the near future. New chips take a while to produce and if the cards are not competitive against the competition, they have to take a price hit and wait a long time for the next generation to be developed to get back in the game. The second problem compounds this. Since gm204 chips smash Tonga chips, their big brothers, which are based on the same architecture are likely to have the same relationship(what also helps this prediction is gm204 and Tonga are so similar in size and power consumption). Meaning gm210 chips should smash Fiji or 390x chips.

Atleast gtx 680 and 7970 were very competitive with each other, so its not a surprise that their big brothers gtx 780 ti and 290x were competitive with each other. However the same cannot be said of Tonga and GM204. The 75 percent difference is massive(techpowerup charts).

If Fiji chips are uncompetitive with gm210 and are closer to performance to Nvidia gm204 chips like the gtx 980 or 970,  their prices will have to reflect that. The problem is rumors suggest all in one cooling(expensive), big expensive 500mm2 + chip, high power consumption(complex board, with more expensive electrical components) which leads to a very expensive card to produce. If gtx 980 performs similarly to Fiji or 290x, all Nvidia has to do to kill the margins is drop the gtx 980 price a to 450 dollars and AMD is making no money(and will lose money from R and D expenditure). Nvidia will still make money on this price considering AMD is able to sell Tonga at 250.

AMD cannot afford to lose any more money. Their cash on hand is below what their threshold for operating is and their R and D budget has shrank(the whole companies R and D is more than 20% lower than Nvidias who doesn't research desktop CPU's) and real estate assets have been sold to prevent a bankruptcy.

AMD will remain open, they are resiliant and have the console program which is just enough to pay their debt interest. But they may be forced to contract further which could hurt their long term ability to compete.


----------



## GhostRyder (Sep 21, 2014)

BiggieShady said:


> Yeah, I meant entire system. Average at load I presume from pcperspective - I don't know how, but entire system with GTX 980 SLI needs less power at load than system with single 290X


Well there is no doubt that a 290X uses much more power, its plainly obvious in that fact since the 980 uses less power than the 780ti which uses less power than the 290X.  That being said I find it hard to see a second 980 only adding 100watts as that seem to be more of a low ball park seeing as how the power usage at high stress is around ~180watts for a single card on its own depending of course on clocks and cooler.  However its still significantly less in the long run...



tajoh111 said:


> From an design standpoint, the r9 285 is the competition for gm204 or the gtx 970 and 980. If you didn't know, Tonga cards are part the first of a new series of cards called pirate islands which is analogous to Nvidia's maxwell cards.
> 
> What makes r9 285 or the tonga chip the direct competitor to gm204 is they both will be the second biggest chip in the product stack. In addition, these two chips will probably have the closest die size compared to other chips around them for this uncoming generation. Having the same chip size also means, the GPU themselves cost a similar amount to produce.
> 
> ...



Tonga is similar to the GM 107 chip in that it was a new chip to give a taste of something to come.  Difference here is that it was designated to replace the aging Tahiti Architecture while showing us a few improvements and offering as a decent midrange price (Versus just putting the chip in a card slot that was not available yet).  It was literally directly advertised as competing with the GTX 760 not the GTX 970 or 980 and was never intended to be such.  Even its number is still in the 2XX series which puts it in line with the other cards in that series rather than the new next generation cards that will have more power to spare.  The R9 390X is a ways off at this point but it going to be the competition for the 980 when its ready in the same way that the HD 7970 came out and then after a few months the GTX 680 came out to fight back.  Its not any different from normal and its just how things work in the game.

AMD got hit with a curve ball not by the GTX 980, specifically the GTX 970 having such a low price point.  The overall average still says the R9 290X is a bit higher depending on clocks for the R9 290X especially at high resolutions while the GTX 980 goes beyond it.  It will likely fall to $350-$400 range and the 290 will fall to $300 while the rest of the cards get knocked into different areas to follow suit.  As for the R&D discussion that is a different argument all together and better suited on a different topic as discussing much further on such things leads a thread about the review of a GPU spammed with the wrong type of discussion but AMD's R&D is not in danger of losing everything just because of this...


----------



## HumanSmoke (Sep 21, 2014)

GhostRyder said:


> Well there is no doubt that a 290X uses much more power, its plainly obvious in that fact since the 980 uses less power than the 780ti which uses less power than the 290X.  That being said I find it hard to see a second 980 only adding 100watts as that seem to be more of a low ball park seeing as how the power usage at high stress is around ~180watts for a single card on its own depending of course on clocks and cooler.  However its still significantly less in the long run...


Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.


GhostRyder said:


> Tonga is similar to the GM 107 chip in that it was a new chip to give a taste of something to come.  Difference here is that it was designated to replace the aging Tahiti Architecture while showing us a few improvements and offering as a decent midrange price


I think you'll find that Tonga isn't Tahiti's successor (just as GM 204 isn't GK 110's successor - successors don't usually have the same ballpark performance as the chip their replacing) it is Pitcairn/Curacao's successor. Tahiti's successor will be Bermuda (Pirate Islands). Fiji, AMD large die answer to GM 200, does not have current analogue in AMD's lineup. BTW: Tonga is Volcanic Islands not Pirate Islands. There is an overlap of architectural tweaks that crosses GPU series with AMD (Hawaii, Bonaire - Sea Islands, Curacao - Southern Islands, Tonga, Iceland(?) - Volcanic Islands, Bermuda, Fiji - Pirate Islands)


GhostRyder said:


> AMD's R&D is not in danger of losing everything just because of this...


No but it certainly wont help matters. Bear in mind that this also has a roll-on effect:
1. Mobile GM 204 (and likely GM 206 will be mobile-centric) will definitely be on OEM's radars.
2. Guaranteed a 10-11 SMM GTX 960 creates pressure on the lower high volume segment of the product stack, and I'm also betting that Nvidia is holding a 14-15 SMM GTX 970 Ti in reserve just in case AMD bring out a fully enabled Tonga SKU. A pricing realignment would just compound the present situation.
3. With a wider uptake of GM 204 cards - and I've seen a number of forumers here and elsewhere looking to change camps if they already haven't done so - it marginalizes even further Mantle and AMD's other features as the installed user base decreases in relation to the opposition (and IMO AMD's Mantle/Gaming Evolved growth led directly to the GTX 970's aggressive pricing). AMD have already invested time, money and effort into making Radeon a more saleable proposition. A large part of that is being obliterated by a shift in current sales. What is the point of Mantle if the oppositions DX11 cards peg performance equal/higher? Without the hardware to walk-the-walk, the features talk-the-talk becomes rather more insignificant.


----------



## tajoh111 (Sep 21, 2014)

GhostRyder said:


> Well there is no doubt that a 290X uses much more power, its plainly obvious in that fact since the 980 uses less power than the 780ti which uses less power than the 290X.  That being said I find it hard to see a second 980 only adding 100watts as that seem to be more of a low ball park seeing as how the power usage at high stress is around ~180watts for a single card on its own depending of course on clocks and cooler.  However its still significantly less in the long run...
> 
> 
> 
> ...



Tonga is not a gm107 chip and gm107 purpose was not to give consumers a taste of things to come. There are no concept chips that give consumers a taste of things to come as that would serve no purpose(this is not a car show where concept cars are showed off). The reason GM107 was released was to steal the discrete laptop market and to plug up a weakpoint in Nvidia low performance portfolio. 

Occasionally there are test chips which test a new process node and these get released, but these chips are usually small as the process is immature.

Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did. 

The fact is it didn't drop power consumption much, was slightly larger than its predecessor and added 5% more performance does not accomplish what Nvidia did with GM107. The only thing it did is add AMD specific add on feature to which bionaire and Hawaii had(true audio, adaptive sync) in the same performance and die size as tahiti. 

Tonga underperforms for its die size and this is why it competes with the gtx 760 and not something higherend where they could sell it for higher and make more money. I think AMD knows this and its why there was generally less fanfare for this launch. No huge press conference. Slow trickling of reviews and generally less excitement from fans.


----------



## Animalpak (Sep 21, 2014)

To Wizzard :

I just wanted to ask if you noticed on the refernce model some coil whine coming from the card ?

I know a light coil whine on powerful videocards are normality.


----------



## GhostRyder (Sep 21, 2014)

tajoh111 said:


> Tonga is not a gm107 chip and gm107 purpose was not to give consumers a taste of things to come. There are no concept chips that give consumers a taste of things to come as that would serve no purpose(this is not a car show where concept cars are showed off). The reason GM107 was released was to steal the discrete laptop market and to plug up a weakpoint in Nvidia low performance portfolio.


GM 107 was intended to be a taste of things to come, best marketing around showing a low version of your next generation work early and let people gawk at it.  It worked as well because everyone was talking for quite some time about how little power the card used and how much better it was than the previous GTX 650ti.



tajoh111 said:


> Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.


 Not every review agrees with you and it out performs the card it replaces while consuming less power which was the point...



tajoh111 said:


> The fact is it didn't drop power consumption much, was slightly larger than its predecessor and added 5% more performance does not accomplish what Nvidia did with GM107. The only thing it did is add AMD specific add on feature to which bionaire and Hawaii had(true audio, adaptive sync) in the same performance and die size as tahiti.


  Its more than that performance wise...However that was the point of it not to be the greatest thing since sliced bread but to bring more GCN 1.1 feature cards to the middle ground for future support of certain game features and things to bide time until the next generation is ready while also showing they are making improvements.



tajoh111 said:


> Tonga underperforms for its die size and this is why it competes with the gtx 760 and not something higherend where they could sell it for higher and make more money. I think AMD knows this and its why there was generally less fanfare for this launch. No huge press conference. Slow trickling of reviews and generally less excitement from fans.


Because its the R9 285, not the R9 3XX...It got some press time talking about the card but it was not stating anywhere it was the newest card that will bring everyone to their knees in awe.  It does what they said it was supposed to, beat the GTX 760, consume less power than the card it replaces, all the while bringing all the CGN 1.1 features to the middle ground.

In the end who cares, its point is well made and at the end of the day its not for everyone.  Its a middle ground card designed for 1080p ultra settings not high resolution ultra gaming.  The GTX 970 and 980 are next generation and in time we will get AMD's response but until then prices will change to reflect that and for now this is the way things are and the argument never changes no matter what company comes with their next generation card first.


----------



## JDG1980 (Sep 21, 2014)

I was curious if someone here had tested the GTX 980's power draw with a full GPGPU compute load (perhaps something like Scrypt). Tom's Hardware claims that the reference 980 draws 285 watts under this kind of load, but that can't be right, can it? Nvidia cards generally don't overshoot their TDP, certainly not by a full 100 watts (that sort of thing would get them in big trouble with OEMs if it were true). I suspect it's more likely that the reviewer was simply interpreting the readings from their shiny new oscilloscope incorrectly, but some confirmation would be nice - on one other message board I frequent, there's already a lot of FUD being spread on this subject.


----------



## HumanSmoke (Sep 21, 2014)

JDG1980 said:


> I was curious if someone here had tested the GTX 980's power draw with a full GPGPU compute load (perhaps something like Scrypt). Tom's Hardware claims that the reference 980 draws 285 watts under this kind of load, but that can't be right, can it? Nvidia cards generally don't overshoot their TDP, certainly not by a full 100 watts (that sort of thing would get them in big trouble with OEMs if it were true). I suspect it's more likely that the reviewer was simply interpreting the readings from their shiny new oscilloscope incorrectly, but some confirmation would be nice - on one other message board I frequent, there's already a lot of FUD being spread on this subject.


Until Tom's actually tells everyone what they are using for their GPGPU testing and is a little more transparent on how they arrive at their readings, it might be something to keep an eye on (if you use the card for GPGPU), but I wouldn't take is as gospel. The Beyond3D forum is discussing the same information with people better versed in electrical measurement than most, so it might pay to bookmark it.
As for being a hot topic....as is the case when any new dominant card arrives, there will be a certain percentage of people desperate to highlight any flaw it has. In this case, whether they're right or wrong, I think you'll have to wait for compute-centric (F@H, CG rendering, etc.) reviews to arrive. Seems a little strange that mining (a fairly intensive workload) doesn't peg the card above its TDP until overclocked.


----------



## The Von Matrices (Sep 21, 2014)

HumanSmoke said:


> I think you'll find that Tonga isn't Tahiti's successor (just as GM 204 isn't GK 110's successor - successors don't usually have the same ballpark performance as the chip their replacing) it is Pitcairn/Curacao's successor. Tahiti's successor will be Bermuda (Pirate Islands). Fiji, AMD large die answer to GM 200, does not have current analogue in AMD's lineup. BTW: Tonga is Volcanic Islands not Pirate Islands. There is an overlap of architectural tweaks that crosses GPU series with AMD (Hawaii, Bonaire - Sea Islands, Curacao - Southern Islands, Tonga, Iceland(?) - Volcanic Islands, Bermuda, Fiji - Pirate Islands)


I have to wonder if the reason AMD continues to use the "groups of islands" theme to intentionally make it confusing what generation each GPU is part of.  AMD has in the past admitted to switching to names instead of numbers in order to make leaks less useful, but at least we could tell a GPU was part of the 5000 series since it had to be named after a tree even if that information by itself provided no hint of the GPU's performance tier.  Since the 6000 series it's all been islands, and if AMD's goal is to confuse everyone, they're doing a mighty good job of it.  It's too bad they won't run out of islands any time soon.


----------



## HumanSmoke (Sep 21, 2014)

The Von Matrices said:


> I have to wonder if the reason AMD continues to use the "groups of islands" theme to intentionally make it confusing what generation each GPU is part of.  AMD has in the past admitted to switching to names instead of numbers in order to make leaks less useful, but at least we could tell a GPU was part of the 5000 series since it had to be named after a tree even if that information by itself provided no hint of the GPU's performance tier.  Since the 6000 series it's all been islands, and if AMD's goal is to confuse everyone, they're doing a mighty good job of it.  It's too bad they won't run out of islands any time soon.


I think it stems from the fact that AMD's R&D is spread fairly thinly. They seem to have an internal roadmap of what they want to achieve with GCN, but the parts that make up the whole are evolving at different rates thanks to R&D prioritization. They went full bore on the high end to match Nvidia, but the architecture isn't that suited to be applied down the product stack in its present form. The sad indictment of this prioritization is that AMD's mobile segment is held together by Pitcairn and Cape Verde based SKUs which look likely to have to soldier on for a while yet...maybe into their third year (Feb/Mar 2015). One thing is for certain, I don't think AMD can afford to keep trimming the R&D budget.


----------



## Hayder_Master (Sep 21, 2014)

nice we back to 680 times


----------



## BiggieShady (Sep 21, 2014)

HumanSmoke said:


> Depends on the utilization of both the first and second card. Any CPU limitation or lack of SLI optimization will affect overall power draw, so any power usage need to take into account workload.



Workload is already taken into account because power draw is measured with the same workload.


----------



## SIGSEGV (Sep 21, 2014)

Nice and awesome card. 
Hope amd brings 20nm process on early 2015 with 300 series as they promised before ( since they have already taped out 20nm on both apu and gpu).
As consumer, i love competition.


----------



## JethroBodine (Sep 21, 2014)

HammerON said:


> This might be the card which allows me to go away from a dual-card setup to run a 2560x1600 resolution and have decent/average 60 fps.....
> For a decent price



I've been using an EVGA Titan SC for 25X16, I bought an EVGA 980 SC. The numbers in this site's review at 25X16 compared to the 690 and 7990 convinced me.

(the 7990 is at 7% higher, the 690 at 8% higher)

That level of performance from a single chip, let alone one as cool and quiet as this chip, is pretty amazing. The factory OC versions of this card should offer indiscernible performance from the 7990/690 on one GPU, for $600 or less.

Good times to be a gamer, this is one of those pivotal moments in GPU history. (E.G. 9700Pro, 8800GTX)


----------



## jabbadap (Sep 21, 2014)

Nice and efficient card, nicely done. It just left a bitter taste that gm107 did not have hdmi2.0, new nvenc and hevc/h265, buying card with gm204 in it for htpc only, is kind of dumb. I hope that nvidia will release gm207 just to catch up with added maxwell 2 features.



Spoiler



offtopic...



tajoh111 said:


> *snip*
> Tonga on the other hand is just a bizarre chip and its review score on this site reflected that. Tonga is simply too big of a chip to be a test chip(along with a new node isn't being tested). It seems out of place, if Tonga was 200mm2 rather than 360mm2, and performed at the level it did, it would be a chip that showed potential in much the same way as GM107 did.
> *snip*



While I agree in most points for tonga being disappointment, I think the one reason why, it is so horrible late. If amd could have managed to release tonga shortly after hawaii way before gm107 with a name r9-280x and r9-280, I think it could have been a good chip at that time. Back then it would have make more sense, think about cayman and barts:
hd6970->r9-290x
hd6950->r9-290
hd6870->tonga xt as r9-280x
hd6850->tonga pro as r9-280

...offtopic


----------



## YautjaLord (Sep 21, 2014)

Just what i thought: no need to upgrade from my GTX 760 in this & next year*. But i loved how this thing performed in Crysis 3 & Wolfenstein: New Order; why you haven't included Carma: R & Serious Sam 3 bugged me for a moment but inclusion of Wolfenstein in the benchmarks suite fixed it for me (one of games i'm willing to have). Thanx Wiz.

P.S. Loved the "Oh, and AMD seems f*cked" @ the end of review. lol

P.P.S. *Definitely no need to upgrade from 2xGTX 760s this & next year either, gonna have the 2nd one by November.


----------



## frdmftr (Sep 21, 2014)

Will the real GTX 980 Please Stand Up!

More of NVIDIA charging flagship prices for a mid-range chip. Yes the preformance is good, but the reality is in terms of chip size this card is a GTX 660/560/460

Imagine if Intel did business the same way NVIDIA has since the GTX 6 series. The 4790K would be a $1,000 processor. And Intel's Flagship processors would carry pie in the sky prices just because there was no competition.

To prove my point the GTX 680 and 660Ti were the same chip. How can a company offer a Flagship Chip at such a price difference? Because even the 680 was a mid range chip.

Until the rest of you Nvidia fanboys catch on to this you will continue to get the second best chip for a premium price.


----------



## JethroBodine (Sep 21, 2014)

frdmftr said:


> Will the real GTX 980 Please Stand Up!
> 
> More of NVIDIA charging flagship prices for a mid-range chip. Yes the preformance is good, but the reality is in terms of chip size this card is a GTX 660/560/460
> 
> ...



You may not have noticed, but NVIDIA hasn't been charging $550 for their ultra high end chips for a while now. While it's probably true higher performing Maxwell variants will follow, it's also probably true they won't cost $550..

There are two kinds of buyers:

Guys like me who buy based on price/performance in the current market.

Guys like you who say "WAIT! I think NVIDIA should be charging less for this because they're going to release some more expensive chips later!"

I'd be more surprised if NVIDIA _didn't _price this at $550:
A. 13% faster than AMDs best at 25X16
B. Either 6, or 16(!) dB lower noise than AMDs best depending which mode you run it in. (BTW- 50dB?! That's FX5800 Ultra Dustbuster level)
C. 110W less peak power consumption/heat dumped in case
D. Better multi GPU
E. Better 3D solution
F. Better feature set (AA, etc)

The cheapest 290Xs on newegg today are $489, 980s would be worth the extra $60 on point A. alone.

AMD may well release water cooled, OCd card but my guess is it be either HIGHLY binned, or close to 300W. I'm expecting a 9590- like product personally. (where they do the OCing and charge a premium for giving you the water cooler)


----------



## DaedalusHelios (Sep 21, 2014)

Animalpak said:


> To Wizzard :
> 
> I just wanted to ask if you noticed on the refernce model some coil whine coming from the card ?
> 
> I know a light coil whine on powerful videocards are normality.



Did anyone check in any reviews for possible coil whine? I am curious despite having one on the way. When mine arrives, what is the best way to get coil whine to present itself? Isn't the best way is have it run a older game super high FPS? I can't remember.


----------



## Sony Xperia S (Sep 21, 2014)

HumanSmoke said:


> I'm OK then. I never use the word f***, and seemingly you don't either
> 
> I promise to only use "fuck" and "shit" and not those weird "f***" and "s&!*" notations. Two thumbs up Sony, F*** and s&!* is for pussies!



I highly recommend you to go and check your health in a hospital!

The reason why I wrote it was that I realised how stupid the conversation went, and it should be improved ( and I didn't intend to insult by any means that guy in that case, it was just dirty but innocent wording)!

Why are you so annoyingly arrogant and sarcastic?


----------



## HumanSmoke (Sep 21, 2014)

BiggieShady said:


> Workload is already taken into account because power draw is measured with the same workload.
> 
> 
> HumanSmoke said:
> ...


Not sure what you're talking about here. GPUs load balance in SLI/CFX - driver profile, vRAM and/or CPU limitation, vSync, app coding will all affect usage (as will GPUs with differing clock/dynamic boost rates). Just because a game can peg GPU usage to near 100%


Spoiler












...doesn't mean that adding a second card will automatically mean that both are running at 100% in the same circumstances. Same system, same application with a second card added....< 70% GPU usage


Spoiler











And of course, depending upon the same app/driver coding and efficiency, vRAM and GPU requirement, not all applications are created equal







Sony Xperia S said:


> I highly recommend you to go and check your health in a hospital!


Oh, Sonny I didn't know you cared! I'll pass though, my super-hypocrite-sense is tingling, so I think I'm good to go.


Sony Xperia S said:


> ( and I didn't intend to insult by any means that guy in that case, it was just dirty but innocent wording)!


Well, the poster (the54thvoid) you aimed it at certainly didn't see it that way, either then or now (judging by the fact he thanked my post and clarified this thoughts just afterwards).


Sony Xperia S said:


> Why are you so annoyingly arrogant and sarcastic?


Check my sig - can't say you weren't warned. Maybe if you applied the same rules to yourself that you are trying to impress upon others, and some degree of argumentative consistency we can put it all behind us and be friends! You have a nice day now.


----------



## frdmftr (Sep 21, 2014)

JethroBodine said:


> You may not have noticed, but NVIDIA hasn't been charging $550 for their ultra high end chips for a while now. While it's probably true higher performing Maxwell variants will follow, it's also probably true they won't cost $550..
> 
> There are two kinds of buyers:
> 
> ...




I did not dispute the preformance of the card. Just stating we are getting half of a chip for the price of a flagship. Based on your reasoning anyone who has an intel 4790 should be sending in extra cash to intel to make up for the price/preformance ratio.

Since the 6 series all Nvidia has been doing is protecting its product stack.

Case in point the GTX 780 could have been released long ago but it wasn't because they didnt feel they had to. When they finally did, it came in at $650. Only to be slashed to $500 a month later. How about all those that bought at $650.

I'm just sick of Nvidia giving us "good enough" Had the Titan been a $650 or the 780 released with the titan would have AMD come to market with the 290 series sooner? I don't know, what I do know is it would have put more pressure on AMD to come to market with a competing product. Had that been the case everyone wins. Faster product turnover, more reason to upgrade and 4k becoming more available on a mass market scale. Not to mention more GPU resources for game developers to give us richer more detailed games. 

Basically, this good enough attitude from Nvidia is holding up progress on many fronts.  It would be nice for Nvidia to give us the best from the get go. Something they havent done since Fermi.

Like I said if Intel did business this way the most powerful CPU available would be the 4790K.


----------



## HumanSmoke (Sep 22, 2014)

frdmftr said:


> Case in point the GTX 780 could have been released long ago but it wasn't because they didnt feel they had to. When they finally did, it came in at $650. Only to be slashed to $500 a month later. How about all those that bought at $650.


That's pretty much the way of the world with high-end cards - you should also check your facts - the GTX 780 price cut came* 5 months* after its launch not one month. Care to work out the likely depreciation rate on the 290X - prices are starting to fall rather rapidly. A card that sold for ~$500 a month ago can be had for 20% less now. That price is an outlier, but I'm guessing that AMD won't be selling too many cards at the current revised MSRP at the moment. The HD 7970 received a 10% price cut a few weeks after it launched thanks to the GTX 680's arrival, and the R9 280 launched at  $280, has now cratered ( ~$200, or nearly a 30% price cut) thanks to AMD EOL'ing the card after a three months lifespan to make way for the R9 285. BTW: That 30% price cut actually exceeds that of the GTX 780's.


frdmftr said:


> I'm just sick of Nvidia giving us "good enough" Had the Titan been a $650 or the 780 released with the titan would have AMD come to market with the 290 series sooner?


No they wouldn't have. Chip design takes years.


frdmftr said:


> I don't know, what I do know is it would have put more pressure on AMD to come to market with a competing product.


Really? If all it takes is a competing product to get AMD to shift gears, why haven't they updated their server/enthusiast desktop platform?  R&D is very much a finite commodity for AMD, don't expect them to stay on the pace when they're fighting a three-front war (x86, ARM, discrete graphics)


> Basically, this good enough attitude from Nvidia is holding up progress on many fronts.  It would be nice for Nvidia to give us the best from the get go. Something they havent done since Fermi.


Odd viewpoint, bearing in mind that both IHV's are dependant upon TSMC's manufacturing process, and both IHV's maximize ROI by respinning product (GTX 680 -> GTX 770 and HD 7970 -> HD 7970 GHz Edition -> R9 280X for example).


----------



## JethroBodine (Sep 22, 2014)

Hard to really question NVIDIA's business methods.

They currently have a market cap of over $10b, and their only competitor has a market cap under $3b. ( and a big chunk of that is the CPU business)

I see a lot of guys on forums saying "Darn you NVIDIA! Give us your best products for $500 as soon as you can get them out the door!".

I sure wouldn't. If I were in charge at NVIDIA and currently had chips 5X more powerful than what we see here, I'd bleed them out just fast enough to keep stomping on AMD and bringing in the high profit quarters. NVDA answers to their board of directors/stockholders, not gamer whim.


----------



## frdmftr (Sep 22, 2014)

HumanSmoke said:


> That's pretty much the way of the world with high-end cards - you should also check your facts - the GTX 780 price cut came* 5 months* after its launch not one month. Care to work out the likely depreciation rate on the 290X - prices are starting to fall rather rapidly. A card that sold for ~$500 a month ago can be had for 20% less now. That price is an outlier, but I'm guessing that AMD won't be selling too many cards at the current revised MSRP at the moment. The HD 7970 received a 10% price cut a few weeks after it launched thanks to the GTX 680's arrival, and the R9 280 launched at  $280, has now cratered ( ~$200, or nearly a 30% price cut) thanks to AMD EOL'ing the card after a three months lifespan to make way for the R9 285. BTW: That 30% price cut actually exceeds that of the GTX 780's.
> 
> No they wouldn't have. Chip design takes years.
> 
> ...




Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.
Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?
Bottom line is consumers drive the market and as long as people are willing to rush out and spend flagship money on a mid-range chip it is us the consumers that will be continued to be shortchanged.
Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip reguardless of what the competition is doing.
Instead of defending Nvidia, maybe you should defend your pocket book.


----------



## thomashrb (Sep 22, 2014)

Thankyou @W1zzard for a really good review. To me the price per dollar is incomplete. Due to display restrictions I play at 1080p.

At 1080p the GTX980 has a 19% perf advantage and a 138Watt advantage over the R9 290X. According to a survey conducted on 34M gamers in May this year as reported by VentureBeat, the average hardcore gamer plays for 22hours/week. If you live in NY, that 138W translates to $26.70/year. The current price comparison for the GTX980 vs the R9 290X is $550 and $500 respectively. Most manufactures will give you at least a 2 year warranty and usually a 3 year warranty on their top end cards making the total cost of ownership for the lifespan of the card equal to the capex + (opex for 3 years). 

for 22hours a week with the R2 290X consuming 138W more than the GTX980 that comes to an extra 157kWh/year. In NY that amounts to $26.70 additional power bills. In 3 years that comes to $80.10 making the GTX980 approximately $30 cheaper to own over the lifespan of the card. In fact even the areas of cheapest power in the Continental US would be paying $47.50 additional on power. So the question is: who in their right minds would be willing to save (in the very best possible case) $2.50 and buy a device that has a 19% performance disadvantage? Simple: the ones who have not been properly informed of the total costs of ownership.

I really enjoyed the article, and especially the extra effort taken to really get into some useful usage metrics. But I did find the Performance per Dollar section to be an incomplete analysis and borderline misleading (even though this oversight is obviously not intentional).


----------



## JethroBodine (Sep 22, 2014)

frdmftr said:


> Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.
> Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?
> Bottom line is consumers drive the market and as long as people are willing to rush out and spend flagship money on a mid-range chip it is us the consumers that will be continued to be shortchanged.
> Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip reguardless of what the competition is doing.
> Instead of defending Nvidia, maybe you should defend your pocket book.



So what position do you hold at NVIDIA?

The reason I ask is the only way you could actually _know_ any of what you allege is if you worked in NVIDIA engineering or management.

If you don't, everything you have postulated is nothing more than conspiracy theory and speculation.


----------



## Champ (Sep 22, 2014)




----------



## DaedalusHelios (Sep 22, 2014)

frdmftr said:


> Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.
> Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?
> Bottom line is consumers drive the market and as long as people are willing to rush out and spend flagship money on a mid-range chip it is us the consumers that will be continued to be shortchanged.
> Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip reguardless of what the competition is doing.
> Instead of defending Nvidia, maybe you should defend your pocket book.



Pricing is created through trying to maximize revenue and dominate the competition(gain market share) with differentiated offerings. Intel, AMD, and Nvidia are just out to make their stockholders the most money. Anything else would be just a figment of your imagination or projection of your own emotions into your analysis. Companies are indifferent to how you feel about their pricing. Public opinion is fickle for the most part. Upgrading from my GTX 680 to the GTX 980 was perceived obsolescence. The GTX 680 suited me just fine but after two generations arrived I felt it was time to buy the best that would fit into my budget. Nvidia gave me a reason to upgrade with the improved efficiency plus better performance at a price that was affordable to me. I was hoping it would offer better performance but it was in the same power envelope as my GTX 680 and noticeably better. That was reason enough.

Intel is losing market share to ARM devices and that is what pushes them to innovate. ARM could catch up to x86 architecture if Intel doesn't continue innovating. Intel has enough money to develop ARM processors but AMD cannot afford what it would take to compete in the ARM market in its current state. ARM might actually be the eventual death of AMD. ATI would have done better separate from AMD if you ask me.



Champ said:


>



Intel Leap ahead?


----------



## frdmftr (Sep 22, 2014)

JethroBodine said:


> So what position do you hold at NVIDIA?
> 
> The reason I ask is the only way you could actually _know_ any of what you allege is if you worked in NVIDIA engineering or management.
> 
> If you don't, everything you have postulated is nothing more than conspiracy theory and speculation.



NVIDIA MIDRANGE CHIPS/CARDS
GTX 460 = GF 104
GTX 560 = GF 114
GTX 680/660TI/770 =GK 104
GTX 980 GM 204

NVIDIA FLAGSHIP CHIPS/CARDS
GTX 480 = GF 100
GTX 580 = GF 114
GTX 780/780TI/TITAN = GK 110

Notice the pattern? Also check the size of the die.

GTX 980 is a midrange chip, that is a fact!


----------



## DaedalusHelios (Sep 22, 2014)

frdmftr said:


> NVIDIA MIDRANGE CHIPS/CARDS
> GTX 460 = GF 104
> GTX 560 = GF 114
> GTX 680/660TI/770 =GK 104
> ...



Is there a portion of the chip that is not being used? I agree there will be a 980 ti most likely. I don't know what time table it is on though. Will there be a Titan and Titan black successor? idk

I don't know if anybody here can say the gtx 980 is a midrange card as a "fact" unless Nvidia confirms it with you though. Or are you saying you consider it a midrange card and that is your opinion? Fact and opinion are not the same thing. Maybe you have information you are breaking NDA on?

Why do you consider GTX 680 a midrange card? It filled the "high end" single GPU spot in the Nvidia GTX 600 series product lineup.


----------



## HumanSmoke (Sep 22, 2014)

frdmftr said:


> Product progression is one thing. 480 to 580 would be another example. Or how about the 8800GT and its progression. In these examples we still got the full chip and amd/nvidia putting the best chip to market. Had amd not had Hawaii we would have never seen the 780ti and the 780 would be still $650 and Maxwell would still be in the pipeline waiting for 20nm.


Unlikely in the extreme. Nvidia at the very least have an internal cadence they need to continue. HPC and workstation - Nvidia's large dies do double duty, and the largest die is primarily a compute chip. Secondly, the same architecture is now applied to SoC's - so reason two to keep arch cadence, as is OEM requirements - OEM's make up the largest buyers of discrete graphics chips - no new bullet points means people don't upgrade - you can only fool some of the people some of the time with rebrands.....and the primary reason? Shareholders.
Using your reasoning, Intel should have shelved Ivy Bridge/-E, Haswell/-E, and Broadwell...and Skylake for all the competition AMD offer.


frdmftr said:


> Not to mention nvidia has discontiuned the 770/780 and 780ti. The 780/780ti I understand. But since a 770 and a 760 cost the same to produce why not keep the 770 with a price cut?


Because all use 28nm wafers that are finite commodity as far as allocation goes. Why would you spend a second thought about producing old 367mm² GK 104 chips when you need wafers to produce GM 206 and GM 107 ? Likewise, 551mm² GK 110's make the same zero sense when the company need to allocate wafers for 398mm² GM 204 and GM 200 production.



frdmftr said:


> Again, if Intel did business this way the 4790k would be the $1k flagship. Kudos to Intel for offering the consumer the best chip regardless of what the competition is doing.


Some people might say that that is rubbish. Intel innovate just enough to keep people upgrading. You really think "best chip regardless" philosophy covers decreasing the PCI-E lanes available to the 5930K ? or deliberately keeping separation between mainstream platforms and HEDT ? The 4790K is the price it is because of the product stack above it, and the number of people who buy $1K processors aren't by any means the majority of users.


frdmftr said:


> Instead of defending Nvidia, maybe you should defend your pocket book.


Sorry if the logic of the situation and business in general is unpalatable to you....
Meanwhile a $339 GTX 970 basically equals the gaming performance of a $450 R9 290X - does that class as defending my pocket book?


DaedalusHelios said:


> I don't know if anybody here can say the gtx 980 is a midrange card as a "fact" unless Nvidia confirms it with you though.


Within Nvidia's GPU hierarchy, GM 204 is indeed a mid range (second tier) chip. GM 200 (or GM 210 as some people are calling it) is the high end. What frdmftr isn't considering is that Tahiti's successor (Bermuda) will also be a midrange chip, since Fiji will be their large die compute chip.
What frdmftr is highlighting about Nvidia's lineup is exactly what AMD are now in the process of doing. Fiji has no analogue in AMD's current lineup. The big chip of each (GM 200 and Fiji) will be heavily compute orientated - IIRC both are likely to feature 1:2 FP64, while the midrange (and lower) starting with GM 204 and Bermuda (the chip associated with the R9 390X) feature die space area-ruled for gaming workloads. frdmftr can pillory Nvidia to his hearts content, but the fact remains that AMD are following their example because bifurcating the product line is the most effective use of expensive silicon production.


----------



## frdmftr (Sep 22, 2014)

HumanSmoke said:


> Unlikely in the extreme. Nvidia at the very least have an internal cadence they need to continue. HPC and workstation - Nvidia's large dies do double duty, and the largest die is primarily a compute chip. Secondly, the same architecture is now applied to SoC's - so reason two to keep arch cadence, as is OEM requirements - OEM's make up the largest buyers of discrete graphics chips - no new bullet points means people don't upgrade - you can only fool some of the people some of the time with rebrands.....and the primary reason? Shareholders.
> Using your reasoning, Intel should have shelved Ivy Bridge/-E, Haswell/-E, and Broadwell...and Skylake for all the competition AMD offer.
> 
> Because all use 28nm wafers that are finite commodity as far as allocation goes. Why would you spend a second thought about producing old 367mm² GK 104 chips when you need wafers to produce GM 206 and GM 107 ? Likewise, 551mm² GK 110's make the same zero sense when the company need to allocate wafers for 398mm² GM 204 and GM 200 production.
> ...




The 760 and the 770 use the exact same chip. Nvidia diaables a portion of the chip to make the 760. That being said why keep a $220 GTX 760 and not go to $250 on the 770? Again just Nvidia protecting its product stack.

I am not a socialist and do not think Nvidia should be a chairty.  However, how long had the Titan been around before we saw the 780? Even the Titan remained unchanged until after the 290x. They were shipping the chips anyway. once yeilds improved they could have fully enabled the Titan long ago.  Instead they disable a portion of the chip?

No arguement here about the 970.  Again a little spendy for a second tier chip but it really delivers on the preformance.

As I have been saying all along the GM 204 is a mid range chip and at $550 that is a top tier price.


----------



## HumanSmoke (Sep 22, 2014)

frdmftr said:


> The 760 and the 770 use the exact same chip. Nvidia diaables a portion of the chip to make the 760. That being said why keep a $220 GTX 760 and not go to $250 on the 770? Again just Nvidia protecting its product stack.


Honestly I don't know how to make it any clearer. Nvidia is drawing a line under Kepler (GK 204, GK 110) production in favour of Maxwell production (GM 107, GM 204, GM 200)  where wafer starts are a limited commodity. Why would Nvidia continue producing a $250 770, when they have a stack of salvage GM 204's ready to go as the GTX 960 next month? You think that a $250 GTX 770 makes more sense than a likely faster GTX 960 at the same (or lower) price point?


frdmftr said:


> I am not a socialist and do not think Nvidia should be a chairty.  However, how long had the Titan been around before we saw the 780?


Different argument entirely. The Titan -> GTX 780 -> 780 Ti -> Titan Black is generally accepted as marketing strategy - no question, and I wouldn't argue otherwise.


frdmftr said:


> Even the Titan remained unchanged until after the 290x. They were shipping the chips anyway. once yeilds improved they could have fully enabled the Titan long ago.


That smells suspiciously like your opinion dressed up as fact- and unfortunately for you it is wrong. The fact is that the fully enabled GTX 780 Ti and Titan Black are both revised B1 silicon. Titan and the GTX 780 - not fully enabled - are A1 silicon. Easy reference: B1 silicon is specced for 7GHz (effective) memory, A1 is specced for 6GHz.


----------



## JethroBodine (Sep 22, 2014)

frdmftr said:


> As I have been saying all along the GM 204 is a mid range chip and at $550 that is a top tier price.



You're working really hard to make it look like NVIDIA is somehow "cheating" people by giving us the top single GPU performance for $550 and beating their competitors $500 card on all fronts for $339.. DARN them for selling people products that are superior on all fronts for $160 less!

On the rest of your theories:

Yes the GTX980 may well not be the largest, most powerful chip based on the Maxwell arch. Doesn't mean larger chips on the Maxwell arch are done now, or could even be produced at 28nm.

So for right now, the GTX980 "is" the high end on PC gaming because there's no other single GPU that competes. (and AMD/NVIDIA is starting to look a lot like AMD/intel- just a beat down, even on the same process node- pretty amazing)

Also, while $550 may be a "top card" price for a second tier vendor like AMD, NVIDIA can (and does) charge more than that for their top card with fair regularity. Their "big" Maxwell chip will likely cost $1000..


----------



## pr0fessor (Sep 22, 2014)

Buying two of these 970 cards that would be my style, because of the 4gb high ram size and the cheaper price than the 980 card. I like the double card setting and the low powerconsumption due to the lower noise. NVIDIA wants me to buy. But then again, I'll wait AMD's turn and buy some similar cards for 100$ less. Or I even skip this generation for no need. 

What if AMD is going to fail with its next cards? Then you should buy these cards now or they even get more expensive. Nice cards anyway.


----------



## N3M3515 (Sep 22, 2014)

JethroBodine said:


> (and AMD/NVIDIA is starting to look a lot like AMD/intel- just a beat down, even on the same process node- pretty amazing)



As far as i have memory about gpus, they are ALWAYS trading blows. And i don't know if i'm understanding people the wrong way here, they seem to be so exited about nvidia beating amd or the other way around, but don't you people see that if one side singlehandedly beats the other out of competition, WE are the ones SCREWED!?


----------



## JethroBodine (Sep 22, 2014)

N3M3515 said:


> As far as i have memory about gpus, they are ALWAYS trading blows. And i don't know if i'm understanding people the wrong way here, they seem to be so exited about nvidia beating amd or the other way around, but don't you people see that if one side singlehandedly beats the other out of competition, WE are the ones SCREWED!?


Personally I'm just excited to see products this good this cheap.

The GK series chips were great, but fairly costly at first. (pre-GTX780 launch)

The R290X was a face plant launch. Sure, it competed, but the OEM design was a hairdryer.

I disagree on the one company winning harms us.

Intel has had no competition for many years, but they have to sell us better parts at a price we'll pay to keep the doors open.

Unless you think they're saying "Gee fellas if we DON'T sell our bada$$ chips for $300, people will buy FX-8350s!"? LOL- like that would happen.


----------



## frdmftr (Sep 23, 2014)

JethroBodine said:


> You're working really hard to make it look like NVIDIA is somehow "cheating" people by giving us the top single GPU performance for $550 and beating their competitors $500 card on all fronts for $339.. DARN them for selling people products that are superior on all fronts for $160 less!
> 
> On the rest of your theories:
> 
> ...




The cost difference comes into play based on die size. Given the size of the die used in the 980 and the fact that it has a 204 at the end tells us the simple FACT this is a mid-range, second tier chip, period. Now, this chip was supposed to have been on the 20nm process and since they went withn 28nm the die is a little bigger than the GK 104 it replaced but not the size of the GK 110.  The 110 is the flagship chip.  The 980 will cost less to produce than the 780, yet is priced higher.
Yes it is impressive for its size but it is not the "full monte"

Everyone wants to bring the 970 into the conversation. That is not what we are taking about. I am specifically referring to the 980. You can talk about price/preformance all you want. Anyone can make a Kia as fast as a BMW that doesn't make it worth the price of a BMW. The 204 chip is an ecomomy class, just because of techinacal advancements it does not make it worth luxury prices.  Think about that next time you buy a TV. year after year they get cheaper and bigger. And since Kepler Nvidia has been offering less silicon for more and more dollars.


----------



## HumanSmoke (Sep 23, 2014)

JethroBodine said:


> I disagree on the one company winning harms us.
> Intel has had no competition for many years, but they have to sell us better parts at a price we'll pay to keep the doors open.
> Unless you think they're saying "Gee fellas if we DON'T sell our bada$$ chips for $300, people will buy FX-8350s!"? LOL- like that would happen.


A different situation with graphics since hardware, features, and end product (games and apps) have a greater interdependency.
If Nvidia were left as the sole discrete GPU supplier, OpenCL would suffer and game developers would likely become more dependant upon the company. If AMD were the sole GPU supplier, they would immediately divert funds/R&D from their graphics program to bolster the processor division as is already apparent.
Competition between architectures and gaming programs serves to keep innovation and development at a somewhat reasonable pace in comparison to what we would have without it.


----------



## JethroBodine (Sep 23, 2014)

frdmftr said:


> The 204 chip is an ecomomy class, just because of techinacal advancements it does not make it worth luxury prices.



Pretty harsh assessment frdmftr.

As you think the 980 at $550 is an "economy class chip" and a "Kia", what do you think of the 290X?

High school science project? Tobacco Row part?

After all, the 980 pretty much embarrasses the 290X on every relevant performance, engineering, and efficiency metric.

AMD has become synonymous with slow, hot, loud, and inefficient in all markets they compete in. Not having any money in the tech world is like having one foot in the grave, the other sliding off the edge, and the rain starting to fall....hard.


----------



## HumanSmoke (Sep 23, 2014)

frdmftr said:


> The 980 will cost less to produce than the 780, yet is priced higher.


Oh, ffs, the GTX 980 is a fully functional GPU. *The GTX 780 is a third tier salvage part*.   Why don't you compare pricing for fully functional GPU vs. fully functional GPU - the top GM 204 part is the GTX 980, the top single GK 110 part is the GTX Titan Black (which hasn't been EOL'd)


frdmftr said:


> And since Kepler Nvidia has been offering less silicon for more and more dollars.


Paraphrase : _GM 204 is bigger than GK 104 but I won't compare any further back than the last GPU because the argument doesn't hold up._ You just admitted:


frdmftr said:


> Given the size of the die used in the 980 and the fact that it has a 204 at the end tells us the simple FACT this is a mid-range, second tier chip, period.


Yet continue to base pricing across two distinct segments in the hierarchy. Die sizes for the second tier GPU have been rising as a general trend - picking out a relative outlier in GK 104 says more about your motivation than it does about process technology  (G92 (324mm²) -> GF 104/GF114 (367mm²/358mm²) -> GK 104 (294mm²) -> GM 204 (398mm²) ) . The same is true for AMD unsurprisingly at the same performance level of their Nvidia counterparts.


----------



## frdmftr (Sep 23, 2014)

HumanSmoke said:


> Oh, ffs, the GTX 980 is a fully functional GPU. *The GTX 780 is a third tier salvage part*.   Why don't you compare pricing for fully functional GPU vs. fully functional GPU - the top GM 204 part is the GTX 980, the top single GK 110 part is the GTX Titan Black (which hasn't been EOL'd)
> 
> Paraphrase : _GM 204 is bigger than GK 104 but I won't compare any further back than the last GPU because the argument doesn't hold up._ You just admitted:
> 
> Yet continue to base pricing across two distinct segments in the hierarchy. Die sizes for the second tier GPU have been rising as a general trend - picking out a relative outlier in GK 104 says more about your motivation than it does about process technology  (G92 (324mm²) -> GF 104/GF114 (367mm²/358mm²) -> GK 104 (294mm²) -> GM 204 (398mm²) ) . The same is true for AMD unsurprisingly at the same performance level of their Nvidia counterparts.




Actually it does hold up going all the way back the the 8800 Ultra. Even that chip is bigger than the GM 204 in the 980. The GM 204 may be a fully enabled chip but as I'm sure you know the each chip comes from a wafer. Fully enable or not the FACT remains the GM 204 takes up less real estate on said wafer. Which means they will get more chips per wafer compared to the GK 110!!. Given the fact that just about every company I can think of will base part of thier cost on how many chips per wafer, well except Nvidia.

I paid the same for my S2 as my Note 3, my 60in TV cost less in 2014 than my 40in in 2010, Motherboards, RAM, CPU's all about the same price. By no means am I here to defend AMD. Just pointing out that of all the things I just listed Nvidia is the only company that is giving me less of something for more $$$$.


----------



## RealNeil (Sep 23, 2014)

Good review W1zzard, of a really good performing card. I like the price point it has too. This can only mean lower prices in the GPU market and ~that's~ good news for us consumers.


----------



## HumanSmoke (Sep 23, 2014)

frdmftr said:


> Actually it does hold up going all the way back the the 8800 Ultra. Even that chip is bigger than the GM 204 in the 980.


Nice trolling. Spend an entire thread prattling on about the GM 204 being a second tier chip then, when the comparison to other second tier chips is made switch to comparing it to the top GPU in the stack.


frdmftr said:


> The GM 204 may be a fully enabled chip but as I'm sure you know the each chip comes from a wafer. Fully enable or not the FACT remains the GM 204 takes up less real estate on said wafer. Which means they will get more chips per wafer compared to the GK 110!!.


Ooooh well done! Now work out the next part!
A wafer contains a number of GPUs of varying functionality.
GM 204 harvest:
GTX 980 ($549)
GTX 970 ($329)
GTX 960 (~$250)

GM 110 harvest (consumer)
GTX Titan Black ($999)
GTX Titan  ($999)
GTX 780 Ti ($649)
GTX 780 ($649 -> $500)

Now, why don't you work out some averages and tell me how chips from a GM 204 wafer are more expensive than those from a GK 110 wafer. You do know how to calculate yields from a 12" wafer I take it since you're lecturing me on fancy terms such as "wafer"

IF a wafer contained ALL fully functional dies then your argument holds up. THEY DON'T so YOURS DOESN'T


frdmftr said:


> I paid the same for my S2 as my


I'm going to cut you off right there since I don't care about your shopping


frdmftr said:


> I just listed Nvidia is the only company that is giving me less of something for more $$$$.


Um, well, don't buy it. Making some flawed comparison to justify your position just makes it sound like whining. You have seven posts in this thread and the fact that I could basically deconstruct your argument by just quoting your own words back to you should tell you something about how solid it is.
Why rage on about a card you aren't interested in? The GTX 970 is a star performer of this launch - and even you've admitted that it is a good card, yet out of the FOUR GM 204 reviews posted, THREE of them concern the GTX 970 and you haven't posted a single comment on any of them. Agenda much?


----------



## DaedalusHelios (Sep 23, 2014)

frdmftr said:


> Just pointing out that of all the things I just listed Nvidia is the only company that is giving me less of something for more $$$$.



What matters to most consumers is performance per dollar and not how much wafer they get. I have never bought a card based on the size of its wafer. The power efficiency is an important perk though.

Does anyone know someone who buys hardware for the size of its chip rather than performance or value? I am starting to wonder what you are arguing for. That Nvidia shouldn't succeed doing more with less wafer and profiting? I want AMD to recover but they seem behind in every way.


----------



## radrok (Sep 23, 2014)

Thanks for the awesome review, always a pleasure reading TPU articles.

Seems like my Titan SLI has been an awesome investment considering how much it is still holding, especially considering my hefty overclock, which brings performance on par, if not superior to this GPU.

Waiting for big Maxwell to upgrade


----------



## JethroBodine (Sep 23, 2014)

DaedalusHelios said:


> What matters to most consumers is performance per dollar and not how much wafer they get. I have never bought a card based on the size of its wafer. The power efficiency is an important perk though.



Maybe there will be a new review category- "wafer per dollar".

Or maybe no one really cares about wafer size at all.

Personally, I don't care if AMD or NVIDIA figure out how to get their performance off parts with 3 transistors that cost them fifty cents to make. If the part offers more performance than the old high end, uses less power and gives off less heat, it's worth the  money to me.

Some seem to be arguing against NVIDIA finding a way to maintain their margins and profitability. What would they prefer? For NVIDIA to be going broke like AMD? That _can't_ be in the best interest of gaming.

As $330 GTX970s outperform $500 290Xs at better power/heat/thermals, one could say that AMD is now robbing consumers. Giving them less desirable performance everywhere that matters for 50% more cash. Getting more wafer for that $170 wouldn't be a key purchase factor for me.


----------



## N3M3515 (Sep 23, 2014)

JethroBodine said:


> Personally I'm just excited to see products this good this cheap.
> 
> The GK series chips were great, but fairly costly at first. (pre-GTX780 launch)
> 
> ...



I knew you were goin to pull out that intel stuff, you see, since intel got the core arch out, they have been incredibly slow to increase the performance, so slow that i have a 4-year old cpu, and it still is going strong as to i don't need to replace it even for 2 more years.....competition is what brings archs like nehalem over netburst.


----------



## RealNeil (Sep 23, 2014)

DaedalusHelios said:


> I have never bought a card based on the size of its wafer.



Me neither,...................I just buy the most performance & features that I can afford at the time I'm buying a GPU. I read reviews from about four tech sites that let me know if a part has any problems, and I rule out some GPUs based on that.
I'm still new to TPU, but I like the reviews here so far.

My two GTX-570s were great when I bought them years ago. My two GTX-680s were even better and together, they still deliver good performance.
I bought two R9-280X OC GPU's and never looked back. They're pretty good cards, and worth every penny spent.

My latest buy is a single 4GB GTX-760 and that's going into a M-ITX build with room for just one GPU. These new NVIDIA offerings are something to look forward to.


----------



## JethroBodine (Sep 23, 2014)

N3M3515 said:


> I knew you were goin to pull out that intel stuff, you see, since intel got the core arch out, they have been incredibly slow to increase the performance, so slow that i have a 4-year old cpu, and it still is going strong as to i don't need to replace it even for 2 more years.....competition is what brings archs like nehalem over netburst.



I could have sworn it was a combination of engineering breakthroughs and invention.

Are you implying NVIDIA would have released Maxwell three years ago if AMD had a chip at this level three years ago, because "competition" drove them to it?

I have to differ. AMD would be launching a Maxwell-esque this week if that were true. I think they're "going" to launch a revision of 290X with a water cooler and some minor manufacturing process improvements instead, and give away games with it.

Why isn't competition driving them to catch up with intel? Surely intel has been beating them down for years and years, that competition should have drove them to innovate a better arch than Faildozer?


----------



## N3M3515 (Sep 23, 2014)

JethroBodine said:


> I could have sworn it was a combination of engineering breakthroughs and invention.
> 
> Are you implying NVIDIA would have released Maxwell three years ago if AMD had a chip at this level three years ago, because "competition" drove them to it?
> 
> ...



Maybe not three years earlier but definetly sooner.

Remember when AMD was nothing and intel was everithing? rest my case.


----------



## JethroBodine (Sep 23, 2014)

N3M3515 said:


> Maybe not three years earlier but definetly sooner.
> 
> Remember when AMD was nothing and intel was everithing? rest my case.



Remember?

What? This year? Last? 2012? 2011? I have a harder time remembering to when AMD was something, because I think it was June 2006.

My guess is this will only get worse. No competition for intel on the horizon. Maxwell boots them out of laptops effectively. Low margins on console chips. If I'm right on slight core revisions for upcoming GPUs tough sledding ahead for AMD.


----------



## N3M3515 (Sep 23, 2014)

JethroBodine said:


> Remember?
> 
> What? This year? Last? 2012? 2011? I have a harder time remembering to when AMD was something, because I think it was June 2006.
> 
> My guess is this will only get worse. No competition for intel on the horizon. Maxwell boots them out of laptops effectively. Low margins on console chips. If I'm right on slight core revisions for upcoming GPUs tough sledding ahead for AMD.



What i was impliying, is that AMD was nothing long ago(14 years), and managed to beat intel(performance/power - wise), with fewer resources than it has now, can they do it again? i don't know, but there's a good chance it happens. It's normal.


----------



## JethroBodine (Sep 24, 2014)

N3M3515 said:


> What i was impliying, is that AMD was nothing long ago(14 years), and managed to beat intel(performance/power - wise), with fewer resources than it has now, can they do it again? i don't know, but there's a good chance it happens. It's normal.



There's a "good chance" AMD will beat intel when they're behind in IPC, process node, losing money practically every quarter for many years, and having a market cap equal to about one quarter of profits for intel?!


----------



## 64K (Sep 24, 2014)

DaedalusHelios said:


> I would say 500Watt would be safe for SLI gtx 980 4GB and 400 watt would be safe for a single gtx 980 4GB. You want at least 100watt breathing room to make sure your PSU has a long life.
> 
> That being said, my 1200watt PSU is seriously overkill now.



No it wouldn't. To SLI the GTX 980 would overload the 500 watt PSU.


----------



## frdmftr (Sep 25, 2014)

DaedalusHelios said:


> What matters to most consumers is performance per dollar and not how much wafer they get. I have never bought a card based on the size of its wafer. The power efficiency is an important perk though.
> 
> Does anyone know someone who buys hardware for the size of its chip rather than performance or value? I am starting to wonder what you are arguing for. That Nvidia shouldn't succeed doing more with less wafer and profiting? I want AMD to recover but they seem behind in every way.



Preformance should be rewarded. but with Kepler Nvidia switched from being the green team to the Greed team. The $200 price point for Nvidia's mid range goes all the way back to the 8000 series. Watch what happened with Kepler

GF 104 332mm2 $199 (GTX 460)
GF 114 360mm2 $199 (GTX 560)
GK 204 294mm2 $500 (GTX 680)

Overnight the price went up 150% By sleight of hand the X80 series became a mid range card in the line up. And the new flagship is now $1000
Again, of course preformance should be rewarded but at what point does it become gouging?
Look at what this pricing did to those that would later purchase a 780 when the 290 was released.
Had Nvidia kept the 680 with the GK 110 as origionally planned we the consumer could actually buy an Nvidia card with out wasting money on something that will be devalued 33% overnight

The same is true for the 780ti. and the hit anyone who bought one is taking. Not a very good way to treat loyal customers.

Of course chip development takes time and obviously AMD wasn't sitting on the hawaii chip. All I'm saying is if Nvidia released the 680 with the GK110 chip and charged $600 from the get go how much sooner could have amd come to market with the 290 series? Not to mention at that point it would have held the $600 price point for some time and the consumer could have bought with confidenfce. AND at the end of the day Nvidia makes more money.  Increased duration of sales for the GK 204 at a price point of $400 and the GK 110 at $600. AND along with it would have been a steeper reduction of price of the 7900 series from AMD.
As you can see the greed from Nvidia hurt the game developers, the gamers and the development of this industry as a whole.

So today we are left with a chip that not that long ago was at a $200 price point. Of course the preformance is very impressive but if they were making money with this chip at $200 they are now making truckloads of money at $550 and laughing thier way to the bank with your money.


----------



## DaedalusHelios (Sep 25, 2014)

64K said:


> No it wouldn't. To SLI the GTX 980 would overload the 500 watt PSU.



Yeah with a 100watt+ breathing room 600watt sounds safer. http://us.hardware.info/reviews/562...way-sli-review-test-results-power-consumption

Haswell-E with GTX 980 SLI consumes 429W on their test. I am sure you couldn't run furmark and get that though. 



frdmftr said:


> Preformance should be rewarded. but with Kepler Nvidia switched from being the green team to the Greed team. The $200 price point for Nvidia's mid range goes all the way back to the 8000 series. Watch what happened with Kepler
> ~snip~
> So today we are left with a chip that not that long ago was at a $200 price point. Of course the preformance is very impressive but if they were making money with this chip at $200 they are now making truckloads of money at $550 and laughing thier way to the bank with your money.



Pure capitalism is a system self-controlled by greed. Nvidia is just another supplier. If you had say one thousand cards you made a month and charged $200 when people were fully willing to buy it for $550, would that make you very good at running a business?

I would like to get more for less too. They will charge as much as they think will maximize profits because publicly traded companies are out to maximize profits for their shareholders. Dealing with what volumes they can produce and how many customers are willing to pay a given amount. They don't sit down and talk about what they think will be fair and decide that way.


----------



## JethroBodine (Sep 25, 2014)

frdmftr said:


> Preformance should be rewarded. but with Kepler Nvidia switched from being the green team to the Greed team. The $200 price point for Nvidia's mid range goes all the way back to the 8000 series. Watch what happened with Kepler
> 
> GF 104 332mm2 $199 (GTX 460)
> GF 114 360mm2 $199 (GTX 560)
> ...



A. NVIDIA sells on performance, not wafer size. People buy based on performance, not wafer size.

http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_660/27.html

The 660Ti (mid-range chip for the Kepler gen at $300) outperforms last gen top chip, the GTX580. The GTX660 (4th highest performing Kepler) outperforms last gen's 2nd highest chip, the 570.

B. As big of an AMD supporter as you are, you should be very happy NVIDIA sold GTX680s for $500 instead of $200. (or even $300) Know why? _Not one person on the planet would have bought an AMD part if GTX680s cost $200. Everyone would have GTX680s and AMD would be out of business.
_
C. $500 is not NVIDIA's high end price point now. If people stop buying their $750-$1000 cards, prices may come back down to $500, but I wouldn't count on it unless AMD actually competes. If you want to blame anyone for NVIDIA's pricing, blame AMD. Look at their last launch the 290X. Yes, they caught up to the Titan in performance in the 50db meltdown mode, but the cards were definitely second class compared to Titans and 780s. Noisey little blast furnaces, while NV was selling cool and quiet cards.

Roy Taylor of AMD says "We love competition!" and "You ain't seen nothing yet!" but their engineering isn't backing him up to date. Maybe we'll see a change today with their announcement, but until they actually compete NVIDIA will charge first tier prices and AMD will take the scraps and keep losing market share.


----------



## rtwjunkie (Sep 25, 2014)

DaedalusHelios said:


> I don't know if anybody here can say the gtx 980 is a midrange card as a "fact" unless Nvidia confirms it with you though. Or are you saying you consider it a midrange card and that is your opinion? Fact and opinion are not the same thing. Maybe you have information you are breaking NDA on?
> 
> Why do you consider GTX 680 a midrange card? It filled the "high end" single GPU spot in the Nvidia GTX 600 series product lineup.


 
What he is referring to is not the cards being midrange, but the chips used IN those cards.  And on that matter he is correct.  The current high-end card is built on the mid-range Maxwell(GM204).  GM200 is nearly ready, thus your high-end chip will be released eventually to a high(er) end card that no one here can actually guess correctly at yet.


----------



## frdmftr (Sep 25, 2014)

JethroBodine said:


> A. NVIDIA sells on performance, not wafer size. People buy based on performance, not wafer size.
> 
> http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_660/27.html
> 
> ...


 
In all honesty I should have compared it to the 560ti to be fair as this was the fully enabled GF 114 at $250

Also, I have been a loyal Nvidia customer since my first video card, the MMX 440? There is even a pic of my 660SLI setup posted on Nvidia's site. My current set up is a 290x with a GTX 480 as my PhysX Card. The 660's were moved to my kids computers and I sold my GTX 680, I can provide you a link to my craigslist post when I sold that card.

I never said they should have sold the GK 204 for $200. Later in my post I mentioned a price point of $400 for that chip. Like I said I  think innovation should be rewarded.


----------



## frdmftr (Sep 25, 2014)

DaedalusHelios said:


> Yeah with a 100watt+ breathing room 600watt sounds safer. http://us.hardware.info/reviews/562...way-sli-review-test-results-power-consumption
> 
> Haswell-E with GTX 980 SLI consumes 429W on their test. I am sure you couldn't run furmark and get that though.
> 
> ...


----------



## JethroBodine (Sep 25, 2014)

Nice. Yesterday there's a 9/25 announcement alluded to, with Roy Taylor saying we haven't seen anything yet and that AMD loves competition.

Today on Roy's Twitter he's got an India FirePro link, something about PayPal and Bitcoins, and "Why APUs are OK for gaming".

No competition for the GTX980 and 970 there.

Other rumors say no 290 successor coming till first half 2015.  So I guess till then there might not be an competition.

Note to AMD guys: When your reference design at launch has an inferior cooling solution, that's the first impression you make and what people see in the reviews.


----------



## RealNeil (Sep 25, 2014)

I agree about first impressions. Get it right to begin with and you'll have a lot more traction selling your products.


----------



## frdmftr (Sep 25, 2014)

DaedalusHelios said:


> Yeah with a 100watt+ breathing room 600watt sounds safer. http://us.hardware.info/reviews/562...way-sli-review-test-results-power-consumption
> 
> Haswell-E with GTX 980 SLI consumes 429W on their test. I am sure you couldn't run furmark and get that though.
> 
> ...


 

I'm glad you brought up the share holders because withholding the 2304 core GK 110  was a dangerous play for them, a disservice to the gaming industry, and a huge disservice to the consumer. Allow me to explain

First of all had the 2304 GK 110 been released around the time of the GK 204 that card would have sold for $600 a Full year longer than it did and they would still realized huge profits on a GK 204 at $400. Nvidia would have completly owned the top of the market undisputed for at least 9 months. Plus with the GK 110 being more available, game developers would have seen more resources to work with on a larger scale and game development would advance.

Now we know for a fact they were making GK 110's because of the Titan. We know there would have been a decent percentage of chips that did not make the 2600 core standard for the Titan. What were they doing with these chips, they were sitting on a shelf collecting dust thats what.  

Here is where the $500 greed on the GK 204 harms everyone including the Shareholders.  Now this is a what if and it didn't play out this way as we know, it just goes to show how pure greed is dangerous....

Had the Hawaii (290/290X) chip been a mere 10% faster? At best the 780 would have sold at $650 for only 5 months then it would have gone to $430 maybe $440. AND the 780ti would have only been equal to the 290X and at best the 780Ti would have only been a $620 or so card.  

As We know this didn't happen, but it is almost as bad as what actually did, as I have explained above. Additionally, Nvidia's 780 pricing structure was very unstable and the consumer cleary lost value.

Capitalism is putting your best to market at a fair price regardless of what the competition is doing. What Nvidia did and are currently are doing is a plague that has infected this world. Chasing increasing margin with while withholding tech is PURE GREED plain and simple. Think about that as you rush to the store to spend $550 on a chip that had held an introductory market price of $200-$250 since 2005


----------



## W1zzard (Sep 25, 2014)

frdmftr said:


> Chasing increasing margin with while withholding tech is PURE GREED plain and simple.


Playing devil's advocate here, even though I agree with you: That's how corporations work. If you gave your money to NVIDIA as investor wouldn't you expect maximized returns on your $$ ?
If you had 1 million to invest today, would you give it to AMD or NVIDIA ?


----------



## JethroBodine (Sep 26, 2014)

W1zzard said:


> Playing devil's advocate here, even though I agree with you: That's how corporations work. If you gave your money to NVIDIA as investor wouldn't you expect maximized returns on your $$ ?
> If you had 1 million to invest today, would you give it to AMD or NVIDIA ?



AMD W1zzard, because they give a gamer a decent chunk of wafer for his $500! Plus, it's cold in Wisconsin in the winter, 290X on uber mode running on a FX8350 cpu could heat the whole house! Lots of cold states out there.


----------



## btarunr (Sep 26, 2014)

JethroBodine said:


> AMD W1zzard, because they give a gamer a decent chunk of wafer for his $500! Plus, it's cold in Wisconsin in the winter, 290X on uber mode running on a FX8350 cpu could heat the whole house! Lots of cold states out there.



Friendly advice, don't invest in anything.


----------



## W1zzard (Sep 26, 2014)

btarunr said:


> Friendly advice, don't invest in anything.


Universal advice, rather spend it in some more or less meaningful way


----------



## JethroBodine (Sep 26, 2014)

btarunr said:


> Friendly advice, don't invest in anything.



I was being facetious, because the question was self evident and stated as proof NVIDIA's strategy is a successful one.

NVIDIA and intel enjoy the position of being the first tier vendor to a market of people whose income has been largely unaffected by economic downturn. Computer gamers are often people connected to IT/IS, and other technophiles. As such, they can charge higher prices to people who have more disposable income for their hobbies.


----------



## frdmftr (Sep 27, 2014)

W1zzard said:


> Playing devil's advocate here, even though I agree with you: That's how corporations work. If you gave your money to NVIDIA as investor wouldn't you expect maximized returns on your $$ ?
> If you had 1 million to invest today, would you give it to AMD or NVIDIA ?




Hubris is more dangerous than incompetence. Nvidia dodged the Hawaii bullet due to outside influences. The only reason the 780 and 780ti was able to hold the price point they did was due to the fact that you could not buy an AMD card at retail because of crypto mining. Nvidia underestimated AMD and they got lucky, I dont think that makes them a better investment.

From what I have seen on other sites the 290X is not a fully enabled chip. Given the improvements of Tonga over Tahiti and those improvements applied to a fully enabled Hawaii, I think AMD is still very much in this game.

As a side note, my hybrid PhysX rig works very well and I am Impressed with my 290X BUT if Nvidia wants to get back on track with pricing I would most likely come back. I think the driver support is better and on the few games that have PhysX I enjoy the extra effects(which explains the hybrid PhysX).


----------



## JethroBodine (Sep 27, 2014)

frdmftr said:


> From what I have seen on other sites the 290X is not a fully enabled chip. *Given the improvements of Tonga over Tahiti and those improvements applied to a fully enabled Hawaii,* I think AMD is still very much in this game.



http://www.techpowerup.com/reviews/Sapphire/R9_285_Dual-X_OC/30.html



			
				TechPowerUp said:
			
		

> Quite slow as performance has not improved
> 
> Not very power efficient
> 
> ...




http://www.anandtech.com/show/8460/amd-radeon-r9-285-review/19



			
				AnandTech said:
			
		

> At the end of the day it brings a very minor 3-5% performance increase over the R9 280 with virtually no change in price or power consumption.



So you think if AMD brings the 3-5% improvements of Tonga to Hawaii, they will overcome the 19% performance deficit at 1080P or the 13% deficit at 25X16 reported by this site? Am I missing something?

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/26.html


----------



## frdmftr (Sep 27, 2014)

JethroBodine said:


> http://www.techpowerup.com/reviews/Sapphire/R9_285_Dual-X_OC/30.html
> 
> http://www.anandtech.com/show/8460/amd-radeon-r9-285-review/19
> 
> ...



1) preformnace would scale based on the number of cores. So a 10% or more of an increase in preformance for a card with 40% more cores is very realistic.

2) the 3-5% increase was accomplished with 33% less memory and a 33% narrower memory interface.

3) A fully enabled Hawaii chip would by itself be close to 10% faster
http://wccftech.com/amd-hawaii-gpu-...chip-48-compute-units-3072-stream-processors/

TPU is a great site however, it is not the best site to get a fair compairson of AMD vs Nvidia. the useage of older games Like Diablo 3 which heavily favor Nvidia skew the summary.  Granted the 290 vs 780 it only gives the 780 a 10% preformance advantage but it is the 20 extra fps that will skew the average.  Hitman is a more relevent game to benchmark in 2014 than Diablo 3 and Hitman is a game that very much favors AMD.

A few quotes from Toms Hardware and a link to the page to prove my point

"We'd also like to address the GeForce GTX 760's position on our average performance chart. Seeing the 760 sit just *below* the Radeon *R9 270X* wasn't something we expected"

"this situation is the result of a combination of factors including *some newer game titles* such as Thief (that favor the GCN architecture), *mixed with some high-detail settings*"

http://www.tomshardware.com/reviews/amd-radeon-r9-285-tonga,3925-14.html


----------



## JethroBodine (Sep 28, 2014)

frdmftr said:


> 1
> 
> 3) A fully enabled Hawaii chip would by itself be close to 10% faster
> http://wccftech.com/amd-hawaii-gpu-...chip-48-compute-units-3072-stream-processors/



Dave Baumann of AMD PR has already stated publicly on his old site that the R290X is a full chip, so people probably shouldn't wait for the "full chip" that AMD launched long ago:

http://forum.beyond3d.com/showpost.php?p=1817984&postcount=2269



			
				Dave Baumann/AMD said:
			
		

> R9 290X is a full chip.



Which makes sense. If there was a "fully enabled" 290X chip, they probably would have launched it instead of watching NVIDIA have the top single GPU spot for the last 11 months.


----------



## Fluffmeister (Oct 8, 2014)

GTX 980 is an overclocking monster, whilst being frugle on power.

http://www.hardocp.com/article/2014...overclocking_video_card_review/5#.VDWtlqhdWe4

Nv have achieved wonders with Maxwell.


----------



## nunyabuisness (Oct 17, 2014)

frdmftr said:


> NVIDIA MIDRANGE CHIPS/CARDS
> GTX 460 = GF 104
> GTX 560 = GF 114
> GTX 680/660TI/770 =GK 104
> ...



Tom from Nvidia who appeared on the PC Per demo video confirmed the 980 is the high end of that chip. there is no more headroom he replied for that chip! 

Nvidia have dug themselves a large hole in the mid range market. the 970 and 980 both beat the 290 series cards EASILY. in power and Horsepower 
but not releasing the 960 has left a hole in the midrange market. people that are constrained to sub $350 or even $300 market the 285 and 280X is still the best chip for the cash. 
WHich has to lend creedence to the 960 either getting really bad yields. the GM 206 silicon isnt a good chip and isnt performing as well as the GM 204 (980/970) chip.
Either way the midrange market as long as NVidia leave the 960 out of it. are just handing over the most profitable sector in the GFX card market midrange.


----------



## W1zzard (Oct 17, 2014)

nunyabuisness said:


> WHich has to lend creedence to the 960 either getting really bad yields. the GM 206 silicon isnt a good chip and isnt performing as well as the GM 204 (980/970) chip.
> Either way the midrange market as long as NVidia leave the 960 out of it. are just handing over the most profitable sector in the GFX card market midrange.


Or it simply means NVIDIA makes people buy the GTX 970 and 980, even though they could just afford the 960, but don't want to wait for it


----------



## nunyabuisness (Oct 17, 2014)

I'm not sure what world you live in. but most normal people. are constrained by their BUDGETS. they wont buy something double the price because nvidia made them.

Take me for example. I have a 7850 O/C. Ive been holding out for years 2 respins of the same chips, but they havent released any new silicon except for the 290. that is out of my price range. so is the 970 and 980. which leaves me the 280,285 and 280X. also the 760. the 280 series beats the 760 easily. so the 280 is still the best choice for me. as I need the extra horsepower for next months new games.


----------

