# AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price



## btarunr (Jan 29, 2015)

AMD decided to cash-in on the GeForce GTX 970 memory controversy, with a bold move and a cheap (albeit accurate) shot. The company is making its add-in board (AIB) partners lower pricing of its Radeon R9 290X graphics card, which offers comparable levels of performance to the GTX 970, down to as low as US $299.

And then there's a gentle reminder from AMD to graphics card buyers with $300-ish in their pockets. With AMD, "4 GB means 4 GB." AMD also emphasizes that the R9 290 and R9 290X can fill their 4 GB video memory to the last bit, and feature a 512-bit wide memory interface, which churns up 320 GB/s of memory bandwidth at reference clocks, something the GTX 970 can't achieve, even with its fancy texture compression mojo.



 



*View at TechPowerUp Main Site*


----------



## Cybrnook2002 (Jan 29, 2015)

Nice to throw in a punch every now and then  

Excited for the 380X already.....


----------



## Parn (Jan 29, 2015)

Further lowing the price will only hurt the profit margin for AMD and their partners. Hawaii XT dies and 290X cards are not cheap to manufacture.


----------



## btarunr (Jan 29, 2015)

Parn said:


> Further lowing the price will only hurt the profit margin for AMD and their partners. Hawaii XT dies and 290X cards are not cheap to manufacture.



I guess making some money is better than making no money (nobody buying R9 290X over GTX 970).


----------



## 64K (Jan 29, 2015)

$280 after rebate. Nice.




Is not happy.


----------



## THE_EGG (Jan 29, 2015)

yet here in Australia they still run for roughly $500-$650+ depending on the model. Oh well...


----------



## Eroticus (Jan 29, 2015)

btarunr said:


> I guess making some money is better than making no money (nobody buying R9 290X over GTX 970).



The reasons are - nvidia has much more fan boys / power effective / 970 is newer product  ( 1.6 years )/ and the main reason - 380x is coming . no point to buy old gen when amd will win like always in next one ....


----------



## DarkOCean (Jan 29, 2015)

Parn said:


> Further lowing the price will only hurt the profit margin for AMD and their partners. Hawaii XT dies and 290X cards are not cheap to manufacture.


I dont think is that much more expensive to make than a gm204, the chip it's less than 10% bigger.


----------



## Parn (Jan 29, 2015)

btarunr said:


> I guess making some money is better than making no money (nobody buying R9 290X over GTX 970).



According to other news, a lot of the 970 owners in EU are going to return their cards. AMD could have been a bit more patient before cutting the price as those people will be sprint for 290X afterwards. There is no other option that can provide comparable performance at the 970/290X price range.


----------



## Jorge (Jan 29, 2015)

For those who don't know, top-of-the-line CPUs and GPUs are cash cows with huge margins compared to mainstream models. Only enthusiasts  or Biz buy the very high end, high margin products. Lowering the price slightly is a no brainer and will more than be offset by the increased volume. It's not difficult to do the math when you know the margins and volume.


----------



## Zakin (Jan 29, 2015)

I think this is more so, AMD knew they had to price drop these half a year ago, and this was just a decent time to do it. Otherwise they've been overpriced for quite some time, enjoy your GPUs that AMD enjoys touting 94 celsius running as a feature.


----------



## Parn (Jan 29, 2015)

DarkOCean said:


> I dont think is that much more expensive to make than a gm204, the chip it's less than 10% bigger.



6.2 billion transistors over 5.2 billion. That's about 20% less yield per wafer (this doesn't even take chip defects into accounts). Coupled with the 512bit bus, the card is quite a bit more expensive than 980.


----------



## Sasqui (Jan 29, 2015)

Zakin said:


> enjoy your GPUs that AMD enjoys touting 94 celsius running as a feature.



my XFX 290x DC doesn't get over 78c, my 290x reference under H20 doesn't break 40c.

Reference cooler?  Yea, that's another story, lol


----------



## Zakin (Jan 29, 2015)

Sasqui said:


> my XFX 290x DC doesn't get over 78c, my 290x reference under H20 doesn't break 40c.
> 
> Reference cooler?  Yea, that's another story, lol



I would sure hope under water it would be that cool...although I'm surprised I don't recall the XFX doing that decently, still a bit on the high side though. I ditched AMD after two straight generations of dealing with 2-3 months post launch games with no fixes/optimizations, let alone the older games they never would fix. I still have to deal with them on my brother's build unfortunately, typically every other month or so, they've definitely gotten better at least. Still not sure the appeal on the super inefficient chips in the R9 200s though, I'm not a big fan of throwing money at an unfinished engineering project.


----------



## TheGuruStud (Jan 29, 2015)

Zakin said:


> I would sure hope under water it would be that cool...although I'm surprised I don't recall the XFX doing that decently, still a bit on the high side though. I ditched AMD after two straight generations of dealing with 2-3 months post launch games with no fixes/optimizations, let alone the older games they never would fix. I still have to deal with them on my brother's build unfortunately, typically every other month or so, they've definitely gotten better at least. Still not sure the appeal on the super inefficient chips in the R9 200s though, I'm not a big fan of throwing money at an unfinished engineering project.



You should probably look at nvidia's hand slinging money for game issues (crossfire is a different story but single gpu has been pretty solid).


----------



## Menta (Jan 29, 2015)

https://rog.asus.com/forum/showthread.php?57022-asus-970-strix-false-specs


----------



## NightOfChrist (Jan 29, 2015)

I agree with AMD. A 4GB card means it has 4GB and should be able to use all the vRAM available when needed. I am glad I went for a Zotac GTX 980 reference card instead of a custom GTX 970, despite the latter being much cheaper than the former. I can say I am very satisfied with my purchase but I hope there will be a better, if not best solution for all GTX 970 owners who decided not to return their cards. Let this be a very important lesson for NVIDIA so they will not repeat the same mistake again.


----------



## Zakin (Jan 29, 2015)

TheGuruStud said:


> You should probably look at nvidia's hand slinging money for game issues.


I'd usually be all about this, except I comically remember back in the day the Nvidia titles never really seemed to have too many issues on AMD cards so long as you weren't stupid enough to try and run physx. If anything those games typically came off as better PC ports because of the endorsement. That actually still seems true today. I wouldn't doubt Nvidia slinging game to keep their name better with devs, you know if devs enjoy having likely half of their user base pissed off at them for long periods of time because they won't fork over a little information to AMD. Doesn't seem wise at all to me.


----------



## TheGuruStud (Jan 29, 2015)

Zakin said:


> I'd usually be all about this, except I comically remember back in the day the Nvidia titles never really seemed to have too many issues on AMD cards so long as you weren't stupid enough to try and run physx. If anything those games typically came off as better PC ports because of the endorsement. That actually still seems true today. I wouldn't doubt Nvidia slinging game to keep their name better with devs, you know if devs enjoy having likely half of their user base pissed off at them for long periods of time because they won't fork over a little information to AMD. Doesn't seem wise at all to me.


Nvidia sends in so many programmers to control game optimization...it's like the stazi. Devs are afraid of them. They will never say a bad thing about Nvidia unless they want fired. Devs haven't had control of their games in years.

Crysis 2 is a standout example. I can't remember if that's the one where the devs refused to even comment or not, though. Nvidia was basically allowed full control and pumped in massive amounts of tesselation for flat surfaces. IIRC, there was even an underground river getting rendered. 17% slowdown on nvidia and 30% on AMD. The gtx 580 could churn through tesselation much better. Nvidia dropped this tactic when the new AMD cards came out with heavy tess firepower.


----------



## btarunr (Jan 29, 2015)

Menta said:


> https://rog.asus.com/forum/showthread.php?57022-asus-970-strix-false-specs



I would argue that the store is the last (and only) link in the supply-chain between you and the product. So if you want a return, you should take it up with the store. PT has good consumer laws.


----------



## KarymidoN (Jan 29, 2015)

So many Nvidia Fanboys...
U$ 280 After Rebate... R9 290X (non-Reference cooler)... 4GB VRAM - 512bits...
U$ 350 -> GTX 970 (msi or Asus) = 3,5GB VRAM - 256 bits....






You can pay JUST U$ 70 for 4,7 frames (the graph use an reference R9 290X)....
Temperature... Noise... etc... etc...

mine GTX 970 MSI gaming 4GB is coming home, but i'm not happy cause i'm going to use it on 3 FHD displays, and yes, need some extra video memory...


----------



## Cybrnook2002 (Jan 29, 2015)

I have a belly button....


----------



## Shambles1980 (Jan 29, 2015)

I honestly would just keep a 970 if i had bought one.
It seems a bit silly to complain over gddr usage when the performance is there. 
If they return them then im not sure what you would buy instead of a 970 possibly a 780ti? Although i still think id prefer a 970. 
the 290x is a powerful card but i dont think id chose one over the 970 simply for heat reasons.


----------



## bogami (Jan 29, 2015)

If they canot drop the price on real 230$ (even this is too much)! it should now look into piles in front of their doors ! greedy bastards .Processors that should be in the trash because of imperfect operation of the entire processor is locked in part (the part that is defective cutting silicone) and instead of partially cover the cost of production want to do with garbage greasy profit .the sad thing is that we will do this with TITAN X.
So it is when you stand constructional problems in software drivers and this then it bites you in the ass nVidia. However, please note that this lesson will not sobered them up. I want to see fall to the bottom and then they will begin again to respected customers . We live in an age of information and misinformation, and it seems that nVidia is more on the misinformation .Many blindly buy and do not know how many tricks are here ..
AMD R9-290x is much better card regardless of the FPS is much more stable operation and 512 paips ,remember ,6 monitors....Tf"ps !!!!!!!!!!!!! A.S.O.


----------



## Ja.KooLit (Jan 29, 2015)

THE_EGG said:


> yet here in Australia they still run for roughly $500-$650+ depending on the model. Oh well...



the price cuts are only in US. I dont know about europe. But same as you, here in korea it cost same as launch price.


----------



## NightOfChrist (Jan 29, 2015)

night.fox said:


> the price cuts are only in US. I dont know about europe. But same as you, here in korea it cost same as launch price.


There is no price cut here in Japan either. My girlfriend wants to buy one for The Sims 4 and Final Fantasy. Perhaps after a few days?


----------



## Menta (Jan 29, 2015)

btarunr said:


> I would argue that the store is the last (and only) link in the supply-chain between you and the product. So if you want a return, you should take it up with the store. PT has good consumer laws.




they said no because they have no recall order


----------



## newtekie1 (Jan 29, 2015)

Good, hopefully nVidia follows and drops the 970 price so I can grab a second.


----------



## RejZoR (Jan 29, 2015)

Hate it when price drops never reach Europe. Still selling them for 400+ EUR...


----------



## ShurikN (Jan 29, 2015)

Gigabyte's is the only one with 300$ price tag, the rest are 340-360
That's basically the same price it had for like couple of months now.
Fail to see that major price drop...


----------



## Maban (Jan 29, 2015)

@btarunr What's the source of that AMD propaganda image? Is it actually from AMD?


----------



## MxPhenom 216 (Jan 29, 2015)

NVIDIA dropped the ball so hard on that 970.


----------



## ironwolf (Jan 29, 2015)

ShurikN said:


> Gigabyte's is the only one with 300$ price tag, the rest are 340-360
> That's basically the same price it had for like couple of months now.
> Fail to see that major price drop...


Newegg US:

(Diamond brand $279.99 after MIR)
http://www.newegg.com/Product/Product.aspx?Item=N82E16814103246&ignorebbr=1

(Gigabyte brand $279.99 after MIR)
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125499&ignorebbr=1


----------



## 15th Warlock (Jan 29, 2015)

ironwolf said:


> Newegg US:
> 
> (Diamond brand $279.99 after MIR)
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814103246&ignorebbr=1
> ...



Thanks for the link, wonder how these puppies work in 3 way crossfire, for 280 the gigabyte card seems super tempting, I could add more muscle to my radeon avenger


----------



## hyp36rmax (Jan 29, 2015)

Parn said:


> Further lowing the price will only hurt the profit margin for AMD and their partners. Hawaii XT dies and 290X cards are not cheap to manufacture.



You totally missed the point, this move will broaden AMD's market share as well as solidify possible AMD customers to the brand.  It's also a great way to move product which would happen anyways with the R9 300 series on the horizon.


----------



## Sony Xperia S (Jan 29, 2015)

Cybrnook2002 said:


> Excited for the 380X already.....



You will have to wait a little bit.... well, a few months. 



THE_EGG said:


> yet here in Australia they still run for roughly $500-$650+ depending on the model. Oh well...



Not much of a pain to die from. You can always ask a friend to import you one from elsewhere. 



newtekie1 said:


> Good, hopefully nVidia follows and drops the 970 price so I can grab a second.



I would recommend to get rid of the nvidia and buy two 290Xs instead. I will insure and guarantee you a better user experience.


----------



## IINexusII (Jan 29, 2015)

Maban said:


> @btarunr What's the source of that AMD propaganda image? Is it actually from AMD?



https://twitter.com/Thracks/status/560511204951855104


----------



## ShurikN (Jan 29, 2015)

ironwolf said:


> Newegg US:
> 
> (Diamond brand $279.99 after MIR)
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814103246&ignorebbr=1
> ...


Ugh wouldn't touch that Diamond with a 10 ft pole


----------



## 15th Warlock (Jan 29, 2015)

Sony Xperia S said:


> I would recommend to get rid of the nvidia and buy two 290Xs instead. I will insure and guarantee you a better user experience.



Why would you recommend that? The 970 performs exactly the same now as it did before this whole wrong specs ordeal came to light.

Why would he dump a perfectly working card? As a user of both Nvidia and AMD cards, I can tell from personal experience they both have their pros and con's, but to be honest, if nvidia drops the price of the 970, getting a second card for cheap remains a perfectly viable option to current 970 owners, one I'm sure no one would regret


----------



## rruff (Jan 29, 2015)

Parn said:


> Further lowing the price will only hurt the profit margin for AMD and their partners. Hawaii XT dies and 290X cards are not cheap to manufacture.



The beauty of mass production is that the marginal cost is quite small to spit out an extra widget after the design has been done and the manufacturing has been set up. AMD will not lose money by selling the 290x cheaper than the 970 at this point in time, and they will hopefully claw back some market share. AMD lost a lot of market share in 3&4th Q 2014. Any company that intends to stay in business cannot afford to let that happen.


----------



## krimetal (Jan 29, 2015)

Sasqui said:


> my XFX 290x DC doesn't get over 78c, my 290x reference under H20 doesn't break 40c.
> 
> Reference cooler?  Yea, that's another story, lol



How many people buy reference cards anyway?


----------



## rruff (Jan 29, 2015)

ironwolf said:


> Newegg US:
> 
> (Diamond brand $279.99 after MIR)
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814103246&ignorebbr=1
> ...



Much better price on a Powercolor at NCIX. Only $233: http://www.ncixus.com/products/?usa...XnhUkV12US3NTtDzY0&ir_cid=3092&ir_affid=10451


----------



## ShurikN (Jan 29, 2015)

rruff said:


> Much better price on a Powercolor at NCIX. Only $233: http://www.ncixus.com/products/?usaffiliateid=1000031504&sku=103515&vpn=AXR9 290X 4GBD5-TDHE&manufacture=PowerColor&promoid=1452&ir_clickid=Wf51I9yQYyNqzXaUZTW5GXnhUkV12US3NTtDzY0&ir_cid=3092&ir_affid=10451


Wow, now that's a steal. You could buy 2 and destroy everything in your wake. For less than a single 980.


----------



## 64K (Jan 29, 2015)

rruff said:


> Much better price on a Powercolor at NCIX. Only $233: http://www.ncixus.com/products/?usaffiliateid=1000031504&sku=103515&vpn=AXR9 290X 4GBD5-TDHE&manufacture=PowerColor&promoid=1452&ir_clickid=Wf51I9yQYyNqzXaUZTW5GXnhUkV12US3NTtDzY0&ir_cid=3092&ir_affid=10451



That's a steal but I don't know anything about Powercolor. Are they a good quality brand that backs up their warranty well?


----------



## KarymidoN (Jan 29, 2015)

64K said:


> That's a steal but I don't know anything about Powercolor. Are they a good quality brand that backs up their warranty well?



PowerColor = Good Quality, good RMA...
Only issue: The turboduo cards in 100% fan = LOUDEST DUAL FAN CARDS... just use automatic fan control and play quiet...


----------



## 15th Warlock (Jan 29, 2015)

64K said:


> That's a steal but I don't know anything about Powercolor. Are they a good quality brand that backs up their warranty well?



Was gonna ask the same question, anyone here care to share their thoughts on this particular card?


----------



## rruff (Jan 29, 2015)

64K said:


> That's a steal but I don't know anything about Powercolor. Are they a good quality brand that backs up their warranty well?



They've been around awhile. They are a low-end video card company, but are legit. Here are Newegg reviews on this specific card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814131569

Don't see any tests of this model, but lots of the 3 fan 290x.


----------



## REAYTH (Jan 29, 2015)

KarymidoN said:


> So many Nvidia Fanboys...
> U$ 280 After Rebate... R9 290X (non-Reference cooler)... 4GB VRAM - 512bits...
> U$ 350 -> GTX 970 (msi or Asus) = 3,5GB VRAM - 256 bits....
> 
> ...


And? I have two 670's that devour that for less money.


----------



## ThE_MaD_ShOt (Jan 29, 2015)

Damn you Amd. Just when I was thinking of selling my 290x you have to drop prices and drive the used price even lower.


----------



## 64K (Jan 29, 2015)

rruff said:


> They've been around awhile. They are a low-end video card company, but are legit. Here are Newegg reviews on this specific card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814131569
> 
> Don't see any tests of this model, but lots of the 3 fan 290x.



Yeah, I had a quick look around for info and reviews on the 2 fan 290x and didn't find much. It seems the 290x PCS 3 fan is much more popular. I did find this on the 2 fan 290x

http://www.reddit.com/r/buildapcsales/comments/2s50zf/gpupowercolor_radeon_r9_290x_975mhz_4gb_26999/


----------



## Fluffmeister (Jan 29, 2015)

All this controversy and they still feel the need to lower their prices, doh!

AMD know that cheap[er] GTX 970's hitting the market ain't going to do them any favours, in fact it's the last thing they need.


----------



## GreiverBlade (Jan 29, 2015)

krimetal said:


> How many people buy reference cards anyway?


i do ... because a Aquacomputer Kryographics Hawaii + backplate on a 2nd hand ref 290 @ 190$ is even a bit cheaper than some of the custom design (and looks better)

well ... i love both brand but the 970 "drama" is still real ... even if it doesn't impact the real performance, they still did try to hide it, that fact only is enough for me. (not that any other brand doesn't "cheat" but still  )
waiting to see some 2nd hand 290/290X price down where i am ... if it reach my country before the 380x or something else from Nv (i think my 290 will sit on my shelf by that time .... i can not resolve myself to sell the best card i ever had  all category include : bang for buck, check )


----------



## Casecutter (Jan 29, 2015)

64K said:


> That's a steal but I don't know anything about Powercolor. Are they a good quality brand that backs up their warranty well?


 
I've had various PowerColor (~5) other 4-5years and not one complaint.  Always get my rebates, the 280 I got beginning of Dec came in like 3 weeks.  Did have an old 6790 that had a noisy fan contacted service and they just sent a fan no questions asked.

Robust construction, many with backing plates, as above you need good air movement or you can tell the fans will ramp up, add changed fans and it's quiet.

It might be too early for pricing to move out for the 380X, but what they heck as good as any to start...
While AMD should not be that blatant on the 4Gb is 4Gb, more just say full memory bandwidth of 320 GB/s utilization, on 512-Bit.


----------



## rruff (Jan 29, 2015)

Fluffmeister said:


> AMD know that cheap[er] GTX 970's hitting the market ain't going to do them any favours, in fact it's the last thing they need.



Nah! The 970s are now tainted, so they lower the price a little on the 290 and 290x to get more people to jump from Nvidia to AMD. 

The work has already been done on these cards. Marginal costs are low. What AMD needs is *market share*, and this is a smart move for them. If/when Nvidia cuts the price of 970s they can follow them down, and then the 380x will be released which will hopefully blow away the 980, and they can sell that one for a premium.


----------



## Fluffmeister (Jan 29, 2015)

rruff said:


> Nah! The 970s are now tainted, so they lower the price a little on the 290 and 290x to get more people to jump from Nvidia to AMD.
> 
> The work has already been done on these cards. Marginal costs are low. What AMD needs is *market share*, and this is a smart move for them. If/when Nvidia cuts the price of 970s they can follow them down, and then the 380x will be released which will hopefully blow away the 980, and they can sell that one for a premium.



I hope your right, I really do, losing market share and money like AMD have been doing can't be fun for the red team.

Besides I'm up for a second cheap and "tainted" 970, keep me posted on any great deals you see, if what you say is true they won't sell out quickly anyway.


----------



## damric (Jan 29, 2015)

Damn these 290x are tempting at that price, but I would need a heftier PSU I think than just my 550W Capstone.


----------



## Sasqui (Jan 29, 2015)

krimetal said:


> How many people buy reference cards anyway?



I made that mistake once.  That's the one underwater.

Really though, the card would throttle with the fan on full blast! (running FurMark mind you)


----------



## Toothless (Jan 29, 2015)

I'm still wanting two 970's for triple 1080p. Though at this point I might have to go AMD..


----------



## hyp36rmax (Jan 29, 2015)

damric said:


> Damn these 290x are tempting at that price, but I would need a heftier PSU I think than just my 550W Capstone.




You'll be fine as long as you don't go crazy with the voltage.


----------



## HumanSmoke (Jan 29, 2015)

krimetal said:


> How many people buy reference cards anyway?


And yet, in the highest demand phase of the high-end products life cycle, the launch period, AMD allows ONLY reference designs.
That's management gold right there!


----------



## rruff (Jan 29, 2015)

damric said:


> Damn these 290x are tempting at that price, but I would need a heftier PSU I think than just my 550W Capstone.





hyp36rmax said:


> You'll be fine as long as you don't go crazy with the voltage.



I agree... it's a decent PSU and will likely have no problem unless the rest of your system is a power hog.


----------



## GreiverBlade (Jan 29, 2015)

damric said:


> Damn these 290x are tempting at that price, but I would need a heftier PSU I think than just my 550W Capstone.





hyp36rmax said:


> You'll be fine as long as you don't go crazy with the voltage.





rruff said:


> I agree... it's a decent PSU and will likely have no problem unless the rest of your system is a power hog.


agreed too, this is my calculation on my mainrig (including the OC on the CPU but not the GPU)
http://extreme.outervision.com/PSUEngine proved to be pretty reliable most of the time for me

*Minimum PSU Wattage:478 W
Recommended 
PSU Wattage:* *















i have a 650w bronze Integra R2 (a good cheap PSU which rate almost like a silver cert) and i OC my 290 @ a 290X stock level (with some run for benchies @ 1150/1500) without any hiccups

edit: the total wattage is including 5x 120mm led fan, 1x 140mm led fan, 1x 60mm RAM fan, 2x Phobya DC12-220, 1x Led stripe 30cm 12v, 1 SSD and 2 7.2Krpm SIII HDD


----------



## Tonduluboy (Jan 29, 2015)

In my country MSI gaming R9 290x already been selling at USD$310 for months. 

$1 = RM3.6  today!


----------



## RealNeil (Jan 29, 2015)

Cybrnook2002 said:


> Nice to throw in a punch every now and then
> 
> Excited for the 380X already.....



Agreed,.......Smart move on their part to press any advantage they can.
380X is interesting to me too.


----------



## AsRock (Jan 30, 2015)

Zakin said:


> I think this is more so, AMD knew they had to price drop these half a year ago, and this was just a decent time to do it. Otherwise they've been overpriced for quite some time, enjoy your GPUs that AMD enjoys touting 94 celsius running as a feature.



94c yeah when they used to sell them with stock coolers and they could be fixed them with some know how.



Zakin said:


> I would sure hope under water it would be that cool...although I'm surprised I don't recall the XFX doing that decently, still a bit on the high side though. I ditched AMD after two straight generations of dealing with 2-3 months post launch games with no fixes/optimizations, let alone the older games they never would fix. I still have to deal with them on my brother's build unfortunately, typically every other month or so, they've definitely gotten better at least. Still not sure the appeal on the super inefficient chips in the R9 200s though, I'm not a big fan of throwing money at an unfinished engineering project.



Mines doing perfectly fine, if there was a issue it would of been due to one of the VRM's but with good case air flow which you should have in the 1st place there is not real issue.


----------



## GreiverBlade (Jan 30, 2015)

AsRock said:


> 94c yeah when they used to sell them with stock coolers and they could be fixed them with some know how.
> Mines doing perfectly fine, if there was a issue it would of been due to one of the VRM's but with good case air flow which you should have in the 1st place there is not real issue.


so do mine ... 47° gpu 41° VRM when gaming ... (when i used it on stock ref well i had to put the fan at 65% permanent xD i reckon the temp were more 75° and 77° back then ) water is a must for a ref sample (the custom design were too expensive for me at the time   )


----------



## Rowsol (Jan 30, 2015)

I laughed when I saw the picture.  Stay classy, AMD.


----------



## Caring1 (Jan 30, 2015)

Parn said:


> 6.2 billion transistors over 5.2 billion. That's about 20% less yield per wafer (this doesn't even take chip defects into accounts). Coupled with the 512bit bus, the card is quite a bit more expensive than 980.


How do you figure that out?
Using tried and tested technology for die size actually keeps the costs lower, the GM chip is newer tech and costly to implement, therefore more bad chips are likely during start up and higher costs.


----------



## btarunr (Jan 30, 2015)

Menta said:


> they said no because they have no recall order



That doesn't mean you can't return a product what doesn't work as advertised. Take them to consumer court, get back not just your money but also legal fees incurred.


----------



## GhostRyder (Jan 30, 2015)

This made me laugh as to how up front they were with the wording.  Nice to see the price drop a little more though I have seen some for that price already (Heck lower in some cases) and its going to make some people think harder now I guess.


----------



## RichF (Jan 30, 2015)

NightOfChrist said:


> There is no price cut here in Japan either. My girlfriend wants to buy one for The Sims 4 and Final Fantasy. Perhaps after a few days?


Tell her to stick with the Sims 3. Sims 4 stinks.


----------



## xfia (Jan 30, 2015)

my sister loves the sims 4..  I think a lot girls do


----------



## RichF (Jan 30, 2015)

xfia said:


> my sister loves the sims 4..  I think a lot girls do


OK, but it's a worse game. It doesn't have the open world the Sims 3 has and it was radically stripped of content in order to sell more DLCs. It doesn't even have toddlers or pools.

EA didn't even bother to make a 64-bit binary which shows just how little effort the company thinks it needs to put into a game it expects to extract hundreds of dollars per person with. (Prior versions could run into memory limitations with enough 3rd-party content in conjunction with the tons of expansions, content needed to make the game more interesting.)


----------



## Pumper (Jan 30, 2015)

RejZoR said:


> Hate it when price drops never reach Europe. Still selling them for 400+ EUR...



Well, in Lithuania at least, the Gigabyte 970s are ~€400 (used to be ~350 before the damn euro plummeted against the almighty dollar) while Gigabyte 290Xs already were at €350-370.


----------



## NC37 (Jan 30, 2015)

Who cares? DX12 is coming. Within not too long, all these cards will be worthless in value as everyone will shift to 12. Even if games don't support it yet, the desire to get on board with a new standard will be enough. Specially with M$ finally getting off their butt thanks to AMD Mantle showing them their butt.


----------



## xfia (Jan 30, 2015)

Microsoft has made DX12 widely compatible with older AMD and Nvidia gpu's. there may be a few features not supported but everything that is most important will be anyway.
they have always worked with hardware companies in development to ensure compatibility.. its just smart business.
hell if game developers would have embraced new DX api's we could have been on DX14 by now.
I'm sure they felt a little slapped in the face when DX11 was released and game developers continued using DX9.


----------



## Pumper (Jan 30, 2015)

NC37 said:


> Who cares? DX12 is coming. Within not too long, all these cards will be worthless in value as everyone will shift to 12. Even if games don't support it yet, the desire to get on board with a new standard will be enough. Specially with M$ finally getting off their butt thanks to AMD Mantle showing them their butt.



lol, what are you talking about? Every single DX11 card will be able to use DX12 as it's just a software update.


----------



## xfia (Jan 30, 2015)

yeah DX12 is going to work for the jaguar apu's in the xb1 and ps4...  exactly what that hardware needs


----------



## Big_Vulture (Jan 30, 2015)

Who is changing it to the double power hungry slower Radeon, that is stupid. Maybe $250 would be acceptable for 290X.


----------



## AsRock (Jan 30, 2015)

NC37 said:


> Who cares? DX12 is coming. Within not too long, all these cards will be worthless in value as everyone will shift to 12. Even if games don't support it yet, the desire to get on board with a new standard will be enough. Specially with M$ finally getting off their butt thanks to AMD Mantle showing them their butt.



Some fully support DX12 as i understand.

http://www.amd.com/en-us/press-releases/Pages/amd-demonstrates-2014mar20.aspx



Big_Vulture said:


> Who is changing it to the double power hungry slower Radeon, that is stupid. Maybe $250 would be acceptable for 290X.



Whats that bird that shoves it's head in sand ?, oh yes a ostrich.


----------



## buggalugs (Jan 30, 2015)

Big_Vulture said:


> Who is changing it to the double power hungry slower Radeon, that is stupid. Maybe $250 would be acceptable for 290X.



It isn't slower. I get 980 performance with my 290X. I get around 15,000 3D marks (11) with a 290X and 4790K and its completely silent, temps are around 69 degrees full load......  If any of you guys have a 970 run 3D mark 11 and post your score.

Im kind of glad I bought my 290X when I did last year, long before the 970/980 came out because all these cards will be worthless once DX 12 cards come out very soon. At least I got a good year out of it, If you just bought a 970/980 when we are so close to Windows 10 and DX12, its going to be a very poor investment for you. DX 12 is a game changer.


----------



## Prima.Vera (Jan 30, 2015)

buggalugs said:


> Im kind of glad I bought my 290X when I did last year, long before the 970/980 came out because all these cards will be worthless once DX 12 cards come out very soon. At least I got a good year out of it, If you just bought a 970/980 when we are so close to Windows 10 and DX12, its going to be a very poor investment for you. DX 12 is a game changer.



Neh, I don't think so tbh. I hope D3D12 wouldn't be a a"_game changer_" like D3D11 was over D3D9.1


----------



## Xzibit (Jan 30, 2015)

Most of the game engines would need a heavy re-write and developers are to lazy for that.  12years later and we are still getting DX9 game engines.  Not to mention how long its taken to transition to 64bit. Plus they aren't going to spend time and money because bigger texture packs and DLC content are all the rage now a days.

We'll see DX9 game engines being ported over to DX12 and we'll be luck to get newer DX10 ported by 2016.  Maybe Microsoft will pump money into a game to have DX12 native but it will just be a showcase product.  At the earliest maybe we'll start seeing a handful of games by 2016 Holiday Season.

Oops forgot. DX12 is on XB1 so we are still going to get half ass ports. They are just going to take less effort to port.

If you get passed all that you still have to deal with Patch-apalooza


----------



## GreiverBlade (Jan 30, 2015)

NC37 said:


> Who cares? DX12 is coming. Within not too long, all these cards will be worthless in value as everyone will shift to 12.


sweet utopia... [sarcasm]that's a fact that DX11 was widely adopted on launch date and all DX9/10 cards were rendered worthless/obsolete by that mean[/sarcasm] ... not mentioning that DX12 features are not all mandatory, any DX11/11.2 GPU will be able to hold and run games and be used in W10.

but that's not in the way of the thread, let stick to it.


----------



## Recus (Jan 30, 2015)

btarunr said:


> That doesn't mean you can't return a product what doesn't work as advertised. Take them to consumer court, get back not just your money but also legal fees incurred.



So why you can't refund games in Steam? Double standards?


----------



## btarunr (Jan 30, 2015)

Recus said:


> So why you can't refund games in Steam? Double standards?



Because you're not buying games, you're buying a licence to play that game, which is subject to Steam's and the game dev's EULAs.


----------



## 64K (Jan 30, 2015)

btarunr said:


> Because you're not buying games, you're buying a licence to play that game, which is subject to Steam's and the game dev's EULAs.



True. You don't own the games you have on Steam. They can lock you out of your account and games if they want to. It's in the EULA.


----------



## Recus (Jan 30, 2015)

btarunr said:


> Because you're not buying games, you're buying a licence to play that game, which is subject to Steam's and the game dev's EULAs.



But the game is stored in your PC so technically you bought him. Also physical copies have to be activated through Steam.


----------



## 64K (Jan 30, 2015)

Recus said:


> But the game is stored in your PC so technically you bought him. Also physical copies have to be activated through Steam.



Yes, it's stored on your drive but to be able to play most games on Steam you have to be able to log on to Steam to start the game. When you click on the game icon that's what happens. If your account is locked (usually for using hacks in MP or a hack to get free games from Steam) then the game being stored on your drive is useless.


----------



## lukart (Jan 30, 2015)

Right now 290's are a steal. For this price who goes for 970 its a 101% nvidia Fanboy. I mean, 290x obviously has more muscle than 970, full 4Gb   better 4K game play.... any reason left for buying the 970?


----------



## TRWOV (Jan 30, 2015)

lukart said:


> Right now 290's are a steal. For this price who goes for 970 its a 101% nvidia Fanboy. I mean, 290x obviously has more muscle than 970, full 4Gb   better 4K game play.... any reason left for buying the 970?



_Thermals, loudness and power requirements are some. _Granted, the 3 of them are easily dismissable, though. AIB designs take care of the first two and as for the third most people already have >500w PSUs. Power bills would be a factor only if you kept your PC on 24/7 folding or something. Also Physx but that's more like a footnote rather than a full reason (haven't seen many new games using Physx).


----------



## rruff (Jan 30, 2015)

TRWOV said:


> _Thermals, loudness and power requirements are some._



QC, reliability, drivers, features. 

For power consumption, the 290x uses ~8W more at idle, 43W more with multi-monitor, 67W playing a Bluray, 155W at max. Figure ~$1/yr/watt for 24/7 operation. It can be a lot depending on what you are doing with it. 

http://www.techpowerup.com/reviews/ASUS/GTX_970_STRIX_OC/23.html


----------



## Sony Xperia S (Jan 30, 2015)

rruff said:


> QC, reliability, drivers, features.



You would believe your rich imagination because you work at nvidia and are very deep into their world?.... :meh:


----------



## AsRock (Jan 30, 2015)

Big_Vulture said:


> Who is changing it to the double power hungry slower Radeon, that is stupid. Maybe $250 would be acceptable for 290X.





TRWOV said:


> _Thermals, loudness and power requirements are some. _Granted, the 3 of them are easily dismissable, though. AIB designs take care of the first two and as for the third most people already have >500w PSUs. Power bills would be a factor only if you kept your PC on 24/7 folding or something. Also Physx but that's more like a footnote rather than a full reason (haven't seen many new games using Physx).



Power usage might depend to even more so if you use vsync.


----------



## Hayder_Master (Jan 30, 2015)

bowwahahaha most funny video i ever see about 970


----------



## LiveOrDie (Jan 30, 2015)

Who cares the card still runs the same the 290x is a hair dryer end of story.


----------



## AsRock (Jan 30, 2015)

Live OR Die said:


> Who cares the card still runs the same the 290x is a hair dryer end of story.



WOW, only 1/2 truths now, your starting to sound like nVidia.


----------



## rruff (Jan 30, 2015)

Sony Xperia S said:


> You would believe your rich imagination because you work at nvidia and are very deep into their world?.... :meh:



I don't work at Nvidia. And I have no dog in this hunt.

If you want some ideas, look at reviews from owners on both sides and see what they say. Nvidia cards have fewer issues than AMD's.


----------



## Cybrnook2002 (Jan 30, 2015)

Lightning at Newegg is $299 after rebate, that's an awesome deal for a mean overclocker:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787


----------



## GhostRyder (Jan 30, 2015)

Big_Vulture said:


> Who is changing it to the double power hungry slower Radeon, that is stupid. Maybe $250 would be acceptable for 290X.


290X is more powerful than a GTX 970 so that point is invalid...But then again that comment an obvious troll bait...



rruff said:


> QC, reliability, drivers, features.
> 
> For power consumption, the 290x uses ~8W more at idle, 43W more with multi-monitor, 67W playing a Bluray, 155W at max. Figure ~$1/yr/watt for 24/7 operation. It can be a lot depending on what you are doing with it.
> 
> http://www.techpowerup.com/reviews/ASUS/GTX_970_STRIX_OC/23.html


QC is about the same dude including reliability, the differences posted are normally very small except in cases involving massive problems with certain specific cards which is not something seen very often.  Drivers are just as fine no matter what brand your using so that argument is completely irrelevant and the same goes for features because both sides have a counter to each feature within reason.

On top of that power consumption is already proven time and again to be a moot point except in small situations.  In most cases you would have to run a card for such a high amount of time under load through the year to really equate to power differences becoming present on your bill.  On top of that it normally would take years of doing that just to equate to a reasonable price difference between the cards especially when one card is cheaper than the other.  Not to mention you have to include people who use Vsync or similar which alleviates a lot of stress off the GPU and lowers the power usage as well.  The only major concern for power usage would be a PSU for users which a ~500watt is generally what a gamer buys and will run the card so its still a moot point.

Anyway, either way its funny AMD is doing this to cash up on people returning the card with that type of joke add.  Either way I am sure they are going to get some sales with that price on their cards since they are still one of the best high resolution performing GPU's out there at the moment.  Prices so good on high end gaming cards more people can join the fray and get some serious gaming cards for a good price.



Cybrnook2002 said:


> Lightning at Newegg is $299 after rebate, that's an awesome deal for a mean overclocker:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787


 
Dang, now I wish I wanted/needed one of those variants.


----------



## Sony Xperia S (Jan 30, 2015)

rruff said:


> Nvidia cards have fewer issues than AMD's.



That's good for the stupid because they never have to think that probably the problem is in them or in their system.


----------



## rruff (Jan 30, 2015)

GhostRyder said:


> QC is about the same dude including reliability, the differences posted are normally very small except in cases involving massive problems with certain specific cards which is not something seen very often.  Drivers are just as fine no matter what brand your using so that argument is completely irrelevant and the same goes for features because both sides have a counter to each feature within reason.



You can say that, but have you looked? Do you have evidence? Newegg is probably the best source. Here are the GTX 970 and R9 290x organized by rating: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709 600536049 600473871 &IsNodeId=1&page=1&bop=And&Order=RATING&PageSize=90

There are 6 Nvidia's ahead of any AMD and the sole AMD card with more than 10 reviews and getting 5 eggs is a 8GB Sapphire that costs >$400. If I wanted to kill a few hours I could calculate the mean and median rating for all the cards, but at a glance I can see that Nvidia would win.

If you want to go to the bottom of the Maxwell line, the GTX 750 vs the R7 260x, it's even more dramatic. http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709 600487564 600473874&IsNodeId=1&bop=And&Pagesize=90&Page=1

If you can find a case where AMD would beat the competing Nvidia card in Newegg reviews, I'd be interested.




GhostRyder said:


> On top of that power consumption is already proven time and again to be a moot point except in small situations.  In most cases you would have to run a card for such a high amount of time under load through the year to really equate to power differences becoming present on your bill.



It isn't tough to calculate and I already gave the numbers. If you pay a normal US price for electricity (~11 cents/kw-hr) it's $1/W/yr continuous. A 290x uses 8W more at idle and way more than that at other times. For my typical use (computer on 24/7 but heavy card use only a couple hours per day), it would probably amount to around $20-30/yr. Is that a weird case and is that amount of money trivial? Not to me.


----------



## rruff (Jan 30, 2015)

Sony Xperia S said:


> That's good for the stupid because they never have to think that probably the problem is in them or in their system.



Why would Nvidia cards be more idiot proof than AMD?


----------



## Sony Xperia S (Jan 30, 2015)

rruff said:


> Why would AMD cards be more idiot proof than nvidia?



Fixed.


----------



## rruff (Jan 30, 2015)

Sony Xperia S said:


> Fixed.



You are contradicting yourself...


----------



## Cybrnook2002 (Jan 30, 2015)

While ya'll are bitching, I'm gaming    (at 100+ FPS)


----------



## GhostRyder (Jan 30, 2015)

rruff said:


> You can say that, but have you looked? Do you have evidence? Newegg is probably the best source. Here are the GTX 970 and R9 290x organized by rating: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709 600536049 600473871 &IsNodeId=1&page=1&bop=And&Order=RATING&PageSize=90
> 
> There are 6 Nvidia's ahead of any AMD and the sole AMD card with more than 10 reviews and getting 5 eggs is a 8GB Sapphire that costs >$400. If I wanted to kill a few hours I could calculate the mean and median rating for all the cards, but at a glance I can see that Nvidia would win.
> 
> ...


 
So newegg is a deciding factor based on products reviews?  You do realize anyone can choose to write and anyone can choose not to which basically makes those pointless.  Especially when many people are "Anonymous", Don't have the "Purchased" symbol next to their name, or repeat reviews (Which you can do btw).  On top of all that many times those complaints are not even about failing cards and sometimes are just lesser complaints aimed at a multitude of things (Not happy with shipping, game deal/rebate not working, etc).  Retailer site reviews no matter what its for are rarely ever useful dude...



rruff said:


> It isn't tough to calculate and I already gave the numbers. If you pay a normal US price for electricity (~11 cents/kw-hr) it's $1/W/yr continuous. A 290x uses 8W more at idle and way more than that at other times. For my typical use (computer on 24/7 but heavy card use only a couple hours per day), it would probably amount to around $20-30/yr. Is that a weird case and is that amount of money trivial? Not to me.


So you are concerned about electricity yet leave your computer on 24/7?  That's like saying I am concerned with my light bulbs using to much electricity and buying CFL/LED bulbs to leave on 24/7.  Even so your calculations have a lot more to factor in then just that and even so most of the reviews of power consumption show gaming stress with no limits on which does adjust power usage.  Not really in the mood to debate this but places like Linus tech tips have done similar tests and come to the same conclusion.





Cybrnook2002 said:


> While ya'll are bitching, I'm gaming    (at 100+ FPS)


 I wish I could be gaming right now


----------



## rruff (Jan 30, 2015)

GhostRyder said:


> So newegg is a deciding factor based on products reviews?



It's imperfect but do you have a better one? Seriously, I want to know if there is a good source of information on QC and reliability? Many of the issues you mention *should* apply to both and would not consistently skew results. The only thing that would is if Nvidia is spamming reviews more than AMD.



GhostRyder said:


> So you are concerned about electricity yet leave your computer on 24/7?



It's working 24/7. It can't work if it's off. And I'm not concerned about electricity, I'm concerned about $. If I'm going to keep a card for 2 years and that is an extra $50 in cost, then it is definitely a factor in comparing price vs performance.


----------



## 64K (Jan 30, 2015)

rruff said:


> It's imperfect but do you have a better one? Seriously, I want to know if there is a good source of information on QC and reliability? Many of the issues you mention *should* apply to both and would not consistently skew results. The only thing that would is if Nvidia is spamming reviews more than AMD.



I wouldn't put too much stock in private reviews. There have been instances on Metacritic for example where employees from a developer went and posted flattering reviews of a game that they had worked on and only focused on the good part of the game and got caught red handed doing it. Then you look at Battlefield and Call of Duty reviews and there's some kind of fanboy feud going on between the two camps for whatever reason and they smear the other camp with negative reviews for each release.


----------



## CrAsHnBuRnXp (Jan 30, 2015)

Eroticus said:


> The reasons are - nvidia has much more fan boys / power effective / 970 is newer product  ( 1.6 years )/ and the main reason - 380x is coming . no point to buy old gen when amd will win like always in next one ....


Like always? When AMD releases a new GPU, they win for like a week or 2 until nvidia turns around and releases another card to compete with it and AMD gets knocked back down.



rruff said:


> You are contradicting yourself...


No he's just being a fanboy. Difference.


----------



## Xzibit (Jan 30, 2015)




----------



## Sony Xperia S (Jan 30, 2015)

rruff said:


> You are contradicting yourself...



No, I am not.

When you say "water proof" it means that it is resistant and protected. The same with AMD cards - they are protected and resistant to idiots.


----------



## AsRock (Jan 30, 2015)

GhostRyder said:


> 290X is more powerful than a GTX 970 so that point is invalid...But then again that comment an obvious troll bait...
> 
> 
> QC is about the same dude including reliability, the differences posted are normally very small except in cases involving massive problems with certain specific cards which is not something seen very often.  Drivers are just as fine no matter what brand your using so that argument is completely irrelevant and the same goes for features because both sides have a counter to each feature within reason.
> ...



Why bother ? ignorance is bliss and seems like he cannot get his head around a few facts.

And WTF is this 24\7 BS,  as if your gaming 24\7 you got other issue's which are much more important.  So math is flawed right from the get go.

Yes vsync makes a hell of a difference @60Hz which most are on still and is typically best for gaming typically.


----------



## GhostRyder (Jan 30, 2015)

rruff said:


> It's imperfect but do you have a better one? Seriously, I want to know if there is a good source of information on QC and reliability? Many of the issues you mention *should* apply to both and would not consistently skew results. The only thing that would is if Nvidia is spamming reviews more than AMD.


Sadly there are not that many but a few that are at least acceptable to go off of are for instance a post on linustechtips that links to a French site has some decent coverage of that.  There is also the Pudget systems link that shows their personal experience with cards in house testing and the field.  But either way a review on manufacturer/retail sites is not useful because as stated by @64K they are skewed very easily by people who are fanboys or making random complaints.  I see plenty of complaints that sometimes come from the same person 3 or 4 times and even ones from people who do not own the video card complaining.



rruff said:


> It's working 24/7. It can't work if it's off. And I'm not concerned about electricity, I'm concerned about $. If I'm going to keep a card for 2 years and that is an extra $50 in cost, then it is definitely a factor in comparing price vs performance.


  If it costs you an extra $50 in 2 years, that would be a heavy amount of use on the card at a constant rate.  That is also still including if the rate of usage remains at its peak which most cards do not remain at the peak power outputs for very long except in situations like crypto currency mining.



Xzibit said:


>


 LOL are you serious there is another one of those...Wow.


----------



## rruff (Jan 30, 2015)

GhostRyder said:


> Sadly there are not that many but a few that are at least acceptable to go off of are for instance a post on linustechtips that links to a French site has some decent coverage of that.



Thanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns: 


- Radeon HD 7850 : 2,69%
- Radeon HD 7870 : 12,45%
- Radeon HD 7950 : 5,32%
- Radeon HD 7970 : 7,24%

- GeForce GTX 560 Ti : 1,43%
- GeForce GTX 660 Ti : 3,06%
- GeForce GTX 670 : 3,42%
- GeForce GTX 680 : 2,66%

- Radeon HD 7850 : 3,74%
- Radeon HD 7870 : 5,48%
- Radeon HD 7870 XT : 4,25%
- Radeon HD 7950 : 5,75%
- Radeon HD 7970 : 5,31%

- GeForce GTX 660 : 1,01%
- GeForce GTX 660 Ti : 2,81%
- GeForce GTX 670 : 2,87%
- GeForce GTX 680 : 1,99%

The Pudget Systems results were very unfavorable to AMD for initial reliability, but that was a small sample size. 



> If it costs you an extra $50 in 2 years, that would be a heavy amount of use on the card at a constant rate.



I don't know if it is heavy, but not super light either. There is +8W in idle. The 290x uses 67W more just running a bluray, and ~100W more in typical gaming. If I gamed for 2hr/day and watched 2hr/day of video that would be an average of 14W, or 22W total adding the idle consumption, or $22/yr. Maybe $50 in two years is a bit much but it isn't that crazy either.


----------



## Digital Dreams (Jan 31, 2015)

haha


----------



## efikkan (Jan 31, 2015)

It's time to stop this nonsense. The performance loss compared to GTX 980 is negligible, and GTX 970 still remains the best value GPU choice. Graphic cards with some slower memory banks is not new, but people have long forgot GTX 660Ti. Having 512 MB of slower memory in GTX 970 might be a little issue for some CUDA-uses, but for current games it remains irrelevant. The memory bus is *way too slow* to utilize the memory within a single frame anyway.

It's laughable that AMD tries to cash in on this fuss. GTX 970 still has 4 GB of memory and is still a superior choice to R9 290X. Spreading misinformation about this controversy is quite immoral.


----------



## xfia (Jan 31, 2015)

efikkan said:


> It's time to stop this nonsense. The performance loss compared to GTX 980 is negligible, and GTX 970 still remains the best value GPU choice. Graphic cards with some slower memory banks is not new, but people have long forgot GTX 660Ti. Having 512 MB of slower memory in GTX 970 might be a little issue for some CUDA-uses, but for current games it remains irrelevant. The memory bus is *way too slow* to utilize the memory within a single frame anyway.
> 
> It's laughable that AMD tries to cash in on this fuss. GTX 970 still has 4 GB of memory and is still a superior choice to R9 290X. Spreading misinformation about this controversy is quite immoral.



haha do you realize it is flawed by the basic parallel principal to make a gpu like this? if Intel made the xeon gpu's like this they would be garbage. some cuda uses? try like most if not all.  
there is not misinformation going around.. it is a peace of shit that people have been complaining about since it was launched and nvidia ignored them so they where thinking it is just drivers and sli performance will improve haha


----------



## buggalugs (Jan 31, 2015)

efikkan said:


> It's time to stop this nonsense. The performance loss compared to GTX 980 is negligible, and GTX 970 still remains the best value GPU choice. Graphic cards with some slower memory banks is not new, but people have long forgot GTX 660Ti. Having 512 MB of slower memory in GTX 970 might be a little issue for some CUDA-uses, but for current games it remains irrelevant. The memory bus is *way too slow* to utilize the memory within a single frame anyway.
> 
> It's laughable that AMD tries to cash in on this fuss. GTX 970 still has 4 GB of memory and is still a superior choice to R9 290X. Spreading misinformation about this controversy is quite immoral.



You are completely wrong, on everything, and I still don't understand why some people are defending Nvidia when they have been dishonest. It doesn't matter how good something is , if it is advertised to have something and it doesn't, that's a problem. When they advertised the card as "having the same memory subsystem as the 980" and it doesn't, that's a problem. When Nvidia comes clean only after the issue was reported on tech websites 3 months later, that's a problem. You're the one with bad morals if you are defending false advertising.


----------



## efikkan (Jan 31, 2015)

buggalugs said:


> You are completely wrong, on everything, and I still don't understand why some people are defending Nvidia when they have been dishonest. It doesn't matter how good something is , if it is advertised to have something and it doesn't, that's a problem. When they advertised the card as "having the same memory subsystem as the 980" and it doesn't, that's a problem. When Nvidia comes clean only after the issue was reported on tech websites 3 months later, that's a problem. You're the one with bad morals if you are defending false advertising.


Noone is defending Nvidia for advertising the wrong specs in terms of memory bandwidth and so on, that is their responsability. But at the end of the day it's not big enough of a problem to change the verdict of the product and it's market position, and it remains a big PR blunder with minimal impact for actual product owners. Truth be told, almost noone would be able to notice it anyway, but unfortunately every problem will now be blamed on this issue, even though most problems claimed about the memory issue has nothing to do with the slow 512 MB of memory.


----------



## AsRock (Jan 31, 2015)

that link all i see is a lot of miss leading numbers


rruff said:


> Thanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns:
> 
> 
> - Radeon HD 7850 : 2,69%
> ...




Don't believe every thing you read even more so on a forum,  and the 100w part is total BS as most games as better played with vsync on with the odd exception like with Watch Dogs.

All the games i play run 180w-240w total system power usage and runs 80w idle.  With vsync off you are some what correct in what your saying.

Our electric bill has not gone up either since i got my 290X clocked at 1050 over the last year i have had it in fact the 290X is taking the same power usage as my 6970 was but with higher details in games.


----------



## Lionheart (Jan 31, 2015)

Digital Dreams said:


> haha



You are a legend for posting that lmfao. That was the funniest shit I've seen in a long time


----------



## Caring1 (Jan 31, 2015)

Lionheart said:


> You are a legend for posting that lmfao.


Ummm, NO, he's not. It's been posted about 5 times already in these forums if you looked and read them.


----------



## Super XP (Jan 31, 2015)

I would have to agree, the Radeon R9 290(x) takes it home big time.


----------



## Lionheart (Jan 31, 2015)

Caring1 said:


> Ummm, NO, he's not. It's been posted about 5 times already in these forums if you looked and read them.


 
Uumm yes I can have my own god damn opinion if I like, so go cry somewhere else lolz


----------



## btarunr (Jan 31, 2015)

Recus said:


> But the game is stored in your PC so technically you bought him. Also physical copies have to be activated through Steam.



Technically you don't buy the game. You buy the license (a permission) to play the game. The physical copy is a piece of plastic containing the software. You pay money for the license. Just because the software is on your hard drive doesn't mean you bought it.

Look at it this way. You don't buy a passport from your government, you apply for it, you pay the required fees, and then they give you a passport TO HOLD. Your government still OWNS your passport. Same with credit cards. Your bank OWNS your credit card. When you buy games on Steam, or buy a physical copy, and you have it installed, you're HOLDING the software, along with a LICENSE to use it. You don't own the game, even if it came in a $200 collector's edition set with a gold disc, sitting on a satin pillow, in an expensive wood box.

That's why you can't compare Steam purchases with graphics card purchases. It's a tangible commodity that isn't subject to any EULA. You buy it, and then you can use it to play games, watch videos, create CGI, or use as paperweight (like W1zzard does).


----------



## RejZoR (Jan 31, 2015)

rruff said:


> Thanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns:
> 
> 
> - Radeon HD 7850 : 2,69%
> ...



I'd take these statistics with a pinch of salt without knowing why users actually returned them. I've returned my HD7870 Toxic as well, because it was loud as hell despite their famous VaporX cooler (could be defective, I don't know). And bought a HD7950 afterwards. Meaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model. The current HD7950 WindForce3X is by far the best card I've owned in years despite also having few minor issues.


----------



## Devon68 (Jan 31, 2015)

This is what makes me kind of mad and said at the same time:
1 card cost's 300$ converted into my countries currency it's:
300 dollars is = 25560.06 rsd
25560.06 rsd x2 is = 51120.12 rsd

and this is how much one card cost's in my country:
http://www.winwin.rs/racunari-i-kom...vi-hdmi-dp-512bit-gv-r929xoc-4gd-1189519.html

BTW the price in red is the price if you order it online
the green is the price if you want to buy it from the shop.


----------



## 64K (Jan 31, 2015)

Devon68 said:


> This is what makes me kind of mad and said at the same time:
> 1 card cost's 300$ converted into my countries currency it's:
> 300 dollars is = 25560.06 rsd
> 25560.06 rsd x2 is = 51120.12 rsd
> ...



Yikes, the online price is 52,000 din. That's about $476 US. I doubt that the difference is solely due to an import tariff. I would be mad too.


----------



## rruff (Jan 31, 2015)

64K said:


> Yikes, the online price is 52,000 din. That's about $476 US. I doubt that the difference is solely due to an import tariff. I would be mad too.



Weird things is, being in the US I can buy stuff from overseas, doesn't matter what country, and I don't pay any tariff or extra fee. But I also sell stuff overseas, and *in every case the country I ship to requires a hefty tariff or tax*. Like 30% or so. Even shipping to poor countries. I thought "free trade" was supposed to go both ways?


----------



## Hyphen (Jan 31, 2015)

Today's "Daily Deal Slasher" over at TigerDirect is this card at $300.


----------



## rruff (Jan 31, 2015)

RejZoR said:


> Meaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model.



Those sort of defects are not specific to AMD, but there seems to be a consistent trend of defects of some sort being higher with AMD cards. Maybe driver related?


----------



## TRWOV (Jan 31, 2015)

RejZoR said:


> I'd take these statistics with a pinch of salt without knowing why users actually returned them. I've returned my HD7870 Toxic as well, because it was loud as hell despite their famous VaporX cooler (could be defective, I don't know). And bought a HD7950 afterwards. Meaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model. The current HD7950 WindForce3X is by far the best card I've owned in years despite also having few minor issues.



Also remember that AMD has a smaller user base so 1 AMD card accounts for a higher percentage than 1 nVidia card.


----------



## rruff (Jan 31, 2015)

TRWOV said:


> Also remember that AMD has a smaller user base so 1 AMD card accounts for a higher percentage than 1 nVidia card.



Been looking at percentages.


----------



## GhostRyder (Jan 31, 2015)

rruff said:


> Thanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns:
> 
> 
> - Radeon HD 7850 : 2,69%
> ...


Your talking about a very small difference in returns though comparing both with one exception in that test.  Not enough to really count as severe differences in reliability besides one out of the bunch in that chart (Which can be seen going back on different models of cards from both sides).  Problem with most failures still is as stated by others it depends on why they were returned more than anything because cooler malfunctions or many other reasons including things that are not the cards fault.  I have met many people who return cards for random reasons as being faulty when it turns out to be their fault and those returns are calculated into many statistics counting it as faulty because the card has to go through a testing process/refurb process before it can be sold or used again.  Again we can say that both sides get those problems but AMD tends to get a brunt of it more than others because many people tend to blame AMD/The card first before checking for other issues because that is how they get labeled on the internet.  You see a dozen posts regarding a certain problem with a card, that must be what's causing your issues right?  Same thing will happen with the GTX 970 unfortunately where the VRAM problem will be associated from this point on with every problem a user faces which in many cases will not be the case.

As far as wattage goes though, I can say that $22 a year is still asking a lot for a power difference between cards though.  I can tell you I have 3 R9 290X cards, I game at 4K pretty often when I am not working including BF4, LoL (Though that only uses 1 card), and others and my electric bill has not gone up at least that I have noticed from my last couple of builds.  But then again I have had Dual HD 6990's, 2 GTX 580's, 2 GTX 460's, 2x GTX 295 (Dual PCB variants), etc so maybe I am a bad matchup for that however power usage generally on computers while one can use significantly more than another under load those loads fluctuate so much on top of the fact that while the numbers sounds high in the grand scheme comparing to most other household appliances (Even just lighting) it really is not as much as you would think.



TRWOV said:


> Also remember that AMD has a smaller user base so 1 AMD card accounts for a higher percentage than 1 nVidia card.


But there are a lot more OEM machines with NVidia inside than AMD which is where a lot more of those numbers tend to come from especially on the mobile market.  Many times as well those numbers when a machine is returned for a problem even if it is the GPU those numbers do not hit most of the polls when a manufacturer finds a bad GPU.


----------



## efikkan (Jan 31, 2015)

*Let's look at the simple facts here for a moment:*
GTX 970 has 13 SMMs with a computational power of 3494 GFlop/s (75.76% of GTX 980), with a theoretical memory bandwidth of 196 GB/s vs. 224 GB/s (87.5%). When a GPU accesses a single tiny block of memory, it will never read it at 224 GB/s. A single block will be placed in one of the eight memory chips, where each one is accessible at 28 GB/s (on separate 32-bit buses). Let's say you have a hypothetical GPU of 32 SMMs and a 512-bit memory bus(total), loading the same single block of memory from one SMM will still be just as slow as on GTX 970. The reason for this is the GPU allocates a single block on one memory controller. If you think about it for a moment, you'll realize why every single SMM can't load from memory at 224 GB/s, that would make the GPU extremely complex and defeat the purpose of a GPU processing different data as efficient as possible.

When looking back on GTX 970 and GTX 980, simple maths shows the GTX 970 has more memory bandwidth per GFlop (0.056 vs. 0.049 GB/flop), meaning in a gaming setting where each frame time is limited, each SMM has more memory bandwidth at their disposal than on an GTX 980. Even though each GPU can store 4 GB in memory, no game will ever come close to using all of that in a single frame rendering, and typical load is generally around 1.5 GB per frame or lower. So provided that the GPU/driver is smart enough to store appropriately in the two memory pools, the last slow 512 MB can be completely transparent to the end user. Heck, even if the GTX 970 had only 3 GB of fat memory it could still be achieved. Given the fact that GTX 970 has more memory bandwidth per SMM than GTX 980, *GTX 970 is still less bottlenecked by the memory bus in a gaming setting*. This is why the slowdown for GTX 970 vs. GTX 980 is negligible, and the slowdown we can see has more to do with fewer SMMs than the memory bus.

When using the GPU for computing(CUDA or OpenCL) the consequences of the slower memory might be a bit different, in specific situations where the program accesses randomly across all of GPU memory.

But for gaming, the *GTX 970 still remains just good of a choice as two weeks ago*. If it weren't for the specific compute situations, no one would probably ever notice the slowdown. Any any minor issues resulting from this memory configuration can be fixed in software (if any).


----------



## newtekie1 (Jan 31, 2015)

LOL


----------



## The N (Jan 31, 2015)

AMD want to get in between Nvidia users and NVidia . so they can get something outta it. with effective meaningful advertisement. thats called, strategy. effective move to some extend.


----------



## dj-electric (Jan 31, 2015)

newtekie1 said:


> LOL



Yup. Hypocrisy at its very best. The text there should be much bigger


----------



## Jurassic1024 (Jan 31, 2015)

Jorge said:


> For those who don't know, top-of-the-line CPUs and GPUs are cash cows with huge margins compared to mainstream models. Only enthusiasts  or Biz buy the very high end, high margin products. Lowering the price slightly is a no brainer and will more than be offset by the increased volume. It's not difficult to do the math when you know the margins and volume.



Cash cows? Keep dreaming. The greater chunk of revenue comes from low and midrange. EVERYONE knows this but you?!

ie: Have you seen a Steam Hardware Survey... ever?!


----------



## R-T-B (Jan 31, 2015)

Jurassic1024 said:


> Cash cows? Keep dreaming. The greater chunk of revenue comes from low and midrange. EVERYONE knows this but you?!
> 
> ie: Have you seen a Steam Hardware Survey... ever?!




He was speaking in a per GPU context.  Completely different..


----------



## SK-1 (Feb 1, 2015)

Dang.... I just saw one for 229.00 after discount. I spent almost as much on a R9-270X less than a year ago


----------



## HumanSmoke (Feb 1, 2015)

RejZoR said:


> I'd take these statistics with a pinch of salt without knowing why users actually returned them. I've returned my HD7870 Toxic as well, because it was loud as hell despite their famous VaporX cooler (could be defective, I don't know). And bought a HD7950 afterwards. Meaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model. The current HD7950 WindForce3X is by far the best card I've owned in years despite also having few minor issues.


The actual answer in many cases, especially Sapphire, lies in the model number. A high number of the Sapphire cards being returned are reduced BoM models, where Sapphire had done stuff like putting the Vapor-X shroud over some pretty basic componentry in some cases (coil whine seems a large factor in some of SKUs). A keen eye would see that many of their reduced BoM cards feature the blue PCB, while the premium cards often sport the black PCB.
Anyhow, returns usually fall under two categories - failure or customer dissatisfaction. Whichever the reason, the numbers are pertinent to those who might purchase.
As an aside, those numbers are for an older timeframe. I actually aggregated the latest numbers from Hardware France in a post a week ago (I won't try to C&P here).


----------



## RejZoR (Feb 1, 2015)

Interesting. I've never had any graphic card fail on me (neither AMD/ATi or NVIDIA based) if I exclude the HD5850 which i fried myself by fiddling with the cooling system. And then I at least had an excuse to buy HD6950


----------



## HumanSmoke (Feb 1, 2015)

RejZoR said:


> Interesting. I've never had any graphic card fail on me (neither AMD/ATi or NVIDIA based) if I exclude the HD5850 which i fried myself by fiddling with the cooling system. And then I at least had an excuse to buy HD6950


I had five HD 5850's. Started with two XFX Black Editions (returned one because of the GSoD issue, the other wouldn't hold the default OC), went with the three Sapphire Toxic 2GB - all three are still running strong today with their 2nd owners. Also had fails from a reference (XFX) 5970 which was a basket case from day one, as were two BFG GTX 280 H2OC's which hadn't been binned for the overclock they were running (a common BFG failing).
Of the machines I've been called on to diagnose and repair, most faults are from user error - assuming five minutes gaming is sufficient testing for a maxed out overclock. The largest card issues were from HD 4870's and 4890's, followed by GTX 280's running too high a clock, and those single slot 8800GT's with the cheapo 7-blade blower fan.


----------



## RejZoR (Feb 1, 2015)

I've had a GeForce 7600GT to fail, but that was after I sold it to someone else.  He called me that gfx card doesn't work (and I know it worked fine before I took it out of my PC). Luckily it was still in warranty when I sold it so he got a new one. I guess it was an issue with static electricity or something else that person messed up somehow. So, technically it wasn't NVIDIA's fault here either.


----------



## AsRock (Feb 1, 2015)

RejZoR said:


> I've had a GeForce 7600GT to fail, but that was after I sold it to someone else.  He called me that gfx card doesn't work (and I know it worked fine before I took it out of my PC). Luckily it was still in warranty when I sold it so he got a new one. I guess it was an issue with static electricity or something else that person messed up somehow. So, technically it wasn't NVIDIA's fault here either.



Some times just replacing the user will solve the issue





Anyways, i have had 2 go wrong on me over the last 25 years. One was a 2900XT but that was my fault as i forgot to plug the fan on it to cool the ram chips top side which were overclocked as high as i could get them. 

The other as on a XFX 4890 and when the cooler was put back on it the cooler would make a odd sound which was sent back to XFX and they replaced it with a newer series card and sent it directly to it's new user.


----------



## rruff (Feb 1, 2015)

HumanSmoke said:


> I actually aggregated the latest numbers from Hardware France in a post a week ago.



http://www.hardware.fr/articles/927-5/cartes-graphiques.html

Good stuff. Definitely higher return rates for AMD cards.


----------



## AsRock (Feb 1, 2015)

rruff said:


> http://www.hardware.fr/articles/927-5/cartes-graphiques.html
> 
> Good stuff. Definitely higher return rates for AMD cards.



so they say, i have returned more nvidia cards than  i have AMD, i had hell with the nvidia 7xxx range. Never mind the 4x0 range were on one the heatsink fell off.

Although it would not stop  me from buying one, how ever i think it's more like hard drives some brands you have good luck with some others not so. I had  my fair share of seagate and WD drives how ever seagate i have always gave e issues were as with some one else had bad luck with WD's.


----------



## Ferrum Master (Feb 1, 2015)

What are those numbers? If only two 7970 are sold and one returned then RR rate becomes 50%, give me a break really.

There are not known manufacturing errors for both camps, and any responsibility should go to AIB partners anyway.


----------



## GhostRyder (Feb 1, 2015)

Dj-ElectriC said:


> Yup. Hypocrisy at its very best. The text there should be much bigger


Maybe if it was true?



rruff said:


> http://www.hardware.fr/articles/927-5/cartes-graphiques.html
> 
> Good stuff. Definitely higher return rates for AMD cards.


Return rates have to much to account for as I said when I posted those links to you, the problem being that for starters people can return for ANY issue and it gets counted on top of that even if we are looking at those numbers the difference is very small except in one case.  The only real reason to check reviews/return rates or the likes on a card is if you see a very abnormally high number on one specific card like a card that has an oddly high return rate because that can indicate a real problem.  Both card camps have no problems with return rates and neither are really better than the other at quality control.  We will probably see a slightly higher than normal return rate for the GTX 970 and we all know the reason for that, does not mean the quality control and NVidia was horrible or anything.

I have had 2 cards fail on me in my life and both were NVidia (1 9800GT and 1 9800GX2), but that does not say much as I have bought mostly NVidia from the beginning so its not a good comparison.  Any card unless there is a specific design flaw can fail at a similar rate and the only thing that really matters when comparing is abnormally high rates.


----------



## HumanSmoke (Feb 1, 2015)

Ferrum Master said:


> What are those numbers? If only two 7970 are sold and one returned then RR rate becomes 50%, give me a break really.


If you read the introduction to the articles, you'll find that that the criteria for inclusion is a sample size of a minimum 500 units for a vendor model, and 100 units for an individual AIB/AIC SKU. Further note that for any AIB/AIC SKU where the sample size is under 200 units, the entry in the list is _italicised_.


Ferrum Master said:


> There are not known manufacturing errors for both camps, and any responsibility should go to AIB partners anyway.


Why not the card manufacturer? All AMD's reference cards are made by PC Partner (Sapphire's parent company)- who also produce all the cards for Sapphire, Zotac, Inno3D, ELSA, Leadtek, Manli, PNY's reference cards, Zogis, and Point of View. How does the AIB adding a sticker absolve the manufacturer of blame?

Customer dissatisfaction (noise, cooling, insufficient clock speed binning etc) : ODM/AIB/AIC  issue or OEM issue if reference design
Manufacturing defect : Board manufacturer (ODM) if individual SKU's, or OEM if the reference design is at fault.


----------



## perryra1968 (Feb 1, 2015)

Sasqui said:


> my XFX 290x DC doesn't get over 78c, my 290x reference under H20 doesn't break 40c.
> 
> Reference cooler?  Yea, that's another story, lol


Well, my reference designed Sapphire still runs under 70c after 4 hours of BF4.  Idle has been 36c. No special cooling or anything. I never understood what this heat factor is that everyone continues to talk about.  I couldn't be happier with this card.


----------



## eidairaman1 (Feb 1, 2015)

perryra1968 said:


> Well, my reference designed Sapphire still runs under 70c after 4 hours of BF4.  Idle has been 36c. No special cooling or anything. I never understood what this heat factor is that everyone continues to talk about.  I couldn't be happier with this card.



thats all that matters.

I have a VaporX, no need to try overclocking it either


----------



## HumanSmoke (Feb 1, 2015)

perryra1968 said:


> Well, my reference designed Sapphire still runs under 70c after 4 hours of BF4.  Idle has been 36c. No special cooling or anything. I never understood what this heat factor is that everyone continues to talk about.  I couldn't be happier with this card.


A reference 290X that doesn't exceed 70C under extended full 3D load! I'm guessing you're either deaf, or this is your house.


----------



## rruff (Feb 1, 2015)

HumanSmoke said:


> A reference 290X that doesn't exceed 70C under extended full 3D load!



I'm guessing that it *isn't* full 3D load. Throttled by the CPU or frame limiting, or something...


----------



## newtekie1 (Feb 1, 2015)

rruff said:


> I'm guessing that it *isn't* full 3D load. Throttled by the CPU or frame limiting, or something...


That'd be my guess too.  A 290X won't heat up and throttle if you're playing 1080p@60Hz because that isn't pushing the card.

Edit: According to his specs he has a 1920x1200 60Hz monitor.  Close enough to 1080p.  And it definitely won't push a 290X to its limits.  But you start running it at 1440p with MSAAx8 and peg the GPU usage to 100% in games, and I guarantee it'll throttle.



HumanSmoke said:


> A reference 290X that doesn't exceed 70C under extended full 3D load! I'm guessing you're either deaf, or this is your house.



Deaf might be a possibility.  Apparently when W1z was testing the reference 290X, his neighbors actually complained about the noise!  His neighbors!!!


----------



## TRWOV (Feb 1, 2015)

To be fair the case might have a play into that. When I had my reference 7970 on a Cosmos Pure I barely heard it but it sounded like a jet engine on a Source 210.

I don't know how soundproof his Antec 902 is though.


----------



## ManofGod (Feb 1, 2015)

newtekie1 said:


> LOL



Odd, I have never had that issue with a reference design. That is even when running VSR resolutions up to 3200 x ??? I cannot recall because I am not in front of my computer. Oh well, enjoy spreading your FUD, I will enjoy playing my quiet, fast computer.


----------



## ManofGod (Feb 2, 2015)

I read which is why I responded as I did. I have a reference R9290X and do not have any throttling issues and it is quiet. Of course, perhaps I k ow what I am doing and that is why I do not have the issue he described in his mocking post?

Oh well, I have no issue with folks using what they want. However, FUD can be stupid.


----------



## Ferrum Master (Feb 2, 2015)

HumanSmoke said:


> If you read the introduction to the articles, you'll find that that the criteria for inclusion is a sample size of a minimum 500 units for a vendor model, and 100 units for an individual AIB/AIC SKU. Further note that for any AIB/AIC SKU where the sample size is under 200 units, the entry in the list is _italicised_.



Italicized under 200? It still give quite crippled number on 100% scale really even if  it is a 500 and one is 1000.



HumanSmoke said:


> Sapphire, Zotac, Inno3D, ELSA, Leadtek, Manli, PNY's reference cards, Zogis, and Point of View



From those at this side of the pond only Sapphires are here... ELSA is Nippon banzai only now as far  I know. Asus got the largest sale number, then MSI and Gigabyte, XFX's also are not available on our side, without ebay, so a tough math really. So each of them should be analyzed with more details, I agree. The problem in between the chair and the keyboard and a real hardware fault, although... it also does not give a good info too...  for example driver issues... like 344.11 had bsods on certain motherboards, and thus RMA rate artificially increases. Some Cats have them a lot always, it just a part of a daily life .



HumanSmoke said:


> A reference 290X that doesn't exceed 70C under extended full 3D load! I'm guessing you're either deaf, or this is your house.



That thing can't hold anything less that those 95C even when running dual monitor setup desktop, not mentioning even 3D(throttle) mode.


----------



## HumanSmoke (Feb 2, 2015)

Ferrum Master said:


> Italicized under 200? It still give quite crippled number on 100% scale really even if it is a 500 and one is 1000.


Technically the disparity isn't that great. The minimum number of returns to make the list is 100 (which would equate to ~2000 cards of that individual model sold on average over 6 months). The return percentage is calculated by the model number sold - not overall sales of all cards, so the percentages are reasonably valid IMO.


Ferrum Master said:


> From those at this side of the pond only Sapphires are here... ELSA is Nippon banzai only now as far  I know. Asus got the largest sale number, then MSI and Gigabyte, XFX's also are not available on our side, without ebay, so a tough math really.


Well, the return rates are from a French hardware outlet. Point of View is a European brand, so it shouldn't be a surprise that they are represented across Europe (as are many brands). I'm not a big frequenter of French computer stores, but this one stocks a number of Zotacs's, and a whole range of PNY workstation cards (unsurprisingly since PNY is based in France).


Ferrum Master said:


> So each of them should be analyzed with more details, I agree. The problem in between the chair and the keyboard and a real hardware fault, although... it also does not give a good info too...  for example driver issues... like 344.11 had bsods on certain motherboards, and thus RMA rate artificially increases. Some Cats have them a lot always, it just a part of a daily life .


Yes, for sure there are many different reasons for returns that don't necessarily reflect the workmanship or design of the card - no doubt there are even cases of buyers remorse, and maybe people hoping to return the card but keep any bundled games - not sure how that works out since some etailers will bill the user for a used product key. As has been noted by others, there are returns likely because of hard use in alt-currency mining.

It isn't any kind of absolute proof, but it does represent a reasonable indicator (also, IMO).


----------



## Sasqui (Feb 2, 2015)

perryra1968 said:


> Well, my reference designed Sapphire still runs under 70c after 4 hours of BF4.  Idle has been 36c. No special cooling or anything. I never understood what this heat factor is that everyone continues to talk about.  I couldn't be happier with this card.



LOL, I don't care who you are, that's funny right there.


----------



## newtekie1 (Feb 2, 2015)

ManofGod said:


> Odd, I have never had that issue with a reference design. That is even when running VSR resolutions up to 3200 x ??? I cannot recall because I am not in front of my computer. Oh well, enjoy spreading your FUD, I will enjoy playing my quiet, fast computer.



Yeah right.  The reference 290X isn't quiet, pretty much everyone know that is fact.  And W1z even had throttling issues on his open testbench.  So yeah, _I'm_ the one spreading FUD...


----------



## ManofGod (Feb 2, 2015)

newtekie1 said:


> Yeah right.  The reference 290X isn't quiet, pretty much everyone know that is fact.  And W1z even had throttling issues on his open testbench.  So yeah, _I'm_ the one spreading FUD...



Yes, you are, thank you for agreeing.


----------



## 64K (Feb 2, 2015)

From W1zzards review of the reference R9 290X


"Idle noise levels are decent, almost quiet. You can barely hear the card when it is installed in a case.

The picture changes completely once you fire up a gaming session. AMD's fan will ramp up very quickly to cope with skyrocketing temperatures. Enable "Quiet" BIOS, which limits fan speed to a maximum of 2000 RPM, and the card will run into its 94°C temperature limit after only a few minutes, which results in lower clocks and performance (to stay below 94°C). The "Quiet" BIOS will not deaden down the card, but its noise levels can be tolerated.

Using the "Uber" BIOS results in RPM limitations falling away and the fan spinning up as fast as it has to in order for the card to stay below its temperature target (94°C by default). You will hear nothing but the card's fan noise, which makes hearing enemy footsteps or similar sound effects impossible unless you play with headphones.

Overall, I am disappointed by the acoustic experience the R9 290X provides. AMD should have invested some time into developing a good cooler, like NVIDIA did with the GTX Titan."

http://www.techpowerup.com/reviews/AMD/R9_290X/26.html


----------



## ManofGod (Feb 2, 2015)

64K said:


> From W1zzards review of the reference R9 290X
> "Idle noise levels are decent, almost quiet. You can barely hear the card when it is installed in a case.
> The picture changes completely once you fire up a gaming session. AMD's fan will ramp up very quickly to cope with skyrocketing temperatures. Enable "Quiet" BIOS, which limits fan speed to a maximum of 2000 RPM, and the card will run into its 94°C temperature limit after only a few minutes, which results in lower clocks and performance (to stay below 94°C). The "Quiet" BIOS will not deaden down the card, but its noise levels can be tolerated.
> Using the "Uber" BIOS results in RPM limitations falling away and the fan spinning up as fast as it has to in order for the card to stay below its temperature target (94°C by default). You will hear nothing but the card's fan noise, which makes hearing enemy footsteps or similar sound effects impossible unless you play with headphones.
> ...


That's nice but I have not had those experiences. However, if you have it on an open bench with you ear less than a foot from it, of course it will seem noisy to you. However, if you build it into a quiet system, you will not hear it and it does not throttle. (Mine does not throttle and I do not hear it well I am gaming.) Therefore, either his experience is based on that open bench operation or I am lying. I know I am not lying but hey, if you do not believe me, that is fine, I will still continue to use my quiet, non throttling R9 290X and you use what you want to.
Remember, I built my system with a case that is designed for quietness which makes a difference.


----------



## GhostRyder (Feb 2, 2015)

newtekie1 said:


> Yeah right.  The reference 290X isn't quiet, pretty much everyone know that is fact.  And W1z even had throttling issues on his open testbench.  So yeah, _I'm_ the one spreading FUD...


 


64K said:


> From W1zzards review of the reference R9 290X
> 
> 
> "Idle noise levels are decent, almost quiet. You can barely hear the card when it is installed in a case.
> ...


I own 3 R9 290X cards that have the reference cooler (Or had).  They are not the loudest cards I have ever owned nor were they intolerable under uber settings while testing (I never actually gamed on them with the reference cooler much, I did a couple passes of heaven and some furmark to heat them up).  The cooler was enough to manage the cards without any throttling *Except* when two of them were side by side in my test rig that is in an Antec Lanboy Air which the top one would drop to about ~900 under extended benching unless I routed the two side panel fans with fresh air towards them.  Inside my Corsair Obsidian 800D for testing they were not bad and actually quieter then a few cards I have owned in the past and I managed to keep them below 90 leaving the fan on auto "uber".  Bumping the fan speed up 5% more dropped the temps down to ~83 and while the cards were noticeable during benching they were not the worst nor were even something I would consider to the point of being something I could not deal with just 1 or maybe 2 cards.  Though I almost always go liquid cooling and had chosen to no matter what mostly because I like to have more overclocking headroom and I had 3 cards.  Though to be fair my 800D is a pretty hefty thick case compared to my past cases.

Y'all have to realize open air test benches tend to be a lot louder and showcase a cards noise a lot more than cases even when they are like the Lanboy which is pretty much open air.  While open air testing benches may give it a little more access to cold air for reduced temps, a good fan blowing some air from an intake will help keep temps down in a case and depending on the case and how thick it is will alleviate noise pretty easily which is how most people achieve decent noise and temps as I know many people who have 290's/290X's with stock coolers and are not bothered by them.  The cooler was nothing special by any means however it was not horrendous nor was it unbearable in a normal gaming rig as long as you were not intentionally trying to starve it from air.

If y'all want to speak about really unbearable cards that literally never were quiet we can speak about my HD 6990's and my GTX 295's (Both of the 295's were the Dual PCB variants).  Took a lot of work to make those a bit tolerable on their air coolers but both ended in quad liquid configurations.

To finish this off why is this now the central focus of the discussion?  Aftermarket coolers are more available now than the stock to begin with anyways so why is this even a discussion at this point?  Even if we still want to make a point about stock cooling AMD apparently heard everyone's cry for blood on stock cooling and is working on an AIO for their next generation so I guess then we will find out then what will happen with stock coolers.


----------



## AsRock (Feb 2, 2015)

GhostRyder said:


> I own 3 R9 290X cards that have the reference cooler (Or had).  They are not the loudest cards I have ever owned nor were they intolerable under uber settings while testing (I never actually gamed on them with the reference cooler much, I did a couple passives of heaven and some furmark to heat them up).  The cooler was enough to manage the cards without any throttling *Except* when two of them were side by side in my test rig that is in an Antec Lanboy Air which the top one would drop to about ~900 under extended benching unless I routed the two side panel fans with fresh air towards them.  Inside my Corsair Obsidian 800D for testing they were not bad and actually quieter then a few cards I have owned in the past and I managed to keep them below 90 leaving the fan on auto "uber".  Bumping the fan speed up 5% more dropped the temps down to 83 and while the cards were noticeable during benching they were not the worst nor were even something I would consider to the point of being something I could not deal with just 1 or maybe 2 cards.  Though I almost always go liquid cooling and had chosen to no matter what mostly because I like to have more overclocking headroom and I had 3 cards.  Though to be fair my 800D is a pretty hefty thick case compared to my past cases.
> 
> Y'all have to realize open air test benches tend to be a lot louder and showcase a cards noise a lot more than cases even when they are like the Lanboy which is pretty much open air.  While open air testing benches may give it a little more access to cold air for reduced temps, a good fan blowing some air from an intake will help keep temps down in a case and depending on the case and how thick it is will alleviate noise pretty easily which is how most people achieve decent noise and temps as I know many people who have 290's/290X's with stock coolers and are not bothered by them.  The cooler was nothing special by any means however it was not horrendous nor was it unbearable in a normal gaming rig as long as you were not intentionally trying to starve it from air.
> 
> ...



Yeah complaining about a issue which is a none issue is dumb to say the least, yeah nv fans having a go at AMD when really they should be having ago at each other about the current 970 issue or none issue.


----------



## 64K (Feb 2, 2015)

I don't think the reference R9 290X is relevant anymore but someone accused another person of spreading FUD which he wasn't. Both Nvidia and AMD have had noisy reference coolers in the past. The GTX 480 comes to mind.


----------



## newtekie1 (Feb 2, 2015)

ManofGod said:


> That's nice but I have not had those experiences. However, if you have it on an open bench with you ear less than a foot from it, of course it will seem noisy to you. However, if you build it into a quiet system, you will not hear it and it does not throttle. (Mine does not throttle and I do not hear it well I am gaming.) Therefore, either his experience is based on that open bench operation or I am lying. I know I am not lying but hey, if you do not believe me, that is fine, I will still continue to use my quiet, non throttling R9 290X and you use what you want to.
> Remember, I built my system with a case that is designed for quietness which makes a difference.


I'd believe him more than I do some random unknown user on a forum.

Also from his review of the 290X:



> Even at 100%, it could barely keep the card from overheating and was noisier than any cooler I've ever experienced. *My neighbors actually complained*, asking why I used power tools that late at night.



You can put all the sound proofing you want in the case, if the card is loud enough his neighbor could hear it, open bench or not, the card is insanely loud!

And furthermore, an open bench generally means much cooler temps, which generally means much lower fan speed and much less noise produced.


----------



## ManofGod (Feb 2, 2015)

newtekie1 said:


> I'd believe him more than I do some random unknown user on a forum.
> 
> Also from his review of the 290X:
> 
> ...



That's nice.  Yep, a computer right up against the wall in an apartment would definitely carry sound. However, inside a case, they would probably not have heard it. Also, you most certainly will not get better temps in an open air bench simply because there is no airflow occurring across the computer components.


----------



## newtekie1 (Feb 2, 2015)

ManofGod said:


> That's nice.  Yep, a computer right up against the wall in an apartment would definitely carry sound. However, inside a case, they would probably not have heard it. Also, you most certainly will not get better temps in an open air bench simply because there is no airflow occurring across the computer components.



If you don't think a GPU runs cooler on an open bench than it does in a case you have no idea what you're talking about.


----------



## 64K (Feb 3, 2015)

MSI R9 290X Gaming for $280 after rebate on Newegg. 

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127773&cm_re=r9_290x-_-14-127-773-_-Product


----------



## Cybrnook2002 (Feb 3, 2015)

64K said:


> MSI R9 290X Gaming for $280 after rebate on Newegg.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127773&cm_re=r9_290x-_-14-127-773-_-Product


Great deals! I picked up a second 290X lightning for crossfire.


----------



## GhostRyder (Feb 3, 2015)

Cybrnook2002 said:


> Great deals! I picked up a second 290X lightning for crossfire.


 Oh you lucky devil, I wish I had either waited on the Lightning's or gone with the Sapphire VaporX 8gb variants for my rig!


----------



## ManofGod (Feb 3, 2015)

Cybrnook2002 said:


> Great deals! I picked up a second 290X lightning for crossfire.



If I did not already own a R9 290X XFX Reference card, I would pick this up the Lightning card for sure. My only disappointment is that the reference card does not support UEFI GOP mode.


----------



## anubis44 (Feb 3, 2015)

TheGuruStud said:


> Nvidia sends in so many programmers to control game optimization...it's like the stazi. Devs are afraid of them. They will never say a bad thing about Nvidia unless they want fired. Devs haven't had control of their games in years.
> 
> Crysis 2 is a standout example. I can't remember if that's the one where the devs refused to even comment or not, though. Nvidia was basically allowed full control and pumped in massive amounts of tesselation for flat surfaces. IIRC, there was even an underground river getting rendered. 17% slowdown on nvidia and 30% on AMD. The gtx 580 could churn through tesselation much better. Nvidia dropped this tactic when the new AMD cards came out with heavy tess firepower.



There's nothing new here, but thanks for shedding light on the dark activities of this nefarious company. nVidia has been trying to screw us over so many times I've lost count. But here's just a few examples:

1) They didn't compensate owners of laptops equipped with nVidia GPUs (I myself had a Toshiba Tecra that died because of this infamous 'bumpgate' scandal).
2) They were behind many tech websites maintaining that 2GB was 'more than enough' graphics memory during the GTX670/GTX680 launches, when AMD was selling 3GB 7950 and 7970 cards, and there was already a lot of evidence that 2GB was inadequate. So these nVidia cards were obsolete right out of the box. What a scam.
3) They've tried to corner the PC gaming markets using tactics like the above-mentioned 'game optimizations', which were really just intended to make the game run worse on AMD hardware, rather than well on nVidia hardware. PhysX was an even more blatant example of this kind of nonsense, introducing utterly useless additional effects merely for the sake of locking out AMD cards from rendering the game identically. Now that AMD has wisely grabbed the gaming consoles, I'm pretty sure many of these efforts to cheat/threaten developers will subside, but we must remain ever-vigilant.
4) G-Sync is their latest attempt to insult our intelligence by implementing a proprietary, locked version of a technology that most monitors can already essentially support. They just want an excuse to put a green goblin logo on the monitor and charge us $100 extra for the same monitor as a freesync-enabled version, while locking out AMD cards. This will likely be their biggest failure yet, as freesync monitors will be out within a month, and it wouldn't surprise me if the manufactures didn't make all their monitors freesync-capable going forward, since it wouldn't cost them practically anything to do it, and it will give them another checkbox feature.
5) Finally, more recently, Jen Hsun and his minions have picked millions of pockets by dishonestly selling a '4GB' GTX970 card that can really only address 3.5GB of memory at full speed. How many of those millions would have thought twice about those GTX970s if they'd known about this issue before they bought? 

Personally, I went with a $260 Gigabyte Windforce R9 290 standard edition card just before Christmas, and flashed the bios with the R9 290 O/C bios (available on this very website here: http://www.techpowerup.com/vgabios/
! Thanks TechPowerUp !) and now it runs at 1050MHz rock solid stable. It runs quiet and cool, and it was more than $100 less than the flawed GTX970.

I think we should just say no to nVidia at this point, at least until they're really hurting. That would be a fitting punishment for all this crap they've been pulling.


----------



## xfia (Feb 3, 2015)

I'm down with that...   say no to ngreedia!!!


----------



## Cybrnook2002 (Feb 3, 2015)

ManofGod said:


> If I did not already own a R9 290X XFX Reference card, I would pick this up the Lightning card for sure. My only disappointment is that the reference card does not support UEFI GOP mode.


Not to go off topic, but seems it's not ALL reference cards. Just the BIOS XFX used. And even with that, seems there "might?" be a UEFI compatible BIOS for the XFX reference card:

http://www.techpowerup.com/forums/threads/xfx-r9-290x-not-uefi.205450/

Might want to PM with XFXSupport TPU member.


----------



## XFXSupport (Feb 4, 2015)

Cybrnook2002 said:


> Not to go off topic, but seems it's not ALL reference cards. Just the BIOS XFX used. And even with that, seems there "might?" be a UEFI compatible BIOS for the XFX reference card:
> 
> http://www.techpowerup.com/forums/threads/xfx-r9-290x-not-uefi.205450/
> 
> Might want to PM with XFXSupport TPU member.




We only have 1 documented case that i can think of where a customer had a 290x that wasn't uefi compatible, but wasn't tested in house, nor was the motherboard tested.

As a rule, we like to address the motherboard first and see if its not an Issue with R9 GPU's first, then visit the UEFI settings.  contact me so we can investigate this further or open a ticket at xfxsupport.com.  Thank you


----------



## btarunr (Feb 9, 2015)

Maban said:


> @btarunr What's the source of that AMD propaganda image? Is it actually from AMD?



Yes. https://twitter.com/amd_roy/status/561222965023363073


----------

