Thursday, January 29th 2015

AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price

AMD decided to cash-in on the GeForce GTX 970 memory controversy, with a bold move and a cheap (albeit accurate) shot. The company is making its add-in board (AIB) partners lower pricing of its Radeon R9 290X graphics card, which offers comparable levels of performance to the GTX 970, down to as low as US $299.

And then there's a gentle reminder from AMD to graphics card buyers with $300-ish in their pockets. With AMD, "4 GB means 4 GB." AMD also emphasizes that the R9 290 and R9 290X can fill their 4 GB video memory to the last bit, and feature a 512-bit wide memory interface, which churns up 320 GB/s of memory bandwidth at reference clocks, something the GTX 970 can't achieve, even with its fancy texture compression mojo.
Add your own comment

181 Comments on AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price

#126
RejZoR
rruffThanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns:


- Radeon HD 7850 : 2,69%
- Radeon HD 7870 : 12,45%
- Radeon HD 7950 : 5,32%
- Radeon HD 7970 : 7,24%

- GeForce GTX 560 Ti : 1,43%
- GeForce GTX 660 Ti : 3,06%
- GeForce GTX 670 : 3,42%
- GeForce GTX 680 : 2,66%

- Radeon HD 7850 : 3,74%
- Radeon HD 7870 : 5,48%
- Radeon HD 7870 XT : 4,25%
- Radeon HD 7950 : 5,75%
- Radeon HD 7970 : 5,31%

- GeForce GTX 660 : 1,01%
- GeForce GTX 660 Ti : 2,81%
- GeForce GTX 670 : 2,87%
- GeForce GTX 680 : 1,99%

The Pudget Systems results were very unfavorable to AMD for initial reliability, but that was a small sample size.



I don't know if it is heavy, but not super light either. There is +8W in idle. The 290x uses 67W more just running a bluray, and ~100W more in typical gaming. If I gamed for 2hr/day and watched 2hr/day of video that would be an average of 14W, or 22W total adding the idle consumption, or $22/yr. Maybe $50 in two years is a bit much but it isn't that crazy either.
I'd take these statistics with a pinch of salt without knowing why users actually returned them. I've returned my HD7870 Toxic as well, because it was loud as hell despite their famous VaporX cooler (could be defective, I don't know). And bought a HD7950 afterwards. Meaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model. The current HD7950 WindForce3X is by far the best card I've owned in years despite also having few minor issues.
Posted on Reply
#127
Devon68
This is what makes me kind of mad and said at the same time:
1 card cost's 300$ converted into my countries currency it's:
300 dollars is = 25560.06 rsd
25560.06 rsd x2 is = 51120.12 rsd

and this is how much one card cost's in my country:
www.winwin.rs/racunari-i-komponente/racunarske-komponente/graficke-kartice/graficka-kartica-amd-radeon-r9-290x-gigabyte-oc-4gb-ddr5-dvi-hdmi-dp-512bit-gv-r929xoc-4gd-1189519.html

BTW the price in red is the price if you order it online
the green is the price if you want to buy it from the shop.
Posted on Reply
#128
64K
Devon68This is what makes me kind of mad and said at the same time:
1 card cost's 300$ converted into my countries currency it's:
300 dollars is = 25560.06 rsd
25560.06 rsd x2 is = 51120.12 rsd

and this is how much one card cost's in my country:
www.winwin.rs/racunari-i-komponente/racunarske-komponente/graficke-kartice/graficka-kartica-amd-radeon-r9-290x-gigabyte-oc-4gb-ddr5-dvi-hdmi-dp-512bit-gv-r929xoc-4gd-1189519.html

BTW the price in red is the price if you order it online
the green is the price if you want to buy it from the shop.
Yikes, the online price is 52,000 din. That's about $476 US. I doubt that the difference is solely due to an import tariff. I would be mad too.
Posted on Reply
#129
rruff
64KYikes, the online price is 52,000 din. That's about $476 US. I doubt that the difference is solely due to an import tariff. I would be mad too.
Weird things is, being in the US I can buy stuff from overseas, doesn't matter what country, and I don't pay any tariff or extra fee. But I also sell stuff overseas, and in every case the country I ship to requires a hefty tariff or tax. Like 30% or so. Even shipping to poor countries. I thought "free trade" was supposed to go both ways?
Posted on Reply
#130
Hyphen
Today's "Daily Deal Slasher" over at TigerDirect is this card at $300.
Posted on Reply
#131
rruff
RejZoRMeaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model.
Those sort of defects are not specific to AMD, but there seems to be a consistent trend of defects of some sort being higher with AMD cards. Maybe driver related?
Posted on Reply
#132
TRWOV
RejZoRI'd take these statistics with a pinch of salt without knowing why users actually returned them. I've returned my HD7870 Toxic as well, because it was loud as hell despite their famous VaporX cooler (could be defective, I don't know). And bought a HD7950 afterwards. Meaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model. The current HD7950 WindForce3X is by far the best card I've owned in years despite also having few minor issues.
Also remember that AMD has a smaller user base so 1 AMD card accounts for a higher percentage than 1 nVidia card.
Posted on Reply
#133
rruff
TRWOVAlso remember that AMD has a smaller user base so 1 AMD card accounts for a higher percentage than 1 nVidia card.
Been looking at percentages.
Posted on Reply
#134
GhostRyder
rruffThanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns:


- Radeon HD 7850 : 2,69%
- Radeon HD 7870 : 12,45%
- Radeon HD 7950 : 5,32%
- Radeon HD 7970 : 7,24%

- GeForce GTX 560 Ti : 1,43%
- GeForce GTX 660 Ti : 3,06%
- GeForce GTX 670 : 3,42%
- GeForce GTX 680 : 2,66%

- Radeon HD 7850 : 3,74%
- Radeon HD 7870 : 5,48%
- Radeon HD 7870 XT : 4,25%
- Radeon HD 7950 : 5,75%
- Radeon HD 7970 : 5,31%

- GeForce GTX 660 : 1,01%
- GeForce GTX 660 Ti : 2,81%
- GeForce GTX 670 : 2,87%
- GeForce GTX 680 : 1,99%

The Pudget Systems results were very unfavorable to AMD for initial reliability, but that was a small sample size.



I don't know if it is heavy, but not super light either. There is +8W in idle. The 290x uses 67W more just running a bluray, and ~100W more in typical gaming. If I gamed for 2hr/day and watched 2hr/day of video that would be an average of 14W, or 22W total adding the idle consumption, or $22/yr. Maybe $50 in two years is a bit much but it isn't that crazy either.
Your talking about a very small difference in returns though comparing both with one exception in that test. Not enough to really count as severe differences in reliability besides one out of the bunch in that chart (Which can be seen going back on different models of cards from both sides). Problem with most failures still is as stated by others it depends on why they were returned more than anything because cooler malfunctions or many other reasons including things that are not the cards fault. I have met many people who return cards for random reasons as being faulty when it turns out to be their fault and those returns are calculated into many statistics counting it as faulty because the card has to go through a testing process/refurb process before it can be sold or used again. Again we can say that both sides get those problems but AMD tends to get a brunt of it more than others because many people tend to blame AMD/The card first before checking for other issues because that is how they get labeled on the internet. You see a dozen posts regarding a certain problem with a card, that must be what's causing your issues right? Same thing will happen with the GTX 970 unfortunately where the VRAM problem will be associated from this point on with every problem a user faces which in many cases will not be the case.

As far as wattage goes though, I can say that $22 a year is still asking a lot for a power difference between cards though. I can tell you I have 3 R9 290X cards, I game at 4K pretty often when I am not working including BF4, LoL (Though that only uses 1 card), and others and my electric bill has not gone up at least that I have noticed from my last couple of builds. But then again I have had Dual HD 6990's, 2 GTX 580's, 2 GTX 460's, 2x GTX 295 (Dual PCB variants), etc so maybe I am a bad matchup for that however power usage generally on computers while one can use significantly more than another under load those loads fluctuate so much on top of the fact that while the numbers sounds high in the grand scheme comparing to most other household appliances (Even just lighting) it really is not as much as you would think.
TRWOVAlso remember that AMD has a smaller user base so 1 AMD card accounts for a higher percentage than 1 nVidia card.
But there are a lot more OEM machines with NVidia inside than AMD which is where a lot more of those numbers tend to come from especially on the mobile market. Many times as well those numbers when a machine is returned for a problem even if it is the GPU those numbers do not hit most of the polls when a manufacturer finds a bad GPU.
Posted on Reply
#135
efikkan
Let's look at the simple facts here for a moment:
GTX 970 has 13 SMMs with a computational power of 3494 GFlop/s (75.76% of GTX 980), with a theoretical memory bandwidth of 196 GB/s vs. 224 GB/s (87.5%). When a GPU accesses a single tiny block of memory, it will never read it at 224 GB/s. A single block will be placed in one of the eight memory chips, where each one is accessible at 28 GB/s (on separate 32-bit buses). Let's say you have a hypothetical GPU of 32 SMMs and a 512-bit memory bus(total), loading the same single block of memory from one SMM will still be just as slow as on GTX 970. The reason for this is the GPU allocates a single block on one memory controller. If you think about it for a moment, you'll realize why every single SMM can't load from memory at 224 GB/s, that would make the GPU extremely complex and defeat the purpose of a GPU processing different data as efficient as possible.

When looking back on GTX 970 and GTX 980, simple maths shows the GTX 970 has more memory bandwidth per GFlop (0.056 vs. 0.049 GB/flop), meaning in a gaming setting where each frame time is limited, each SMM has more memory bandwidth at their disposal than on an GTX 980. Even though each GPU can store 4 GB in memory, no game will ever come close to using all of that in a single frame rendering, and typical load is generally around 1.5 GB per frame or lower. So provided that the GPU/driver is smart enough to store appropriately in the two memory pools, the last slow 512 MB can be completely transparent to the end user. Heck, even if the GTX 970 had only 3 GB of fat memory it could still be achieved. Given the fact that GTX 970 has more memory bandwidth per SMM than GTX 980, GTX 970 is still less bottlenecked by the memory bus in a gaming setting. This is why the slowdown for GTX 970 vs. GTX 980 is negligible, and the slowdown we can see has more to do with fewer SMMs than the memory bus.

When using the GPU for computing(CUDA or OpenCL) the consequences of the slower memory might be a bit different, in specific situations where the program accesses randomly across all of GPU memory.

But for gaming, the GTX 970 still remains just good of a choice as two weeks ago. If it weren't for the specific compute situations, no one would probably ever notice the slowdown. Any any minor issues resulting from this memory configuration can be fixed in software (if any).
Posted on Reply
#137
The N
AMD want to get in between Nvidia users and NVidia . so they can get something outta it. with effective meaningful advertisement. thats called, strategy. effective move to some extend.
Posted on Reply
#138
dj-electric
newtekie1LOL

Yup. Hypocrisy at its very best. The text there should be much bigger
Posted on Reply
#139
Jurassic1024
JorgeFor those who don't know, top-of-the-line CPUs and GPUs are cash cows with huge margins compared to mainstream models. Only enthusiasts or Biz buy the very high end, high margin products. Lowering the price slightly is a no brainer and will more than be offset by the increased volume. It's not difficult to do the math when you know the margins and volume.
Cash cows? Keep dreaming. The greater chunk of revenue comes from low and midrange. EVERYONE knows this but you?!

ie: Have you seen a Steam Hardware Survey... ever?!
Posted on Reply
#140
R-T-B
Jurassic1024Cash cows? Keep dreaming. The greater chunk of revenue comes from low and midrange. EVERYONE knows this but you?!

ie: Have you seen a Steam Hardware Survey... ever?!
He was speaking in a per GPU context. Completely different..
Posted on Reply
#141
SK-1
Dang.... I just saw one for 229.00 after discount. I spent almost as much on a R9-270X less than a year ago:eek:
Posted on Reply
#142
HumanSmoke
RejZoRI'd take these statistics with a pinch of salt without knowing why users actually returned them. I've returned my HD7870 Toxic as well, because it was loud as hell despite their famous VaporX cooler (could be defective, I don't know). And bought a HD7950 afterwards. Meaning the card wasn't returned because the chip performance was flawed (it was in fact really fast), it was because Sapphire somehow managed to fuck up the cooling on that particular model. The current HD7950 WindForce3X is by far the best card I've owned in years despite also having few minor issues.
The actual answer in many cases, especially Sapphire, lies in the model number. A high number of the Sapphire cards being returned are reduced BoM models, where Sapphire had done stuff like putting the Vapor-X shroud over some pretty basic componentry in some cases (coil whine seems a large factor in some of SKUs). A keen eye would see that many of their reduced BoM cards feature the blue PCB, while the premium cards often sport the black PCB.
Anyhow, returns usually fall under two categories - failure or customer dissatisfaction. Whichever the reason, the numbers are pertinent to those who might purchase.
As an aside, those numbers are for an older timeframe. I actually aggregated the latest numbers from Hardware France in a post a week ago(I won't try to C&P here).
Posted on Reply
#143
RejZoR
Interesting. I've never had any graphic card fail on me (neither AMD/ATi or NVIDIA based) if I exclude the HD5850 which i fried myself by fiddling with the cooling system. And then I at least had an excuse to buy HD6950 :D
Posted on Reply
#144
HumanSmoke
RejZoRInteresting. I've never had any graphic card fail on me (neither AMD/ATi or NVIDIA based) if I exclude the HD5850 which i fried myself by fiddling with the cooling system. And then I at least had an excuse to buy HD6950 :D
I had five HD 5850's. Started with two XFX Black Editions (returned one because of the GSoD issue, the other wouldn't hold the default OC), went with the three Sapphire Toxic 2GB - all three are still running strong today with their 2nd owners. Also had fails from a reference (XFX) 5970which was a basket case from day one, as were two BFG GTX 280 H2OC's which hadn't been binned for the overclock they were running (a common BFG failing).
Of the machines I've been called on to diagnose and repair, most faults are from user error - assuming five minutes gaming is sufficient testing for a maxed out overclock. The largest card issues were from HD 4870's and 4890's, followed by GTX 280's running too high a clock, and those single slot 8800GT's with the cheapo 7-blade blower fan.
Posted on Reply
#145
RejZoR
I've had a GeForce 7600GT to fail, but that was after I sold it to someone else. He called me that gfx card doesn't work (and I know it worked fine before I took it out of my PC). Luckily it was still in warranty when I sold it so he got a new one. I guess it was an issue with static electricity or something else that person messed up somehow. So, technically it wasn't NVIDIA's fault here either.
Posted on Reply
#146
AsRock
TPU addict
RejZoRI've had a GeForce 7600GT to fail, but that was after I sold it to someone else. He called me that gfx card doesn't work (and I know it worked fine before I took it out of my PC). Luckily it was still in warranty when I sold it so he got a new one. I guess it was an issue with static electricity or something else that person messed up somehow. So, technically it wasn't NVIDIA's fault here either.
Some times just replacing the user will solve the issue


Anyways, i have had 2 go wrong on me over the last 25 years. One was a 2900XT but that was my fault as i forgot to plug the fan on it to cool the ram chips top side which were overclocked as high as i could get them.

The other as on a XFX 4890 and when the cooler was put back on it the cooler would make a odd sound which was sent back to XFX and they replaced it with a newer series card and sent it directly to it's new user.
Posted on Reply
#148
AsRock
TPU addict
rruffwww.hardware.fr/articles/927-5/cartes-graphiques.html

Good stuff. Definitely higher return rates for AMD cards.
so they say, i have returned more nvidia cards than i have AMD, i had hell with the nvidia 7xxx range. Never mind the 4x0 range were on one the heatsink fell off.

Although it would not stop me from buying one, how ever i think it's more like hard drives some brands you have good luck with some others not so. I had my fair share of seagate and WD drives how ever seagate i have always gave e issues were as with some one else had bad luck with WD's.
Posted on Reply
#149
Ferrum Master
What are those numbers? If only two 7970 are sold and one returned then RR rate becomes 50%, give me a break really.

There are not known manufacturing errors for both camps, and any responsibility should go to AIB partners anyway.
Posted on Reply
#150
GhostRyder
Dj-ElectriCYup. Hypocrisy at its very best. The text there should be much bigger
Maybe if it was true?
rruffwww.hardware.fr/articles/927-5/cartes-graphiques.html

Good stuff. Definitely higher return rates for AMD cards.
Return rates have to much to account for as I said when I posted those links to you, the problem being that for starters people can return for ANY issue and it gets counted on top of that even if we are looking at those numbers the difference is very small except in one case. The only real reason to check reviews/return rates or the likes on a card is if you see a very abnormally high number on one specific card like a card that has an oddly high return rate because that can indicate a real problem. Both card camps have no problems with return rates and neither are really better than the other at quality control. We will probably see a slightly higher than normal return rate for the GTX 970 and we all know the reason for that, does not mean the quality control and NVidia was horrible or anything.

I have had 2 cards fail on me in my life and both were NVidia (1 9800GT and 1 9800GX2), but that does not say much as I have bought mostly NVidia from the beginning so its not a good comparison. Any card unless there is a specific design flaw can fail at a similar rate and the only thing that really matters when comparing is abnormally high rates.
Posted on Reply
Add your own comment
Dec 3rd, 2024 13:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts