Thursday, January 29th 2015

AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price

AMD decided to cash-in on the GeForce GTX 970 memory controversy, with a bold move and a cheap (albeit accurate) shot. The company is making its add-in board (AIB) partners lower pricing of its Radeon R9 290X graphics card, which offers comparable levels of performance to the GTX 970, down to as low as US $299.

And then there's a gentle reminder from AMD to graphics card buyers with $300-ish in their pockets. With AMD, "4 GB means 4 GB." AMD also emphasizes that the R9 290 and R9 290X can fill their 4 GB video memory to the last bit, and feature a 512-bit wide memory interface, which churns up 320 GB/s of memory bandwidth at reference clocks, something the GTX 970 can't achieve, even with its fancy texture compression mojo.
Add your own comment

181 Comments on AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price

#101
rruff
GhostRyderQC is about the same dude including reliability, the differences posted are normally very small except in cases involving massive problems with certain specific cards which is not something seen very often. Drivers are just as fine no matter what brand your using so that argument is completely irrelevant and the same goes for features because both sides have a counter to each feature within reason.
You can say that, but have you looked? Do you have evidence? Newegg is probably the best source. Here are the GTX 970 and R9 290x organized by rating: www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709 600536049 600473871 &IsNodeId=1&page=1&bop=And&Order=RATING&PageSize=90

There are 6 Nvidia's ahead of any AMD and the sole AMD card with more than 10 reviews and getting 5 eggs is a 8GB Sapphire that costs >$400. If I wanted to kill a few hours I could calculate the mean and median rating for all the cards, but at a glance I can see that Nvidia would win.

If you want to go to the bottom of the Maxwell line, the GTX 750 vs the R7 260x, it's even more dramatic. www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709 600487564 600473874&IsNodeId=1&bop=And&Pagesize=90&Page=1

If you can find a case where AMD would beat the competing Nvidia card in Newegg reviews, I'd be interested.
GhostRyderOn top of that power consumption is already proven time and again to be a moot point except in small situations. In most cases you would have to run a card for such a high amount of time under load through the year to really equate to power differences becoming present on your bill.
It isn't tough to calculate and I already gave the numbers. If you pay a normal US price for electricity (~11 cents/kw-hr) it's $1/W/yr continuous. A 290x uses 8W more at idle and way more than that at other times. For my typical use (computer on 24/7 but heavy card use only a couple hours per day), it would probably amount to around $20-30/yr. Is that a weird case and is that amount of money trivial? Not to me.
Posted on Reply
#102
rruff
Sony Xperia SThat's good for the stupid because they never have to think that probably the problem is in them or in their system. :laugh: :rolleyes:
Why would Nvidia cards be more idiot proof than AMD?
Posted on Reply
#103
Sony Xperia S
rruffWhy would AMD cards be more idiot proof than nvidia?
Fixed. :)
Posted on Reply
#104
rruff
Sony Xperia SFixed. :)
You are contradicting yourself...
Posted on Reply
#105
Cybrnook2002
While ya'll are bitching, I'm gaming :-) (at 100+ FPS)
Posted on Reply
#106
GhostRyder
rruffYou can say that, but have you looked? Do you have evidence? Newegg is probably the best source. Here are the GTX 970 and R9 290x organized by rating: www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709 600536049 600473871 &IsNodeId=1&page=1&bop=And&Order=RATING&PageSize=90

There are 6 Nvidia's ahead of any AMD and the sole AMD card with more than 10 reviews and getting 5 eggs is a 8GB Sapphire that costs >$400. If I wanted to kill a few hours I could calculate the mean and median rating for all the cards, but at a glance I can see that Nvidia would win.

If you want to go to the bottom of the Maxwell line, the GTX 750 vs the R7 260x, it's even more dramatic. www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709 600487564 600473874&IsNodeId=1&bop=And&Pagesize=90&Page=1

If you can find a case where AMD would beat the competing Nvidia card in Newegg reviews, I'd be interested.
So newegg is a deciding factor based on products reviews? You do realize anyone can choose to write and anyone can choose not to which basically makes those pointless. Especially when many people are "Anonymous", Don't have the "Purchased" symbol next to their name, or repeat reviews (Which you can do btw). On top of all that many times those complaints are not even about failing cards and sometimes are just lesser complaints aimed at a multitude of things (Not happy with shipping, game deal/rebate not working, etc). Retailer site reviews no matter what its for are rarely ever useful dude...
rruffIt isn't tough to calculate and I already gave the numbers. If you pay a normal US price for electricity (~11 cents/kw-hr) it's $1/W/yr continuous. A 290x uses 8W more at idle and way more than that at other times. For my typical use (computer on 24/7 but heavy card use only a couple hours per day), it would probably amount to around $20-30/yr. Is that a weird case and is that amount of money trivial? Not to me.
So you are concerned about electricity yet leave your computer on 24/7? That's like saying I am concerned with my light bulbs using to much electricity and buying CFL/LED bulbs to leave on 24/7. Even so your calculations have a lot more to factor in then just that and even so most of the reviews of power consumption show gaming stress with no limits on which does adjust power usage. Not really in the mood to debate this but places like Linus tech tips have done similar tests and come to the same conclusion.
Cybrnook2002While ya'll are bitching, I'm gaming :) (at 100+ FPS)
I wish I could be gaming right now :(
Posted on Reply
#107
rruff
GhostRyderSo newegg is a deciding factor based on products reviews?
It's imperfect but do you have a better one? Seriously, I want to know if there is a good source of information on QC and reliability? Many of the issues you mention *should* apply to both and would not consistently skew results. The only thing that would is if Nvidia is spamming reviews more than AMD.
GhostRyderSo you are concerned about electricity yet leave your computer on 24/7?
It's working 24/7. It can't work if it's off. And I'm not concerned about electricity, I'm concerned about $. If I'm going to keep a card for 2 years and that is an extra $50 in cost, then it is definitely a factor in comparing price vs performance.
Posted on Reply
#108
64K
rruffIt's imperfect but do you have a better one? Seriously, I want to know if there is a good source of information on QC and reliability? Many of the issues you mention *should* apply to both and would not consistently skew results. The only thing that would is if Nvidia is spamming reviews more than AMD.
I wouldn't put too much stock in private reviews. There have been instances on Metacritic for example where employees from a developer went and posted flattering reviews of a game that they had worked on and only focused on the good part of the game and got caught red handed doing it. Then you look at Battlefield and Call of Duty reviews and there's some kind of fanboy feud going on between the two camps for whatever reason and they smear the other camp with negative reviews for each release.
Posted on Reply
#109
CrAsHnBuRnXp
EroticusThe reasons are - nvidia has much more fan boys / power effective / 970 is newer product ( 1.6 years )/ and the main reason - 380x is coming . no point to buy old gen when amd will win like always in next one ....
Like always? When AMD releases a new GPU, they win for like a week or 2 until nvidia turns around and releases another card to compete with it and AMD gets knocked back down.
rruffYou are contradicting yourself...
No he's just being a fanboy. Difference.
Posted on Reply
#111
Sony Xperia S
rruffYou are contradicting yourself...
No, I am not.

When you say "water proof" it means that it is resistant and protected. The same with AMD cards - they are protected and resistant to idiots. :D
Posted on Reply
#112
AsRock
TPU addict
GhostRyder290X is more powerful than a GTX 970 so that point is invalid...But then again that comment an obvious troll bait...


QC is about the same dude including reliability, the differences posted are normally very small except in cases involving massive problems with certain specific cards which is not something seen very often. Drivers are just as fine no matter what brand your using so that argument is completely irrelevant and the same goes for features because both sides have a counter to each feature within reason.

On top of that power consumption is already proven time and again to be a moot point except in small situations. In most cases you would have to run a card for such a high amount of time under load through the year to really equate to power differences becoming present on your bill. On top of that it normally would take years of doing that just to equate to a reasonable price difference between the cards especially when one card is cheaper than the other. Not to mention you have to include people who use Vsync or similar which alleviates a lot of stress off the GPU and lowers the power usage as well. The only major concern for power usage would be a PSU for users which a ~500watt is generally what a gamer buys and will run the card so its still a moot point.

Anyway, either way its funny AMD is doing this to cash up on people returning the card with that type of joke add. Either way I am sure they are going to get some sales with that price on their cards since they are still one of the best high resolution performing GPU's out there at the moment. Prices so good on high end gaming cards more people can join the fray and get some serious gaming cards for a good price.



Dang, now I wish I wanted/needed one of those variants.
Why bother ? ignorance is bliss and seems like he cannot get his head around a few facts.

And WTF is this 24\7 BS, as if your gaming 24\7 you got other issue's which are much more important. So math is flawed right from the get go.

Yes vsync makes a hell of a difference @60Hz which most are on still and is typically best for gaming typically.
Posted on Reply
#113
GhostRyder
rruffIt's imperfect but do you have a better one? Seriously, I want to know if there is a good source of information on QC and reliability? Many of the issues you mention *should* apply to both and would not consistently skew results. The only thing that would is if Nvidia is spamming reviews more than AMD.
Sadly there are not that many but a few that are at least acceptable to go off of are for instance a post on linustechtips that links to a French site has some decent coverage of that. There is also the Pudget systems link that shows their personal experience with cards in house testing and the field. But either way a review on manufacturer/retail sites is not useful because as stated by @64K they are skewed very easily by people who are fanboys or making random complaints. I see plenty of complaints that sometimes come from the same person 3 or 4 times and even ones from people who do not own the video card complaining.
rruffIt's working 24/7. It can't work if it's off. And I'm not concerned about electricity, I'm concerned about $. If I'm going to keep a card for 2 years and that is an extra $50 in cost, then it is definitely a factor in comparing price vs performance.
If it costs you an extra $50 in 2 years, that would be a heavy amount of use on the card at a constant rate. That is also still including if the rate of usage remains at its peak which most cards do not remain at the peak power outputs for very long except in situations like crypto currency mining.
Xzibit
LOL are you serious there is another one of those...Wow.
Posted on Reply
#114
rruff
GhostRyderSadly there are not that many but a few that are at least acceptable to go off of are for instance a post on linustechtips that links to a French site has some decent coverage of that.
Thanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns:


- Radeon HD 7850 : 2,69%
- Radeon HD 7870 : 12,45%
- Radeon HD 7950 : 5,32%
- Radeon HD 7970 : 7,24%

- GeForce GTX 560 Ti : 1,43%
- GeForce GTX 660 Ti : 3,06%
- GeForce GTX 670 : 3,42%
- GeForce GTX 680 : 2,66%

- Radeon HD 7850 : 3,74%
- Radeon HD 7870 : 5,48%
- Radeon HD 7870 XT : 4,25%
- Radeon HD 7950 : 5,75%
- Radeon HD 7970 : 5,31%

- GeForce GTX 660 : 1,01%
- GeForce GTX 660 Ti : 2,81%
- GeForce GTX 670 : 2,87%
- GeForce GTX 680 : 1,99%

The Pudget Systems results were very unfavorable to AMD for initial reliability, but that was a small sample size.
If it costs you an extra $50 in 2 years, that would be a heavy amount of use on the card at a constant rate.
I don't know if it is heavy, but not super light either. There is +8W in idle. The 290x uses 67W more just running a bluray, and ~100W more in typical gaming. If I gamed for 2hr/day and watched 2hr/day of video that would be an average of 14W, or 22W total adding the idle consumption, or $22/yr. Maybe $50 in two years is a bit much but it isn't that crazy either.
Posted on Reply
#116
efikkan
It's time to stop this nonsense. The performance loss compared to GTX 980 is negligible, and GTX 970 still remains the best value GPU choice. Graphic cards with some slower memory banks is not new, but people have long forgot GTX 660Ti. Having 512 MB of slower memory in GTX 970 might be a little issue for some CUDA-uses, but for current games it remains irrelevant. The memory bus is way too slow to utilize the memory within a single frame anyway.

It's laughable that AMD tries to cash in on this fuss. GTX 970 still has 4 GB of memory and is still a superior choice to R9 290X. Spreading misinformation about this controversy is quite immoral.
Posted on Reply
#117
xfia
efikkanIt's time to stop this nonsense. The performance loss compared to GTX 980 is negligible, and GTX 970 still remains the best value GPU choice. Graphic cards with some slower memory banks is not new, but people have long forgot GTX 660Ti. Having 512 MB of slower memory in GTX 970 might be a little issue for some CUDA-uses, but for current games it remains irrelevant. The memory bus is way too slow to utilize the memory within a single frame anyway.

It's laughable that AMD tries to cash in on this fuss. GTX 970 still has 4 GB of memory and is still a superior choice to R9 290X. Spreading misinformation about this controversy is quite immoral.
haha do you realize it is flawed by the basic parallel principal to make a gpu like this? if Intel made the xeon gpu's like this they would be garbage. some cuda uses? try like most if not all.
there is not misinformation going around.. it is a peace of shit that people have been complaining about since it was launched and nvidia ignored them so they where thinking it is just drivers and sli performance will improve haha
Posted on Reply
#118
buggalugs
efikkanIt's time to stop this nonsense. The performance loss compared to GTX 980 is negligible, and GTX 970 still remains the best value GPU choice. Graphic cards with some slower memory banks is not new, but people have long forgot GTX 660Ti. Having 512 MB of slower memory in GTX 970 might be a little issue for some CUDA-uses, but for current games it remains irrelevant. The memory bus is way too slow to utilize the memory within a single frame anyway.

It's laughable that AMD tries to cash in on this fuss. GTX 970 still has 4 GB of memory and is still a superior choice to R9 290X. Spreading misinformation about this controversy is quite immoral.
You are completely wrong, on everything, and I still don't understand why some people are defending Nvidia when they have been dishonest. It doesn't matter how good something is , if it is advertised to have something and it doesn't, that's a problem. When they advertised the card as "having the same memory subsystem as the 980" and it doesn't, that's a problem. When Nvidia comes clean only after the issue was reported on tech websites 3 months later, that's a problem. You're the one with bad morals if you are defending false advertising.
Posted on Reply
#119
efikkan
buggalugsYou are completely wrong, on everything, and I still don't understand why some people are defending Nvidia when they have been dishonest. It doesn't matter how good something is , if it is advertised to have something and it doesn't, that's a problem. When they advertised the card as "having the same memory subsystem as the 980" and it doesn't, that's a problem. When Nvidia comes clean only after the issue was reported on tech websites 3 months later, that's a problem. You're the one with bad morals if you are defending false advertising.
Noone is defending Nvidia for advertising the wrong specs in terms of memory bandwidth and so on, that is their responsability. But at the end of the day it's not big enough of a problem to change the verdict of the product and it's market position, and it remains a big PR blunder with minimal impact for actual product owners. Truth be told, almost noone would be able to notice it anyway, but unfortunately every problem will now be blamed on this issue, even though most problems claimed about the memory issue has nothing to do with the slow 512 MB of memory.
Posted on Reply
#120
AsRock
TPU addict
that link all i see is a lot of miss leading numbers
rruffThanks for the link. It seems to tell a similar story as the Newegg reviews, with the Nvidia cards getting fewer returns:


- Radeon HD 7850 : 2,69%
- Radeon HD 7870 : 12,45%
- Radeon HD 7950 : 5,32%
- Radeon HD 7970 : 7,24%

- GeForce GTX 560 Ti : 1,43%
- GeForce GTX 660 Ti : 3,06%
- GeForce GTX 670 : 3,42%
- GeForce GTX 680 : 2,66%

- Radeon HD 7850 : 3,74%
- Radeon HD 7870 : 5,48%
- Radeon HD 7870 XT : 4,25%
- Radeon HD 7950 : 5,75%
- Radeon HD 7970 : 5,31%

- GeForce GTX 660 : 1,01%
- GeForce GTX 660 Ti : 2,81%
- GeForce GTX 670 : 2,87%
- GeForce GTX 680 : 1,99%

The Pudget Systems results were very unfavorable to AMD for initial reliability, but that was a small sample size.



I don't know if it is heavy, but not super light either. There is +8W in idle. The 290x uses 67W more just running a bluray, and ~100W more in typical gaming. If I gamed for 2hr/day and watched 2hr/day of video that would be an average of 14W, or 22W total adding the idle consumption, or $22/yr. Maybe $50 in two years is a bit much but it isn't that crazy either.
Don't believe every thing you read even more so on a forum, and the 100w part is total BS as most games as better played with vsync on with the odd exception like with Watch Dogs.

All the games i play run 180w-240w total system power usage and runs 80w idle. With vsync off you are some what correct in what your saying.

Our electric bill has not gone up either since i got my 290X clocked at 1050 over the last year i have had it in fact the 290X is taking the same power usage as my 6970 was but with higher details in games.
Posted on Reply
#121
Lionheart
Digital Dreams
haha
You are a legend for posting that lmfao. That was the funniest shit I've seen in a long time :roll::roll:
Posted on Reply
#122
Caring1
LionheartYou are a legend for posting that lmfao.
Ummm, NO, he's not. It's been posted about 5 times already in these forums if you looked and read them.
Posted on Reply
#123
Super XP
I would have to agree, the Radeon R9 290(x) takes it home big time.
Posted on Reply
#124
Lionheart
Caring1Ummm, NO, he's not. It's been posted about 5 times already in these forums if you looked and read them.
Uumm yes I can have my own god damn opinion if I like, so go cry somewhere else lolz
Posted on Reply
#125
btarunr
Editor & Senior Moderator
RecusBut the game is stored in your PC so technically you bought him. Also physical copies have to be activated through Steam.
Technically you don't buy the game. You buy the license (a permission) to play the game. The physical copy is a piece of plastic containing the software. You pay money for the license. Just because the software is on your hard drive doesn't mean you bought it.

Look at it this way. You don't buy a passport from your government, you apply for it, you pay the required fees, and then they give you a passport TO HOLD. Your government still OWNS your passport. Same with credit cards. Your bank OWNS your credit card. When you buy games on Steam, or buy a physical copy, and you have it installed, you're HOLDING the software, along with a LICENSE to use it. You don't own the game, even if it came in a $200 collector's edition set with a gold disc, sitting on a satin pillow, in an expensive wood box.

That's why you can't compare Steam purchases with graphics card purchases. It's a tangible commodity that isn't subject to any EULA. You buy it, and then you can use it to play games, watch videos, create CGI, or use as paperweight (like W1zzard does).
Posted on Reply
Add your own comment
Nov 21st, 2024 09:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts