Tuesday, October 7th 2014

AMD Cuts Prices of R9 290 Series and R9 280 Series Even Further

AMD cut prices of its Radeon R9 290 series and R9 280 series graphics cards further down from last month's price-cuts. The cuts see the company's flagship single-GPU product, the Radeon R9 290X, drop from $449, down to $399, an $150 overall drop, from its launch price of $549. The Radeon R9 290, on the other hand, has its price cut to $299, from its launch price of $399. The drop in price of the R9 290 is squeezing AMD's sub-$300 lineup like never before. The R9 280X is down to $270, just $30 less than the R9 290. The R9 285, which launched barely two months ago, has its price squeezed to $229, just $10 more than NVIDIA's GTX 760. If you're in the market for a graphics card with about $250 in hand, you're now open to a ton of options, including ramen for a week, in exchange for the $329 GeForce GTX 970.
Source: Tweaktown
Add your own comment

140 Comments on AMD Cuts Prices of R9 290 Series and R9 280 Series Even Further

#126
Sony Xperia S
NaitoThe following section may be a bit disjointed as I wrote this late at night trying to process the staggered launch of the chips, the reviews, the performance, and my own observations at the time. Please bear with me if you can.
  • AMDs (then ATI) last decent lead over Nvidia, was during 2006, with an architecture born from a time before AMDs acquisition of ATI. The (R400) X8** series and the later R5*0 X19** series, saw many successes against Nvidia (Geforce 6000 and 7000 series, respectively) and ultimately won the fixed-pipeline/fixed-shader battle. To end ATIs reign of 2006, Nvidia released the (G80) 8800 GTX. The X1950XTX still managed to trade punches with the hot and noisy G80, but ultimately lost out in performance, particularly when optimizations for the newer GPGPU architecture came about.
  • Come May 2007, ATI releases the abysmal (R600) HD 2900 XT; this was hot, noisy and performed worse, in most cases, than the prior R5*0 architecture. Nvidia fixes the G80s issues and releases the (G92) 8800GT that same year in October with ATI quickly releasing their (RV670) HD 3870 to fix the horror which was the R600. The HD3870 was not powerful enough to topple Nvidia's G80s or the later G92s, so ATI, perhaps with a hint of desperation, releases dual-GPU cards to try and take performance crown. To their credit, the HD 3870 at least corrected most the issues with the HD 2900XT. To add further insult to ATIs failings, simply refreshed the G92 for the Geforce 9000 series, possibly enjoying decent profit.
  • Mid 2008 comes around and Nvidia releases their new (GT200) GTX 280 just before ATI releases a decent answer to the G92/G80, the (RV770) HD4870. Unfortunately, while the HD4870 finally took the lead from the G92s, it could not match the GT200s so, again, ATI relied on dual-GPU cards to hassle Nvidia latest offerings. This can't be cheap for them to do.
  • 2009 sees some refreshing from both sides with the RV790 and GT200b appearing. By the end of 2009, ATI releases their new TerraScale2-based (Cypress XT) HD 5870.
  • We had to wait till the beginning of 2010 to see Nvidia's next architecture: the Fermi-based (GF100) GTX 480. While the Fermi took the outright performance title, it came at a cost; the GPU was hot and noisy and to make matter worse, not that much faster than ATIs latest offerings (or less so, if you consider the dual-GPU cards). This was the first time in a long while ATI/AMD had released something that was arguably better than what Nvidia could offer. To try and recover from their embarrassment, Nvidia releases the (GF110) GTX 580 at the tail end of 2010, possibly with the added pressure from AMDs latest Barts XT chips. Luckily for Nvidias sake, December saw AMDs (Cayman XT) HD6970s flop (to a degree); the VLIW4 architecture and performance would simply not scale as expected.
  • 2012 is the year something major occurs; it's the first time, even with a staggered launch, that the companies don't go head to head with the best the architecture can offer. January sees AMD release the (Tahiti XT) HD7970, but in response, Nvidia only releases their mid-tier Kepler GK104 as the GTX 680. As I have stated before, the GK110 was revealed the same month of the GK104 release and was released November the same year.
So back to the original argument; while there was always a staggered launch, it wasn't until the last few generations did something like the Kepler v Tahiti occur. A mid-tier GPU going against a top-tier GPU with the same or better performance (until Tahiti XT2 atleast). This meant Nvidia, rather than fully destroying AMDs offerings with the release of the GK110, enjoyed large profits on a marked up mid-tier GPU whilst keeping an illusion of competition. History has now repeated itself with the release of the Maxwells. Is this Nvidia being kind to AMD? Or are they just looking to fool the consumer and enjoy larger profits with a marked-up chip? Price and performance has always conveniently slotted between the two brands, even when such a difference in architecture performance occurs.
Very good post. :)
Posted on Reply
#127
Nabarun
WTF is going on ??? :confused:
Posted on Reply
#128
eidairaman1
The Exiled Airman
arbiterThat happens on how many 970 cards, handful? How many of 290(x) ref cards throttled unless you ran the at vacum cleaner noise level? that would be All of them. AMD clearly screwed up with using that reference cooler and most reviewers slammed them for it and rightfully so.
Just like this:

www.techpowerup.com/forums/threads/205648/unread
Posted on Reply
#130
Wolf32
Whats crazy is two AMD R9 290x in crossfire are still faster and smoother in the new games like Watch Dog and other new titles then the GTX 900's in SLI. So if you are running the high resolutions the AMD cards are still better. In single match ups the GTX cards are faster but when running two together the AMD's crossfire comes out ahead. The 900 GTX's win the power usage battle but when using more then one the 900 GTX's lose in dual mode and higher resolution because SLI needs to be upgraded. Check out the Hardocp review on 980 GTX in SLI vs the 290X in crossfire. go to their site look for yourself. Sure the AMD's use more power but who cares if they deliver better performance in the higher resolutions like 4k. I more interested in the performance and SLI appears to be broken in higher resolution which I play in. I got the 980 GTX because it was suppose to be better and it is in single card setup but in dual its not.
Posted on Reply
#131
Nabarun
Wolf32Whats crazy is two AMD R9 290x in crossfire are still faster and smoother in the new games like Watch Dog and other new titles then the GTX 900's in SLI. So if you are running the high resolutions the AMD cards are still better. In single match ups the GTX cards are faster but when running two together the AMD's crossfire comes out ahead. The 900 GTX's win the power usage battle but when using more then one the 900 GTX's lose in dual mode and higher resolution because SLI needs to be upgraded. Check out the Hardocp review on 980 GTX in SLI vs the 290X in crossfire. go to their site look for yourself. Sure the AMD's use more power but who cares if they deliver better performance in the higher resolutions like 4k. I more interested in the performance and SLI appears to be broken in higher resolution which I play in. I got the 980 GTX because it was suppose to be better and it is in single card setup but in dual its not.}
WTF man! I hope my friends aren't gonna waste much words on this. AMD suscks. I wish they didn't. But they do.
Posted on Reply
#132
arbiter
Wolf32Whats crazy is two AMD R9 290x in crossfire are still faster and smoother in the new games like Watch Dog and other new titles then the GTX 900's in SLI. So if you are running the high resolutions the AMD cards are still better. In single match ups the GTX cards are faster but when running two together the AMD's crossfire comes out ahead. The 900 GTX's win the power usage battle but when using more then one the 900 GTX's lose in dual mode and higher resolution because SLI needs to be upgraded. Check out the Hardocp review on 980 GTX in SLI vs the 290X in crossfire. go to their site look for yourself. Sure the AMD's use more power but who cares if they deliver better performance in the higher resolutions like 4k. I more interested in the performance and SLI appears to be broken in higher resolution which I play in. I got the 980 GTX because it was suppose to be better and it is in single card setup but in dual its not.
Yea AMD cards still win, but they have 2x the memory path and on top of that 2x the power consumption as well. But if you are using a 1440p monitor, even TPU reviewed gtx970's in SLI and they were beating a matching or beating 295x2. Being new gpu arch probably could fix a lot of things with driver updates.

www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/


Don't care how you cut that, 300 watt's is a lot.
Posted on Reply
#133
64K
arbiterYea AMD cards still win, but they have 2x the memory path and on top of that 2x the power consumption as well. But if you are using a 1440p monitor, even TPU reviewed gtx970's in SLI and they were beating a matching or beating 295x2. Being new gpu arch probably could fix a lot of things with driver updates.

www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/


Don't care how you cut that, 300 watt's is a lot.
It seems like a lot and I guess it depends on where you live and how expensive electricity is but here in the USA the average is 12 cents per kWh. If you game for 20 hours a week that extra 300 watts would cost $3.12 per month. That's not a lot really.
Posted on Reply
#134
Wolf32
When it comes to gaming its about the FPS not how much power it uses. The AMD at 4k still holds the upper hand and that's what I'm after. Nvidia needs to fix SLI it does not work as well as Crossfire and if you read the article you would see it will not be fixed until NVidia changes SLI like AMD did Crossfire in the hardware. NVidia has a great card but they still have it handicapped on the memory side with the small bus. They either do not have enough memory or they make the bus too small, Nvidia needs to correct this then they would have a great card. Look I have a 980 GTX card but after seeing this test it looks like I went the wrong way since I will be playing at 4k. I would have been better off with AMD's crossfire performance wise. I'm not concerned about how many watts they use but how many FPS they do and how constant those frames are produced and AMD has the better cards in that regard. Two R9 290X's will cost hundreds less then two 980 GTX's and that savings will cover any extra cost in electricity. And I would also have better performance to boot at the higher resolutions then what NVidia provides. read this www.hardocp.com/article/2014/10/27/nvidia_geforce_gtx_980_sli_4k_video_card_review/5#.VFMy2ul0y88
Posted on Reply
#135
arbiter
Wolf32Nvidia needs to fix SLI it does not work as well as Crossfire and if you read the article you would see it will not be fixed until NVidia changes SLI like AMD did Crossfire in the hardware.
What do they need to fix in hardware?
Posted on Reply
#136
Wolf32
SLI hardware part just read the article it will tell you about it.

Frame Rate Consistency and Scaling
We experienced something with SLI we aren't use to at 4K gaming. We experienced some inconsistent frames, some low efficiency and poor SLI scaling. We were used to seeing this on AMD GPUs until AMD fixed their issues. AMD implemented a technology called Frame Pacing, and ultimately went the hardware route with XDMA on the AMD Radeon R9 290/X.
At the end of the day, what we find is that GeForce GTX 980 SLI performance is left wanting at 4K, not because Maxwell isn't fast, but because the current implementation of SLI is more inconsistent and less efficient compared to AMD's XDMA technology on the AMD Radeon R9 290X. This is a case of aging SLI actually hindering very capable GPUs. SLI needs an upgrade, it needs to evolve.
We do not think the true potential of the GeForce GTX 980 GPUs are being exploited with current 4K SLI gaming. It is being held back from its full potential. If GTX 980 could be fully and efficiently tapped, two GTX 980 GPUs have the potential to offer a better gameplay experience.
AMD hit NVIDIA hard with this new XDMA technology. Everyone was expecting NVIDIA would strike back with Maxwell by offering its own evolved SLI technology. However, it did not for this generation. That may end up biting NVIDIA in the butt as far as 4K gaming goes in the future.

www.hardocp.com/article/2014/10/27/nvidia_geforce_gtx_980_sli_4k_video_card_review/9#.VFNXfOl0y88
Posted on Reply
#137
Tonduluboy
64KIt seems like a lot and I guess it depends on where you live and how expensive electricity is but here in the USA the average is 12 cents per kWh. If you game for 20 hours a week that extra 300 watts would cost $3.12 per month. That's not a lot really.
In my country the average is 28 cents per kWh... I've 2 kids. They are gaming around 4 hours a day 2 hours each, the extra watts is costing me $10 a month... or $120 a year... let say the lifespan of the card is 3 years before i upgrade into a new one... the 3 yrs cost of extra electricity is $360 ( i can buy new one GTX 970 )
Posted on Reply
#138
eidairaman1
The Exiled Airman
TonduluboyIn my country the average is 28 cents per kWh... I've 2 kids. They are gaming around 4 hours a day 2 hours each, the extra watts is costing me $10 a month... or $120 a year... let say the lifespan of the card is 3 years before i upgrade into a new one... the 3 yrs cost of extra electricity is $360 ( i can buy new one GTX 970 )
What country? What standard of currency
Posted on Reply
#140
Tonduluboy
In my country where average income is USD$1000-1500 monthly, a lot of people especially living in the city paying $100 per month for electricty alone. That 10% of monthly income.
So any GPU using less power is welcome.
The 290x Shappire Trix OC recently on sale for $325 in my location, way cheaper compare to its introduction price at $615. The gaming performance of this card is on par with gigabyte G1 gtx 970 ( 970 slightly expensive than 290x) but 970 using wayyyy less power consumption. So for me, i will choose any card with less power consumption coz in the long run i will save a lot $$$ on electricity bill.
Posted on Reply
Add your own comment
Nov 22nd, 2024 14:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts