• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Cuts Prices of R9 290 Series and R9 280 Series Even Further

Joined
Nov 10, 2006
Messages
4,666 (0.71/day)
Location
Washington, US
System Name Rainbow
Processor Intel Core i7 8700k
Motherboard MSI MPG Z390M GAMING EDGE AC
Cooling Corsair H115i, 2x Noctua NF-A14 industrialPPC-3000 PWM
Memory G. Skill TridentZ RGB 4x8GB (F4-3600C16Q-32GTZR)
Video Card(s) ZOTAC GeForce RTX 3090 Trinity
Storage 2x Samsung 950 Pro 256GB | 2xHGST Deskstar 4TB 7.2K
Display(s) Samsung C27HG70
Case Xigmatek Aquila
Power Supply Seasonic 760W SS-760XP
Mouse Razer Deathadder 2013
Keyboard Corsair Vengeance K95
Software Windows 10 Pro
Benchmark Scores 4 trillion points in GmailMark, over 144 FPS 2K Facebook Scrolling (Extreme Quality preset)
 
Joined
Mar 28, 2014
Messages
586 (0.15/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
The following section may be a bit disjointed as I wrote this late at night trying to process the staggered launch of the chips, the reviews, the performance, and my own observations at the time. Please bear with me if you can.
  • AMDs (then ATI) last decent lead over Nvidia, was during 2006, with an architecture born from a time before AMDs acquisition of ATI. The (R400) X8** series and the later R5*0 X19** series, saw many successes against Nvidia (Geforce 6000 and 7000 series, respectively) and ultimately won the fixed-pipeline/fixed-shader battle. To end ATIs reign of 2006, Nvidia released the (G80) 8800 GTX. The X1950XTX still managed to trade punches with the hot and noisy G80, but ultimately lost out in performance, particularly when optimizations for the newer GPGPU architecture came about.

  • Come May 2007, ATI releases the abysmal (R600) HD 2900 XT; this was hot, noisy and performed worse, in most cases, than the prior R5*0 architecture. Nvidia fixes the G80s issues and releases the (G92) 8800GT that same year in October with ATI quickly releasing their (RV670) HD 3870 to fix the horror which was the R600. The HD3870 was not powerful enough to topple Nvidia's G80s or the later G92s, so ATI, perhaps with a hint of desperation, releases dual-GPU cards to try and take performance crown. To their credit, the HD 3870 at least corrected most the issues with the HD 2900XT. To add further insult to ATIs failings, simply refreshed the G92 for the Geforce 9000 series, possibly enjoying decent profit.

  • Mid 2008 comes around and Nvidia releases their new (GT200) GTX 280 just before ATI releases a decent answer to the G92/G80, the (RV770) HD4870. Unfortunately, while the HD4870 finally took the lead from the G92s, it could not match the GT200s so, again, ATI relied on dual-GPU cards to hassle Nvidia latest offerings. This can't be cheap for them to do.

  • 2009 sees some refreshing from both sides with the RV790 and GT200b appearing. By the end of 2009, ATI releases their new TerraScale2-based (Cypress XT) HD 5870.

  • We had to wait till the beginning of 2010 to see Nvidia's next architecture: the Fermi-based (GF100) GTX 480. While the Fermi took the outright performance title, it came at a cost; the GPU was hot and noisy and to make matter worse, not that much faster than ATIs latest offerings (or less so, if you consider the dual-GPU cards). This was the first time in a long while ATI/AMD had released something that was arguably better than what Nvidia could offer. To try and recover from their embarrassment, Nvidia releases the (GF110) GTX 580 at the tail end of 2010, possibly with the added pressure from AMDs latest Barts XT chips. Luckily for Nvidias sake, December saw AMDs (Cayman XT) HD6970s flop (to a degree); the VLIW4 architecture and performance would simply not scale as expected.

  • 2012 is the year something major occurs; it's the first time, even with a staggered launch, that the companies don't go head to head with the best the architecture can offer. January sees AMD release the (Tahiti XT) HD7970, but in response, Nvidia only releases their mid-tier Kepler GK104 as the GTX 680. As I have stated before, the GK110 was revealed the same month of the GK104 release and was released November the same year.
So back to the original argument; while there was always a staggered launch, it wasn't until the last few generations did something like the Kepler v Tahiti occur. A mid-tier GPU going against a top-tier GPU with the same or better performance (until Tahiti XT2 atleast). This meant Nvidia, rather than fully destroying AMDs offerings with the release of the GK110, enjoyed large profits on a marked up mid-tier GPU whilst keeping an illusion of competition. History has now repeated itself with the release of the Maxwells. Is this Nvidia being kind to AMD? Or are they just looking to fool the consumer and enjoy larger profits with a marked-up chip? Price and performance has always conveniently slotted between the two brands, even when such a difference in architecture performance occurs.

Very good post. :)
 
Joined
May 25, 2013
Messages
739 (0.18/day)
Location
Kolkata, India
System Name barely hangin on...
Processor Intel I5 4670K @stock
Motherboard Asus H81m-cs (nothing else available now)
Cooling CM Hyper 212X (in push-pull)
Memory 16GB Corsair Vengeance Dual Channel 1866MHz
Video Card(s) Asus RX 580 4GB Dual
Storage WD Blue 1TB, WD Black 2TB, Samsung 850 Evo 250GB
Display(s) Acer KG241QP 144Hz
Case Cooler Master CM 690 III (Transparent side panel) - illuminated with NZXT HUE RGB
Audio Device(s) FiiO E10K>Boom 3D>ATH M50/Samson SR850/HD599SE
Power Supply Corsair RM 850
Mouse Redragon M901 PERDITION 16400 DPI Laser Gaming Mouse
Keyboard HyperX Alloy FPS Mechanical Gaming Keyboard (Cherry MX Brown)
Software 7-64bit MBR, 10-64bit UEFI (Not Multi-boot), VBox guests...
WTF is going on ??? :confused:
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,198 (6.64/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Joined
Mar 28, 2014
Messages
586 (0.15/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit

Wolf32

New Member
Joined
Oct 30, 2014
Messages
3 (0.00/day)
Whats crazy is two AMD R9 290x in crossfire are still faster and smoother in the new games like Watch Dog and other new titles then the GTX 900's in SLI. So if you are running the high resolutions the AMD cards are still better. In single match ups the GTX cards are faster but when running two together the AMD's crossfire comes out ahead. The 900 GTX's win the power usage battle but when using more then one the 900 GTX's lose in dual mode and higher resolution because SLI needs to be upgraded. Check out the Hardocp review on 980 GTX in SLI vs the 290X in crossfire. go to their site look for yourself. Sure the AMD's use more power but who cares if they deliver better performance in the higher resolutions like 4k. I more interested in the performance and SLI appears to be broken in higher resolution which I play in. I got the 980 GTX because it was suppose to be better and it is in single card setup but in dual its not.
 
Joined
May 25, 2013
Messages
739 (0.18/day)
Location
Kolkata, India
System Name barely hangin on...
Processor Intel I5 4670K @stock
Motherboard Asus H81m-cs (nothing else available now)
Cooling CM Hyper 212X (in push-pull)
Memory 16GB Corsair Vengeance Dual Channel 1866MHz
Video Card(s) Asus RX 580 4GB Dual
Storage WD Blue 1TB, WD Black 2TB, Samsung 850 Evo 250GB
Display(s) Acer KG241QP 144Hz
Case Cooler Master CM 690 III (Transparent side panel) - illuminated with NZXT HUE RGB
Audio Device(s) FiiO E10K>Boom 3D>ATH M50/Samson SR850/HD599SE
Power Supply Corsair RM 850
Mouse Redragon M901 PERDITION 16400 DPI Laser Gaming Mouse
Keyboard HyperX Alloy FPS Mechanical Gaming Keyboard (Cherry MX Brown)
Software 7-64bit MBR, 10-64bit UEFI (Not Multi-boot), VBox guests...
Whats crazy is two AMD R9 290x in crossfire are still faster and smoother in the new games like Watch Dog and other new titles then the GTX 900's in SLI. So if you are running the high resolutions the AMD cards are still better. In single match ups the GTX cards are faster but when running two together the AMD's crossfire comes out ahead. The 900 GTX's win the power usage battle but when using more then one the 900 GTX's lose in dual mode and higher resolution because SLI needs to be upgraded. Check out the Hardocp review on 980 GTX in SLI vs the 290X in crossfire. go to their site look for yourself. Sure the AMD's use more power but who cares if they deliver better performance in the higher resolutions like 4k. I more interested in the performance and SLI appears to be broken in higher resolution which I play in. I got the 980 GTX because it was suppose to be better and it is in single card setup but in dual its not.}
WTF man! I hope my friends aren't gonna waste much words on this. AMD suscks. I wish they didn't. But they do.
 
Joined
Jun 13, 2012
Messages
1,388 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Whats crazy is two AMD R9 290x in crossfire are still faster and smoother in the new games like Watch Dog and other new titles then the GTX 900's in SLI. So if you are running the high resolutions the AMD cards are still better. In single match ups the GTX cards are faster but when running two together the AMD's crossfire comes out ahead. The 900 GTX's win the power usage battle but when using more then one the 900 GTX's lose in dual mode and higher resolution because SLI needs to be upgraded. Check out the Hardocp review on 980 GTX in SLI vs the 290X in crossfire. go to their site look for yourself. Sure the AMD's use more power but who cares if they deliver better performance in the higher resolutions like 4k. I more interested in the performance and SLI appears to be broken in higher resolution which I play in. I got the 980 GTX because it was suppose to be better and it is in single card setup but in dual its not.

Yea AMD cards still win, but they have 2x the memory path and on top of that 2x the power consumption as well. But if you are using a 1440p monitor, even TPU reviewed gtx970's in SLI and they were beating a matching or beating 295x2. Being new gpu arch probably could fix a lot of things with driver updates.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/


Don't care how you cut that, 300 watt's is a lot.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.73/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Yea AMD cards still win, but they have 2x the memory path and on top of that 2x the power consumption as well. But if you are using a 1440p monitor, even TPU reviewed gtx970's in SLI and they were beating a matching or beating 295x2. Being new gpu arch probably could fix a lot of things with driver updates.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/


Don't care how you cut that, 300 watt's is a lot.

It seems like a lot and I guess it depends on where you live and how expensive electricity is but here in the USA the average is 12 cents per kWh. If you game for 20 hours a week that extra 300 watts would cost $3.12 per month. That's not a lot really.
 

Wolf32

New Member
Joined
Oct 30, 2014
Messages
3 (0.00/day)
When it comes to gaming its about the FPS not how much power it uses. The AMD at 4k still holds the upper hand and that's what I'm after. Nvidia needs to fix SLI it does not work as well as Crossfire and if you read the article you would see it will not be fixed until NVidia changes SLI like AMD did Crossfire in the hardware. NVidia has a great card but they still have it handicapped on the memory side with the small bus. They either do not have enough memory or they make the bus too small, Nvidia needs to correct this then they would have a great card. Look I have a 980 GTX card but after seeing this test it looks like I went the wrong way since I will be playing at 4k. I would have been better off with AMD's crossfire performance wise. I'm not concerned about how many watts they use but how many FPS they do and how constant those frames are produced and AMD has the better cards in that regard. Two R9 290X's will cost hundreds less then two 980 GTX's and that savings will cover any extra cost in electricity. And I would also have better performance to boot at the higher resolutions then what NVidia provides. read this http://www.hardocp.com/article/2014...x_980_sli_4k_video_card_review/5#.VFMy2ul0y88
 
Last edited:
Joined
Jun 13, 2012
Messages
1,388 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Nvidia needs to fix SLI it does not work as well as Crossfire and if you read the article you would see it will not be fixed until NVidia changes SLI like AMD did Crossfire in the hardware.

What do they need to fix in hardware?
 

Wolf32

New Member
Joined
Oct 30, 2014
Messages
3 (0.00/day)
SLI hardware part just read the article it will tell you about it.

Frame Rate Consistency and Scaling
We experienced something with SLI we aren't use to at 4K gaming. We experienced some inconsistent frames, some low efficiency and poor SLI scaling. We were used to seeing this on AMD GPUs until AMD fixed their issues. AMD implemented a technology called Frame Pacing, and ultimately went the hardware route with XDMA on the AMD Radeon R9 290/X.
At the end of the day, what we find is that GeForce GTX 980 SLI performance is left wanting at 4K, not because Maxwell isn't fast, but because the current implementation of SLI is more inconsistent and less efficient compared to AMD's XDMA technology on the AMD Radeon R9 290X. This is a case of aging SLI actually hindering very capable GPUs. SLI needs an upgrade, it needs to evolve.
We do not think the true potential of the GeForce GTX 980 GPUs are being exploited with current 4K SLI gaming. It is being held back from its full potential. If GTX 980 could be fully and efficiently tapped, two GTX 980 GPUs have the potential to offer a better gameplay experience.
AMD hit NVIDIA hard with this new XDMA technology. Everyone was expecting NVIDIA would strike back with Maxwell by offering its own evolved SLI technology. However, it did not for this generation. That may end up biting NVIDIA in the butt as far as 4K gaming goes in the future.

http://www.hardocp.com/article/2014...x_980_sli_4k_video_card_review/9#.VFNXfOl0y88
 
Last edited:
Joined
Jun 19, 2012
Messages
141 (0.03/day)
Location
Sabah, Malaysia
System Name My White Theme Desktop PC
Processor 4770k / 4670k
Motherboard ASUS
Cooling CM 412 slim
Memory Kingston 8GB DDR3-1600
Video Card(s) GTX 770 / GTX 670
Storage Kingston SSD v300 120GB + 1TB Black + 1TB Green + 1TB Green + 1TB Blue
Display(s) 23" Philips IPS White Slim x 3 / Acer
Case Thermaltake Revo White Snow Edition / Cooler Master
Audio Device(s) Creative Sound Blaster 5.1
Power Supply AC 650w 80+ bronze x 2 White
It seems like a lot and I guess it depends on where you live and how expensive electricity is but here in the USA the average is 12 cents per kWh. If you game for 20 hours a week that extra 300 watts would cost $3.12 per month. That's not a lot really.

In my country the average is 28 cents per kWh... I've 2 kids. They are gaming around 4 hours a day 2 hours each, the extra watts is costing me $10 a month... or $120 a year... let say the lifespan of the card is 3 years before i upgrade into a new one... the 3 yrs cost of extra electricity is $360 ( i can buy new one GTX 970 )
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,198 (6.64/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
In my country the average is 28 cents per kWh... I've 2 kids. They are gaming around 4 hours a day 2 hours each, the extra watts is costing me $10 a month... or $120 a year... let say the lifespan of the card is 3 years before i upgrade into a new one... the 3 yrs cost of extra electricity is $360 ( i can buy new one GTX 970 )

What country? What standard of currency
 
Joined
Jun 13, 2012
Messages
1,388 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Joined
Jun 19, 2012
Messages
141 (0.03/day)
Location
Sabah, Malaysia
System Name My White Theme Desktop PC
Processor 4770k / 4670k
Motherboard ASUS
Cooling CM 412 slim
Memory Kingston 8GB DDR3-1600
Video Card(s) GTX 770 / GTX 670
Storage Kingston SSD v300 120GB + 1TB Black + 1TB Green + 1TB Green + 1TB Blue
Display(s) 23" Philips IPS White Slim x 3 / Acer
Case Thermaltake Revo White Snow Edition / Cooler Master
Audio Device(s) Creative Sound Blaster 5.1
Power Supply AC 650w 80+ bronze x 2 White
In my country where average income is USD$1000-1500 monthly, a lot of people especially living in the city paying $100 per month for electricty alone. That 10% of monthly income.
So any GPU using less power is welcome.
The 290x Shappire Trix OC recently on sale for $325 in my location, way cheaper compare to its introduction price at $615. The gaming performance of this card is on par with gigabyte G1 gtx 970 ( 970 slightly expensive than 290x) but 970 using wayyyy less power consumption. So for me, i will choose any card with less power consumption coz in the long run i will save a lot $$$ on electricity bill.
 
Top