Thursday, January 29th 2015
AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price
AMD decided to cash-in on the GeForce GTX 970 memory controversy, with a bold move and a cheap (albeit accurate) shot. The company is making its add-in board (AIB) partners lower pricing of its Radeon R9 290X graphics card, which offers comparable levels of performance to the GTX 970, down to as low as US $299.
And then there's a gentle reminder from AMD to graphics card buyers with $300-ish in their pockets. With AMD, "4 GB means 4 GB." AMD also emphasizes that the R9 290 and R9 290X can fill their 4 GB video memory to the last bit, and feature a 512-bit wide memory interface, which churns up 320 GB/s of memory bandwidth at reference clocks, something the GTX 970 can't achieve, even with its fancy texture compression mojo.
And then there's a gentle reminder from AMD to graphics card buyers with $300-ish in their pockets. With AMD, "4 GB means 4 GB." AMD also emphasizes that the R9 290 and R9 290X can fill their 4 GB video memory to the last bit, and feature a 512-bit wide memory interface, which churns up 320 GB/s of memory bandwidth at reference clocks, something the GTX 970 can't achieve, even with its fancy texture compression mojo.
181 Comments on AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price
well ... i love both brand but the 970 "drama" is still real ... even if it doesn't impact the real performance, they still did try to hide it, that fact only is enough for me. (not that any other brand doesn't "cheat" but still :roll: )
waiting to see some 2nd hand 290/290X price down where i am ... if it reach my country before the 380x or something else from Nv (i think my 290 will sit on my shelf by that time .... i can not resolve myself to sell the best card i ever had :roll: all category include : bang for buck, check )
Robust construction, many with backing plates, as above you need good air movement or you can tell the fans will ramp up, add changed fans and it's quiet.
It might be too early for pricing to move out for the 380X, but what they heck as good as any to start...
While AMD should not be that blatant on the 4Gb is 4Gb, more just say full memory bandwidth of 320 GB/s utilization, on 512-Bit.
The work has already been done on these cards. Marginal costs are low. What AMD needs is market share, and this is a smart move for them. If/when Nvidia cuts the price of 970s they can follow them down, and then the 380x will be released which will hopefully blow away the 980, and they can sell that one for a premium.
Besides I'm up for a second cheap and "tainted" 970, keep me posted on any great deals you see, if what you say is true they won't sell out quickly anyway.
Really though, the card would throttle with the fan on full blast! (running FurMark mind you)
That's management gold right there!
extreme.outervision.com/PSUEngine proved to be pretty reliable most of the time for me
Minimum PSU Wattage:478 W
Recommended
PSU Wattage:*
i have a 650w bronze Integra R2 (a good cheap PSU which rate almost like a silver cert) and i OC my 290 @ a 290X stock level (with some run for benchies @ 1150/1500) without any hiccups
edit: the total wattage is including 5x 120mm led fan, 1x 140mm led fan, 1x 60mm RAM fan, 2x Phobya DC12-220, 1x Led stripe 30cm 12v, 1 SSD and 2 7.2Krpm SIII HDD
$1 = RM3.6 today!
380X is interesting to me too.
Using tried and tested technology for die size actually keeps the costs lower, the GM chip is newer tech and costly to implement, therefore more bad chips are likely during start up and higher costs.
EA didn't even bother to make a 64-bit binary which shows just how little effort the company thinks it needs to put into a game it expects to extract hundreds of dollars per person with. (Prior versions could run into memory limitations with enough 3rd-party content in conjunction with the tons of expansions, content needed to make the game more interesting.)
they have always worked with hardware companies in development to ensure compatibility.. its just smart business.
hell if game developers would have embraced new DX api's we could have been on DX14 by now.
I'm sure they felt a little slapped in the face when DX11 was released and game developers continued using DX9.