Thursday, November 6th 2014

MSI Announces Radeon R9 290X Gaming 8GB Graphics Card

MSI is proud to announce the availability of the new R9 290X GAMING 8G graphics card. Packed with 8 Gigabyte GDDR5 memory operating at 5500MHz and all the extra features stuffed with it, the new 290X GAMING 8G is sure to drive UltraHD gaming resolutions without any problem. The MSI Twin Frozr IV Advanced ensures your card runs cool so you can enjoy maximum performance while AMD's PowerTune technology enables the R9 290X GAMING 8G to run at highest clock speeds.

With support for the latest industry standards and thrilling new technology such as Mantle support in Battlefield 4. Thanks to the bundled MSI GAMING App gamers can quickly switch between three pre-sets including a silent mode optimized for power efficiency and an overclocking and OC Mode to get the most power out of your graphics card, without worrying about learning how to overclock. The R9 290X GAMING 8G has been designed to give you a fluid and silent gaming experience that delivers you true next-gen performance for 4K UHD resolutions and up, without sacrificing on thermals or noise.
Stay cool in combat
The MSI Twin Frozr IV Advanced has been completely customized for the R9 290X GAMING 8G graphics cards to deliver the best thermal experience. The design uses a larger copper base for heat absorption and the heat pipes are in contact with a bigger part of the heat sink and exposed to more airflow to ensure the highest performance of the GPU because of optimal temperatures. Combined with the dual form-in-one heat sinks that add both cooling ability and structural reinforcement the R9 290X GAMING 8G stays cool, quiet and safe.

Optimized for Gamers
The new R9 290X GAMING 8G is packed with features that benefit every gamer. AMD TrueAudio technology allows far more realistic positional audio and the added benefit of surround sound over connected displays. Mantle allows game developers to directly speak to the GPU optimizing GPU performance. This can all be displayed in a glorious UltraHD / 4K resolution and even up as the new R9 290X GAMING 8G is offering unmatched performance at 4K resolutions. This can be easily connected through the DVI, HDMI and DisplayPort connectors. Up to six dedicated displays can be connected to the R9 290X GAMING 8G for an amazing Eyefinity experience.

R9 290X GAMING 8G Technical specifications
  • GPU: Hawaii XT
  • R9 290X GAMING 8G Clock speeds: 1040 MHz (OC Mode)
  • Memory size / Speed: 8096 MB / 5500 MHz
  • Connectivity: DisplayPort / HDMI / 2x DL-DVI-D
  • Card Dimensions: 276x127x39 mm
  • Afterburner OC support: GPU / VDDCI Overvoltage
  • Power connectors: 1x 8-pin, 1x 6-pin PCI Express
Add your own comment

49 Comments on MSI Announces Radeon R9 290X Gaming 8GB Graphics Card

#1
the54thvoid
Super Intoxicated Moderator
Here we go with overkill. That goes for when Nvidia release their version too (as recently rumoured).
new 290X GAMING 8G is sure to drive UltraHD gaming resolutions without any problem.
That is a lie. One card?



Single cards don't yet run 4k.... Their PR should be more honest and mention crossfire. And that runs fine on 4GB, does it not?

Just in from Hexus:

Posted on Reply
#2
Aquinus
Resident Wat-man
the54thvoidThat is a lie. One card?
You do realize that even if you run crossfire, you're still limited by the frame buffer on just one of the GPUs. The benefit of adding more memory is for multiple GPUs not a single GPU. I just feel that needs to be thrown out there.
Posted on Reply
#3
the54thvoid
Super Intoxicated Moderator
AquinusYou do realize that even if you run crossfire, you're still limited by the frame buffer on just one of the GPUs. The benefit of adding more memory is for multiple GPUs not a single GPU. I just feel that needs to be thrown out there.
Seriously... you don't think i know that?

It's been shown the extra VRam does very little - for now. In a year or so, we may need it - likely so given advances. But it's the gpu horsepower that is lacking. Even 2 top end GPU's struggle in top tier games to get to 60fps.
Posted on Reply
#4
Aquinus
Resident Wat-man
the54thvoidSeriously... you don't think i know that?

It's been shown the extra VRam does very little - for now. In a year or so, we may need it - likely so given advances. But it's the gpu horsepower that is lacking. Even 2 top end GPU's struggle in top tier games to get to 60fps.
Just because it doesn't have an impact now doesn't mean it won't down the road. I'm not saying that it isn't a marketing ploy, I'm just saying there are cases where AA is cranked up in crossfire with high resolution textures will simply use more memory and going forward it's not like it's going to go backward and developers will start making lower resolution textures and smaller games.

Also it will run 4k, it just won't run it well. Intel IGPs can do 4k since IVB, that doesn't mean it does it well though.
Posted on Reply
#5
the54thvoid
Super Intoxicated Moderator
Aquinus...Also it will run 4k, it just won't run it well...
Just so we're clear, I did quote the article in my first post:
is sure to drive UltraHD gaming resolutions without any problem
without any problem =/= just won't run it well

We both agree, let's not be mistaken on that one. An 8Gb gpu is not good enough now for 4K. Next years AMD & Nvidia performance GPU's at 8Gb might be a different story. There just isn't a point in strapping 8Gb to a gpu for gaming right now.

The only equivalent I have is that the overclocked Titan Z running at 1000-1100 runs nearly as fast as stock sli 780ti's but it's extra 3Gb Vram does little if anything for performance*. Unfortunately, I can't find any bloody review site that has benched overclocked Titan Z's in the same format they've tested sli 780ti's (at least not in English).

* though admittedly, a 3Gb frame buffer at 4K is too low, so the 780ti stutter issues are from that. The 4Gb on the 9series make that improvement.
Posted on Reply
#6
D007
I honestly don't think we will need 8 gb memory for at least 4 years.
Posted on Reply
#7
btarunr
Editor & Senior Moderator
AquinusJust because it doesn't have an impact now doesn't mean it won't down the road. I'm not saying that it isn't a marketing ploy, I'm just saying there are cases where AA is cranked up in crossfire with high resolution textures will simply use more memory and going forward it's not like it's going to go backward and developers will start making lower resolution textures and smaller games.

Also it will run 4k, it just won't run it well. Intel IGPs can do 4k since IVB, that doesn't mean it does it well though.
You won't need AA with 4K, unless you're using 65-inch or 80-inch 4K TVs, instead of 28-inch or 32-inch monitors. At monitor sizes, the jagged lines simply aren't discernible.
Posted on Reply
#8
Aquinus
Resident Wat-man
the54thvoid* though admittedly, a 3Gb frame buffer at 4K is too low, so the 780ti stutter issues are from that. The 4Gb on the 9series make that improvement.
That's really my point though. How long will it be until we utilize 4GB if 3GB GPUs are already starting to show inadequacy. Yeah, 8GB is overkill, but that's simply the next step up from 4GB because of the width of the memory bus.
btarunrYou won't need AA with 4K, unless you're using 65-inch or 80-inch 4K TVs, instead of 28-inch or 32-inch monitors. At monitor sizes, the jagged lines simply aren't discernible.
I've yet to use a 4k display so I can't speak from experience but you very well can still have jagged lines simply from how the scene is rendered regardless of monitor resolution. That has nothing to do with the monitor itself, but you're right, higher resolution would mean more detail on that line but that doesn't mean it gets smoothed out if it's jagged in the first place. I just find it hard to believe that it would eliminate it all together so I think a statement like that is rather bold.
Posted on Reply
#9
chinmi
More Bram but it's still the same gpu, will it matter much? I don't think the gpu has the juice to process all those data on that huge vram
Posted on Reply
#10
the54thvoid
Super Intoxicated Moderator
AquinusThat's really my point though. How long will it be until we utilize 4GB if 3GB GPUs are already starting to show inadequacy. Yeah, 8GB is overkill, but that's simply the next step up from 4GB because of the width of the memory bus.
We're starting to do a dance of circles here. I know that's your point but the point of my initial post was that putting more Vram on a GPU that already lacks the grunt to drive what requires such huge Vram requirements is pointless.
4K looks to require >3Gb right now on some titles (assuming it's coded well).
4K is to much for a single Hawaii or maxwell card right now.
the GPU horsepower, cores, processing output is simply not good enough.
So, for now, on the card in the OP, 8Gb is utterly unnecessary for gaming.

Of course we'll use more Vram in the future but on current generation cards. there is no tangible benefit.

I agree with you but the point is and the question we have to answer is: Does a 290X benefit from 8Gb memory? No, it does not. And when it does need that 8Gb buffer, it'll be far too weak to use it.

EDIT:

the mark of a good man is one that can take defeat without bitching:

Seems one game does like it:

Posted on Reply
#11
EarthDog
btarunrYou won't need AA with 4K, unless you're using 65-inch or 80-inch 4K TVs, instead of 28-inch or 32-inch monitors. At monitor sizes, the jagged lines simply aren't discernible.
And that is the thing... depends on the size of the monitor and how close you sit at it will determine if you need AA. But if you pour it on at 4K res.... vRAM use skyrockets.

While we only see its benefits in one game this second, as time goes on, more and more will respond that way...so it is needed now for that title.
Posted on Reply
#12
64K
Well, according to Toms Hardware they showed an average 14% increase in performance over a R9 290X 4 GB testing a Sapphire Vapor-X R9 290X 8 GB at 4K. Different card than the OP stated but a general idea what to expect.

www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977.html

If the price is reasonable it might be worth it to some for 4K but it doesn't look to good right now according to AnandTech the card is priced at £600 (or £500/$848 ex. VAT) in the UK.
Posted on Reply
#13
GhostRyder
64KWell, according to Toms Hardware they showed an average 14% increase in performance over a R9 290X 4 GB testing a Sapphire Vapor-X R9 290X 8 GB at 4K. Different card than the OP stated but a general idea what to expect.

www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977.html

If the price is reasonable it might be worth it to some for 4K but it doesn't look to good right now according to AnandTech the card is priced at £600 (or £500/$848 ex. VAT) in the UK.
If its that much its way to expensive but still could be useful in an extreme setup for 4K like with 3-4 cards. At least you could say your future proofed for awhile in the area of VRAM.

I think the card is a little late in all honesty for a release like this but I guess its something that would be desired for hardcore users though at this point if the price is as its listed its not going to be much of a deal except in a very small minute amount of situations that also could involve some professional work. But I believe this one is more focused on gaming...
Posted on Reply
#14
the54thvoid
Super Intoxicated Moderator
64KWell, according to Toms Hardware they showed an average 14% increase in performance over a R9 290X 4 GB testing a Sapphire Vapor-X R9 290X 8 GB at 4K. Different card than the OP stated but a general idea what to expect.

www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977.html

If the price is reasonable it might be worth it to some for 4K but it doesn't look to good right now according to AnandTech the card is priced at £600 (or £500/$848 ex. VAT) in the UK.
£380 at OcUK this week offer.

www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP
Posted on Reply
#15
badtaylorx
I guess it was too much to ask to cover those other 6 VRAM chips....

just lazy...."tisk tisk"
Posted on Reply
#16
GhostRyder
the54thvoid£380 at OcUK this week offer.

www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP
Now that is a much more reasonable price point but still a little up there in my book. Though I guess you are getting the 8gb and the VaporX variant of the card which generally are some of the more expensive versions.
Posted on Reply
#17
RCoon
the54thvoidWe're starting to do a dance of circles here. I know that's your point but the point of my initial post was that putting more Vram on a GPU that already lacks the grunt to drive what requires such huge Vram requirements is pointless.
4K looks to require >3Gb right now on some titles (assuming it's coded well).
4K is to much for a single Hawaii or maxwell card right now.
the GPU horsepower, cores, processing output is simply not good enough.
So, for now, on the card in the OP, 8Gb is utterly unnecessary for gaming.

Of course we'll use more Vram in the future but on current generation cards. there is no tangible benefit.

I agree with you but the point is and the question we have to answer is: Does a 290X benefit from 8Gb memory? No, it does not. And when it does need that 8Gb buffer, it'll be far too weak to use it.

EDIT:

the mark of a good man is one that can take defeat without bitching:

Seems one game does like it:

8GB on a 290X is basically worthless, there is no other way of going around the subject. AMD has nothing else right now, so the only way to drive sales is to arbitrarily add extra numbers to their card specs. 1FPS increase in a single game at 4K from doubling the RAM and dropping an addition £50 doesn't make sense to me. By all accounts I see no reason to recommend an 8GB 290 or 290X to anyone, I see absolutely zero cases where any gamer would need it besides for bragging rights, and even then somebody can wander along with a 4GB 980 and boast superiority on node size and power consumption.

In the event future games require 8GB of VRAM due to massive improvements in textures, you still wouldn't buy a 290X, because by then the core horsepower behind the memory is still the bottleneck.

tl;dr 8GB VRAM is irrelevant on current generation GPUs, both NVidia and AMD.
badtaylorxI guess it was too much to ask to cover those other 6 VRAM chips....

just lazy...."tisk tisk"
Modern VRAM chips don't need anything beyond the normal airflow they recieve from the fans. Heatsinks simply aren't a requirement these days due to the max temp allowances.
Posted on Reply
#18
fullinfusion
Vanguard Beta Tester
badtaylorxI guess it was too much to ask to cover those other 6 VRAM chips....

just lazy...."tisk tisk"
they are covered, with the cooler and thermal tape
Posted on Reply
#19
DavidFennerR
RCoon8GB on a 290X is basically worthless, there is no other way of going around the subject. AMD has nothing else right now, so the only way to drive sales is to arbitrarily add extra numbers to their card specs. 1FPS increase in a single game at 4K from doubling the RAM and dropping an addition £50 doesn't make sense to me. By all accounts I see no reason to recommend an 8GB 290 or 290X to anyone, I see absolutely zero cases where any gamer would need it besides for bragging rights, and even then somebody can wander along with a 4GB 980 and boast superiority on node size and power consumption.

In the event future games require 8GB of VRAM due to massive improvements in textures, you still wouldn't buy a 290X, because by then the core horsepower behind the memory is still the bottleneck.

tl;dr 8GB VRAM is irrelevant on current generation GPUs, both NVidia and AMD.


Modern VRAM chips don't need anything beyond the normal airflow they recieve from the fans. Heatsinks simply aren't a requirement these days due to the max temp allowances.
You would be suprised of how many people talk about the cards ram quantity first when asked what card they have. Even before the card version or brand. "what videocard do you have?" "Mmmm... it's a nvidia, GT something, but it has 4 gigs of ram! (smiles)". Most of the buyers are ignorant about this things, so I say 8 gbs it's a good marketing strategy, specially since amd is doing so bad lately... I really hope they sell like crazy, so we have more competition for some years.
On the other hand, the real winners are the people that use this cards for rendering. I use cycles render with Cuda, and I have gotten over 4gb with scenes easily, 8gbs would really come in handy!! So thanks amd, cause now nvidia will have to respond to take some of the ignorant market themselves :D
Posted on Reply
#22
eidairaman1
The Exiled Airman
these were too late, should of been released earlier this year
Posted on Reply
#23
Fluffmeister
To be fair Sapphire have had a 8GB model out for ages, although it also cost like £600 here in the UK.

Even at £380 these really don't make much sense, reference plus the odd custom 980 can be had for £20-30 more and that is just plain better.

Beyond that cards like the MSI 290X 4GB Gaming can be had for over £100 less, let alone the amazing all conquering GTX 970.
Posted on Reply
#24
haswrong
D007I honestly don't think we will need 8 gb memory for at least 4 years.
some of us may find a use for it. i do not object against large amount of memory.. although i would be happier, if the gpus got a lil more horsepower to utilize it with more joy. i dont ask for a quantum computer here. these days you can optimize the chips blueprint within month or two with a sophisticated software and the next month you can start the test production.. seriously, i dont know why amd does not advance in this field anymore.. 4k monitors arrived a while ago, but we are stuck in the bronze age. no freesync in sight either.. amd is a bubble thats about to pop.
eidairaman1these were too late, should of been released earlier this year
or at least release it with better clocked gpus and memo.. that would spur up some interest in these aging monster devices.
Posted on Reply
#25
D007
I keep seeing people saying 4k is too much for one card.
I have to say in a lot of cases you are wrong.
I was playing in 4k on one gtx 680 for some time.
One gtx 980 runs start citizen at 4k perfectly smooth.
Tested it myself on ultra.

I have never seen anywhere near 8 gb of memory usage though.
Most I have seen is like 2.
Posted on Reply
Add your own comment
Dec 21st, 2024 10:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts