Thursday, November 6th 2014
MSI Announces Radeon R9 290X Gaming 8GB Graphics Card
MSI is proud to announce the availability of the new R9 290X GAMING 8G graphics card. Packed with 8 Gigabyte GDDR5 memory operating at 5500MHz and all the extra features stuffed with it, the new 290X GAMING 8G is sure to drive UltraHD gaming resolutions without any problem. The MSI Twin Frozr IV Advanced ensures your card runs cool so you can enjoy maximum performance while AMD's PowerTune technology enables the R9 290X GAMING 8G to run at highest clock speeds.
With support for the latest industry standards and thrilling new technology such as Mantle support in Battlefield 4. Thanks to the bundled MSI GAMING App gamers can quickly switch between three pre-sets including a silent mode optimized for power efficiency and an overclocking and OC Mode to get the most power out of your graphics card, without worrying about learning how to overclock. The R9 290X GAMING 8G has been designed to give you a fluid and silent gaming experience that delivers you true next-gen performance for 4K UHD resolutions and up, without sacrificing on thermals or noise.Stay cool in combat
The MSI Twin Frozr IV Advanced has been completely customized for the R9 290X GAMING 8G graphics cards to deliver the best thermal experience. The design uses a larger copper base for heat absorption and the heat pipes are in contact with a bigger part of the heat sink and exposed to more airflow to ensure the highest performance of the GPU because of optimal temperatures. Combined with the dual form-in-one heat sinks that add both cooling ability and structural reinforcement the R9 290X GAMING 8G stays cool, quiet and safe.
Optimized for Gamers
The new R9 290X GAMING 8G is packed with features that benefit every gamer. AMD TrueAudio technology allows far more realistic positional audio and the added benefit of surround sound over connected displays. Mantle allows game developers to directly speak to the GPU optimizing GPU performance. This can all be displayed in a glorious UltraHD / 4K resolution and even up as the new R9 290X GAMING 8G is offering unmatched performance at 4K resolutions. This can be easily connected through the DVI, HDMI and DisplayPort connectors. Up to six dedicated displays can be connected to the R9 290X GAMING 8G for an amazing Eyefinity experience.
R9 290X GAMING 8G Technical specifications
With support for the latest industry standards and thrilling new technology such as Mantle support in Battlefield 4. Thanks to the bundled MSI GAMING App gamers can quickly switch between three pre-sets including a silent mode optimized for power efficiency and an overclocking and OC Mode to get the most power out of your graphics card, without worrying about learning how to overclock. The R9 290X GAMING 8G has been designed to give you a fluid and silent gaming experience that delivers you true next-gen performance for 4K UHD resolutions and up, without sacrificing on thermals or noise.Stay cool in combat
The MSI Twin Frozr IV Advanced has been completely customized for the R9 290X GAMING 8G graphics cards to deliver the best thermal experience. The design uses a larger copper base for heat absorption and the heat pipes are in contact with a bigger part of the heat sink and exposed to more airflow to ensure the highest performance of the GPU because of optimal temperatures. Combined with the dual form-in-one heat sinks that add both cooling ability and structural reinforcement the R9 290X GAMING 8G stays cool, quiet and safe.
Optimized for Gamers
The new R9 290X GAMING 8G is packed with features that benefit every gamer. AMD TrueAudio technology allows far more realistic positional audio and the added benefit of surround sound over connected displays. Mantle allows game developers to directly speak to the GPU optimizing GPU performance. This can all be displayed in a glorious UltraHD / 4K resolution and even up as the new R9 290X GAMING 8G is offering unmatched performance at 4K resolutions. This can be easily connected through the DVI, HDMI and DisplayPort connectors. Up to six dedicated displays can be connected to the R9 290X GAMING 8G for an amazing Eyefinity experience.
R9 290X GAMING 8G Technical specifications
- GPU: Hawaii XT
- R9 290X GAMING 8G Clock speeds: 1040 MHz (OC Mode)
- Memory size / Speed: 8096 MB / 5500 MHz
- Connectivity: DisplayPort / HDMI / 2x DL-DVI-D
- Card Dimensions: 276x127x39 mm
- Afterburner OC support: GPU / VDDCI Overvoltage
- Power connectors: 1x 8-pin, 1x 6-pin PCI Express
49 Comments on MSI Announces Radeon R9 290X Gaming 8GB Graphics Card
As far as single card performance at 4k, depends on the title and settings, but most aren't what one would consider 'playable' (30fps+) with reasonable texture settings when looking at benchmarks.. I'd rather rock 1440 and ultra than 4k and medium... But that is just me.
not what I would consider playable here..But it isn't a fps so more forgiveness is built in not to mention it's largely a cpu bound game...www.pugetsystems.com/labs/articles/Star-Citizen-Benchmark-Arena-Commander-v0-8-571/page3
I'm not so sure with next year games or with AA, texture mods, and ENB even though it's currently the fastest and most expensive single card (not counting old and odd Titans).
Comfortably running 4K with single card is still a year or two away.
Crysis is the perfect example of badly optimized games.
I really don't think about "oh can I play crysis in 4k".
Who does?
I can comfortably play star citizen at 4k on one card.
I understand what you mean and to an extent I agree but most games you will be able to play in 4k on one gtx 980.
I find that on my 50" 4k TV, I use 2X AA though, never more.
Totally unnecessary to use more.
It runs perfectly in it.
Seeing as though games will not use 8GB of RAM at this point, I would be leery of buying into 8GBs of RAM on a GPU.
Once this is massive RAM idea is utilized and more mainstream, these 8GB cards may not work properly like the cards released at that time.
The real question is, when will 4GB not be enough and will it be soon? I think someone earlier posted that at 4k they are using as much as 3.5GB of VRAM. That's really close to eating up 4GB already. I think that's just worth mentioning.
In a year or two I would imagine so. By the time the method becomes obsolete, the cards should also long be obsolete as well. Outside of NVIDIA's compression technology 256 bit is 256 bit. Unless they completely change the way GPUs do things, which is possible, it will still be viable in say 2 years. If they do change the way that GPUs do things, than all other cards vram amount be damned, will be in the same boat.
Think of it this way, if you have two gpus, one renders a frame, than the next renders a frame, then the first renders again, going back and forth. Both GPUs needs all the stuff to render that scene, therefore memory doesn't scale out. Also sharing memory would imply a memory controller shared by two GPUs and that adds latency and reduces bandwidth which could cripple a GPU.
All in all, things are the way they are because people who are very smart and do this for a living haven't thought up a better way to do it. So before you complain a little too much, I beg you to remember that this isn't easy stuff to design and there are limitations to what hardware as we know it can do.
I remember in the past that it was possible to add more memory to some of my video cards. Plug an play memory would be cool nowadays.
I'm saying this all because CFX/SLI isn't going to change because both cards *NEED* to have everything to render any given scene. Adding any back-and-forth communication between the GPUs to share memory will slow down overall rendering. In fact duplication of data such as textures is one of the ways they can run crossfire without having games implement something insane and without slowing down GPUs due to superfluous communication because memory requests are blocking and will slow the entire thing down.
I'm responding in depth because people like you say, "Well, wouldn't it be nice if this worked like this." The problem is more often than not, people over-simplify the problem and don't realize the one of the reasons crossfire/sli even works is because data exists on both GPUs. So if you want all of your VRAM, don't run crossfire/sli, but what you're carping about is part of how those technologies work.