# MSI Radeon R9 280X Gaming 6 GB



## W1zzard (May 12, 2014)

MSI's Radeon R9 280X Gaming 6 GB comes with twice the video memory of the reference design. We will test multiple resolutions, including EyeFinity and 4K, to see whether it really makes a difference. Another highlight of the card is that it's really quiet, quieter than any other R9 280X we've tested before.

*Show full review*


----------



## Hilux SSRG (May 23, 2014)

Nice review.

$120 extra for a 50mhz base clock bump and 3gb more memory is a huge waste of money.  The price can't be right.


----------



## Suka (May 23, 2014)

Hilux SSRG said:


> Nice review.
> 
> $120 extra for a 50mhz base clock bump and 3gb more memory is a huge waste of money.  The price can't be right.


spot on.


----------



## Sony Xperia S (May 23, 2014)

It is a the almost 3-year old Tahiti, not Hawaii. 

"AMD's Hawaii graphics processor uses the GCN shader architecture. It is produced on a 28 nm process at TSMC, Taiwan, with 6.2 billion transistors on a 438 mm² die."

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming_6_GB/4.html



Man, the progress with GPUs is terrible. Moore's "law" is long long gone, dead and buried.


----------



## Fluffmeister (May 23, 2014)

Again people insist on overstating VRAM requirements frankly, sure more VRAM isn't a bad thing, but I would say even 6GB on 780 Ti would be overkill.


----------



## HumanSmoke (May 24, 2014)

Fluffmeister said:


> Again people insist on overstating VRAM requirements frankly, sure more VRAM isn't a bad thing, but I would say even 6GB on 780 Ti would be overkill.


*4K yo *

Nothing says future proofing like a three-year-old GPU using a soon-to-be-replaced-by-DisplayPort 1.3/HDMI 2.0 output that it couldn't use anyway (assuming firmware was offered) without the guarantee that its pixel clock could run reliably at 600MHz (for 60Hz operation).


----------



## Tsukiyomi91 (May 24, 2014)

I doubt that a soon-to-be 3 year old chip that's being "reused" is going to be a good investment for "future proofing". Benchmark already stated that having more video RAM does not yield high frame rate & I agree that spending an extra $120 for a minor core speed bump (measly 50MHz) & 3GB of extra video RAM is a waste of money (and time). Rather spend my dough on a GTX780Ti 3GD5 since it has 1.) a full-blown GK110 chip 2.) 7GHz of effective memory speed 3.) lowered optimal temperature range for GPU Boost to kick in & lastly 4.) Kicks the AMD R9 290X's arse in many ways despite being $100 more.


----------



## HumanSmoke (May 24, 2014)

Tsukiyomi91 said:


> I doubt that a soon-to-be 3 year old chip that's being "reused" is going to be a good investment for "future proofing".


It's sarcasm. The fact that I referenced Tahiti's nominal 400MHz pixel clock not being able to reach 600MHz (as is the case with Hawaii) for 4K60MHz operation should have been a big tell.


----------



## techy1 (May 24, 2014)

nice  - a 280x (that is barely better than any other 280x) for a price of R9 290 - well... thx MSI, but no thx


----------



## walterg74 (May 24, 2014)

I lke the reviews here, but seriously:

Pro: 6Gb of RAM
Con: Extra RAM doesn't improve performance...

So wth? That's a little (actually a lot) contradictory...

Trying to decide best value/perf between these: 270X/760/280X/770, to replace my old HD4870.

Don't wanna go higher than that, since later on I will just get a new built with either an i5 4670K, or an i7 4770K, etc.  (currently running an older Phenom II X4 3.0Ghz BE, on a Gb 790X series motherboard, and still DDR2 RAM...)


----------



## MrGrammaX (May 24, 2014)

meh..  R9 290 4gb(400$) is better choice


----------



## Sony Xperia S (May 24, 2014)

walterg74 said:


> Trying to decide best value/perf between these: 270X/760/280X/770, to replace my old HD4870.
> 
> Don't wanna go higher than that, since later on I will just get a new built with either an i5 4670K, or an i7 4770K, etc.  (currently running an older Phenom II X4 3.0Ghz BE, on a Gb 790X series motherboard, and still DDR2 RAM...)



R9 270X or even better R7 260X (at 1920*1080). 

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming_6_GB/26.html


----------



## pky (May 24, 2014)

What's more interesting than the 6GB of memory is the much more silent and effective cooler than the one on the 3GB MSI Gaming. Would be best if they make a 3GB model with the updated cooler with the same price tag as the previous one.


----------



## W1zzard (May 24, 2014)

pky said:


> What's more interesting than the 6GB of memory is the much more silent and effective cooler than the one on the 3GB MSI Gaming. Would be best if they make a 3GB model with the updated cooler with the same price tag as the previous one.


yup, that would be nice. or upgrade the existing 3 gb model with the design of the 6 gb card.



walterg74 said:


> I lke the reviews here, but seriously:
> 
> Pro: 6Gb of RAM
> Con: Extra RAM doesn't improve performance...
> ...


people have different requirements. so if you play nothing but skyrim with texture mods you might actually need 6 gb (I'd rather suggest play a different game  )


----------



## GhostRyder (May 25, 2014)

You need the extra 3gb of Vram only in the case of 4k in reality.  So if you want to invest in 3-4 of these cards for 4k you will have enough ram at least to make up for it but unless the game your playing has the scaling to make up for a 3-4 card setup it becomes a waste.  Plus as seen, might as well grab the R9 290, it will be enough for 4k and the same price in the end with more performance to boot



W1zzard said:


> people have different requirements. so if you play nothing but skyrim with texture mods you might actually need 6 gb (I'd rather suggest play a different game  )


lol, so many ways with that game to wreck your computer, I think Skyrim became an experiment to see how far they can make a computer cry before the game became unplayable.


----------



## alwayssts (May 26, 2014)

HumanSmoke said:


> *4K yo *
> 
> Nothing says future proofing like a three-year-old GPU using a soon-to-be-replaced-by-DisplayPort 1.3/HDMI 2.0 output that it couldn't use anyway (assuming firmware was offered) without the guarantee that its pixel clock could run reliably at 600MHz (for 60Hz operation).



QFT.  That said, cards like this signal that 4Gbit ram is getting ready for primetime, which is a big deal (and I think largely the take-away from this product.)

Now, I don't know if Tonga (iirc my codesnames) for instance will have either of those connectivity options (probably not) but I would be willing to bet it's pixel clock is fixed and there are 8-chip 4GB models from the get-go, regardless of the big push in the rumor mill being 2GB reference (inferring amd wants them to make a big splash for typical 1080p users).

Couple of those cards, at the right price, could be kinda-sorta interesting versus Hawaii, just as 2xPitcairn was to Tahiti, although in the later example 4GB was a very rare sight.

On a side note, glad to see (at least with this sample) 4Gb elpida chips are not as crappy as some of their 2Gb offerings.  Perhaps their binning standards changed with new products because of the abilities to differentiate chips for new standards (ie low-power), perhaps the chips themselves are just so new (implied by the price) that higher-leakage parts might flow into lower-end products (as is usually true in the beginning) before they are eventually binned more thoroughly, or maybe just perhaps they were just tired of being crapped on by geeks versus Samsung/Hynix.   I'm always glad W1z calls them out in his reviews.  Who knows in this case, but if repeated enough times things like that can have an effect on products.


----------



## HumanSmoke (May 26, 2014)

alwayssts said:


> Now, I don't know if Tonga (iirc my codesnames) for instance will have either of those connectivity options (probably not) but I would be willing to bet it's pixel clock is fixed


The 400MHz pixel clock is a hold over from the old RAMDAC standard. Since VGA has gone the way of the dodo, and the advent of faster-than-60Hz monitors, I wouldn't think keeping the old 400MHz limit had any relevance for todays cards


alwayssts said:


> and there are 8-chip 4GB models from the get-go, regardless of the big push in the rumor mill being 2GB reference (inferring amd wants them to make a big splash for typical 1080p users).


Probably safe to say that even if the 2GB model is reference standard, you should see 4GB cards at the same time unless AMD see the 4GB variant as the salve for AIB's forced to tow the line with a reference cooler for a few months. One of the truly bizarre business strategies of AMD seems to be the "reference only" board at launch. You might have thought they'd have learned by now that a site will only do a single review for a reference board (maybe two if they do a separate CrossfireX review), yet Nvidia reap the full benefit of PR on Day One thanks to a slew of DCII, Windforce, Jetstream, TwinFrozr, and SuperClockedFTWiChillAMP'ed versions


alwayssts said:


> Couple of those cards, at the right price, could be kinda-sorta interesting versus Hawaii, just as 2xPitcairn was to Tahiti, although in the later example 4GB was a very rare sight.


Well, with GDDR5 turning into the new DDR3 with the advent of HBM / WideIO / HMC and DDR4/GDDR6, maybe the incentive to lower prices to keep the old production lines running might make 4GB GDDR5 cost effective. 5 Gbit chips are already (seemingly) dirt cheap, so no reason that 7Gbit can't follow suit over the next couple of years.


----------



## GhostRyder (May 26, 2014)

alwayssts said:


> QFT.  That said, cards like this signal that 4Gbit ram is getting ready for primetime, which is a big deal (and I think largely the take-away from this product.)
> 
> Now, I don't know if Tonga (iirc my codesnames) for instance will have either of those connectivity options (probably not) but I would be willing to bet it's pixel clock is fixed and there are 8-chip 4GB models from the get-go, regardless of the big push in the rumor mill being 2GB reference (inferring amd wants them to make a big splash for typical 1080p users).
> 
> Couple of those cards, at the right price, could be kinda-sorta interesting versus Hawaii, just as 2xPitcairn was to Tahiti, although in the later example 4GB was a very rare sight.



Its mostly because its whats necessary, more ram has become a huge requirement since we are just skipping up to higher resolutions faster than hardware has a chance to evolve.  We never really even got a full introduction to the world of 1440/1600p before 4k was the next big thing.

It will probably become a new standard for at least the middle ground offerings to have at least 4gb, but its going to become one of those standards just to handle the new gen games and 1080p will soon be considered a low baller in the gaming world.

In the end though, the 6gb seems like a necessity while being a waste at the same time at least with this price in mind.  You might as well just buy the 4gb R9 290 and gain a nice performance advantage while still being able to handle the current gen lineup of resolutions (If you invest into CFX).



alwayssts said:


> On a side note, glad to see (at least with this sample) 4Gb elpida chips are not as crappy as some of their 2Gb offerings.  Perhaps their binning standards changed with new products because of the abilities to differentiate chips for new standards (ie low-power), perhaps the chips themselves are just so new (implied by the price) that higher-leakage parts might flow into lower-end products (as is usually true in the beginning) before they are eventually binned more thoroughly, or maybe just perhaps they were just tired of being crapped on by geeks versus Samsung/Hynix.  I'm always glad W1z calls them out in his reviews.  Who knows in this case, but if repeated enough times things like that can have an effect on products.


Oh im with you on that, I hate those sneaky random assortment of ram that you can end up with.  Its nice to be able to see what a card has on it so you can at least assume its going to contain that grade of ram especially when people like to overclock their ram to the limits.  im surprised in the end though because Elpida has taken such a hit with how bad the chips have been to the overclocking community, guess I got lucky with my cards I didn't run into any.


----------



## techy1 (May 26, 2014)

I like these discussions that extra vram could give real advantage in "some scenarios" and most deffinetly in "4k"... now - lets look at real life (cuz Wizz made real life test for us) - 1) is there any advantage of extra memory? 2) is there any benefits in highest settings and/or higher resolutions? 3) is there any in gains in 4k (there is 4k in this test!!!)? - answer for all these questions is NO, NO and NO.... so from where someoene gets the idea that extra vram could give advantage (if the tests show othervise)?


----------



## pky (May 26, 2014)

@techy1 Dude, it's not all about frames per second. Resolution and settings affect the image you see on the screen. Example image. If you just care about the FPS, then you can play on lower res/settings, but people want to play the game as it's meant to be played, not to see blurriness everywhere.


----------



## SmokingCrop (May 26, 2014)

techy1 said:


> I like these discussions that extra vram could give real advantage in "some scenarios" and most deffinetly in "4k"... now - lets look at real life (cuz Wizz made real life test for us) - 1) is there any advantage of extra memory? 2) is there any benefits in highest settings and/or higher resolutions? 3) is there any in gains in 4k (there is 4k in this test!!!)? - answer for all these questions is NO, NO and NO.... so from where someoene gets the idea that extra vram could give advantage (if the tests show othervise)?



Go play on the lowest resolution with the integrated gpu of your cpu then.
k, tnx bai.


----------



## techy1 (May 26, 2014)

"Go play on the lowest resolution with the integrated gpu of your cpu then.
k, tnx bai. " - only if my iGPU would have doulbe amout of vram I should be fine, right? 

"@@techy1 Dude, it's not all about frames per second. Resolution and settings affect the image you see on the screen. Example image. If you just care about the FPS, then you can play on lower res/settings, but people want to play the game as it's meant to be played, not to see blurriness everywhere."
I am talking about reallife tests (that are made on high settings - all equal for all tested GPUs - you can read description) and there double vRam did not show any benefit whatsoever... it is not that wizz tests "double vRam" GPU's on high settings and normal GPU's un normal settings


----------



## pky (May 26, 2014)

Oh, I got it... I guess I misunderstood your previous post.


----------



## thebluebumblebee (May 27, 2014)

Listed price: $400
Price today at Newegg $330.  After MIR: $299


----------



## GhostRyder (May 27, 2014)

thebluebumblebee said:


> Listed price: $400
> Price today at Newegg $330.  After MIR: $299


Well at that price that's actually a pretty good deal if you want to run a multi-card setup at a high resolution!


----------



## IRQ Conflict (May 28, 2014)

The conclusion in the review shows an 8.9 but the main review page shows 9.1. Which is it?


----------



## W1zzard (May 28, 2014)

IRQ Conflict said:


> The conclusion in the review shows an 8.9 but the main review page shows 9.1. Which is it?


fixed


----------



## The Von Matrices (May 31, 2014)

GhostRyder said:


> Well at that price that's actually a pretty good deal if you want to run a multi-card setup at a high resolution!



I see no point of a 6GB 280X since it can't practically be used in Crossfire to drive the displays that would actually need 6GB of VRAM.  Using any 60Hz display setup over 4MP including Eyefnity and 4K is asking for trouble with Crossfire 280Xs.

The frame pacing issues with the 280X/7970 are still not fixed for high resolution even with the latest drivers.  I have 3 7970s and a 7MP display setup but can't stand to use Crossfire - the stuttering for most games is that distracting.  The big issue is that those few games with optimizations for frame pacing are not the newest games that I actually want to play.  For example, the 14.3 drivers finally added reasonable frame pacing for Crysis 3 over one year after the game's release and after everyone who had wanted to play the game already did.  Even the "Gaming Evolved" titles (now with the exception of Crysis 3) are unsupported for 280X high resolution frame pacing.  Thief for example has a 70% variance in frame timing even though it's AMD's current premiere title.

I seriously have wanted to play Bioshock Infinite in Eyefinity at high or extreme settings but I can't stand the stuttering in Crossfire and therefore run one card (Crossfire disabled) at normal settings.  I don't hold out hope for AMD releasing frame pacing optimizations for the game either.  Considering the age of the Tahiti GPU, I think stuttering is an intractable problem and people like me just got burned by AMD's promises within the past year of fixing the issue completely.  With all the headaches these 7970s have caused me, I know that as soon as the 20nm GPUs come out, I'm getting rid of the 7970s.


----------



## sliderider (Jul 24, 2014)

It seems the 6gb is of no benefit at all, at least in the resolutions that the vast majority of gamers play at. If you play with 3-1280p monitors, then it might be of some benefit but for the rest of us who don't, it's a waste of money when there are better cards for less that you can use on a single monitor. It looks to me like any performance gain is coming from the factory overclocks and not the additional RAM. Running a side by side comparison with other factory overclocked R9 280X cards with 3gb would likely prove that out. A Sapphire R9 280X Toxic would probably match this card and cost less.



MrGrammaX said:


> meh..  R9 290 4gb(400$) is better choice



Or another massively overclocked 280x with only 3gb, like the Sapphire Toxic or the 2nd gen. Vapor-X with Tri-X fans.


----------



## Aquinus (Jul 25, 2014)

It's entirely possible that the 6GB might not show its colors unless you run crossfire with 2 or 3 in the future when more games have bigger textures. I know with my 6870s, the 1GB holds them back on newer titles where with a single one it's tolerable (for what it is,) but 2GB would help for crossfire where I have more flexibility on quality settings which may use more hefty textures and such. I suspect that same could become true of this GPU if the conditions were right, which they're not yet. It's a lot of extra money to find out though.


----------



## halorath (Aug 10, 2014)

Basically it depends if you have multiple monitors or a single monitor simply because when the calculation that happens in the
GPU so that many pixels appear on screen to form the whole screen they must have storage to be saved in the card, so
guess what the storage is... that's right you guessed it, it's the VRAM.

So if you have multiple monitors I highly recommend the 6gb version.
E.g. you have a 2GB card and you're running a high-end game at ultra settings you're going to get FPS drops if you're running a second monitor at the same time in the same graphics card.
3GB is solid for 2 monitors but that depends on their resolution and if you have one using the HDMI port because HDMI is harder on the card than DVI. Same goes with the FPS when comparing a 60 Fps monitor to a 120 Fps monitor and if the game is or isn't locked to 60 Fps.

But as far as no game using more than 2GB is wrong;
Heavily modded Skyrim littered with custom HD packs etc can use well over 4GB of VRAM (not including System RAM).
Watch Dogs on ultra settings will use 3.4GB of VRAM (not including System RAM) and that is without unlocking the hidden graphic settings that were only shown at E3.
That's just a few examples and you can look it up too; with GPU statistics showing a GPU maxing out at 4GB VRAM with one monitor playing one game.
Not all games will use more than 2GB but some games do and games within the next 2-3 years will use more than 2GBs also.

That and with the 280x 6GB having 10 degrees less then the 280x 3GB version on load you can overclock it further.

Also if you want to use Crossfire or SLI make sure you have a good CPU and depending on the cards have the CPU Overclocked above 4Ghz and above 4.5Ghz at the least if you Crossfire / SLI 3 high end cards, GPUs can only work with what they are given and it's the CPUs that feed them the game (the GPUs cannot access the hard drives directly it all has to go through the CPU same goes with the system ram anything coming in and out goes through the CPU hence why it's the heart of a computer) and if you have 3 Crossfired / SLI cards and a CPU lower than 4Ghz your CPU will choke your GPUs the they will drop frames not perform as good as they can and stutter but they will still run at high temps though even if they aren't being fed what they require.
E.g. Two R9 290s on CPU lower than 4Ghz gets choked and will run Crysis 3 at 30-40 fps however One R9 290 will run at 60 fps on Crysis 3 with the same settings on the same CPU. because having Two Crossfire R9 290s is too taxing on a CPU lower than 4Ghz.


----------



## Dagamus(NM) (Aug 25, 2014)

So my export times in adobe premiere pro are awesome with my setup. I am running 3 r9 280x's on my rampage iv with 3930k and my times for exporting a 30 minute hd video is at 11 minutes. This computer has 32gb kingston value ram lol.

This is about four minutes faster than my upstairs computer using sli evga gtx 670 ftw, amd fx-8350 and 32gb kingston hyper-x beast @1866. This is with cuda acceleration.

Before this most recent release of adobe products it was only nvidia products that accelerated adobe but now with the creative cloud releases adopting OpenCL and supporting it fully amd can compete in the professional arena that was previously closed out.

I would be running quad r9 280x 6gb cards but the ek cooling configurator was originally listed as having my existing 7970 blocks fit these which is why I got them. To my dismay the layout of the msi 6gb is different than the 3gb model. It is not just the cooler that is different but the overall pcb. 

So nobody makes a waterblock other than a custom job from aqua tuning in Germany and those prices are much higher than my want to get the fourth card in. The spacing for the rampage iv board is favorable for three air cooled cards. I would match them with the amd cpu and use nvidia with the 3930k but amd pcie spacing is retarded and only runs air cooled practically with dual cards. 

If anybody knows of an am3+ board that has even spacing for 3 gpu's please share. I wouldn't mind brand matching and getting a 3rd nvidia card to see how it does heads up in video rendering in premiere pro or after effects.

This tri-fire r9 280x 6gb games pretty awesome too. Probably not any better than some other setups but it does great in my experience. 3D dead island and Sleeping dogs is sweet. I am just waiting for dead island 2 to come out.


----------



## eidairaman1 (Aug 27, 2014)

I think Msi should of focused on figuring out how to push 4gib on the board and making it a heavy oc ghz model.


----------



## Dagamus(NM) (Aug 27, 2014)

1050/6000 out of the box isn't good enough? These do 1200/7000 no problem without artifacts.


----------



## eidairaman1 (Aug 27, 2014)

You forget oc mileage varies. This unit only pulled 1155... Also the 6Gib of ram doesnt negate the expense compared to the 3 gib model- when they perform the same, now if it was a 768bit buss it be different. Now lets see a 285 unit.


----------



## Dagamus(NM) (Aug 28, 2014)

Where are you going to get a 768bit bus at this price? Nowhere, that's where. Double the bus for 10x the price. 12gb Titan black is the cheapest at $3K for getting a 768 bit bus. 

It would be reasonable to want 512 but that would cost a lot more I think.


----------

