• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon RX 480 Nitro, Reference Cards Pictured

For the potential 4GB SKU, the 8GB will be more.
As a card that is made mostly to dominate 1080p (with some excursion to 1440p) 4GB is just fine. 8GB is overkill imo.
 
As a card that is made mostly to dominate 1080p (with some excursion to 1440p) 4GB is just fine. 8GB is overkill imo.

"Overkill"

Small example of 4GB vs 8GB.


Extra
+i would like to see Benchmarks of RX480 4GB vs 8GB.
 
Last edited:
"Overkill"

Small example of 4GB vs 8GB.

One game is not representative. Further more, 970 with 3.5GB does fine. Thats some issue with the game itself more than anything else. Maybe even some driver issue.
Besides, if i'm not mistaken, TPU had an article where only 4K gaming has a need for >4GB vram.
 
One game is not representative. Further more, 970 with 3.5GB does fine. Thats some issue with the game itself more than anything else.
Besides, if i'm not mistaken, TPU had an article where only 4K gaming has a need for >4GB vram.

2 years ago, maybe..



Ultra settings , i can't run Hyper.


Doom,Wolfenstein,GTA V,Shadow of Mordor,Total War and many more. - use over 4GB vram.
 
Dunno if it's my processor (G3258 @ 4.5GHz) or GPU (R9 290 @ 1000/1250), but even Ultra seems to be too much for Mirror's Edge Catalyst on 1080p.
 
That resolution has 2.5X more pixels than 1080p, ofc its gonna need more vram.
 
That resolution has 2.5X more pixels than 1080p, ofc its gonna need more vram.

"TPU had an article where only 4K gaming has a need for >4GB vram."

3K < 4K.


+GTA V, DOOM,Shadow Of Mordor,Mirror's Edge,Dying Light - user over 4GB on 1080p resolution.


39$ for extra 4gb worth it.
 
Last edited:
"TPU had an article where only 4K gaming has a need for >4GB vram."

3K < 4K.


+GTA V, DOOM,Shadow Of Mordor,Mirror's Edge,Dying Light - user over 4GB on 1080p resolution.


39$ for extra 4gb worth it.
Ok, if you say so...
 
Speaking about that game and quality:

713bca1bec.jpg
 
Speaking about that game and quality:

Digital whatever retested it, watch the new video, this picture from the old one.


----

 
We can hope; what we are told is that 2x RX480's in Crossfire beat the GTX1080, which is going to be way more of a bargain price/performance wise. GTX1080 for $700 *cough cough, wheeeeeze COUGH* and 2x RX480's for $400-500.

Stop this. We're told they beat a 1080 in one game.

Digital whatever retested it, watch the new video, this picture from the old one.


----


The retest has nothing to do with what medi01 is pointing out. In the above image both are running at Ultra
 
Last edited:
Well at least were finally seeing a little from the manufacturers side on this GPU instead of just AMD holding a card up. Least we know its really coming soon since Sapphire posted what theirs looks like.
 
One game is not representative. Further more, 970 with 3.5GB does fine. Thats some issue with the game itself more than anything else. Maybe even some driver issue.
Besides, if i'm not mistaken, TPU had an article where only 4K gaming has a need for >4GB vram.
And what about a year or two from now? 4GB if fine TODAY, but buying a $200 piece of equipment without thinking of the future is pretty silly, IMO.

GTA V, far cry primal, and shadow of mordor can all push 4GB at 1080p, with games like wolfenstein not far behind. When the 8GB upgrade is only $30 or so, considering it may get another year out of the GPU it's be just plain silly to get the 4GB version.

When I got my 770, I debated getting the 2GB or 4GB edition. People said the 2GB would be just fine.

Well guess what? two years later, and it struggles. Forza, GTA, SoM, Wolfenstein, None of those game can be maxxed out due to the 2GB limitation, while the 4GB versions can run them all no problem. If I had gone with the 4GB editions, I wouldnt be replacing my GPUs this year. To say nothing about SLI. In SLI, hitting the 2GB limit is painfully obvious. I have the GPU power, just sitting idle, because I only have 2GB of VRAM.

So yeah, 4GB is enough right now. But a year from now, not so much. better to have too much VRAM that goes to waste then having a GPU hamstrung by it's paltry VRAM.
 
And what about a year or two from now? 4GB if fine TODAY, but buying a $200 piece of equipment without thinking of the future is pretty silly, IMO.

GTA V, far cry primal, and shadow of mordor can all push 4GB at 1080p, with games like wolfenstein not far behind. When the 8GB upgrade is only $30 or so, considering it may get another year out of the GPU it's be just plain silly to get the 4GB version.

When I got my 770, I debated getting the 2GB or 4GB edition. People said the 2GB would be just fine.

Well guess what? two years later, and it struggles. Forza, GTA, SoM, Wolfenstein, None of those game can be maxxed out due to the 2GB limitation, while the 4GB versions can run them all no problem. If I had gone with the 4GB editions, I wouldnt be replacing my GPUs this year. To say nothing about SLI. In SLI, hitting the 2GB limit is painfully obvious. I have the GPU power, just sitting idle, because I only have 2GB of VRAM.

So yeah, 4GB is enough right now. But a year from now, not so much. better to have too much VRAM that goes to waste then having a GPU hamstrung by it's paltry VRAM.
I thought the same buing the 970 when it released , 4gb vram is more than enough i sayd for 4-5 years at 1080p but after less than 2 years here we are. You can NOT future proof something like a GPU , that's why you pick what's necessary to play the game you want to play at the resolution and settings you want to play it and be done with.
 
I really have to try new Mirror's Edge on my GTX 980. Usually, anything I throw at it, it can run smoothly at max possible settings. I do only have 4GB VRAM. At least it's full speed unlike crappy GTX 970 :P
 
If Mirror’s Edge Catalyst is what PC gaming is going to offer with us be on the cusp of Dx12 and 2017... let’s just give it up!

To me it looks cartoonish, brooding, blatantly exaggerated, and generated. While computer gaming in 2017 should provide a cinematic experience; as to render scenes that approximates what the human eye can actually perceive as lifelike with depth of field, though the full gambit of colors that provide true to life luminance. To me this looks like we are back in the late '90 early 2000. Game houses are just looking to get rich, not accelerate the beauty of what can be made possible with 2017 hardware.


I saw wonderful things at the Odyssey Visual Design, Computer Animation Festival at Museum of Contemporary Arts, in La Jolla in 1986. Had you said, game in 2017 would end up looking like some comic book strip, many in attendance would told you you're “nuts”. But profits trumps doing the work and being proud of it, in the 21st century.
 
Last edited:
If Mirror’s Edge Catalyst is what PC gaming is going to offer with us be on the cusp of Dx12 and 2017 let’s just give up!

To me it looks cartoonish, brooding, blatantly exaggerated, and generated. While computer gaming in 2017 should provide a cinematic experience; as to render scenes that approximates what the human eye can actually perceive as lifelike with depth of field, though the full gambit of colors that provide true to life luminance. To me this looks like we are back in the late '90 early 2000. Game houses are just looking to get rich, not accelerate the beauty of what can be made possible with 2017 hardware.


I saw wonderful things at the Odyssey Visual Design Computer Animation Festival at Museum of Contemporary Arts, in La Jolla in 1986. Had you said, game in 2017 would end up looking like some comic book strip, many in attendance would told you you're “nuts”. But profits trumps doing the work and being proud of it, in the 21st century.

NVIDIA asked DICE to not downgrade the graphics.

If you wish for downgraded PC games, we should stop bitching about Ubisoft.

Much respect to Nvidia, from me.

http://wccftech.com/nvidia-releases-gtx-1070-game-ready-driver/
 
Last edited:
If Mirror’s Edge Catalyst is what PC gaming is going to offer with us be on the cusp of Dx12 and 2017... let’s just give it up!

To me it looks cartoonish, brooding, blatantly exaggerated, and generated. While computer gaming in 2017 should provide a cinematic experience; as to render scenes that approximates what the human eye can actually perceive as lifelike with depth of field, though the full gambit of colors that provide true to life luminance. To me this looks like we are back in the late '90 early 2000. Game houses are just looking to get rich, not accelerate the beauty of what can be made possible with 2017 hardware.


I saw wonderful things at the Odyssey Visual Design, Computer Animation Festival at Museum of Contemporary Arts, in La Jolla in 1986. Had you said, game in 2017 would end up looking like some comic book strip, many in attendance would told you you're “nuts”. But profits trumps doing the work and being proud of it, in the 21st century.
Do you understand what art-style is? Or do you assume that EVERY game must be a photo-realistic shooter now that we have tons of computational power? Ever heard of cel-shading? of games intended to look like cartoons? games that are meant to be unrealistic? Games that are FUN? Again, ART-STYLE? Not every game has to look the same.

If you want to ooh and ahh over the latest and greatest videos of truly lifelike quality, may I suggest you go outside, perhaps take up photography?
 
NVIDIA asked DICE to not downgrade highest quality of the graphics.

If you want to downgraded PC games, we should stop bitching about Ubisoft.

Much respect to Nvidia, from me.
Not sure what your saying? It's not a Nvidia thing as much as it just some... or anymore many game houses (I refrain from calling them developers) know whatever pile they dump on PC gamers, they'll get enthralled and buy it, even while it replicates some Marvel Comic book for the 1960's.

Ever heard of cel-shading?
Well that's one technique. Personally when I first saw a ball rolling around and the cast shadow would follow along, even pass along a wall as it replicated re-life I was impressed. But then I was 24, at the time and computer animation was a wonderland now it looks like a waste land.
 
Last edited:
Not sure what your saying? It's not a Nvidia thing as much as it just some... or anymore many game houses (I refrain from calling them developers) know whatever pile they dump on PC gamers, they'll get enthralled and buy it, even while it replicates some Marvel Comic book for the 1960's.
And what is wrong with looking like a comic book? Not every game has to be crysis. Borderlands looking cartoonish did not make it a bad game, mirrors edge 1 looking simple did not make it a bad game, shovel knight looking like an 8 bit game did not make it a bad game, batman not looking photo realistic did not make it a bad game (not talking about arkham knight here), infamous second son not looking photo realistic did not make it a bad game. crysis 3 looking photo realistic did not necessarily make it a good game. Battlefront looking just like the movies CERTAINLY did not make it a good game.

Hell, Viewtiful Joe looks exactly like a comic book, and that was a fantastic game.

If you dont like anything but photo-realistic shooters, fine, thats your preference, but dont talk down about games that use other art styles.
 
Not sure what your saying? It's not a Nvidia thing as much as it just some... or anymore many game houses (I refrain from calling them developers) know whatever pile they dump on PC gamers, they'll get enthralled and buy it, even while it replicates some Marvel Comic book for the 1960's.
What ?

First Mirror's Edge Catalyst pretty optimized game, and like @TheinsanegamerN said, this is how MEC should look.


"NVIDIA’s Pascal GPUs Inspired DICE To Add Hyper Settings for Mirror’s Edge Catalyst"
 
Back
Top