# rx570: 4gb vs 8gb?



## qu4k3r (Nov 22, 2018)

Hi there.

I have been looking for a very nice and cheap deal for a gfx card beacuase my old gtx570 died time ago and since then I've been using a hd5830 that I had in the closed as spare part. This morning I saw at amazon the msi rx570 armor oc 8gb at 150$, but there were nothing left in the night. So I decided to wait a little and keep looking for, and I saw the powercolor rx570 red dragon 4gb oc at 140$ in amazon. Well well 10$ cheaper is good because I also want to buy a 512gb (61$) ssd and 4tb hard drive (98$) with very tight budget (315$ incl shipping). I placed the order for these three items and 2 hours later the msi rx570 8gb oc came back to stock, lol... I don't want to cancel the order beacuse i dont want to spend more money.

The question is... is there any real difference between 8 and 4gb for 1080p gaming with this card?

PC specs profile fx6300 that I dont plan to upgrade in the near future, so I'm aware there could be a little bottleneck.

Thanks in advanced and happy holydays.-


----------



## EarthDog (Nov 22, 2018)

Depends on the game. Few use more than 4gb at 1080p so you'll be fine for the most part. But as time goes on they will use more memory. I'd go with 8gb for the $10 difference....



...but if you dont want to 'cancel the order and spend more money'..why are coming here and asking?


----------



## yotano211 (Nov 22, 2018)

I believe the 8gb version runs a little faster on the memory.


----------



## qu4k3r (Nov 22, 2018)

EarthDog said:


> ...but if you dont want to 'cancel the order and spend more money'..why are coming here and asking?


Because I just wanted to see an opinion, I have a limit for money I can spend, and find a balance between the other two items will be difficult if choosing the 8gb version.

Thanks for all answers anyway.-


----------



## ASOT (Nov 22, 2018)

Better to get the RX 480/580 4GB


----------



## Rahnak (Nov 22, 2018)

I think you'll be fine with the 4GB version. While it's true that a couple of games are pushing over 4GB of memory at 1080p, they only do so on the very highest graphical presets which, let's be honest, you probably won't be using if you care about frame rates.


----------



## kapone32 (Nov 22, 2018)

Get the Power Color it has a much better cooling array than the MSI and it is cheaper. I had a 580 4gb and a 5808GB and saw no difference in Gaming. The 8GB is good for VR or higher than 1440P resolutions but has no discernable difference at 1080P


----------



## SoNic67 (Nov 22, 2018)

kapone32 said:


> The 8GB is good for VR or higher than 1440P resolutions but has no discernable difference at 1080P


I used to thought so too, but when I saw Quake Champions taking a cool 7GB on my RX580 with 8GB of memory, I corrected my opinion.
And no, I don't use virtual resolution, this is at 1920x1080.


----------



## Rahnak (Nov 22, 2018)

SoNic67 said:


> I used to thought so too, but when I saw Quake Champions taking a cool 7GB on my RX580 with 8GB of memory, I corrected my opinion.
> And no, I don't use virtual resolution, this is at 1920x1080.


That's probably a bug/anomaly/leak.

EDIT:
This is the recommended GPU for Quake Champions:
*Graphics:* AMD R9 290 4GB / Nvidia GTX 770 4GB
So yeah, using 7GBs of memory is certainly not normal.


----------



## SoNic67 (Nov 22, 2018)

Not a leak, it is steady during gameplay, as you see in the pic.
It's not just the textures but also the compiled shaders can take a lot of space.


----------



## kapone32 (Nov 22, 2018)

SoNic67 said:


> I used to thought so too, but when I saw Quake Champions taking a cool 7GB on my RX580 with 8GB of memory, I corrected my opinion.
> And no, I don't use virtual resolution, this is at 1920x1080.



That is interesting as I have 2 Vega 64s and I never see them use more than 6GB even though I am runnig at 4K Ultra for all my games.


----------



## Rahnak (Nov 22, 2018)

SoNic67 said:


> Not a leak, it is steady during gameplay, as you see in the pic.
> It's not just the textures but also the compiled shaders can take a lot of space.


Did some googling. It seems the game engine is a mess. You can't even select higher than medium quality textures unless you have 16gb of ram, because of memory leaks.


----------



## sixor (Nov 22, 2018)

GO FOR 8GB FOR FUTURE PROOFING

sure, 4gb is enough but most games are using 5-6gb

and yes, some games will fill all the available vram because it can, to avoid stutters and stuff, 

FINAL FANTASY XV
GEARS OF WAR 4


----------



## SoNic67 (Nov 22, 2018)

Rahnak said:


> because of memory leaks.


Again, not memory leaks, memory usage it is a stable straight line, smaller than the max memory. Sure, maybe not optimized, I'll give that. Anyway it was just an example of a game that I can play with my 8GB and it might not play that well with a 4GB card.

Memory leak became a buzzword that people throw in everywhere. A memory leak will fill the whole available memory and then crash the app via overflow.


----------



## Rahnak (Nov 22, 2018)

SoNic67 said:


> Again, not memory leaks, memory usage it is a stable straight line, smaller than the max memory. Sure, maybe not optimized, I'll give that. Anyway it was just an example of a game that I can play with my 8GB and it might not play that well with a 4GB card.
> 
> Memory leak became a buzzword that people throw in everywhere. A memory leak will fill the whole available memory and then crash the app via overflow.


Sorry if I wasn't more clear. I didn't mean your case specifically was a memory leak. That was the reason given why you can't enable high/ultra textures if you only have 8GB of RAM.


----------



## eidairaman1 (Nov 22, 2018)

If money is tight 4GB if not that bad go for the 8GB


----------



## gamerman (Nov 22, 2018)

its clear but useless info.

if you play 4K games 8gb gpu,otherless 4gb is more than enough. but fact truth is that rx 500 series are NOT 4K games so forget it.

but if you are smart and want good quality,fast gpu with great efficiency and want your gpu keep value later when you might sell it, chooce nvidia gtx 1060 6gb oc'd model for fhd gaming, gtx 1080 oc'd model for  and 2080 for higher gaming.
price is higher but after 2-3 years example when you offer it net, it buy at once. all amd polaris  are dead value.


----------



## eidairaman1 (Nov 22, 2018)

gamerman said:


> its clear but useless info.
> 
> if you play 4K games 8gb gpu,otherless 4gb is more than enough. but fact truth is that rx 500 series are NOT 4K games so forget it.
> 
> ...



He is asking about the RX570, not Geforce


----------



## SoNic67 (Nov 22, 2018)

gamerman said:


> otherless 4gb is more than enough


I just gave a clear example above when 4GB is *not enough* for 1080 gaming.


----------



## MrGenius (Nov 22, 2018)

SoNic67 said:


> I just gave a clear example above when 4GB is *not enough* for 1080 gaming.


No you didn't. You gave a clear example of a game "using" more than 4GB of VRAM @ 1080p. Which is essentially meaningless. Just because a game "uses" any amount of VRAM doesn't mean it needs to "use" that much. Or that it can't run just as well "using" less(or even more). Some games will "use" as much VRAM as they can...just because they can. Which is pretty common knowledge.

Anyway...until just recently I was still running my 280X 3GB. And I never found a game that wouldn't run, and very well I might add, @ 1080p with it. I always run all my games with ultra settings too...for the record. So I can't see how 4GB wouldn't be plenty for 1080p. At present at least.

And...BTW...Quake Champions ran flawlessly on my 280X 3GB w/ ultra settings @ 1080p. I have no idea how much VRAM it was "using" though. I don't monitor that crap. If it runs, it runs. Who gives AF how much VRAM it's "using"? What good does knowing that do you?


----------



## SoNic67 (Nov 23, 2018)

It doesn't matter what you think a game *should* use as memory. That's not an *example*, it is a wish.
An *example* is that, in the actual implementation of an actual game, you cannot play that game on highest texture settings without that amount of memory. Reality sometimes is not perfect.
And also, an example is just that... doesn't need to do good or bad.


----------



## John Naylor (Nov 23, 2018)

qu4k3r said:


> The question is... is there any real difference between 8 and 4gb for 1080p gaming with this card?



No.... biu whey believe me.... look at the test results.  Let's look at the data right here on TPU.  Background .... It's an absoliute given, that if 3 or 4 GB was not enough at 1080p, then if used at same settings at 1440 that gap would invariably have to widen.   I don't have 4 and 8GB cards to compare but we do have 3 and 6GB.  Now the 3 GB 1060 has 11% less shaders than the 1060 6 GB so it's not apples and apples.    When we look at the image below, we see that the 6 GB model (1280 shaders) is 6 % faster  than the 3 GB model with only 1152 shaders.






Now we can not ascertain what gives the 6 % advantage here ... is it VRAM or shaders.... but we can absolutely with no uncertainty conclude that the  6 % must widen at 1440p, *if** VRAM is an issue.  It doesn't... same 6%.  Therefore, across the 18 game TPU test suite, the extra VRAM is not doing squat....not at 1080p, not at 1440p according tot he data.  We do see when looking at 2160p, the gap does widen.   

https://tpucdn.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/images/perfrel_2560_1440.png

Now I don't doubt that ya can find a game that is affected, I can't place much weight on a single instance especially when this is a common frustration with bad console reports.  Now you will find folks who believe they can prove that they found a game that uses more RAM ... but as there is no utility in existence that measures VRAM usage, including the one they are using as proof, they haven't proven anything.   Testers have been comparing different versions of the same cards w/ 2 vs 4, 3 vs 6 and 4 vs 8 GB for ages.... and except for the oddball poor console port, they ain't finding any.

Here's the 2  GB and 4 GB 770 ... the web site of the original test is no longer up but the charts are included in this video.  What they found testing at 5760 x 1080 there was no significant difference between the 2 GB and 4 GB cards except for 5 games.  But it didn't matter as having 4GB instead of 2 GB really doesn't matter when 2 GB gets you 13 fps and 4 GB gets you 17 fps.   On games that were actually "playable", which one was insignificantly faster were split between the 2 cards  sometimes 2 Gb was faster by a hair, sometimes the 4 GB was faster. But the most interesting result of that test was max payne 2.  It would not install at the 5760 x 1080 setting with the 2GB card ... so they went ahead and installed the 4 GB and got the results.  On a whim, they decided to retry the 2 GB card ... worked fine.   See the way this works is cards "allocate" VRAM by looking at what's available and the logic goes "Ahhh... he's got 8GB lets reserve 6 GB for us

But here's a simpler analogy ... if you need an 8GB card to play 1080p.... then what do you need to play at 4 x 1080p ...aka 4k ?    So if ya need 8GB for 1080p, you need 32 GB of VRAM to play at 4 x 1080p or 4k.


----------



## EarthDog (Nov 23, 2018)

Thats cool and true and all...but it doesnt always manifest itself in average fps.  

What it can do is make the gaming experience not as pleasurable with swapping out data causing occasional stuttering and hitching when its swapping out. Average fps doesnt tell the whole story.

If a game needs more vram, it will get it by swapping it out....and swapping it out...and swapping it out versus simply reading it because it is already there.


----------



## John Naylor (Nov 23, 2018)

SoNic67 said:


> I just gave a clear example above when 4GB is *not enough* for 1080 gaming.



To borrow from The  Princess Bride "I don't think that word means what you think it means".  You have made an assumption that GFX utilities do something that they are not actually capable of.

You posted an image of a game that "allocated" 7 GB of VRAM , allocated does not mean "used".  I have a Visa card... I have an allocated credit limit of $75k.   I have $500 charged on it.  When I apply for new car loan, the credit report doesn't say that I owe $500.  It says that Visa has "allocated $75k of credit to me.   Reference the max payne example above ... during the installation, could not install for 5760 x 1080 but when the tested it after installing with the 4 GB card, and switcing back, they got the same FPs, at the same quality and no discernible differences.

Guru3D did the same thing with the 960s .... Puget Sound did it with the 6xx series with same results.

https://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-960-g1-gaming-4gb-review,12.html

ExtremeTech did it with 9xx .... https://www.extremetech.com/gaming/...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

And now back to The "Princess Bride"

No utility actually reports _"how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “*None of the GPU tools on the market report memory usage correctly*, whether it’s GPU-Z, Afterburner, Precision, etc. *They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” *_

So no, you can't use a a utility to prove that 4 GB is not enough because no utility exists that measures actual VRAM usage.  And again, if 4 GB isn't enough for a game at 1080p, then you need more than 16 GB of VRAM for that same game at 4 x 1080p (4k) ... can't get those yet.



EarthDog said:


> Thats cool and true and all...but it doesnt always manifest itself in average fps.



My question is... if there's a problem, how come none of these testers have been able to observe it ?   In addition to reporting fps, they are providing assessments of image quality, playability, stuttering or hitching... "no observable difference" ... even at 5760 x 1080


----------



## EarthDog (Nov 23, 2018)

Games are different in how they are coded to use memory. That's a pretty interesting blanket statement (by nvidia) honestly...

I then wonder why in games would the allocation change. For example, in those utilities during a game the "allocated" memory is incredibly dynamic going up and down throughout. Why would it use/waste the extra cycles to constantly reallocate space instead of acting more like a page file? If it peaks at 6gb, why does it appear to instantly 'reallocate' memory to a lesser amount? That doesnt make sense to me.

A game knows how much it needs (GTA V lists the amount, so does SOTR)...why wouldn't it just make a slightly larger pool? I dontbunderstand what efficiencies that method would have.

I also wonder how much it is off from the "allocated" amount the utilities show. I'll have to check to confirm, but I recall in SOTR and GTA V, when benchmarking, what it listed was damn close to what the utilities read. So....how off is it really? While that statement is true, is it just a couple percent so what we see is 'good enough'for most? Or are we talking like 20%???

How did that site capture actual ram use???



John Naylor said:


> My question is... if there's a problem, how come none of these testers have been able to observe it ?   In addition to reporting fps, they are providing assessments of image quality, playability, stuttering or hitching... "no observable difference" ... even at 5760 x 1080


I've seen it mentioned when it happens. It isnt rare. And many just run a canned bench and mark the score, so there's that.


Again, its not so much about average than it can be about minimums and the game experience being as affected in some cases. The severity of things depends on the title and how it manages memory as well. It isnt a black and white issue.


----------



## qu4k3r (Nov 23, 2018)

John Naylor said:


> No utility actually reports _"how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “*None of the GPU tools on the market report memory usage correctly*, whether it’s GPU-Z, Afterburner, Precision, etc. *They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” *_



I found a review from a brazilian site with exactly these two cards. For the same game, sometimes they allocated more or less the same amount of vram, but sometimes not. However they perfrom almost the same.










I don't play the latest games anyways, so I guess the powercolor rx570 4gb will be enough for what I need atm. Thanks for all your anwsers


----------



## SoNic67 (Nov 23, 2018)

The fact that both Windows 10 "Task Manager - Performance tab" and GPU-Z show same amount of GPU memory tells me that what nVidia guy said might have to be taken with a grain of salt. Maybe is true for nVidia cards due to their compression algorithm (but I doubt it).
Also, the fact that a random reviewer says "that there is no quality difference" in not proof. I can say it is in a Youtube video, so it's he said, I said type of deal. Without actual tools to measure the texture quality,  I tend not to believe it.


----------



## EarthDog (Nov 25, 2018)

EarthDog said:


> Games are different in how they are coded to use memory. That's a pretty interesting blanket statement (by NVIDIA) honestly...
> 
> I then wonder why in games would the allocation change. For example, in those utilities during a game the "allocated" memory is incredibly dynamic going up and down throughout. *Why would it use/waste the extra cycles to constantly reallocate space instead of acting more like a page file? If it peaks at 6gb, why does it appear to instantly 'reallocate' memory to a lesser amount? That doesn't make sense to me.*
> 
> ...


@John Naylor - Thoughts......


----------



## John Naylor (Nov 27, 2018)

It's not about how the game is coded, it's about access to the information ... the info is not available.   When installing, the routine looks at the physical amount of RAM on the card ... and says OK, he has 8, lets grab 6 for us.  The validity of this presumption initially was that Max Payne 2 would refuse to install at 5760 x 1080 with 2 GB card ... swap in a 4 GB model and it was fine ... swap back... same avg fps, same min fps, same image quality, same everything.

Of course two different things will record close to the same numbers ... that's the only information available.

As to why peeps have **seen** problems, well it's pretty well known how you can create such problems.... load up max settings at 4k, zoom in and out repetaedly and you get your 15 minutes of fame.

Yes, these have all been addressed in the referenced reviews.   To my eyes, when Guru3d, PugetSound, ExtremeTech and Alienbabeltech all get the same results on 6xx, 7xx, 9xx cards, ... and yes, same avg fps, same min fps, same image quality, same everything.... it's passed "peer review".    They did the same thing to get their 15 minutes of fame with the 3.5 GB **problem** with the 970 ... except as was shown, anytime you could ***create*** the problem on the 970, doing the same thing created the same problem on the 980.


----------



## EarthDog (Nov 27, 2018)

@John Naylor


John Naylor said:


> When installing, the routine looks at the physical amount of RAM on the card ... and says OK, he has 8, lets grab 6 for us.


Ok, that is upon installation. But what we seen in tools like MSI AB is that the RAM "allocation" CONTSTANTLY changes. So while I do believe there are things that check requirements and prevent installation, I think it is bullocks that these tools aren't reading an actual vRAM amount.

Assuming it is true, how much is MSI AB and such off? How did that website which called out those applications for not being accurate actually measure the RAM?

The information is nice, but lacks logic and sense to me...


----------

