Friday, November 9th 2007

How Much Graphics Memory do You Really Need?

As monitors get bigger, run at higher resolutions, and video games require ridiculous amounts of graphics memory to run at respectable settings, both AMD and NVIDIA have shoved more and more graphics memory into their cards. However, how much is enough? The folks at YouGamers did some serious tests, and discovered some interesting facts about VRAM. While AMD and NVIDIA both want you to think that humongous amounts of VRAM will magically make your games run at 1920x1200, YouGamers discovered that quantity is not what really matters. If you want to run the most stressful games at the highest resolutions possible, you will see much more benefit from getting faster graphics card memory, or simply a faster graphics card. You can read the full investigative article here.
Source: Nordic Hardware
Add your own comment

30 Comments on How Much Graphics Memory do You Really Need?

#1
Aeon19
I thought that was obvious
Posted on Reply
#2
Black Panther
Aeon19I thought that was obvious
Well, it might not be obvious to everyone. Don't forget we have members who still think toothpaste is good for use as a tim paste... ;)

Interesting article btw. Never knew that " Vista doesn't differentiate video RAM from system RAM - it's all the same, as far the operating system and games are concerned."
Posted on Reply
#3
mdm-adph
Black PantherDon't forget we have members who still think toothpaste is good for use as a tim paste... ;)
You're... kidding me... :twitch:
Posted on Reply
#4
newconroer
Then why does Crysis stomp the 320 GTS and not the 640?


Article quote :

"When the graphics processor wants to use them, it copies them across into its RAM, deleting other stuff to make room. Cue a spot of stuttering or slow down in the frame rate; this is because it takes quite a bit longer to swap textures around than just accessing them in the onboard (or to give it the correct name, local) RAM."


So yes, not having enough IS relevant. It seems like the article is saying that the faster the RAM on the GPU, the more capable it is of purging unused data and replacing it with active data. But until the hardware is capable of doing so.........
Posted on Reply
#5
Weer
VRAM is only important if it's a bottleneck.

Obviously if you have 2GB of VRAM, you're wasting a lot of power.

And for the same reason a 512MB 8400GS is a horrible idea.
Posted on Reply
#6
DaMulta
My stars went supernova
More the better!!!!!(Depends on what you are doing)

Lets load all of the textures into our video cards!!!!
Posted on Reply
#7
jydie
I would have to confirm that from the various testing I have done in the past 3-4 years. Using ddr2 and ddr3 in video cards is often the most significant difference between certain models... like the Nvidia 8600GS and 8600GT, and the 2600 Pro and 2600 XT. They often bump up the core's clock speed, but the speed of the memory is what seemed to make the biggest difference. I have never noticed any significant improvement between same-model cards that had 256MB of memory compared to one with 128MB.

Besides clock speed, the other obvious thing to look for is the memory interface... 64-bit and lower should be avoided if you want to do any type of gaming, 128-bit is normally used in mid ranged cards, and 256-bit or higher is used in the higher end cards.
Posted on Reply
#8
Sasqui
DaMultaMore the better!!!!!(Depends on what you are doing)

Lets load all of the textures into our video cards!!!!
It does depend totally. While right now 1GB is probably overkill, you can see many of the game on that link that use way over 512MB at 1600x1200. Once you exceed your video ram, it swaps to system memory (much slower), and you get dropped frames, lags, etc.

Too bad they didn't include 1920x1200 in the charts - I'm sure the usage would go up to the 800 MB range on some games.
Posted on Reply
#9
hv43082
So what about gaming at 2560x1600? More Ram definitely matters at this resolution, right? Sorry but too lazy to read the entire article, just pulled a super late night study session for this morning exam...zzzz...
Posted on Reply
#10
jocksteeluk
If pc game developers put as much time into development for existing hardware as they do for consoles', 128mb of vram and 2gig cpu would no doubt last years rather than months as the pc formats top spec system but the fact is companies' like Nvidia, AMD and Intel rely on the waste in the pc games industry to continually sell more products and continue the flow of income.
Posted on Reply
#11
musek
Well... another 'u-must-buy-new-hardware' article. :/ If, for example, Call of Duty 4 is sooooo VRAM eater (always ~400megs) then how was i able to play the demo (same as they did) on R9800 with 128MB VRAM without any problem (on resonable settings of course, 1024x768)? My card (and whole system, coz i have 1GB system memory) should just struggle and write me 'No Wai!'.

I must agree to one thing though - 256MB of VRAM is nowadays absolutely minimum option (IMO not worth buying if someone wishes to play newer titles). But, hey, do we really need an article to know about it. ?


Namaste,
musek


PS. Sorry for my english. :D
Posted on Reply
#12
OnBoard
musekWell... another 'u-must-buy-new-hardware' article. :/ If, for example, Call of Duty 4 is sooooo VRAM eater (always ~400megs) then how was i able to play the demo (same as they did) on R9800 with 128MB VRAM without any problem (on resonable settings of course, 1024x768)? My card (and whole system, coz i have 1GB system memory) should just struggle and write me 'No Wai!'.
Because you didn't have all the settings on HIGH :) Just dropping texture levels lower take whole memory need also lower. Would be nice if they'd included medium settings for those new games as most will propably use them to get playable frames.

edit: And cheers for the article, seems it's time to go 512 like I was planning to. 256MB has been fine for now, Crysis was the first game that absolutely dies when you enabled AA. I got 3fps and a friend 1fps (with a bit higher resolution and pro version card) :p.
Posted on Reply
#13
trog100
the important thing is how much grunt the card has got.. sticking large amounts of memory on cards that cant handle the resolutions and settings that need it is the con..

the amount of memory on a card is a selling point.. mostly they come with more than they can use.. the article is misleading.. if the card aint got the grunt no amount of extra memory is gonna help it..

trog
Posted on Reply
#14
FreedomEclipse
~Technological Technocrat~
lucky for me, I still have A X1800XT 512Mb :P
Posted on Reply
#15
Tatty_Two
Gone Fishing
trog100the important thing is how much grunt the card has got.. sticking large amounts of memory on cards that cant handle the resolutions and settings that need it is the con..

the amount of memory on a card is a selling point.. mostly they come with more than they can use.. the article is misleading.. if the card aint got the grunt no amount of extra memory is gonna help it..

trog
Agree completely, and tests show that even with no AA/AF that once you EXCEED a resolution of 1280 x 1024 then you are exceeding 256MB of GDDR which will result in some system ram swapping......AKA.....jittering, to what degree/amount of RAM is being used thereafter and therfore how much system ram swapping takes place depends on the resolution and the detail level. Even at 1280 x 1024 in Oblivion with 8x AA and 8x AF at times during the game the card will require more than the 256MB.

I have a really nice set of tests somewhere that I have posted here before that really sums the process up well......I'll have to try digging it out.....just on the off chance that some of you managed to stay awake until the end of my post and are interested :laugh:
Posted on Reply
#16
musek
OnBoardBecause you didn't have all the settings on HIGH :) Just dropping texture levels lower take whole memory need also lower.
Still, at friends rig (256mb vram) CoD4 demo is flying with everything max'ed out.
Posted on Reply
#17
Tatty_Two
Gone Fishing
musekStill, at friends rig (256mb vram) CoD4 demo is flying with everything max'ed out.
Resolution?
Posted on Reply
#18
Steevo
I remember the 128MB cards beiunbg questioned, if there was the need for such extravagance and extra exspence. I need spell check, I know it.
Posted on Reply
#19
jimmylao
Mmmm, it's just the technology trend. It should be common sense to everyone that if you're going to run new games in 1600x1200 or better resolution with everything maxed out and AA+AF. You're going to need a high end card thus equating to faster clocked ram, gpu, and cpu. And like everyone else said, if you're going to run at that resolution you should have the RAM and VRAM so swapping doesn't occur... however like the beginning posts said, some people think using colgate toothpaste is ok as a tim (are usually newbies who are either learning about computers or are hardcore gamers trying to understand what they need in order to build a faster comp).

All-in-all, it's a good post if someone who doesn't understand comps too much but loves running their games at max settings and wonders why they lag. :rockout:
Posted on Reply
#20
musek
@Tatty_One - 22'' wide, so 1680x1050 at X850XT PE & Pentium D.

@jimmylao - I agree. But i still think that if someone using toothpaste as a tim, he won't be here to read atricles like this (and thats sad).
Posted on Reply
#21
musek
Well... maybe, just look at my current specs to know that at my rig i'm barely walking, mostly crawling. :P


PS. No AA was set.
Posted on Reply
#22
imperialreign
Agree completely, and tests show that even with no AA/AF that once you EXCEED a resolution of 1280 x 1024 then you are exceeding 256MB of GDDR which will result in some system ram swapping......AKA.....jittering, to what degree/amount of RAM is being used thereafter and therfore how much system ram swapping takes place depends on the resolution and the detail level. Even at 1280 x 1024 in Oblivion with 8x AA and 8x AF at times during the game the card will require more than the 256MB.
and that's where the rest of the system comes into play - especially sys MEM and BUS speeds . . . which, come into play no matter what resolution or AA/AF you're running.

Maybe it's just me - but, you have to take the system as a whole into account when looking at video performance. For example, we all know how decently powerful a X1950 is, but through a 1950 in with a P4 and you get sub-par performance, doesn't matter what the clock speeds of the 1950, P4 or DRAM are . . . a little odd that the article barely touches on this.
Posted on Reply
#23
trog100
some games load all the stuff/textures and whatever in at the start of each level.. when they do this u get a long loading wait then it all runs smooth..quake 4 works this way for example..

some like oblivion load it in bit by bit as u go along.. the game slows down (stutters) each time the card needs more textures.. it takes time for the card to be loaded with textures from the system memory.. the bigger the cards memory the longer the slow down.. he he..

not much can be done about this annoying slowdown every so often.. except play the game at lower resolutions and settings.. the old load it all in at the level start worked.. but levels are so huge now it cant be done..

the bottom line is the more the cards memory the longer the stutter as it gets filled.. it all runs nice between stutters is about all u can say at high settings and resolutions with games like oblivion.. he he

trog
Posted on Reply
#24
imperialreign
I honestly preffered the old method of loading all textures prior to the beginning of a level, I don't mind waiting a few extra minutes of load time if it means that the sys won't have to swap out textures 5+ times in a single map . . .

but, you're right, newer games use such huge maps and such intricate textures that load times would be stoopid long . . . the only other fix would be to break a map up into smaller areas, but this would mean "loading zones" and that's not something I associate with PC games - only consoles.
Posted on Reply
#25
WarEagleAU
Bird of Prey
Very interesting article. I didnt know some of this stuff. Thanks.
Posted on Reply
Add your own comment
Jan 9th, 2025 01:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts