• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Plays the VRAM Card Against NVIDIA

Status
Not open for further replies.

bug

Joined
May 22, 2015
Messages
13,790 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10

techspot just did a new review of vram posted about 50 minutes ago.

AMD is such good value, man I would never buy a 8gb vram card in todays gaming world. wild Nvidia still does this.
Their numbers for Hogwarts Legacy or RE4 don't match TPU's. Probably some others, too, can't check them all atm.
 
Joined
Apr 21, 2005
Messages
185 (0.03/day)
Their numbers for Hogwarts Legacy or RE4 don't match TPU's. Probably some others, too, can't check them all atm.

Obviously, games are big and different reviewers use different scenes. Expecting the numbers to match between different sites who test different parts of a game is a folly.
 
Joined
Nov 3, 2011
Messages
695 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Their numbers for Hogwarts Legacy or RE4 don't match TPU's. Probably some others, too, can't check them all atm.
From https://www.techpowerup.com/review/resident-evil-4-benchmark-test-performance-analysis/4.html

For RE4
Techpowerup's 1080p Ultra RT
6800 XT = 105.8 fps
3070 = 88

Techspot's 1080p "Max" with RT
6800 = 91 fps
3070 = crashed

Techpowerup's 1440p Ultra RT
6800 XT = 85 fps
3070 = crashed

Techspot's 1440p "Max" with RT
6800 =77 fps
3070 = crashed

Techpowerup used a faster RX 6800 XT model while Techspot used a lesser RX 6800 model. Techspot's RX 6800 vs Techpowerup's RX 6800 XT numbers are close.
 

bug

Joined
May 22, 2015
Messages
13,790 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
From https://www.techpowerup.com/review/resident-evil-4-benchmark-test-performance-analysis/4.html

For RE4
Techpowerup's 1080p Ultra RT
6800 XT = 105.8 fps
3070 = 88

Techspot's 1080p "Max" with RT
6800 = 91 fps
3070 = crashed

Techpowerup's 1440p Ultra RT
6800 XT = 85 fps
3070 = crashed

Techspot's 1440p "Max" with RT
6800 =77 fps
3070 = crashed

Techpowerup used a faster RX 6800 XT model while Techspot used a lesser RX 6800 model. Techspot's RX 6800 vs Techpowerup's RX 6800 XT numbers are close.
I was talking about tests that passed. They're pretty close together on TPU and wide apart on Techspot. And that's despite Techspot using a slower card, as you noted.
 
Joined
Sep 18, 2017
Messages
198 (0.08/day)
Honestly, VRAM usage really depends on the resolution. There are gaming cards with 20 and 24gb of VRAM. That is such a waste and only makes the cards more expensive. Typically, the top end cards are not that much fasted the the next in the tier so they load it up with unneeded stuff to justify the higher price and the 90, TI, XTX designations. All that additional VRAM just sits idle doing nothing for the life of a gaming card. Money well spent.

My recommendation is if you are buying a new card in 2023 and play at resolutions higher then 1080p, get a card with 12gb - 16gb card with VRAM. For 99% of games on the market, 12gb is enough. Game manufactures dont want high system requirements for games because fewer people will be able to buy and play their games.

There will always be some poorly optimized console ports that will run poorly and use unreasonable system resources. And there will always be a game or two that pushes the envelop and we ask "Can it run Crysis?"
 

jarpe

New Member
Joined
Feb 12, 2023
Messages
4 (0.01/day)
That is an opinion, not fact. Besides, FSR 3.0 is just around the corner, I'd wait for it before calling judgement.


Whether you like FG or not depends on how sensitive you are to input latency. The technology is not without issues at the moment, especially when you generate extra frames from a low frame rate situation.

Personally, I consider FG frame rates irrelevant for comparison.


Not really.
View attachment 291609


Fair enough - I mostly shop at Scan, that's why I was comparing prices from there. A 5% difference might actually be worth it.
It is not an opinion, in many cases the quality mod of FSR looks works than the performance mod of DLSS.
Untitled.jpg


Also FG goes along with Reflex to compensate for latency, and also you don't use FG to hit 60fps, but you use it to hit 100+ fps and the latency in this situation is very good. I was skeptical at first but after I tried FG on several games and I can say it is a game changer and every demanding game need to have it.

This 16% better RT includes games that barely use RT, however in games that heavily use RT and have meaningful visual impact the RT cores on the7900XTX get overwhelmed, that is why the 4080 is 25-45% faster in heavily RT use.
 
  • Like
Reactions: bug
Joined
Jan 14, 2019
Messages
12,353 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It is not an opinion, in many cases the quality mod of FSR looks works than the performance mod of DLSS.View attachment 292340
I haven't tried the latest versions of DLSS since my 2070 died about 6 months ago, so I'll take your word for it. The picture you posted may be an isolated case, but DLSS does look better there, I'll give you that.

Also FG goes along with Reflex to compensate for latency, and also you don't use FG to hit 60fps, but you use it to hit 100+ fps and the latency in this situation is very good. I was skeptical at first but after I tried FG on several games and I can say it is a game changer and every demanding game need to have it.
That's the thing... I don't need 100+ FPS. I need 60, or at least 40-45 minimum.

This 16% better RT includes games that barely use RT, however in games that heavily use RT and have meaningful visual impact the RT cores on the7900XTX get overwhelmed, that is why the 4080 is 25-45% faster in heavily RT use.
I hope both AMD and Nvidia focus the development of their next architectures on RT. Maybe more RT cores with the same number of traditional shader cores. Raster performance is already at a maximum, imo.
 
Joined
Sep 15, 2011
Messages
6,733 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
The keyword being "yet".
In just 3 months, we've had 4 large games where 8Gb started being a serious problem. I predict that the trend isn't going to stop at all during the next two years.
We'll see just how the 10Gb 3080 lasts, and I think the 12Gb 4070/Ti will be worse. At least with the 3080 you had 2 good years of PS4 era holdovers until the requirements climbed hard. The 4070s feel sufficient "for now" I'm sure, but will it even last 2 years? I highly doubt it.
No man. You're just talking about broken console ports, which work fine even with 8GB VRAM cards. It's not that all of the sudden they just start adding 8K resolution textures.
Plus some of the game engines out there cache the whole VRAM, even if you have 24 or 32GB.
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
This is so much BS that it is ridiculous. I have yet a game to play that would require more than 10GB of VRAM. Even the crappiest of all ports ever released, the "The Last of Us - P1" is smooth as butter on Ultra with G-Sync ON, even if the so called in game VRAM usage is around 12.8GB of VRAM.
This kind of post just reads "I dont get it"

We not talking frames per second, we talking about texture quality, textures going *poof* and games crashing. Also some games stutter due to excessive asset swapping (caused by low VRAM).

Now days many games have dynamic engines which adjust to available VRAM on the fly so the effect of low VRAM is not as abvious as it could be.

Some of us are not ok with PS2/PS3 quality textures in 2023.

Its not important to you personally, thats fine, doesnt mean its BS though.
 
Joined
Sep 26, 2022
Messages
232 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
No. All RTX 4000 graphics cards already use 2GB chips. And there definitely won't be any 3GB chips in the next year or two. If 3GB chips do appear, it will only be with the release of the RTX 5000.

How does this one manage 20GB on a 160bit bus then?
 
Joined
Mar 19, 2023
Messages
153 (0.25/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
2 reasons to not wait:
  1. Low performance/€ improvement (so far)
  2. Growing tensions between China and Taiwan
Bonus:
7700/7700XT will likely have similar performance to the 6800XT, but with (only) 12G VRAM
Unlikely. The Angstronomics leak, which was very complete, mentioned a 256 bit bus for Navi 32. Navi 31 has 384.

Logically if the 384 bus was for 24Go, 256 should be for 16.
No dice on Navi 31, still 128 and unless they double the RAM for certain models, it's still going to be an 8.
 

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
How does this one manage 20GB on a 160bit bus then?
There is a bilateral placement of chips here. Two 2GB chips are placed on each memory controller. One chip is placed on the front side, the other on the back. 160bit / 32bit = 5 controllers , 5 controllers * 2 sides = 10 places for the 2GB gddr6 chips = 20GB of VRAM. But since it is very expensive, we definitely won't see this layout on mid-range cards. The only gaming card with such a chip placement was the expensive 3090, with 24 GDDR6X chips of 1GB each, 12 on each side (at the time of the 3090 release, there were no 2GB GDDR6X chips available).
For example, here is a photo of the Asus Strix 3090.
 

Attachments

  • 55.jpg
    55.jpg
    869 KB · Views: 46
Last edited:
Joined
Sep 26, 2022
Messages
232 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
There is a bilateral placement of chips here. Two 2GB chips are placed on each memory controller. One chip is placed on the front side, the other on the back. 160bit / 32bit = 5 controllers , 5 controllers * 2 sides = 10 places for the 2GB gddr6 chips = 20GB of VRAM. But since it is very expensive, we definitely won't see this layout on mid-range cards. The only gaming card with such a chip placement was the expensive 3090, with 24 GDDR6X chips of 1GB each, 12 on each side (at the time of the 3090 release, there were no 2GB GDDR6X chips available).
For example, here is a photo of the Asus Strix 3090.

Jim Carrey Chance GIF
Does it have to be done for all the chips? I.e. in the case of the 4070, would we have a 24gb version or could they just install enough chips in the back to get 16gb?
 

Winssy

New Member
Joined
Mar 31, 2023
Messages
21 (0.03/day)
Does it have to be done for all the chips? I.e. in the case of the 4070, would we have a 24gb version or could they just install enough chips in the back to get 16gb?
As far as I know, dual chip placement must be done for all controllers at once. In other words, the 4070 can have either 12GB or 24GB of VRAM.
In my opinion, there is only one way to make the 4070 with 16GB using dual chip placement, which is to use 4 memory controllers. But in this case, the memory bus will be 128-bit, which will negatively impact the performance.
 

bug

Joined
May 22, 2015
Messages
13,790 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Jim Carrey Chance GIF
Does it have to be done for all the chips? I.e. in the case of the 4070, would we have a 24gb version or could they just install enough chips in the back to get 16gb?
Like he said, the arrangement is expensive, so probably not for mid-range cards. Even if the difference wasn't big, who would pay more for a 4070?
 
Joined
May 15, 2020
Messages
697 (0.42/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
As far as I know, dual chip placement must be done for all controllers at once. In other words, the 4070 can have either 12GB or 24GB of VRAM.
In my opinion, there is only one way to make the 4070 with 16GB using dual chip placement, which is to use 4 memory controllers. But in this case, the memory bus will be 128-bit, which will negatively impact the performance.
Actually, it can be done, but it will result in uneven bandwidths for different parts of the memory, which is as bad an idea as it sounds:
So we won't be seeing anything like that this time.
 
Joined
Aug 2, 2012
Messages
1,988 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
I'd rather have Nvidia than a headache, thank you very much.
 
Joined
Sep 26, 2022
Messages
232 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
Even if the difference wasn't big, who would pay more for a 4070?
I wouldn't. But maybe they would keep it at 600$ and lower the standard 12gb to 500$
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Like he said, the arrangement is expensive, so probably not for mid-range cards. Even if the difference wasn't big, who would pay more for a 4070?
Well if they not profiteering it might be an extra $50-100 to bump a 4070 to 16 gigs, I probably would be more likely to buy a $700 16 gig 4070 than a $600 12 gig 4070.

But knowing Nvidia if they released a 16 gig model it would cost an extra $300.
 
Joined
Sep 15, 2011
Messages
6,733 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
This kind of post just reads "I dont get it"

We not talking frames per second, we talking about texture quality, textures going *poof* and games crashing. Also some games stutter due to excessive asset swapping (caused by low VRAM).

Now days many games have dynamic engines which adjust to available VRAM on the fly so the effect of low VRAM is not as abvious as it could be.

Some of us are not ok with PS2/PS3 quality textures in 2023.

Its not important to you personally, thats fine, doesnt mean its BS though.
You understood nothing from what I've wrote.
I said I was using ultra settings on that game, INCLUDING textures, and didn't have any crashes or stuttering, or sudden pop up effects or buffering on that game.
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
You understood nothing from what I've wrote.
I said I was using ultra settings on that game, INCLUDING textures, and didn't have any crashes or stuttering, or sudden pop up effects or buffering on that game.
This is so much BS that it is ridiculous. I have yet a game to play that would require more than 10GB of VRAM. Even the crappiest of all ports ever released, the "The Last of Us - P1" is smooth as butter on Ultra with G-Sync ON, even if the so called in game VRAM usage is around 12.8GB of VRAM.

Ok boss.
 

bug

Joined
May 22, 2015
Messages
13,790 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
He's talking about how games preload stuff if there's VRAM available. It's done to minimize IO, but it doesn't result in a meaningful performance impact.
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
He's talking about how games preload stuff if there's VRAM available. It's done to minimize IO, but it doesn't result in a meaningful performance impact.
Preloading likely can prevent stutters. Its a good thing, I rather have my assets preloaded than loading on the fly to cause stutters.

I miss the days of load everything into ram, and only then you play, no live loading of stuff in background. I wonder what prevents them doing that now? hmm.
 

bug

Joined
May 22, 2015
Messages
13,790 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Preloading likely can prevent stutters. Its a good thing, I rather have my assets preloaded than loading on the fly to cause stutters.
Well, we're kinda drifting from "must have 16GB VRAM". Unless you're trying something like Rage, textures only change during level changes or moving from one area to another. Not exactly the part of the game that would be ruined by a few stutters.
I miss the days of load everything into ram, and only then you play, no live loading of stuff in background. I wonder what prevents them doing that now? hmm.
There is no way to do that. Try as you may to fit everything into VRAM, there's a kid somewhere with Photoshop and time on their hands that will take your textures, apply 2x scaling on both axes, call that a HD texture pack and boom! you're out of VRAM again. Also, 16GB VRAM is the same as typical RAM in a PC, that's pretty imbalanced.
I think what the developers/engines do right now is pretty well thought: set a baseline and if they find more VRAM than that, try to load some more. Preloading is guesswork though, because you never know what the next area the player visits will be. Or, you can try to predict the next area based on which "exit" the player approaches, but the player changes their mind, you're just preloading things that won;t be used next and you initiate IO that may lead to other performance drops.
 
Joined
Feb 1, 2019
Messages
3,610 (1.69/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Well, we're kinda drifting from "must have 16GB VRAM". Unless you're trying something like Rage, textures only change during level changes or moving from one area to another. Not exactly the part of the game that would be ruined by a few stutters.

There is no way to do that. Try as you may to fit everything into VRAM, there's a kid somewhere with Photoshop and time on their hands that will take your textures, apply 2x scaling on both axes, call that a HD texture pack and boom! you're out of VRAM again. Also, 16GB VRAM is the same as typical RAM in a PC, that's pretty imbalanced.
I think what the developers/engines do right now is pretty well thought: set a baseline and if they find more VRAM than that, try to load some more. Preloading is guesswork though, because you never know what the next area the player visits will be. Or, you can try to predict the next area based on which "exit" the player approaches, but the player changes their mind, you're just preloading things that won;t be used next and you initiate IO that may lead to other performance drops.
Ahh so you now acknowledged VRAM capacity can be an issue then?
 
Status
Not open for further replies.
Top