• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Super Founders Edition

Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
12 GB will be enough at 1080p till 2030.
At 1440p, till probably 2028.
At 4K, it's fine now and will probably do until 2026.

It's 8 GB that's not enough as of now. 12 GB is fine. What hurts this GPU more is its low VRAM bandwidth.
Dream on ;) I give it 3 years.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Aaand right on cue here are the "12GB isn't enough" crowd. It is, for the simple reason that consoles don't even have 12GB.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I won't showcase every single game out here but if we don't count VRAM hogs (just a couple of them exist as of now), 4K gaming is pretty much accessible at <10 GB VRAM point. What ruins 4K experience for 4070 series is that these GPUs are not fast enough for maxing 4K out and you will need either upscaling or some settings turned down in order for games to reach 60+ FPS.

View attachment 329986

IoA is a UE5 game with UE5 being one of the most popular game engines out here. A lot of upcoming AAA games will be based upon this engine and it doesn't slurrrrrrp on your VRAM as much as you are whining about it.

Starfield is also a clear example as to why 12 GB is NOT an issue:

View attachment 329987

Even the Resident Evil 4 Remake, one of the worst case scenarios for nVidia GPUs in general, is running hotly well at 4K on these GPUs:

View attachment 329988

Games like Cities: Skyilnes, The Last of Us, Hogwarts Legacy etc are outliers and they will be outliers for about a couple years from now at least.

At 1440p, DLSS Q allows to shrink VRAM requirements a bit, and at 4K, you are even good to go for DLSS B or even Performance mode. DLSS is by no mean ideal but this feature massively helps these GPUs and will improve over time.

Overall, 12 GB on both these GPUs is fine. I'd very much love to see ~23 GHz VRAM on Super though because this video cards suffers from limited VRAM bandwidth, especially in higher resolutions.
Games like Cities et all aren't outliers, they've always been there. The adequate classification is 'mainstream' versus 'not so mainstream' games. And then you'll see in both categories there are numerous games that do like to have more than 12GB already if you want to really max them out, or mod something to make it better.

We can talk all day about the % of games that might or might not like more than 12GB, and in what time frame that amount might increase, but its irrelevant. If you so happen to like one of them, you've got yourself an inadequate GPU for it.
The same principle applies to CPUs. You want one that runs every game you throw at it and doesn't bottleneck you hard in game X or Y, right? And sure, there are games that are just horribly coded and run shit on all CPUs, but even then, the faster CPU will still net you an advantage the slower one doesn't; you could also relate that to core count near the end of the quadcore era for a similar case as VRAM 8 vs 12 vs 16 today. VRAM is similar that way. You either have enough, or you don't. The only real defining factor in what is the best GPU imho is the one that runs out of core oomph last at the price you want to pay. VRAM shouldn't be the deal breaker.

But then, that's my opinion. The underlying question really is how much $ DLSS and RT are worth to you. Other than that, Nvidia simply offers a lesser product for your money throughout the entire stack. That's the facts as they are laid out before us. We should really stop being in denial of games in existence that require a certain amount of VRAM due to a lack of experience, the reviews we did or didn't read, knowledge or point of view. They exist. They did in 2022. They did in 2023. And they do going forward. And yes, it IS silly that we're getting the hardware we get at the price we're paying right now. Again, let's not be in denial. The current crop of GPUs is subpar. That includes AMD's - they're not priced aggressively enough for what they offer. We're choosing between evils.
 
Last edited:
Joined
Nov 14, 2012
Messages
95 (0.02/day)
It's so ridiculous that people claim that Ray Tracing is better on a 4070 super card than AMD, but the performance penalty is such that the game is almost unplayable.
Anyone who wants to play with Ray Tracing today needs a 4090 card no less.
That
It's so ridiculous that people claim that Ray Tracing is better on a 4070 super card than AMD, but the performance penalty is such that the game is almost unplayable.
Anyone who wants to play with Ray Tracing today needs a 4090 card no less.
Low RT implementation -20% lower FPS, high RT implementation -50%+ lower FPS
So what a point play games at 30-60fps ...
Feature only you can use in 1080p with 600$ gpu ...
 
Joined
Nov 27, 2023
Messages
2,482 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
We're gonna need to hope more cards can overclock their memory like that dual card so we can avoid the absolutely pathetic results like the 4070 super tying the 3080 10gb in CS2 at 4k because of that bandwidth. Yes it's still an issue at 1440p, the 4070 super should not be only beating the 3080 10gb by 5% either. None of AD104 is suited to 500GB/s of bandwidth besides a theoretical 4070 with the full 48mb of l2 cache.
I don’t think that CS2 results have anything to do with the memory bandwidth. It’s just the usual case of Source being Source and behaving in weird ways on newer hardware. It might be fixed with drivers/game updates. If it WAS bandwidth we would see the 7800XT beating at least the base 4070. It does not. In any resolution. The game is just not that memory hungry, it’s an e-sports title.
 
Joined
May 24, 2023
Messages
948 (1.65/day)
People KNOW that nVidia is a terrible company and still choose to support them, like you did.
I did not choose AMD GPU because of the high idle and multimonitor power draw. I hate that. The GPUs from the terrible company at least work as expected in this area. Even the 4070 itself is a good product, very efficient, etc. The price and the position of a customer when trying to decide what to buy are not good at all.
 
Joined
Feb 24, 2023
Messages
3,126 (4.71/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
I don't want to repeat 3070Ti experience, while RX 6800 still relevant, there is a lot of vids about how bad having 8GB today
8 GB was an uncharted territory till 2016. Then, it was only shipped onto highest end GPUs starting with GTX 1070 (or RX 470, yet this GPU is a purely different story).
Then, we observed another two generations of upper-range 8 GB GPUs (2070 and 3070 also having 8 GB onboard). That's why 8 GB looked not so great on 3060 Ti (but acceptable), fouly on 3070 and were a complete insult on a 3070 Ti.

This effectively made 8 GB a 3-generation, or a 6-year thing with first iterations (GTX 1070, 1080) being incapable of sensibly saturate such VRAM capacity, second iterations (2070, 2080) being completely well balanced, and third iterations (3060 Ti, 3070) having more horsepower than 8 GB VRAM allow to make full use of.

12 GB is 1.5 times as much and games where 8 GB is not enough are coincidentally the games where 12 GB is plenty. You realistically need about 9.5 GB for smooth experience in any game whatsoever at 1080p, or 11.5 GB at 1440p. 12 GB is scuffed for some titles at 4K, also is not particularly great if we enable ray tracing but, y'know, heavy RT destroys all existing GPUs and light RT doesn't consume that much VRAM to matter at this point.

Cranking EVERY SINGLE SETTING on Ultra is not mandatory to have proper gaming experience. Some games look marvellous even at Low presets. This way 4070 Super is a 1080p beast, 1440p great choice, 4K ready device. Ready as in you can play anything but some games will require upscaling and/or lower settings.

Regarding RX 470 and AMD GPUs in general: these GPUs handle VRAM differently. And recent reviews showcase about 500 to 900 MB deficite on the AMD's side at 1080p, meaning 4060 with its 8 GB is just barely enough for some titles where 7600 with its same 8 GB VRAM buffer ends up eating dirt and stuttering like no tomorrow. In the grand scheme of things, 4070 Super is definitely behind such GPUs as 7800 XT in terms of VRAM both speed wise and capacity wise. That shows at 4K and very heavy 1440p presets. But let's be honest, even despite worse VRAM overall, 4070 Super does look like a better buy. And at 600 bucks, these 12 GB are expensive but 500 GBps are even worse.

An 8 GB 256-bit GPU with exactly the same L2 cache and exactly the same die would've outperformed this 4070 Super in 90+ % games, especially at 4K. Just look at what 3080 does to 4070 series at 4K. VRAM bandwidth, baby.
 
Joined
Nov 27, 2023
Messages
2,482 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
Aaand right on cue here are the "12GB isn't enough" crowd. It is, for the simple reason that consoles don't even have 12GB.
To play devils advocate in this particular case, I assume most people want to use settings that are quite beyond what the console versions run at and thus assume more memory will be needed. Of course, then we get into the whole idea of diminishing returns and whether minor visual improvements from Uber Epic Ultra settings actually are meaningful, let alone if anyone at the dev studio bothered to optimize them at all.
But don’t get me wrong, I am with you on this.
 
Joined
Jan 8, 2017
Messages
9,499 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
This card is meant for 1080P.
600$ card for 1080p ? :roll:

My brother in Christ the year isn't 2010 anymore. Saying this is "meant for 1080p" is outright insane, if you said this as some kind of excuse for something, it's not, you should be able to play at 4K with a card this expensive at this point in time. That's not unreasonable at all.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
8 GB was an uncharted territory till 2016. Then, it was only shipped onto highest end GPUs starting with GTX 1070 (or RX 470, yet this GPU is a purely different story).
Then, we observed another two generations of upper-range 8 GB GPUs (2070 and 3070 also having 8 GB onboard). That's why 8 GB looked not so great on 3060 Ti (but acceptable), fouly on 3070 and were a complete insult on a 3070 Ti.

This effectively made 8 GB a 3-generation, or a 6-year thing with first iterations (GTX 1070, 1080) being incapable of sensibly saturate such VRAM capacity, second iterations (2070, 2080) being completely well balanced, and third iterations (3060 Ti, 3070) having more horsepower than 8 GB VRAM allow to make full use of.

12 GB is 1.5 times as much and games where 8 GB is not enough are coincidentally the games where 12 GB is plenty. You realistically need about 9.5 GB for smooth experience in any game whatsoever at 1080p, or 11.5 GB at 1440p. 12 GB is scuffed for some titles at 4K, also is not particularly great if we enable ray tracing but, y'know, heavy RT destroys all existing GPUs and light RT doesn't consume that much VRAM to matter at this point.

Cranking EVERY SINGLE SETTING on Ultra is not mandatory to have proper gaming experience. Some games look marvellous even at Low presets. This way 4070 Super is a 1080p beast, 1440p great choice, 4K ready device. Ready as in you can play anything but some games will require upscaling and/or lower settings.

Regarding RX 470 and AMD GPUs in general: these GPUs handle VRAM differently. And recent reviews showcase about 500 to 900 MB deficite on the AMD's side at 1080p, meaning 4060 with its 8 GB is just barely enough for some titles where 7600 with its same 8 GB VRAM buffer ends up eating dirt and stuttering like no tomorrow. In the grand scheme of things, 4070 Super is definitely behind such GPUs as 7800 XT in terms of VRAM both speed wise and capacity wise. That shows at 4K and very heavy 1440p presets. But let's be honest, even despite worse VRAM overall, 4070 Super does look like a better buy. And at 600 bucks, these 12 GB are expensive but 500 GBps are even worse.

An 8 GB 256-bit GPU with exactly the same L2 cache and exactly the same die would've outperformed this 4070 Super in 90+ % games, especially at 4K. Just look at what 3080 does to 4070 series at 4K. VRAM bandwidth, baby.
Consider this for a point of view on 'GPU VRAM requirements over time':

In 2012, we had 2GB on the high end, which evolved to 3GB peak.
In 2014, we had 4GB on the high end, 6GB peak. (+100%)
In 2016, it doubled to 8GB to keep track with consoles, 11GB peak. (+100%)
In 2018, we stalled at 8GB, 11GB peak. (+0%)
In 2020, we stalled at 8-10GB, with a few enthusiast cards above it (new segment). Almost the entire lineup later got double VRAM versions for god knows what reason. (+0%)
In 2023, we're looking at 12GB with 16GB 'peak' at the price point of the non enthusiast cards. (+50% across 3 gens)

Over all this time, everyone was playing their shit at 1080p, 1440p. Some single digit % in 4K or UW. Resolution is irrelevant.

Cache is supposed to save an abysmal bandwidth on this paltry VRAM. And it does! To an extent.
And you are defending that games will keep track with this nonsense? Doubtful, as proven by the games that want more, either because they show lots of different assets and textures (Cities..) or because they run newer engines (UE5) and RT (tm* Nvidia), or because they're simply console ports that have at least more than 10GB and will very soon readily want more than 12 come the next update.

You really need to escape copium mode and look at the facts. VRAM relative to core power has been sharply reduced since 2018 and games are catching up to it fast. The console market has every reason not to wait on Nvidia's lazy ass overpriced crap.
 
Joined
Dec 29, 2020
Messages
210 (0.14/day)
Knowing AMD they will probably let the price of the 7800xt float down a bit after this.
As it stands with lower efficiency, lower RT performance, and a worse upscaler. Nvidia can ask a premium a moderate permium at the same performance. The current price difference is not enough, that gap needs to widen.

Generally AMD has done just that, let the market move the price down.
 
Joined
Nov 27, 2023
Messages
2,482 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
And you are defending that games will keep track with this nonsense? Doubtful, as proven by the games that want more, either because they show lots of different assets and textures (Cities..) or because they run newer engines (UE5) and RT (tm* Nvidia)
Not… the best examples there my man. Cities 2 is fundamentally broken, not even poorly optimized, just broken. It has an abysmal engine. It chokes because it doesn’t cull or occlude anything. Not because it wants to show off some wonderful textures and assets. And UE5 is made from the inception with upscaling in mind. Doesn’t matter if we like it, that’s how the engine is designed. You are NOT supposed to run it at native. All of its flagship features like Nanite are developed with upscaling in mind.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Not… the best examples there my man. Cities 2 is fundamentally broken, not even poorly optimized, just broken. It has an abysmal engine. It chiles because it doesn’t cull or occlude anything. Not because it wants to show off some wonderful textures and assets. And UE5 is made from the inception with upscaling in mind. Doesn’t matter if we like it, that’s how the engine is designed. You are NOT supposed to run it at native. All of its flagship features like Nanite are developed with upscaling in mind.

Here's an early campaign of TW WH3, if you need other examples. It will happily use 12GB and some bits north of it. Not even maxed, mind, I'm at 3440x1440.

And sure, utilized versus allocated. Or: potentially stuttery versus butter smooth. I've been in the game for too long not to know what to buy a GPU on ;)

Ironically you will find high VRAM usage in a lot of games that aren't necessarily very graphically intensive on the core. 4X, (grand) strategy, sim. Essentially also games that like to use mods to expand the experience.

Warhammer3_2023_04_15_15_34_57_177.jpg
 
Joined
Nov 27, 2023
Messages
2,482 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
Here's an early campaign of TW WH3, if you need other examples. It will happily use 12GB and some bits north of it. Not even maxed, mind, I'm at 3440x1440.

And sure, utilized versus allocated. Or: potentially stuttery versus butter smooth. I've been in the game for too long not to know what to buy a GPU on ;)

View attachment 329998
WH3 is also quite broken. Which was the reason W1zz even removed it from the bench suite. And from all the benches I saw, there are no real benefits to smoothness (i.e low percent frametimes) between, say, AMD cards with more VRAM and NV cards with less. Whenever it chokes it chokes hard on both CPU and raw GPU power. Again, Warscape is an old and abused engine, drawing conclusions from it is silly.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
WH3 is also quite broken. Which was the reason W1zz even removed it from the bench suite. And from all the benches I saw, there are no real benefits to smoothness (i.e low percent frametimes) between, say, AMD cards with more VRAM and NV cards with less. Whenever it chokes it chokes hard on both CPU and raw GPU power. Again, Warscape is an old and abused engine, drawing conclusions from it is silly.
Warhammer 3 plays fine - or played, before CA broke it and then proceeded to not fix it.

I frankly don't care whether or not engines are ancient or not, I just care if shit runs proper, and I start upgrading when it doesn't. Every time I felt compelled to upgrade, I was either: short on CPU cores (3570K), short on RAM (8GB), short on VRAM (2GB on 2x660, then 2GB on 1x770, and later 3GB on a 780ti). It was only with Pascal's perfectly balanced 1080 that core and VRAM would just run into their limit in tandem beautifully, and only because I was already limiting myself to get decent frames. Every GPU prior, it was VRAM killing me.
 
Joined
Nov 27, 2023
Messages
2,482 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
Warhammer 3 plays fine - or played, before CA broke it and then proceeded to not fix it.
And it played fine on 12 and even 10 and even 8 gig cards. As fine as it could with regards to the cards raw power, that is. Even the resolution scaling was quite linear.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
And it played fine on 12 and even 10 and even 8 gig cards. As fine as it could with regards to the cards raw power, that is. Even the resolution scaling was quite linear.
It also depends what you bench and play, the hardest nut to crack is really the campaign map. Battles are easy on VRAM.

When I ran WH3 campaign map on my 1080, it was a stuttery mess even with lower settings that would still exceed or hit 8GB. 30-40 FPS, but stuttery.
 
Joined
Jan 20, 2019
Messages
1,589 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
As always GREAT reviews from the wand waving w1zzard. The appreciation never gets old and the catalogued data and how its presented, certainly not forgetting the amount of work that goes into it, deserves Tina Turnas "SIMPLY THE BEST" :)

RE: 4070 SUPER: Looks great but $600... nah! For this sort of money in the least you'd expect some additional reward with 16GB VRAM and some of that wider/faster bandwidth/memory for a more appealing ageing process (esp. hi-res gaming). I'm just not conditioned to accept "$600" for a mid-segment card which should have replaced the original 4070 at $500 (yep a further $50 reduction), similar to the 4080 SUPER coming in at $999 when the standard 4080 model was extortionately selling at $1200+. I would have liked to see the 4070 TI SUPER at the $600 range but what can you do its Nvidia with nice products and day light robbery MSRPs and no doubt people who are willing to splurge (not me).

lol funny thing is... i read a comment above somewhere someone complaining about buying a standard 4070 and now feels gutted. Yep, thats where I am even without the purchase.
 
Joined
Nov 27, 2023
Messages
2,482 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
It also depends what you bench and play, the hardest nut to crack is really the campaign map. Battles are easy on VRAM.

When I ran WH3 campaign map on my 1080, it was a stuttery mess even with lower settings that would still exceed or hit 8GB.
Yup, but the fact that the goddamn map screen runs worse than the battles with potentially hundreds, if not thousands of units, kinda speak to the fact that the engine is not doing great. I heavily dislike making judgments on performance trends based on software that just doesn’t function properly. I mean, I could make a scenario in SC2 that will choke even the fastest modern CPU and cause heavy frame drops. This stems from the engine being almost entirely single threaded and caring basically only about raw IPC+clocks of a core. Does this mean that we should say that, for example, improving game performance through cache like AMD did is unviable? Of course not.
 
Joined
Jul 20, 2020
Messages
1,149 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
I see post about this card and 4K. This card is meant for 1080P. If you want high end gaming at 4K then look at the 4090.


Explain the problem with this card and 1440p. One game falls just short at Ultra (oh no, turn a setting to High) and the other game has a broken game engine as it sucks on every GPU available.

$600 for 1080p is asinine. That's what the 4060/Ti is for or do you think that's only a 720p card for $3/400?
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yup, but the fact that the goddamn map screen runs worse than the battles with potentially hundreds, if not thousands of units, kinda speak to the fact that the engine is not doing great. I heavily dislike making judgments on performance trends based on software that just doesn’t function properly. I mean, I could make a scenario in SC2 that will choke even the fastest modern CPU and cause heavy frame drops. This stems from the engine being almost entirely single threaded and caring basically only about raw IPC+clocks of a core. Does this mean that we should say that, for example, improving game performance through cache like AMD did is unviable? Of course not.
I don't really understand your analogy there. The scenario in SC2 (and SC2 in general) is that it will benefit from the fastest single core you could throw at it. I don't even know how it would run on X3Ds but that is a development that feeds any number of cores, its completely detached from what SC2 is limited by...? I would guess it also runs SC2 like a dream, as it can feed the core faster. So SC2 actually makes a case for cache improvements in CPUs.

At the same time, that campaign map does not stutter if you throw sufficient power at it. Its just a very full map with a lot of costly terrain on it. The amount of shit happening on there.. god almighty. I much preferred the more toned down maps of earlier versions. At the same time, even WH2's mortal empires map stutters as you hover over the big maelstrom in the middle, another such map element I highly despise.

So yeah, sure, you can cripply ANY engine and with that amount of logic pushed, any system with a very heavy map. Point taken. VRAM isn't the only factor in play there, I agree on that too. CPU is as pivotal, if not more so. Still though, I did see notable differences when I moved from 8 to 20GB worth of GPU in this game. On the same CPU, too.
 

Ainygma

New Member
Joined
Jan 16, 2024
Messages
3 (0.01/day)
I'm sure the nvidia fanboys will defend this overpriced card. Should cost $499 which is what the 1070 would cost today adjusted for inflation. There is about a $100-110 difference between this card and the RX 7800 XT. $100 more for DLSS and 25-35% more RT performance is a bad purchase, imo since the RX 7800 XT has 50% more vram, which means it will not need to be replaced sooner in future games. Do not even come at me about frame generation. Frame generation is fake frames. I prefer real frames. I do not particularly like DLSS and FSR upscaling either since native resolution is better.
 
Joined
Dec 28, 2013
Messages
151 (0.04/day)
this card should be 400E max and that's for the over the top AIB models, like strix.
 
Joined
Nov 27, 2023
Messages
2,482 (6.41/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
I don't really understand your analogy there. The scenario in SC2 (and SC2 in general) is that it will benefit from the fastest single core you could throw at it. I don't even know how it would run on X3Ds but that is a development that feeds any number of cores, its completely detached from what SC2 is limited by...? I would guess it also runs SC2 like a dream, as it can feed the core faster. So SC2 actually makes a case for cache improvements in CPUs.
Oh, it does run amazingly well, no question there. But even with the 7800X3D you can and will see drops below 60 in a regular 4v4 game, let alone my theoretical insane scenario. From what I heard, the 13(4)900K/KS actually does do a bit better there, hence my clock point.
And another example of such behavior is Age of Empires 4. Same reasoning there, I imagine.
 
Joined
Sep 17, 2014
Messages
22,642 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Oh, it does run amazingly well, no question there. But even with the 7800X3D you can and will see drops below 60 in a regular 4v4 game, let alone my theoretical insane scenario. From what I heard, the 13(4)900K/KS actually does do a bit better there, hence my clock point.
And another example of such behavior is Age of Empires 4. Same reasoning there, I imagine.
So SC2 is still mostly clocks. Hah. Nice
 
Top