• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 Founders Edition

Joined
May 11, 2018
Messages
1,376 (0.56/day)
Yeah, comparing RTX 3080 with RTX 2080 Ti:

At 1080p it's 18% faster
At 1440p it's 23% faster
At 4K it's 27.5% faster.

MSRP of RTX 2080 Ti was $1499. MSRP of RTX 3080 was $699.

But now we should be glad the RTX 4080 12GB barely matches the RTX 3090 Ti - so we'll be able to buy a 1100 EUR card that matches a 1200 EUR card - wow, much price / performance increase?

And in some cases I predict RTX 4080 12Gb will be much slower, and we'll see a 1100 EUR card barely match the 800 EUR RTX 3080.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,148 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
@W1zzard we need Spider-Man remasted and overwatch on future benchmarks on the 7900 xt.. borderlands and witcher are old
Will definitely add Spiderman even though it's super CPU limited. Not sure about Overwatch due to its always-online design, also it runs 4896748956 FPS and is CPU limited, too. Will probably kick Borderlands 3, but Witcher 3 stays, because very important DX11 game
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
Will definitely add Spiderman even though it's super CPU limited. Not sure about Overwatch due to its always-online design, also it runs 4896748956 FPS and is CPU limited, too. Will probably kick Borderlands 3, but Witcher 3 stays, because very important DX11 game

Is it possible to add a review to compare different CPUs running with RTX 4090?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,148 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Is it possible to add a review to compare different CPUs running with RTX 4090?
Anything is possible ;)

5800X vs 12900K vs 7700X vs 13900K definitely sounds interesting, but that'll be A LOT of work
 
Joined
Apr 1, 2017
Messages
420 (0.15/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W
Yeah, comparing RTX 3080 with RTX 2080 Ti:

At 1080p it's 18% faster
At 1440p it's 23% faster
At 4K it's 27.5% faster.

MSRP of RTX 2080 Ti was $1499. MSRP of RTX 3080 was $699.

But now we should be glad the RTX 4080 12GB barely matches the RTX 3090 Ti - so we'll be able to buy a 1100 EUR card that matches a 1200 EUR card - wow, much price / performance increase?

And in some cases I predict RTX 4080 12Gb will be much slower, and we'll see a 1100 EUR card barely match the 800 EUR RTX 3080.
people pushing the narrative to be amazed at a generic generational performance leap and to be grateful for higher prices is really funny :)
 
Joined
Nov 18, 2020
Messages
39 (0.03/day)
Location
Arad, Romania
Processor i9-10850K @ 125W Power Limit
Motherboard ASUS TUF Gaming Z590-PLUS
Cooling Noctua NH-D15S
Memory Kingston KF432C16RBK2/64
Video Card(s) ASUS RTX 3070 TUF GAMING O8G @ 950mV / 2010MHz
Storage Samsung 970 EVO Plus 2TB + Kingston KC3000 2TB + Samsung 860 EVO 2TB + Samsung 870 EVO 4TB
Display(s) ASUS PB287Q + DELL S2719DGF
Case FRACTAL Define 7 Dark TG
Audio Device(s) integrated + Microlab FC330 / Audio-Technica ATH-M50s/LE
Power Supply Seasonic PRIME TX-650, 80+ Titanium, 650W
Mouse SteelSeries Rival 600
Keyboard Corsair K70 RGB TKL – CHERRY MX SPEED
DLAA is a real technology, it's NVIDIA deep learning anti aliasing, similar to DLSS but without the lower render resolution then upscaling. FG is frame generation.

DLAA is better than TAA, but doesn't offer the performance benefits of DLSS.
Yes, it was my mistake. I thought it should have been DLSS + FG, not DLAA + FG. :)
 

3x0

Joined
Oct 6, 2022
Messages
974 (1.14/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550I Gaming Edge Wi-Fi ITX
Cooling Scythe Fuma 2 rev. B Noctua NF-A12x25 Edition
Memory 2x16GiB G.Skill TridentZ DDR4 3200Mb/s CL14 F4-3200C14D-32GTZKW
Video Card(s) PowerColor Radeon RX7800 XT Hellhound 16GiB
Storage Western Digital Black SN850 WDS100T1X0E-00AFY0 1TiB, Western Digital Blue 3D WDS200T2B0A 2TiB
Display(s) Dell G2724D 27" IPS 1440P 165Hz, ASUS VG259QM 25” IPS 1080P 240Hz
Case Cooler Master NR200P ITX
Audio Device(s) Altec Lansing 220, HyperX Cloud II
Power Supply Corsair SF750 Platinum 750W SFX
Mouse Lamzu Atlantis Mini Wireless
Keyboard HyperX Alloy Origins Aqua
AMD will have a REALLY tough time trying to come anywhere near this with 7000 series, especially if their new cpus are any indication, since they are literally less efficient than previous gen:
You realize that the 7950x at 65W has the same performance as 5950X at stock? AMD, Intel and nV have all opted to increase their power consumption drastically and out of the window of optimal energy efficiency just for a few percent more performance.
 
Joined
May 11, 2018
Messages
1,376 (0.56/day)
people pushing the narrative to be amazed at a generic generational performance leap and to be grateful for higher prices is really funny :)

That's the thing - it seems like "generic generational performance leap" will be present only in RTX 4090, even the RTX 4080 16GB seems too cut to be able to do a RTX 3080 + 50 - 70%, and that's a 1500 EUR card now!

Will we see the push of "but DLSS 3.0 does that, and more!", sO who cares about rasterisation uplift? And have a Turing kind of release - perhaps even worse?
 
Joined
Apr 16, 2019
Messages
632 (0.30/day)
You realize that the 7950x at 65W has the same performance as 5950X at stock? AMD, Intel and nV have all opted to increase their power consumption drastically and out of the window of optimal energy efficiency just for a few percent more performance.
Ahh, so now that the ball is in the other court, it's fine to compare performance at a certain, limited power and not only at stock? Back when 12900k was killing it in this metric, all that mattered was its "horrible stock consumption", hehe... And yes, I realize that, but 4090 just pushed the bar so high, there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.
 

3x0

Joined
Oct 6, 2022
Messages
974 (1.14/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550I Gaming Edge Wi-Fi ITX
Cooling Scythe Fuma 2 rev. B Noctua NF-A12x25 Edition
Memory 2x16GiB G.Skill TridentZ DDR4 3200Mb/s CL14 F4-3200C14D-32GTZKW
Video Card(s) PowerColor Radeon RX7800 XT Hellhound 16GiB
Storage Western Digital Black SN850 WDS100T1X0E-00AFY0 1TiB, Western Digital Blue 3D WDS200T2B0A 2TiB
Display(s) Dell G2724D 27" IPS 1440P 165Hz, ASUS VG259QM 25” IPS 1080P 240Hz
Case Cooler Master NR200P ITX
Audio Device(s) Altec Lansing 220, HyperX Cloud II
Power Supply Corsair SF750 Platinum 750W SFX
Mouse Lamzu Atlantis Mini Wireless
Keyboard HyperX Alloy Origins Aqua
there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.
Let's wait for reviews
 
Joined
May 11, 2018
Messages
1,376 (0.56/day)
And yes, I realize that, but 4090 just pushed the bar so high, there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.

That's not necessarily a given. Lower end Ada cards are more severely cut than it was normal in past - in term of units, memory bandwidth - will they be pushed more to compensate, and so operate more inefficiently?

Very few people actually care about RTX 4090. It's not a normal gaming card, no matter how reviewers and Youtube influencer drool over it's performance.
 
Joined
Sep 27, 2020
Messages
93 (0.06/day)
Will definitely add Spiderman even though it's super CPU limited. Not sure about Overwatch due to its always-online design, also it runs 4896748956 FPS and is CPU limited, too. Will probably kick Borderlands 3, but Witcher 3 stays, because very important DX11 game
Thanks a lot for your hard work!

I agree on Witcher 3, for me it's still a reference and one of the first games I see on any GPU review, was very balanced and their results are important!
 
Joined
Apr 1, 2017
Messages
420 (0.15/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W
there is no way 7000 series will have any hope of even coming within a class of its performance while still staying at least somewhat on the efficiency side of the curve.
people said the exact same thing back when 6000 series was about to be launched xD
 

HTC

Joined
Apr 1, 2008
Messages
4,668 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
Yeah, comparing RTX 3080 with RTX 2080 Ti

You're comparing a flagship card with a non-flagship card: try comparing it to the 2080 instead (i'm referring to launch day reviews).

Do that and then, when 4080 releases, compare it's % lead VS the 3080 with the % lead of the 3080 VS the 2080, and then factor in the prices of the cards.
 
Joined
Apr 16, 2019
Messages
632 (0.30/day)
people said the exact same thing back when 6000 series was about to be launched xD
No they didn't, at least those of us who can think didn't at the very least. Back then AMD had the superior process due to Nvidia favoring volume and going Samsung (which proved to be a great business move; they sold an order of magnitude more 3000 series than AMD did 6000). This time though, if anything, Nvidia will even have a small node edge (N4 vs N5) and it's not hard to predict the outcome. :cool:
 

3x0

Joined
Oct 6, 2022
Messages
974 (1.14/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550I Gaming Edge Wi-Fi ITX
Cooling Scythe Fuma 2 rev. B Noctua NF-A12x25 Edition
Memory 2x16GiB G.Skill TridentZ DDR4 3200Mb/s CL14 F4-3200C14D-32GTZKW
Video Card(s) PowerColor Radeon RX7800 XT Hellhound 16GiB
Storage Western Digital Black SN850 WDS100T1X0E-00AFY0 1TiB, Western Digital Blue 3D WDS200T2B0A 2TiB
Display(s) Dell G2724D 27" IPS 1440P 165Hz, ASUS VG259QM 25” IPS 1080P 240Hz
Case Cooler Master NR200P ITX
Audio Device(s) Altec Lansing 220, HyperX Cloud II
Power Supply Corsair SF750 Platinum 750W SFX
Mouse Lamzu Atlantis Mini Wireless
Keyboard HyperX Alloy Origins Aqua
This time though, if anything, Nvidia will even have a small node edge (N4 vs N5) and it's not hard to predict the outcome.
The difference in nodes could very well be in naming only and effectively the same when it comes to perf/watt/prices
 
Joined
May 2, 2017
Messages
7,762 (2.74/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Just like I was saying back in 3000 vs 6000 efficiency debate, Nvidia on a cutting edge node is far above the rest and it truly shows now:

And that's with the top-tier, no-holds-barred card, designed for maximum performance. Optimize it with lower power limit and some undervolting and you get something that's leagues beyond anything else, which is clearly shown by ridiculously low "60hz v-sync" consumption.

AMD will have a REALLY tough time trying to come anywhere near this with 7000 series, especially if their new cpus are any indication, since they are literally less efficient than previous gen:
The 4090 is quite clearly CPU limited in the TPU efficiency test scenario (not as hard as 1080p, but with an average that close, it's CPU limited most of the time), rendering that comparison quite invalid, as the card is essentially running under clocked. Of course the performance is bonkers nonetheless, and UV/UC/power limiting potential for this card is HUGE (as Der8auer has demonstrated), but these results are not representative. @W1zzard needs to get on this and find another efficiency test scenario.

Edit: autocorrect. Also w1zzard has tested the 4090, 3090 and 6900 XT in 2160p with some interesting results - the AMD card is about the same relative efficiency as it lags behind at the higher resolution, but the 3090 looks much better compared to the 4090.
 
Last edited:
Joined
Jan 18, 2021
Messages
225 (0.15/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
"bullshit fake frames"

Tell me more...

After reading pre-review earlier speculations of 4x increase in FPS with DLSS 3 enabled... i immediately fell into the "what if" pit of too-good-to-be-true skepticism (marketing gimmickery?). So kill the curiousity, tell me more!

Here are a couple of informative videos:



I don't expect you to watch all of that, though. The gist seems to be that DLSS 3 frames aren't quite "fake," but they are definitely "half-fake," or maybe even "three-quarters fake." Certainly NVIDIA's marketing around DLSS 3 trends towards fake. Why? Because the extra frames generated by DLSS 3 don't reduce input latency, at all, in contrast to normal extra framerate. In some cases DLSS 3 even makes latency marginally worse than it would be at a lower native framerate.

At first this didn't sound so bad to me, but it turns out that the use case for this tech is a pretty small niche. For example, if you're already at or near your monitor's max refresh rate, then DLSS 3 is wasted, because the screen can't convey the visual smoothness benefits. Likewise, if you're looking to push stratospheric FPS for competitive gaming, DLSS 3 is completely pointless.

On the other side of the spectrum, at lower FPS numbers DLSS 3's visual artifacting is more noticeable, so the extra frames provided come at a higher visual cost without providing any benefit in terms of responsiveness. Plus DLSS 3 disables V-Sync and FPS limiters by default, so there's tearing if you don't have Variable Refresh Rate or if you're below/above your monitor's thresholds for VRR. These factors limit DLSS 3's appeal as an FPS booster to lower end or mid-range hardware.

So FWIW, Tim says this tech is best for people who fit the following criteria:

- They're already capable of running the game at roughly 100-120 FPS without DLSS 3;
- They're running a (VRR-capable) monitor with a refresh rate significantly higher than 100-120 Hz, and
- They're playing games that aren't especially latency sensitive (e.g. graphically impressive single player stuff, like Cyberpunk 2077)

I don't believe this is an especially large market. People expecting DLSS 3 to be anywhere near as impactful as DLSS 2 are destined for disappointment.

EDIT: Here's the companion article to the HUB video linked above, for those who are more text-inclined: https://www.techspot.com/article/2546-dlss-3/
 
Last edited:
Joined
Apr 12, 2013
Messages
7,631 (1.77/day)
In a dynamic scene frame generation is basically useless, especially high FPS scenarios. I can't see how the predicted or AI generated frame can be as accurate as the real scene ever!
You would probably need 10x the computational power & 1000x-10000x more AI training to get it working in an acceptable way, acceptable to me at least.
 
Joined
Nov 26, 2021
Messages
1,804 (1.55/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
The 4090 is quite clearly CPU limited in the TPU efficiency test scenario (not as hard as 1080p, but with an average that close, it's CPU limited most of the time), rendering that comparison quite invalid, as the card is esse tislly running under clocked. Of course the performance is bonkers nonetheless, and UV/UC/power limiting potential for this card is HUGE (as Der8auer has demonstrated), but these results are not representative. @W1zzard needs to get on this and find another efficiency test scenario.
I think increasing the test scene's resolution to UHD for cards faster than 3090 Ti will be enough to resolve that.
 
Joined
May 2, 2017
Messages
7,762 (2.74/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I think increasing the test scene's resolution to UHD for cards faster than 3090 Ti will be enough to resolve that.
Possibly, though it also skews inter-architectural comparisons as different architectures scale across resolutions differently. The ideal for a broadly representative test would be a demanding 1440p title that still scales to very high fps without becoming cpu limited.
 
Joined
Nov 26, 2021
Messages
1,804 (1.55/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Possibly, though it also skews inter-architectural comparisons as different architectures scale across resolutions differently. The ideal for a broadly representative test would be a demanding 1440p title that still scales to very high fps without becoming cpu limited.
Funnily enough, for all the talk of DX12 decreasing CPU bottlenecks, the one game that doesn't seem CPU limited at 1440p is The Witcher 3.

I also excluded all the games from TPU's test suite that are clearly CPU limited and got somewhat better speedups for the 4090: 53% and 73% over the 3090 Ti and the 3090 respectively at 4K. The games that I excluded are:

  • Battlefield V
  • Borderlands 3
  • Civilization VI
  • Divinity Original Sin II
  • Elden Ring
  • F1 22
  • Far Cry 6
  • Forza Horizon 5
  • Guardians of the Galaxy
  • Halo Infinite
  • Hitman 3
  • Watch Dogs Legion
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,148 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I think increasing the test scene's resolution to UHD for cards faster than 3090 Ti will be enough to resolve that.
All cards have to run the same scene + resolution, because "efficiency" = "fps / power" .. and it has to be a game that's fair to both vendors .. and something popular .. leaning towards switching to doom eternal 4k for all cards .. even very old cards get decent fps there and dont fall off a cliff due to vram limits
 
Joined
Nov 26, 2021
Messages
1,804 (1.55/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
All cards have to run the same scene + resolution, because "efficiency" = "fps / power" .. and it has to be a game that's fair to both vendors .. and something popular .. leaning towards switching to doom eternal 4k for all cards .. even very old cards get decent fps there and dont fall off a cliff due to vram limits
My bad; I forgot about the efficiency metric. I was only thinking of peak power.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,148 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
My bad; I forgot about the efficiency metric. I was only thinking of peak power.
you mean "maximum" in my charts? that's furmark and definitely not cpu limited. but furmark is a totally unrealistic load, that's why I also have a real gaming load and the differences are huge
 
Top