• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090 Founders Edition

Joined
Nov 26, 2021
Messages
1,804 (1.55/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
you mean "maximum" in my charts? that's furmark and definitely not cpu limited. but furmark is a totally unrealistic load, that's why I also have a real gaming load and the differences are huge
I phrased it badly. I meant the Gaming power usage; I consider that the maximum for most purposes as Furmark is unrealistic. I was thinking that the game and resolution were chosen for a near peak gaming power draw.
 
Joined
Jul 9, 2015
Messages
3,429 (0.98/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
How come 40%-somthing perf boost is not a disappointment after promised "2-4 times faster"?

3080Ti vs 2080Ti was a 56% bump, mind you.

How is 4080 supposed to compete, given it is a heavily cut down version of 4090?
 
Last edited:
Joined
Apr 30, 2020
Messages
1,051 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
Funnily enough, for all the talk of DX12 decreasing CPU bottlenecks, the one game that doesn't seem CPU limited at 1440p is The Witcher 3.

I also excluded all the games from TPU's test suite that are clearly CPU limited and got somewhat better speedups for the 4090: 53% and 73% over the 3090 Ti and the 3090 respectively at 4K. The games that I excluded are:

  • Battlefield V
  • Borderlands 3
  • Civilization VI
  • Divinity Original Sin II
  • Elden Ring
  • F1 22
  • Far Cry 6
  • Forza Horizon 5
  • Guardians of the Galaxy
  • Halo Infinite
  • Hitman 3
  • Watch Dogs Legion

List of DirectX 12 games - PCGamingWiki PCGW - bugs, fixes, crashes, mods, guides and improvements for every PC game: only 240 games vs 3,099 games for DX 12 vs DX11.
Raytracing is even smaller 141 games support raytracing : List of games that support ray tracing - PCGamingWiki PCGW - bugs, fixes, crashes, mods, guides and improvements for every PC game
 
Joined
May 11, 2018
Messages
1,375 (0.56/day)
How come 40%-somthing perf boost is not a disappointment after promised "2-4 times faster"?

3080Ti vs 2080Ti was a 56% bump, mind you.

How is 4080 supposed to compete, given it is a heavily cut down version of 4090?

I think we'll have a Turing situation all over again. Higher price increase than performance increase.

In 2018 it was explained by RTX - raytracing and DLSS. Although it took quite a long time for both technologies to become widely adapted, and most people never enjoyed ray tracing on RTX 2080 - it was just too slow.

Now they'll say we have a revolution in frame doubling with DLSS 3.0. Although it increases latency and leaves clear artifacts for all to see with moving GUI elements and such, but it's your fault if you notice it - you should be playing game, not looking for artifacts!
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,148 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
"2-4 times faster"?
Base performance increase: +50%
Turn on DLSS Super Resolution: +50%-+100% depending on setting
Turn on DLSS Frame Generation: +100%
 
Last edited:
Joined
May 11, 2018
Messages
1,375 (0.56/day)
I think Hardware Unboxed has done a good job at showing the visual artefacts of DLSS 3.0:


It's all hard to detect in many cases, but some things are very apparent. Any moving GUI element is really hard to predict by frame generation, so they are heavily garbled in AI generated frames - which cause them to flicker.

Could this be repaired? Well, game could render the GUI elements after the frame generation, but that would require a compleyely different approach - and more involvement by developers.
 
Joined
Jul 15, 2020
Messages
1,036 (0.62/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
How come 40%-somthing perf boost is not a disappointment after promised "2-4 times faster"?

3080Ti vs 2080Ti was a 56% bump, mind you.

How is 4080 supposed to compete, given it is a heavily cut down version of 4090?
2x-4x in a very specific game\demo.
No one expected to see those increases on every game.
This repetitive mantra about 2-4x is sooo very boring.

If you into the bashing business than bash on mate, but know it is look somewhat pathetic.
I very much agree that this product is quite stupid (as any >1000$ GPU out there for gaming imo) but the "you promised me 2-4x" mantra is not the reason.
There much more other, real things, to criticize this GPU about.
 
Last edited:
Joined
May 11, 2018
Messages
1,375 (0.56/day)
But it is silly.

Even if you take into account that Nvidia chose best case scenario - it's pure bullshit, when you notice this claim holds true (even in very specific scenario) only because they compared it to non-DLSS result. As if we don't have the DLSS for 4 years now.

It's not "pathetic" to call bulshit on such practices.
 
Joined
Feb 20, 2017
Messages
92 (0.03/day)
Location
Wild Coast, South Africa
When 150 watts was serious business,,, and for 2 PCBs :roll:

image_2022-10-14_113650200.png
 
Joined
Jul 15, 2020
Messages
1,036 (0.62/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
But it is silly.

Even if you take into account that Nvidia chose best case scenario - it's pure bullshit, when you notice this claim holds true (even in very specific scenario) only because they compared it to non-DLSS result. As if we don't have the DLSS for 4 years now.

It's not "pathetic" to call bulshit on such practices.

I wrote:
"If you into the bashing business than bash on mate, but know it is look somewhat pathetic"
"Somewhat pathetically" naive, if you will, to think that you get "2-4x".
If NV was misleading in its presentation about best case scenario "2-4x" than OK fight them to the moon and I will join but as you said- this is a very much valid graph, no matter how much sugar-coated it is.
So, to be lead by NV PR presentation about "2-4x" and to believe\expect that you will see that improvement in day-to-day usage, while knowing their way of doing business, is either (somewhat pathetically) naive or just "somewhat pathetic" way of bashing because it is not actual, proper, bullshit.

What I will bash on (if i`m into that sort of practice, which i`m not): price, total power consumption, phisical size and weight, DLSS3 only practical for the very high end usage (>120FPS on 240Hz screens) while increasing input lag and so on.

Ranting about about a point that not really exist take all the air from other valid ranting you (and others) might say.
 
Last edited:
Joined
Apr 1, 2017
Messages
420 (0.15/day)
System Name The Cum Blaster
Processor R9 5900x
Motherboard Gigabyte X470 Aorus Gaming 7 Wifi
Cooling Alphacool Eisbaer LT360
Memory 4x8GB Crucial Ballistix @ 3800C16
Video Card(s) 7900 XTX Nitro+
Storage Lots
Display(s) 4k60hz, 4k144hz
Case Obsidian 750D Airflow Edition
Power Supply EVGA SuperNOVA G3 750W
Base performance increase: +50%
Turn on DLSS Super Resolution: +50%-+100% depending on setting
Turn on DLSS Frame Generation: +100%
no way!... but what if you enable Deep Learning Super Sample Super Sample on, you know, the 3000 series gpu?
 
Joined
Jan 18, 2021
Messages
225 (0.15/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
"Somewhat pathetically" naive, if you will, to think that you get "2-4x".
If NV was misleading in its presentation about best case scenario "2-4x" than OK fight them to the moon and I will join but as you said- this is a very much valid graph, no matter how much sugar-coated it is.

We all expect first-party benchmark results to be sugar-coated. That NVIDIA would inflate its performance numbers in marketing is unremarkable, but in this case the inflation is especially dishonest, because the extra frames generated by DLSS 3.0 don't give you at least half the benefit of frames generated by other means (i.e. reduced input latency).

In retrospect, I think I was too kind earlier in discussing this feature. A commenter on the Techspot article I linked described DLSS 3.0 as a "motion fidelity" feature, rather than a performance boost, and that seems like the most sensible way to look at it. Imagine a feature that increased perceived smoothness without adding extra frames. That's what DLSS 3.0 does, in effect. It's purely a visual enhancement, though one that comes with a trade off to picture quality.

(One of the more interesting, and I think damning, passages in the Techspot article observes that DLSS 2.0 in Performance mode gave the same FPS and picture quality as DLSS 2.0 in Quality mode when combined with DLSS 3.0 in a particular game/scenario, and thus DLSS 3.0 was worse than pointless in that scenario, increasing latency in return for zero benefit.)

I think DLSS 3.0 is an impressive invention, and in time it could prove to be useful, but it isn't remotely comparable to extra GPU horsepower.

EDIT: Also I think it's somewhat annoying that NVIDIA chose to label its AI-frame-generation tech as "DLSS 3.0," implying that it's in some way not only linked to DLSS 2.0, but superior to it. In fact, the two features have basically nothing to do with one another. DLSS 2.0 increases real frame rate by rendering the scene in a lower resolution and then ingeniously scaling it up to look like you're running in native. (And in some cases, DLSS 2.0 can actually enhance the image, which is a neat trick.) DLSS 3.0 is a fancy interpolation technology that increases perceived motion smoothness. You can choose to enable one or both; they operate independently.

DLSS 2.0 will remain vastly more useful to the average gamer long after DLSS 3.0 proliferates to the masses. Vastly vastly more useful; it isn't a contest.
 
Last edited:
Joined
Jul 15, 2020
Messages
1,036 (0.62/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
We all expect first-party benchmark results to be sugar-coated. That NVIDIA would inflate its performance numbers in marketing is unremarkable, but in this case the inflation is especially dishonest, because the extra frames generated by DLSS 3.0 don't give you at least half the benefit of frames generated by other means (i.e. reduced input latency).

In retrospect, I think I was too kind earlier in discussing this feature. A commenter on the Techspot article I linked described DLSS 3.0 as a "motion fidelity" feature, rather than a performance boost, and that seems like the most sensible way to look at it. Imagine a feature that increased perceived smoothness without adding extra frames. That's what DLSS 3.0 does, in effect. It's purely a visual enhancement, though one that comes with a trade off to picture quality.

(One of the more interesting, and I think damning, passages in the Techspot article observes that DLSS 2.0 in Performance mode gave the same FPS and picture quality as DLSS 2.0 in Quality mode when combined with DLSS 3.0 in a particular game/scenario, and thus DLSS 3.0 was worse than pointless in that scenario, increasing latency in return for zero benefit.)

I think DLSS 3.0 is an impressive invention, and in time it could prove to be useful, but it isn't remotely comparable to extra GPU horsepower.

EDIT: Also I think it's somewhat annoying that NVIDIA chose to label its AI-frame-generation tech as "DLSS 3.0," implying that it's in some way not only linked to DLSS 2.0, but superior to it. In fact, the two features have basically nothing to do with one another. DLSS 2.0 increases real frame rate by rendering the scene in a lower resolution and then ingeniously scaling it up to look like you're running in native. (And in some cases, DLSS 2.0 can actually enhance the image, which is a neat trick.) DLSS 3.0 is a fancy interpolation technology that increases perceived motion smoothness. You can choose to enable one or both; they operate independently.

DLSS 2.0 will remain vastly more useful to the average gamer long after DLSS 3.0 proliferates to the masses. Vastly vastly more useful; it isn't a contest.
Agreed.
Unlike dlss2\fsr2, dlss3 is a gimmick feature, much like RTX, and the pros are smaller than the cons considrablly.
In time though, NV can improve dlss3 and, except maybe the inceaced input-leg problem, deal with all the visual artifacts.
I`m sure dlss3.4 will be much better just as dlss2.4 is now.
 
Last edited:
Joined
Jan 18, 2021
Messages
225 (0.15/day)
Processor Core i7-12700
Motherboard MSI B660 MAG Mortar
Cooling Noctua NH-D15
Memory G.Skill Ripjaws V 64GB (4x16) DDR4-3600 CL16 @ 3466 MT/s
Video Card(s) AMD RX 6800
Storage Too many to list, lol
Display(s) Gigabyte M27Q
Case Fractal Design Define R5
Power Supply Corsair RM750x
Mouse Too many to list, lol
Keyboard Keychron low profile
Software Fedora, Mint
Agreed.
Unlike dlss2\fsr2, dlss3 is a gimmick feature, much like RTX, and the pros are smaller than the cons considrablly.

In time though, NV can improve dlss3 and, except maybe the inceaced input-leg problem, deal with all the visual artifacts.
I`m sure dlss3.4 will be much better just as dlss2.4 is now.

As I understand it, the latency problem can't ever really be "fixed." NVIDIA might reduce the latency cost of enabling the feature (relative to not enabling it), but the extra frames generated by DLSS 3.0 will always be "fake," or dumb if you prefer, with respect to the player's inputs. That's just the nature of interpolation.

Image quality might be improved, though, and hopefully at some point NVIDIA will find a way to allow V-Sync and framerate caps with DLSS 3.0 on. The Digital Foundry guy actually already forced V-Sync to work in certain situations, which is encouraging.

But again there are certain fundamental limitations here that can't be eliminated--e.g. frames generated by DLSS 3.0 will always be pointless above the maximum refresh rate of your monitor, and DLSS 3.0's picture quality will always be worse at lower FPS, because a longer delay between "real" frames requires the AI to guess more in constructing the mid-point image between them. (And if the frames are displayed longer, obviously, the human eye is more likely to notice errors.) Thus, DLSS 3.0 will tend to skew against the very people you'd expect to want it most (i.e. those without high-refresh monitors or strong rendering hardware, or on the other end of the spectrum, competitive gamers who want stratospheric FPS to improve latency).

I'm sure there's room for improvement, but it's a niche feature now and I doubt that niche will ever dramatically expand. And even if DLSS 3.0 were 100% perfected, it still wouldn't be analogous to adding extra FPS in the traditional way. DLSS 2.0 and its analogues, on the other hand, are and will continue to be extremely useful to huge swathes of the user base, precisely because they do provide real performance boosts analogous to traditional frame rate improvements.

EDIT: lol, it looks like I misread your post. I thought you said NVIDIA might improve the input lag. My fault, man.
 
Joined
Jan 20, 2019
Messages
1,631 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Agreed.
Unlike dlss2\fsr2, dlss3 is a gimmick feature, much like RTX, and the pros are smaller than the cons considrablly.

In time though, NV can improve dlss3 and, except maybe the inceaced input-leg problem, deal with all the visual artifacts.
I`m sure dlss3.4 will be much better just as dlss2.4 is now.

After seeing a couple of reviews from tech-tubers i'm of the same opinion. Perhaps nV is rattled with AMD trailing closely behind + Intel now enlisted with hopefully stronger competition ahead. Rather then offering raw performance at a reasonable cost it appears the king of the hill decided to milk the hill further with the "perception" of wider performance gains. I wander if its desperate times or just another nV authoritative swindling strategy to rob the less informed/insensible rich?
 
Joined
Jul 9, 2015
Messages
3,429 (0.98/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Base performance increase: +50%
Turn on DLSS Super Resolution: +50%-+100% depending on setting
Turn on DLSS Frame Generation: +100%
Oh, that is how that works.
My TV can do "frame generation".
I guess, that's another 200% on top, if base fps is low enough.

2-4x" mantra

If you are fine with "2-4 times" claims, in regards to a card that per TPU tests is about 45% faster, that's cool, I guess.
But fairly pathetic in my books.



Even if one zooms in at what was claimed:

"in games, it's 2 times"
 
Last edited:
Joined
Apr 30, 2020
Messages
1,051 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
So W1zzard ?
Tom's Harwared a links for a compared SLI rtx 3090 ti's vs an RTX 4090 running DLSS on microsoft flight simulator.
W1zzard any plans later on to compare two RTX 3090 Ti's in SLI/mGPU in RTX SLI vs one RTX 4090?

most likly going to be a no, because you focus on Triple A games that everyone has to be playing for a mass audience. Makes the forum here feel like a gaming forum more than a pc enthuiast fourm

I do have a list on supposivly mGPU's games, they need confirmation, It might help.
RTX 4090 Beats Two RTX 3090s in SLI — Barely | Tom's Hardware (tomshardware.com)

1.Why would use DLSS on SLI then complain about the second card not being loaded enough? :confused:

2. Can someone tell me why or how the hell they got cyberpunk 2077 supporting SLI or mGPU, when it was detailed by CD project that they had no plans to impliment it ? :confused:
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,148 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Jul 9, 2015
Messages
3,429 (0.98/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
2x-4x in a very specific game\demo.
I've came across where I got that from, the resetera:

RTX 4080
  • Starts at $900 for 12GB G6X, $1200 for 16GB G6X
  • 2-4x faster than 3080 Ti
  • Launching in November

RTX 4090
  • $1600
  • 24GB G6X
  • 2-4x faster than 3090 Ti
  • Launching on October 12th

  • Ada Lovelace is 2x faster in rasterization and 4x faster in ray-tracing compared to Ampere
  • Ada Lovelace GPUs are significantly more energy efficient compared to Ampere


So, reality is quite far from it, ain't it? (oh, I mean, besides the pricing, although in DE AIBs want 25% on top... I guess I know why EVGA quit)
 
Joined
Apr 30, 2020
Messages
1,051 (0.60/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 32Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
mfsfs is a cpu limited pos. the only way to get higher fps is by using frame doubling in dlss
you mean it's Single threaded even on DX12. Stop calling cpu limited when it barely even uses the cpu.
how about calling cpu limited when the cpu like 5600x or 12400K is at 80% load and going to something like a 5900x or a 12700K increases Frame rates & keeps a decent load on the cpu/
It's crappy built engine that wasn't properly built for multi threading on dx12
 
Joined
Apr 16, 2019
Messages
632 (0.30/day)
you mean it's Single threaded even on DX12. Stop calling cpu limited when it barely even uses the cpu.
how about calling cpu limited when the cpu like 5600x or 12400K is at 80% load and going to something like a 5900x or a 12700K increases Frame rates & keeps a decent load on the cpu/
It's crappy built engine that wasn't properly built for multi threading on dx12
Single thread limited is still, well, cpu limited pal...
 
Joined
Jul 15, 2020
Messages
1,036 (0.62/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
I've came across where I got that from, the resetera:

RTX 4080
  • Starts at $900 for 12GB G6X, $1200 for 16GB G6X
  • 2-4x faster than 3080 Ti
  • Launching in November

RTX 4090
  • $1600
  • 24GB G6X
  • 2-4x faster than 3090 Ti
  • Launching on October 12th

  • Ada Lovelace is 2x faster in rasterization and 4x faster in ray-tracing compared to Ampere
  • Ada Lovelace GPUs are significantly more energy efficient compared to Ampere


So, reality is quite far from it, ain't it? (oh, I mean, besides the pricing, although in DE AIBs want 25% on top... I guess I know why EVGA quit)
Yes, realty is very different from suger costed PR presentation (yet they are right according to the spacific details of the graph\test). What's to it? Something new?
 
Last edited:
Joined
Aug 10, 2021
Messages
166 (0.13/day)
System Name Main
Processor 5900X
Motherboard Asrock 570X Taichi
Memory 32GB
Video Card(s) 6800XT
Display(s) Odyssey C49G95T - 5120 x 1440
Anyone know how DLSS 3.0 is in "slower gameplay games" that is CPU limited?
Like Stellaris late-game or other Paradox map games. City: Skylines and similar games
 
Joined
Jun 6, 2022
Messages
622 (0.64/day)
people pushing the narrative to be amazed at a generic generational performance leap and to be grateful for higher prices is really funny :)
It is the beast of the moment and no one is forcing you to buy it. You want pure performance in 4K, you buy it... if you can afford it. You want to play decently in 8k, buy it. Play WoT in 1080p, don't buy it!
For Content Creation, this price is really low. It takes 2x3090Ti to beat it in this segment.
 
Top