• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New Leak Reveals NVIDIA RTX 5080 Is Slower Than RTX 4090

Joined
Mar 28, 2020
Messages
1,794 (1.00/day)
I would think that the RTX 5080 will struggle to produce any meaningful performance uplift in games when compared to the RTX 4080 Super. Consider the fact that the RTX 5090 requires 30% bump in hardware specs, i.e. CUDA cores and memory bandwidth, including a 30ish % power increase,to obtain close to 30% average rasterization performance gain, the difference between the RTX 5080 and 4080 is actually a lot smaller. So without multi frame generation, this is more like a RTX 4080 Ti.
 
Joined
Oct 19, 2022
Messages
365 (0.42/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6200MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
The driver overhead IS defacto a cpu bottleneck...
The driver overhead causes the "CPU bottleneck" but it's not like the CPU itself is not powerful/fast enough, therefore is the bottleneck. It's Nvidia's fault if their architectures & drivers are not scaling well at low resolutions. AMD do not have that issue.

I would think that the RTX 5080 will struggle to produce any meaningful performance uplift in games when compared to the RTX 4080 Super. Consider the fact that the RTX 5090 requires 30% bump in hardware specs, i.e. CUDA cores and memory bandwidth, including a 30ish % power increase,to obtain close to 30% average rasterization performance gain, the difference between the RTX 5080 and 4080 is actually a lot smaller. So without multi frame generation, this is more like a RTX 4080 Ti.
Yup definitely. Nvidia cheaped out a lot this generation. At those prices & performance we should have had a TSMC 3nm node at least!
 
Last edited:
Joined
Jan 6, 2014
Messages
600 (0.15/day)
Location
Germany
System Name Main Machine
Processor Intel i9-14900K
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Water cooling, 2x EK-DDC 3.2 PWM, 1x360mm+1x420mm EK, Mora 420 Pro, Thermal Grizzly Mycro Pro DD
Memory G.SKILL 32GB DDR5-8000
Video Card(s) MSI 4090 Suprim Liquid with Water Block
Storage 2x WD_BLACK SN850X 2TB und 4TB, 2x8TB Seagate Ironwolf
Display(s) ASUS ROG Swift PG27AQDM OLED, 240Hz
Case ASUS GR701 Hyperion
Audio Device(s) Turtle Beach Stealth Pro
Power Supply Seasonic Prime TX-1600 1600W ATX 3.0
Mouse Logitech G903 LIGHTSPEED Wireless
Keyboard SteelSeries Apex Pro
Software Win 11
Obliviously Mr. Leder Jacket wanted to fool us by saying that 5070 is as fast as my 4090. :wtf:
Maybe some people fell for it but I didn‘t believe it when he said that at CES.:p
 
Joined
Feb 18, 2012
Messages
50 (0.01/day)
System Name Skynet
Processor Core i7 14700K
Motherboard ASUS TUF Gaming Z790-PLUS WIFI
Cooling ARCTIC Liquid Freezer III 420
Memory DDR5 Kingston 32GB
Video Card(s) RTX 4080
Storage SSD Nvme 1TB
Display(s) LG 45GR95QE-B
Case Lian Li III
Audio Device(s) Sound Blaster Recon3D Fatal1ty Champion
Power Supply ROG Strix 1000W
Mouse Logitech G502 Hero
Keyboard Corsair RGB Strafe
Software Windows 11 Pro
Is a kind of Tik-Tok

Tik - series RTX 2000 were good
Tok - series RTX 3000 were not so good
Tik - series RTX 4000 are so much better
Tok - series RTX 5000 well, at least they are trying

I´ve been on the computer segment a lot of years (around 1995 at least) and I can say from sure that most of the time
the performance from each generation is more or less 25%, maybe a very few gens have improve a bit more than that but nehh...

I still remember my first MSI FX5200 and my ASUS ATI 9600 XT, now with more than 600W only for the GPU well... things do not look very good hehehe
 
Joined
Mar 29, 2023
Messages
1,288 (1.82/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
The driver overhead causes the "CPU bottleneck" but it's not like the CPU itself is not powerful/fast enough, therefore is the bottleneck. It's Nvidia's fault if their architectures & drivers are not scaling well at low resolutions. AMD do not have that issue.


Yup definitely. Nvidia cheaped out a lot this generation. At those prices & performance we should have had a TSMC 3nm node at least!

Amd doesn't have that issue, cause they have dedicated hardware to handle much of it, where as nvidia does it all in software. Unless nvidia changes that (they wont) they will always have more driver overhead than amd.

bro i think you are confused or you're trolling me one or the other. Or let me know if you are just looking at outliers I'm looking again at "relative performance". Makes sense so far? Also do you understand what the word "parity" means? Heres the link im using but if you are looking at some other review then let me know.


5090 is 10% faster at 1080p than a 4090 not EQUAL Okay moving on
5090 is 17% faster at 1440p than a 4090 not EQUAL. Okay moving on
5090 is 26% faster at 4k than a 4090 not EQUAL. Thats all the resolutions they tested. Hope you're following so far

If you saying that say 10% at 1080p is parity well 1) that's wrong because thats not what parity means 2) realistically its a meaningless comparison between different resolutions its just another data point. most gamers play at native resolutions high refresh. just because this card scales better at 4k doesn't mean anyone playing at 1440p or below will or should not buy it.

Now explain without using big boy words what exactly are you trying to convince me of?

Dude still haven't understood why comparing below 4k is pointless...

Is a kind of Tik-Tok

Tik - series RTX 2000 were good
Tok - series RTX 3000 were not so good
Tik - series RTX 4000 are so much better
Tok - series RTX 5000 well, at least they are trying

I´ve been on the computer segment a lot of years (around 1995 at least) and I can say from sure that most of the time
the performance from each generation is more or less 25%, maybe a very few gens have improve a bit more than that but nehh...

I still remember my first MSI FX5200 and my ASUS ATI 9600 XT, now with more than 600W only for the GPU well... things do not look very good hehehe

Lol what... 2000 was good, and 3000 not so good? Tell me you don't know what the hell you are talking about, without telling me you don't know what they hell you are talking about...
 
Joined
Sep 17, 2014
Messages
23,428 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Is a kind of Tik-Tok

Tik - series RTX 2000 were good
Tok - series RTX 3000 were not so good
Tik - series RTX 4000 are so much better
Tok - series RTX 5000 well, at least they are trying

I´ve been on the computer segment a lot of years (around 1995 at least) and I can say from sure that most of the time
the performance from each generation is more or less 25%, maybe a very few gens have improve a bit more than that but nehh...

I still remember my first MSI FX5200 and my ASUS ATI 9600 XT, now with more than 600W only for the GPU well... things do not look very good hehehe
2000 was a pile of nothing with RT you couldn't properly use, and an overpriced 2080 & 2080ti... What's good about that? Perhaps the only saving grace was GTX 1080 performance at $350,- with the 2060. Minus 2GB, so planned obscolescence card nonetheless.
 

Hxx

Joined
Dec 5, 2013
Messages
350 (0.09/day)
Dude still haven't understood why comparing below 4k is pointless...
Ah duh I forgot no one owns 1440p/1080p high refresh displays nowadays my bad . In fact If I were you I’d just email TPU to exclude those benches from their review I mean you seem convinced why should they waste their time ?
 
Last edited:
Joined
Mar 29, 2023
Messages
1,288 (1.82/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Ah duh I forgot no one owns 1440p high refresh displays nowadays. If I were you I’d just email TPU to exclude those benches from their review I mean you seem convinced why should they waste their time ?

Gpus should be tested at high res, just like cpus should be tested at low res - can your immense intellect come to the conclusion why that might be ?
 

Hxx

Joined
Dec 5, 2013
Messages
350 (0.09/day)
Gpus should be tested at high res, just like cpus should be tested at low res - can your immense intellect come to the conclusion why that might be ?
Well I appreciate the compliment Thanks !! What you think “should” be done doesn’t make it a reality , pretty cool concept right ?

Second mental exercise for you - what’s high res ? And if it’s 4k then why are there gains at 1440p? Double digit gains I might add but you seem convinced this test shouldn’t be done so I dunno curious to get your thoughts here

Keep in mind that even at 4K, TPUs bench suite holds some older games and they just run into walls that aren't GPU walls. That's why those numbers are to be taken with a grain of salt, and its also part of the reason why you see higher gaps in tests elsewhere - smaller games suite, of more recent titles. I just look at shaders right now, and there's no way I'm seeing the 5080 bridge a near 6k shader gap with clocks.

On TPU's testing, we will see the 5090 extend its lead as the bench suite gets newer titles over time.
I wonder what displays TPU is using 4k is probably 240hz but what about 1440p/1080p ? What’s the max nowadays in the consumer space - 500+ hz? I wonder how big the delta would be at lower res high refresh between the cards especially on those pro titles
 
Last edited:
Joined
Dec 17, 2024
Messages
116 (1.51/day)
Location
CO
System Name Zen 3 Daily Rig
Processor AMD Ryzen 9 5900X with Optimus Foundation block
Motherboard ASUS Crosshair VIII Dark Hero
Cooling Hardware Labs 360GTX and 360GTS custom loop, Aquacomputer HighFlow NEXT, Aquacomputer Octo
Memory G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
Video Card(s) Nvidia RTX 3080 Ti Founders Edition with Alphacool Eisblock
Storage x2 Samsung 970 Evo Plus 2TB, Crucial MX500 1TB
Display(s) LG 42" C4 OLED
Case Lian Li O11 Dynamic
Power Supply be Quiet! Straight Power 12 1500W
Mouse Corsair Scimitar RGB Elite Wireless
Keyboard Keychron Q1 Pro
Software Windows 11 Pro
Is a kind of Tik-Tok

Tik - series RTX 2000 were good
Tok - series RTX 3000 were not so good
Tik - series RTX 4000 are so much better
Tok - series RTX 5000 well, at least they are trying

I´ve been on the computer segment a lot of years (around 1995 at least) and I can say from sure that most of the time
the performance from each generation is more or less 25%, maybe a very few gens have improve a bit more than that but nehh...

I still remember my first MSI FX5200 and my ASUS ATI 9600 XT, now with more than 600W only for the GPU well... things do not look very good hehehe
Lol...RTX 20-series was good? I don't think so. It had marginal uplift vs Pascal, giant chips due to no gain in transistor density going to a half node of 16nm, and cost more with the addition of Tensor and RT cores. If anything, RTX 50-series looks like a repeat of 20-series.

30-series was on inferior tech (Samsung 8nm vs TSMC 7nm), but it was in fact a generational gain in performance over both Pascal and Turing. The only things that really sucked with the 30-series was the availability due to a multitude of factors (ethereum mining, pandemic, scalping, etc.), and the power draw / transient spikes.

40-series, pricing sucked, but it was certainly a huge improvement in efficiency over 30-series and the performance per watt improvement was crazy.
 
Joined
Mar 29, 2023
Messages
1,288 (1.82/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Lol...RTX 20-series was good? I don't think so. It had marginal uplift vs Pascal, giant chips due to no gain in transistor density going to a half node of 16nm, and cost more with the addition of Tensor and RT cores. If anything, RTX 50-series looks like a repeat of 20-series.

30-series was on inferior tech (Samsung 8nm vs TSMC 7nm), but it was in fact a generational gain in performance over both Pascal and Turing. The only things that really sucked with the 30-series was the availability due to a multitude of factors (ethereum mining, pandemic, scalping, etc.), and the power draw / transient spikes.

40-series, pricing sucked, but it was certainly a huge improvement in efficiency over 30-series and the performance per watt improvement was crazy.

Indeed, the 50 series looks exactly like the 20 series, which was also a SKIP gen.

Well I appreciate the compliment Thanks !! What you think “should” be done doesn’t make it a reality , pretty cool concept right ?

Second mental exercise for you - what’s high res ? And if it’s 4k then why are there gains at 1440p? Double digit gains I might add but you seem convinced this test shouldn’t be done so I dunno curious to get your thoughts here


I wonder what displays TPU is using 4k is probably 240hz but what about 1440p/1080p ? What’s the max nowadays in the consumer space - 500+ hz? I wonder how big the delta would be at lower res high refresh between the cards especially on those pro titles

Lol /SWOOOSH

You seem about as sharp as a wooden spoon, so let me pen it out for you : at low res you are bottlenecked by cpu performance, and at high res you are bottlenecked by gpu performance. Therefore there is zero point in benchmarking cpus at high res and gpus at low res. The performance increase you see at high res with gpus will also be there at lower res once you get a faster cpu. And vice versa with cpus.
 

Hxx

Joined
Dec 5, 2013
Messages
350 (0.09/day)
at low res you are bottlenecked by cpu performance
Nope lol at least not across the test suite used by TPU … for example


Please don’t make me explain to you what cpu bottleneck is don’t do it bro.

TPU note in the conclusion :” you could run the card at 1440p[…] the only reason why you would want to do that is if you really want the lowest latency with the highest FPS”
 
Last edited:
Joined
Sep 17, 2014
Messages
23,428 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I wonder what displays TPU is using 4k is probably 240hz but what about 1440p/1080p ? What’s the max nowadays in the consumer space - 500+ hz? I wonder how big the delta would be at lower res high refresh between the cards especially on those pro titles
Its completely irrelevant? Benchmarks run at uncapped FPS... The delta is right there, in each review. Are you for real? Do you really think the monitor hard caps the framerate?

Nope lol at least not across the test suite used by TPU … for example


Please don’t make me explain to you what cpu bottleneck is don’t do it bro.

TPU note in the conclusion :” you could run the card at 1440p[…] the only reason why you would want to do that is if you really want the lowest latency with the highest FPS”
Wha....??
OK this confirms you really have zero clue whatsoever. Do explain what a CPU bottleneck is, this might be fun...
 
Joined
Mar 29, 2023
Messages
1,288 (1.82/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Nope lol at least not across the test suite used by TPU … for example


Man it would help if you understood sarcasm or that writing caps doesn’t make it any better lol or if you at least read the review wouldn’t it . Too bad you didn’t you’re just wasting our time

TPU note :” you could run the card at 1440p[…] the only reason why you would want to do that is if you really want the lowest latency with the highest FPS”

Funny, cause it seems that you didn't actually read it yourself - at the very least, you haven't understood anything.

A few games aren't cpu bottlenecked, which means the "test suite" gets a small increase at low res, but fact is that the vast majority of games are cpu bottlenecked at low res, making it utterly pointless to evaluate highend gpu performance from low res benchmarks.


But sure, continue to argue that the 5080 will be faster than 4090... shows your amazing level of knowledge...
 

Hxx

Joined
Dec 5, 2013
Messages
350 (0.09/day)
Wha....??
OK this confirms you really have zero clue whatsoever. Do explain what a CPU bottleneck is, this might be fun.
Sure I’ll entertain . Let’s look at counter strike . 1440p benches show 578fps . At 4k benches show 347 fps .if the 9800x3d used here bottlenecked this card I would expect to see the same fps or very close to 347fps at 1440p. How did I do ?
 
Joined
Mar 29, 2023
Messages
1,288 (1.82/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Sure I’ll entertain . Let’s look at counter strike . 1440p benches show 578fps . At 4k benches show 347 fps .if the 9800x3d used here bottlenecked this card I would expect to see the same fps or very close to 347fps at 1440p. How did I do ?

Do you have someone you can call to help you tie your shoelaces ?
 

Hxx

Joined
Dec 5, 2013
Messages
350 (0.09/day)
Do you have someone you can call to help you tie your shoelaces ?
All me bro . Better than you using crocks with socks lol

Funny, cause it seems that you didn't actually read it yourself - at the very least, you haven't understood anything.

A few games aren't cpu bottlenecked, which means the "test suite" gets a small increase at low res, but fact is that the vast majority of games are cpu bottlenecked at low res, making it utterly pointless to evaluate highend gpu performance from low res benchmarks.


But sure, continue to argue that the 5080 will be faster than 4090... shows your amazing level of

lol bro move on it’s over yes bottlenecks gpus and cpus are discussed here you’re good
 
Joined
Sep 17, 2014
Messages
23,428 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Sure I’ll entertain . Let’s look at counter strike . 1440p benches show 578fps . At 4k benches show 347 fps .if the 9800x3d used here bottlenecked this card I would expect to see the same fps or very close to 347fps at 1440p. How did I do ?
The 9800X3D could still bottleneck this GPU fine on both resolutions even if the FPS is different. The only issue is, you can barely test it because there's hardly a faster CPU out there. But, the fact that these GPUs (4090, 5090) end up closer together on lower resolutions, indicates they are bottlenecked by something in the pipeline and you're not seeing the full grunt the faster GPU has on offer.

I'm not here to insult you - but perhaps you have a few things to learn.

Let's look a bit longer at CS2 and do some math. 3090 vs 5090 this time?

1080p
726 / 403 = 1,8014 = 180% performance win for 5090
1440p
578 / 289 = 2,0000 = 200% performance win for 5090
4K
347 / 152 = 2,2828 = 228% performance win for 5090

Neither of these GPUs struggle on this game in terms of resources, they all produce immense FPS
At the lower resolutions though, EVEN at 1440p and 4K, there is a CPU impact on the 5090, because it is leaps and bounds faster (48%!!) at 4K. I bet at 8K, you would see an even bigger gap, moving even more load onto the GPU and removing the CPU further as a limiting factor.

You see, a cpu bottleneck isn't just 'cpu too slow'... it loses a fraction of a second on every frame, and when frames are produced at such high frequencies, every millisecond matters and returns in lost GPU performance. In heavier titles, with lower FPS, this effect is less pronounced because now you've got a generally higher average time to produce a frame; a lot more leeway for CPUs to prepare data for said frame.

1738003318535.png
 
Last edited:
Joined
Mar 29, 2023
Messages
1,288 (1.82/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
All me bro . Better than you using crocks with socks lol



lol bro move on it’s over yes bottlenecks gpus and cpus are discussed here you’re good

I suppose i couldn't expect a more intelligent answer from someone claiming that the 5080 is faster than 4090.

The 9800X3D could still bottleneck this GPU fine on both resolutions even if the FPS is different. The only issue is, you can barely test it because there's hardly a faster CPU out there. But, the fact that these GPUs (4090, 5090) end up closer together on lower resolutions, indicates they are bottlenecked by something in the pipeline and you're not seeing the full grunt the GPU has on offer.

I'm not here to insult you - but perhaps you have a few things to learn.

"a few things to learn." that's putting it mildly.
 

Hxx

Joined
Dec 5, 2013
Messages
350 (0.09/day)
The 9800X3D could still bottleneck this GPU fine on both resolutions even if the FPS is different. The only issue is, you can barely test it because there's hardly a faster CPU out there. But, the fact that these GPUs (4090, 5090) end up closer together on lower resolutions, indicates they are bottlenecked by something in the pipeline and you're not seeing the full grunt the faster GPU has on offer.

I'm not here to insult you - but perhaps you have a few things to learn.
But you are insulting me you said I have no clue . That’s an insult FYI. But whatever unlike the snowflake Dragam above me I don’t get up and arms about it lmao.

I don’t know if I agree with your statement not because it’s wrong it’s not but because it applies to every single generation . You will always have this scenario of top dog gpu potentially being bottlenecked by future cpu releases more so than current releases so I’m not sure if this makes sense in what we are trying to debate . The fact of the matter is there are games that can be played at lower than 4k resolutions that can benefit from a 5090 in the right conditions and that’s that that’s my point.
We can argue value , low gains etc . Now explain that to snowflake Dragam hopefully he gets it
 
Joined
Sep 17, 2014
Messages
23,428 (6.13/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
But you are insulting me you said I have no clue . That’s an insult FYI. But whatever unlike the snowflake Dragram above me I don’t get up and arms about it lmao.
I don’t know if I agree with your statement not because it’s wrong but because it applies to every single generation . You will always have this scenario of top dog gpu potentially being bottlenecks by future cpu releases so I’m not sure if this makes sense in what we are trying to debate . The fact of the matter is there are games that can be played at lower than 4k resolutions that can benefit from a 5090 in the right conditions and that’s that that’s my points . Now explain that to snowflake Dagram hopefully he gets it
Look at the above example on CS2. Honestly trying to bring some insight to this otherwise pointless back and forth on 'who's right'. That's not my game, I like to educate.

At the same time I call things as I see them. What should I have said...

And yeah, sure you can use a 5090 to play at 1440p. Two or three years down the line that GPU will probably struggle at that res, too :) I don't really subscribe to the idea that there is a '4K card'. There's just performance that ages. OTOH, this does not make it true that a 5080 will match a 4090 just because the numbers get close at some lower resolution, which I think was the point others were trying to make ;)
 
Last edited:
Joined
Oct 19, 2022
Messages
365 (0.42/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6200MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
Tik - series RTX 2000 were good
Tok - series RTX 3000 were not so good
Tik - series RTX 4000 are so much better
Tok - series RTX 5000 well, at least they are trying

I´ve been on the computer segment a lot of years (around 1995 at least) and I can say from sure that most of the time
the performance from each generation is more or less 25%, maybe a very few gens have improve a bit more than that but nehh...
The RTX 20s were not that good compared to the GTX 10s back then... they just had DLSS which is a pretty cool feature today, but thankfully GTX 10s can use AMD's FSR to help them too. RT cores were cool but the RT performance was pretty terrible... Also the real problem of the RTX 20s was their pricing, the 1080 Ti was $700 whereas the 2080 Ti was $1000 (and even $1200 later on) when the 2080 Ti was only 30-40% but for 40% more money and DLSS was really bad too.

I agree with gen over gen improvements though, it used to be between 30 and 50% each generation, which is exactly where the 5090 is now (at least at 4K which is what this GPU is supposed to be, at 4K GPU). Sure we've seen generations extremely impressive like 8800 GTX being 2x the 7900 GTX and even the 4090 being 70-80% faster than the 3090 and up to 2x in RT/PT but yeah, those are exceptions and should not be considered normal, mostly with the death of Moore's Law!
 
Joined
Mar 29, 2023
Messages
1,288 (1.82/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
The RTX 20s were not that good compared to the GTX 10s back then... they just had DLSS which is a pretty cool feature today, but thankfully GTX 10s can use AMD's FSR to help them too. RT cores were cool but the RT performance was pretty terrible... Also the real problem of the RTX 20s was their pricing, the 1080 Ti was $700 whereas the 2080 Ti was $1000 (and even $1200 later on) when the 2080 Ti was only 30-40% but for 40% more money and DLSS was really bad too.

I agree with gen over gen improvements though, it used to be between 30 and 50% each generation, which is exactly where the 5090 is now (at least at 4K which is what this GPU is supposed to be, at 4K GPU). Sure we've seen generations extremely impressive like 8800 GTX being 2x the 7900 GTX and even the 4090 being 70-80% faster than the 3090 and up to 2x in RT/PT but yeah, those are exceptions and should not be considered normal, mostly with the death of Moore's Law!

I agree with all your points, though 30% is on the very low end of the scale, and will (understandably) make alot of people skip the gen :)
 

Hxx

Joined
Dec 5, 2013
Messages
350 (0.09/day)
And yeah, sure you can use a 5090 to play at 1440p. Two or three years down the line that GPU will probably struggle at that res, too :) I don't really subscribe to the idea that there is a '4K card'. There's just performance that ages. OTOH, this does not make it true that a 5080 will match a 4090 just because the numbers get close at some lower resolution, which I think was the point others were trying to make ;)
We may need to agree to disagree . I follow your point though
However I am focused on specific cases in the present time with the current available hardware . Two individuals with 2 different sets of hardware getting different results out of both cards can be right at the same time even if they reach a different conclusion.
 
Last edited:
Top