• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New Leak Reveals NVIDIA RTX 5080 Is Slower Than RTX 4090

Joined
Jan 14, 2019
Messages
15,559 (6.79/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Shadow Rock LP
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Corsair Crystal 280X
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Yes 40% of the card is completely not being used. When i updated my 5080 and started Microsoft flight simulator on the loading screen i show 4000fps and during game play i am between 200 and 300 fps on ultra at 67% utilization and 60c temp stable.
Why would they do that? What's the point?
 
Joined
Jul 5, 2013
Messages
30,425 (7.06/day)
Edit: The only thing that matters is your card vs your needs vs the money you spent on it.
Exactly this.

You mocked it as an e-penis contest?
That's the way it came off to me as well. Then you followed with...
Cool. I brought the guillotine.
...this and...
I reverse-engineered it.
I patched it.
And I unleashed it.
...this.

So really?

Why would they do that? What's the point?
They're not, that user seems to be trying at the yanking of chains..
 

The-Architect

New Member
Joined
Mar 24, 2025
Messages
22 (0.73/day)
Exactly this.


That's the way it came off to me as well. Then you followed with...

...this and...

...this.

So really?


They're not, that user seems to be trying at the yanking of chains..
yanking at chains with full proof of top 99% GPU in the world. Please don't hurt yourself eating crayons.
 

Attachments

  • RTX5080_Performance_Report_The_Architect_UTF8.pdf
    2.6 KB · Views: 60

The-Architect

New Member
Joined
Mar 24, 2025
Messages
22 (0.73/day)
Were these independent tests conducted by people other than yourself? Yes conducted by NVIDIA today at noon via remote. How's that? Did I meet all your qualifications. Observed by the VP of GPU software and 3 people from engineering. How about that? Does that satisfy you? Oh wait you don't matter because they were.

Hey check out that revision of A1, must be an engineer with a maxed out card. Wow. Who would have ever believed it.
 

Attachments

  • 1000025412.jpg
    1000025412.jpg
    5.8 MB · Views: 49
Joined
Jul 5, 2013
Messages
30,425 (7.06/day)
You claim to be a tech expert and yet can't use the reply button properly? Sorry, not buying your act or your "facts".
 
Joined
Jan 19, 2023
Messages
462 (0.56/day)
Now that is an interesting thread! Plot twist and a comedy!

I have 5080 as well, how's your 3d mark scores? Port Royal? Or speedway? You will be in Hall of Fame for sure!
 
Joined
Dec 25, 2020
Messages
8,250 (5.22/day)
Location
São Paulo, Brazil
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Pichau Lunara ARGB 360 + Honeywell PTM7950
Memory 32 GB G.Skill Trident Z5 RGB @ 7600 MT/s
Video Card(s) Palit GameRock GeForce RTX 5090 32 GB
Storage 500 GB WD Black SN750 + 4x 300 GB WD VelociRaptor WD3000HLFS HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700 benchtable
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Joined
Jun 14, 2020
Messages
4,838 (2.73/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
That's interesting. I'm wondering if there's ever gonna be anything official coming from Nvidia about this. An artificial lock on a GPU just will not do. Why would they do that?

So what does that lock do? Does it limit clocks, power consumption, etc? Or is it a feature lock?
LOL, do you actually believe him?

Let me light the case. Passmarks GPU tests are completely CPU bound. For example in DX9 10 11 12 tests my 4090 was chilling at 30 to 60%. So he probably has a 9800x 3d. The only test that actually stretches the GPU is Compute, and in that one I scored over 30k vs his 23k. That's a 30% difference btw. So yeah, he hasn't "unlocked" 100% of his 5080s brainpower.
 
Joined
Jan 19, 2023
Messages
462 (0.56/day)
LOL, do you actually believe him?

Let me light the case. Passmarks GPU tests are completely CPU bound. For example in DX9 10 11 12 tests my 4090 was chilling at 30 to 60%. So he probably has a 9800x 3d. The only test that actually stretches the GPU is Compute, and in that one I scored over 30k vs his 23k. That's a 30% difference btw. So yeah, he hasn't "unlocked" 100% of his 5080s brainpower.
You dare to defy The Chainbreaker?
 
Joined
Jan 14, 2019
Messages
15,559 (6.79/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Shadow Rock LP
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Corsair Crystal 280X
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
LOL, do you actually believe him?
If I believed him, would I ask for clarification? ;)

I'm not here to believe. I'm here to learn.

An artificial driver limit to disable parts of a perfectly working chip doesn't make any sense from any standpoint, imo, but if he can support his claim with actual evidence, I'm willing to listen.
 
Joined
Dec 25, 2020
Messages
8,250 (5.22/day)
Location
São Paulo, Brazil
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Pichau Lunara ARGB 360 + Honeywell PTM7950
Memory 32 GB G.Skill Trident Z5 RGB @ 7600 MT/s
Video Card(s) Palit GameRock GeForce RTX 5090 32 GB
Storage 500 GB WD Black SN750 + 4x 300 GB WD VelociRaptor WD3000HLFS HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700 benchtable
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
If I believed him, would I ask for clarification? ;)

I'm not here to believe. I'm here to learn.

An artificial driver limit to disable parts of a perfectly working chip doesn't make any sense from any standpoint, imo, but if he can support his claim with actual evidence, I'm willing to listen.

Even back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration. Of course, it was still a GTX 480 in every regard, core count, memory, ECC did not work, etc. - but that allowed you to install the Quadro driver, at least.

What NV does do to segment GeForce to their professional cards is selectively disable optimizations that target certain professional suites, and disable certain esoteric features like 30-bit color SDR. The pro-viz optimizations that target things like specviewperf, Autodesk suites, CATIA, etc. - was famously enabled specifically on Titan X Pascal, Xp, V and RTX, with all other professional features disabled, NVIDIA did this as an answer to the Vega Frontier, which initially explicitly supported both Radeon Pro Software and Adrenalin - nowadays this still works but it's a registry leftover and has to be toggled manually by the user.

Vega FE likewise didn't really enable everything that WX 9100 supported (stereoscopic 3D, ECC, deep color SDR, genlock etc.) are all disabled and hidden, although if you flash a WX 9100 BIOS on that GPU all of these features will be restored and fully functional as the core is exactly identical, and so is the HBM memory used, with the exception of genlock since the Vega FE board physically doesn't have the syncing connector. Only other catch is that since WX 9100 has 6 mDP and Vega FE is 3 DP + 1 HDMI, the HDMI port gets knocked out and DPs 1-3 get detected as the first three ports, with no way to connect anything to 4, 5 and 6 as that physically doesn't exist on the FE board. Since AMD bailed out of the "prosumer" deal with the Radeon VII, NV just released the 3090 as a pure gaming card, buried the Titan line and kept it that way until now. RTX 5090 is... a purebred gaming card. No extra features extended to it.

IF, and only IF this dude is telling the tiniest bit of truth, what he came across is likely the lock on pro-viz optimizations, which to the best of my knowledge, do not affect Cinebench but you should see significant gains in the specviewperf benchmarks.

 
Joined
Jan 14, 2019
Messages
15,559 (6.79/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Shadow Rock LP
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Corsair Crystal 280X
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Even back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration. Of course, it was still a GTX 480 in every regard, core count, memory, ECC did not work, etc. - but that allowed you to install the Quadro driver, at least.

What NV does do to segment GeForce to their professional cards is selectively disable optimizations that target certain professional suites, and disable certain esoteric features like 30-bit color SDR. The pro-viz optimizations that target things like specviewperf, Autodesk suites, CATIA, etc. - was famously enabled specifically on Titan X Pascal, Xp, V and RTX, with all other professional features disabled, NVIDIA did this as an answer to the Vega Frontier, which initially explicitly supported both Radeon Pro Software and Adrenalin - nowadays this still works but it's a registry leftover and has to be toggled manually by the user.

Vega FE likewise didn't really enable everything that WX 9100 supported (stereoscopic 3D, ECC, deep color SDR, genlock etc.) are all disabled and hidden, although if you flash a WX 9100 BIOS on that GPU all of these features will be restored and fully functional as the core is exactly identical, and so is the HBM memory used, with the exception of genlock since the Vega FE board physically doesn't have the syncing connector. Only other catch is that since WX 9100 has 6 mDP and Vega FE is 3 DP + 1 HDMI, the HDMI port gets knocked out and DPs 1-3 get detected as the first three ports, with no way to connect anything to 4, 5 and 6 as that physically doesn't exist on the FE board. Since AMD bailed out of the "prosumer" deal with the Radeon VII, NV just released the 3090 as a pure gaming card, buried the Titan line and kept it that way until now. RTX 5090 is... a purebred gaming card. No extra features extended to it.

IF, and only IF this dude is telling the tiniest bit of truth, what he came across is likely the lock on pro-viz optimizations, which to the best of my knowledge, do not affect Cinebench but you should see significant gains in the specviewperf benchmarks.

Could be... but he's claiming that 40% of the chip on the 5080 is running idle during load, which I find bonkers. Why build a large and expensive chip only to disable half of it (which is fully operational otherwise) by software so that half of the internet community would hate it for being overpriced and stagnant compared to last gen? Why not just let it run wild and obliterate the competition? Or why not design a much smaller and cheaper chip for higher profit margins? It's like Bugatti releasing their newest supercar with a 16-cylinder engine, 6 of which are disabled, which makes it slower than last gen because of... ehm... reasons. :kookoo:
 
Joined
Dec 25, 2020
Messages
8,250 (5.22/day)
Location
São Paulo, Brazil
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Pichau Lunara ARGB 360 + Honeywell PTM7950
Memory 32 GB G.Skill Trident Z5 RGB @ 7600 MT/s
Video Card(s) Palit GameRock GeForce RTX 5090 32 GB
Storage 500 GB WD Black SN750 + 4x 300 GB WD VelociRaptor WD3000HLFS HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700 benchtable
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
That's why I question him, the RTX 5080 is a fully enabled GB203 chip. There is nothing to unlock in it. The only cards that aren't a full die are the 5070 Ti and the 5090.
 
Joined
Jan 14, 2019
Messages
15,559 (6.79/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Shadow Rock LP
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Corsair Crystal 280X
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
That's why I question him, the RTX 5080 is a fully enabled GB203 chip. There is nothing to unlock in it. The only cards that aren't a full die are the 5070 Ti and the 5090.
He's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
 
Joined
Jul 5, 2013
Messages
30,425 (7.06/day)
lol, always funny to see late night drunkposting.
And from a new user who seems to have created an account just to crap-post.

He's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
I'm not buying it either. It is technically possible and with all of NVidia's shenanigans I'm not willing to completely rule it out, but this kind of thing needs more evidence than just "Hey look what I can do!" kinds of claims.
 
Last edited:
Joined
Feb 2, 2025
Messages
72 (0.90/day)
Even back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration.
Or unlock the GTX 465 into GTX 470. Likewise the HD 6950 into HD 6970. And even unlock Phenom II CPUs. Those were the DAYS!
The only cards that aren't a full die are the 5070 Ti and the 5090.
And the 5070.
Given the poor price-to-performance ratio of the 5070 compared to 9070/9070 XT, there's a good chance that next year we'll see the full-die 5070 Super replacing the 5070 at $550 MSRP.
 
Joined
Sep 17, 2014
Messages
23,801 (6.15/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
He's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
This fella was a clear as day troll from post #1 onwards but I liked reading your investigation there :)

Its the internet. The baseline response I have when someone says anything that isnt common sense is 'yeah, whatever'. Turns out to be the correct response in 99,99% of the exchanges you have on this medium. One needs only a brief look at social media discourse to get proof of that.
 
Joined
Jan 14, 2019
Messages
15,559 (6.79/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Shadow Rock LP
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Corsair Crystal 280X
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
This fella was a clear as day troll from post #1 onwards but I liked reading your investigation there :)

Its the internet. The baseline response I have when someone says anything that isnt common sense is 'yeah, whatever'. Turns out to be the correct response in 99,99% of the exchanges you have on this medium. One needs only a brief look at social media discourse to get proof of that.
"Innocent until proven guilty", aka. "not a troll until proven to be one" is my motto here. Objective, scientific proof always decides whether you're one or not. :)

I'm not buying it either. It is technically possible and with all of NVidia's shenanigans I'm not willing to completely rule it out, but this kind of thing needs more evidence than just "Hey look what I can do!" kinds of claims.
1. I can't imagine any motivating factor to disable parts on a fully working chip by software and make it a worse product than it could be.
2. If we assume that the 5080 works with 40% of its parts disabled by default, that means that the chip is capable of performing 40% better than the 4080 Super with a similar number of components running at similar clock speeds, or it has a 12% higher power consumption while using 40% fewer components. Neither of these is possible.

I'm still willing to listen to anyone who wants to prove these points wrong simply on the basis of the above.
 
Joined
Sep 17, 2014
Messages
23,801 (6.15/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
"Innocent until proven guilty", aka. "not a troll until proven to be one" is my motto here. Objective, scientific proof always decides whether you're one or not. :)
Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
 
Joined
Jun 14, 2020
Messages
4,838 (2.73/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
I always give the benefit of the doubt but any benefits went out of the window when he used passmark. Then he provided some CBR24 numbers and yeah, those weren't good for a "40% extra unlocked performance 5080", since it was a lot slower than my 4090. We'd have seen a game by now in 4k if there was anything to it.
 
Joined
Jan 14, 2019
Messages
15,559 (6.79/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Shadow Rock LP
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Corsair Crystal 280X
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.
That's how the internet dies - by idiots flooding it with misinformation, and decent people not having the time to sift through it all. We don't even need AI to make it happen.

For a while, I used to think that TPU was different, that this was a place for people who really understand tech, but I have to admit sadly that it isn't.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
Very true.
 
Joined
Sep 17, 2014
Messages
23,801 (6.15/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
That's how the internet dies - by idiots flooding it with misinformation, and decent people not having the time to sift through it all. We don't even need AI to make it happen.

For a while, I used to think that TPU was different, that this was a place for people who really understand tech, but I have to admit sadly that it isn't.
TPU is different. Plenty of people here that work as stabilizers. No algorithm that makes the idiocy run wild and escalates it further. Does it weed out the nonsense, no. But it certainly helps a lot.

I always give the benefit of the doubt but any benefits went out of the window when he used passmark. Then he provided some CBR24 numbers and yeah, those weren't good for a "40% extra unlocked performance 5080", since it was a lot slower than my 4090. We'd have seen a game by now in 4k if there was anything to it.
I already disconnected at the clear and obvious keyboard heroism in post #1
 
Top