Friday, January 24th 2025

New Leak Reveals NVIDIA RTX 5080 Is Slower Than RTX 4090

A set of newly leaked benchmarks has revealed the performance capabilities of NVIDIA's upcoming RTX 5080 GPU. Scheduled to launch alongside the RTX 5090 on January 30, the GPU was spotted on Geekbench under OpenCL and Vulkan benchmark tests—and based on the performance, it might not make it among the best graphics cards. The tested device was an MSI-branded RTX 5080 labeled as model MS-7E62. This setup had AMD's Ryzen 7 9800X3D processor, which many consider one of the best CPUs for gaming. It also included an MSI MPG 850 Edge TI Wi-Fi motherboard and 32 GB of DDR5-6000 memory.

The benchmark results show that the RTX 5080 scored 261,836 points in Vulkan and 256,138 points in OpenCL tests. Compared to the RTX 4080, its previous version, the RTX 5080 has a 22% boost in Vulkan performance and a small 6.7% gain in OpenCL. Reddit user TruthPhoenixV found that on the Blender Open Data platform, the GPU got a median score of 9,063.77. This score is 9.4% higher than the RTX 4080 and 8.2% better than the RTX 4080 Super. Even with these improvements, the RTX 5080 might not outperform the current-gen top-tier RTX 4090. In the past, NVIDIA's 80-class GPUs have beaten the 90-class GPUs from the previous generation, but these early numbers suggest this trend might not continue for the RTX 5080.
The RTX 5080 uses NVIDIA's latest Blackwell architecture, with 10,752 CUDA cores spread across 84 Streaming Multiprocessors (SMs) versus the 9,728 cores in the RTX 4080. It has 16 GB of GDDR7 memory on a 256-bit bus. NVIDIA says it can deliver 1,801 TOPS in AI performance through Tensor Cores and 171 TeraFLOPS of ray tracing performance using its RT Cores.

That said, it's important to note that these benchmark results have not been fully verified so we should wait for the review embargo to lift before concluding.
Sources: DigitalTrends, TruthPhoenixV
Add your own comment

260 Comments on New Leak Reveals NVIDIA RTX 5080 Is Slower Than RTX 4090

#226
lexluthermiester
AusWolfEdit: The only thing that matters is your card vs your needs vs the money you spent on it.
Exactly this.
The-ArchitectYou mocked it as an e-penis contest?
That's the way it came off to me as well. Then you followed with...
The-ArchitectCool. I brought the guillotine.
...this and...
The-ArchitectI reverse-engineered it.
I patched it.
And I unleashed it.
...this.

So really?
AusWolfWhy would they do that? What's the point?
They're not, that user seems to be trying at the yanking of chains..
Posted on Reply
#227
The-Architect
lexluthermiesterExactly this.


That's the way it came off to me as well. Then you followed with...

...this and...

...this.

So really?


They're not, that user seems to be trying at the yanking of chains..
yanking at chains with full proof of top 99% GPU in the world. Please don't hurt yourself eating crayons.
Posted on Reply
#228
lexluthermiester
The-Architectwith full proof
When others can confirm your results, you may then call it proof. Until then it is only your evidence.
The-ArchitectPlease don't hurt yourself eating crayons.
And then you come in with that. Feel better?
Posted on Reply
#229
The-Architect
lexluthermiesterWhen others can confirm your results, you may then call it proof. Until then it is only your evidence.

And then you come in with that. Feel bett
MY results have been proven on 3 different machines all showing the same increase. Thanks dickhead.
Posted on Reply
#230
lexluthermiester
The-ArchitectMY results have been proven on 3 different machines all showing the same increase.
Were these independent tests conducted by people other than yourself? IF not, it's not proof. Sooo...
The-ArchitectThanks dickhead.
Oh another iteration! How delightful.
Posted on Reply
#231
The-Architect
lexluthermiesterWere these independent tests conducted by people other than yourself? Yes conducted by NVIDIA today at noon via remote. How's that? Did I meet all your qualifications. Observed by the VP of GPU software and 3 people from engineering. How about that? Does that satisfy you? Oh wait you don't matter because they were.
Hey check out that revision of A1, must be an engineer with a maxed out card. Wow. Who would have ever believed it.
Posted on Reply
#232
lexluthermiester
You claim to be a tech expert and yet can't use the reply button properly? Sorry, not buying your act or your "facts".
Posted on Reply
#233
remekra
Now that is an interesting thread! Plot twist and a comedy!

I have 5080 as well, how's your 3d mark scores? Port Royal? Or speedway? You will be in Hall of Fame for sure!
Posted on Reply
#234
Dr. Dro
remekraNow that is an interesting thread! Plot twist and a comedy.
lol, always funny to see late night drunkposting.
Posted on Reply
#235
JustBenching
AusWolfThat's interesting. I'm wondering if there's ever gonna be anything official coming from Nvidia about this. An artificial lock on a GPU just will not do. Why would they do that?

So what does that lock do? Does it limit clocks, power consumption, etc? Or is it a feature lock?
LOL, do you actually believe him?

Let me light the case. Passmarks GPU tests are completely CPU bound. For example in DX9 10 11 12 tests my 4090 was chilling at 30 to 60%. So he probably has a 9800x 3d. The only test that actually stretches the GPU is Compute, and in that one I scored over 30k vs his 23k. That's a 30% difference btw. So yeah, he hasn't "unlocked" 100% of his 5080s brainpower.
Posted on Reply
#236
remekra
JustBenchingLOL, do you actually believe him?

Let me light the case. Passmarks GPU tests are completely CPU bound. For example in DX9 10 11 12 tests my 4090 was chilling at 30 to 60%. So he probably has a 9800x 3d. The only test that actually stretches the GPU is Compute, and in that one I scored over 30k vs his 23k. That's a 30% difference btw. So yeah, he hasn't "unlocked" 100% of his 5080s brainpower.
You dare to defy The Chainbreaker?
Posted on Reply
#237
AusWolf
JustBenchingLOL, do you actually believe him?
If I believed him, would I ask for clarification? ;)

I'm not here to believe. I'm here to learn.

An artificial driver limit to disable parts of a perfectly working chip doesn't make any sense from any standpoint, imo, but if he can support his claim with actual evidence, I'm willing to listen.
Posted on Reply
#238
Dr. Dro
AusWolfIf I believed him, would I ask for clarification? ;)

I'm not here to believe. I'm here to learn.

An artificial driver limit to disable parts of a perfectly working chip doesn't make any sense from any standpoint, imo, but if he can support his claim with actual evidence, I'm willing to listen.
Even back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration. Of course, it was still a GTX 480 in every regard, core count, memory, ECC did not work, etc. - but that allowed you to install the Quadro driver, at least.

What NV does do to segment GeForce to their professional cards is selectively disable optimizations that target certain professional suites, and disable certain esoteric features like 30-bit color SDR. The pro-viz optimizations that target things like specviewperf, Autodesk suites, CATIA, etc. - was famously enabled specifically on Titan X Pascal, Xp, V and RTX, with all other professional features disabled, NVIDIA did this as an answer to the Vega Frontier, which initially explicitly supported both Radeon Pro Software and Adrenalin - nowadays this still works but it's a registry leftover and has to be toggled manually by the user.

Vega FE likewise didn't really enable everything that WX 9100 supported (stereoscopic 3D, ECC, deep color SDR, genlock etc.) are all disabled and hidden, although if you flash a WX 9100 BIOS on that GPU all of these features will be restored and fully functional as the core is exactly identical, and so is the HBM memory used, with the exception of genlock since the Vega FE board physically doesn't have the syncing connector. Only other catch is that since WX 9100 has 6 mDP and Vega FE is 3 DP + 1 HDMI, the HDMI port gets knocked out and DPs 1-3 get detected as the first three ports, with no way to connect anything to 4, 5 and 6 as that physically doesn't exist on the FE board. Since AMD bailed out of the "prosumer" deal with the Radeon VII, NV just released the 3090 as a pure gaming card, buried the Titan line and kept it that way until now. RTX 5090 is... a purebred gaming card. No extra features extended to it.

IF, and only IF this dude is telling the tiniest bit of truth, what he came across is likely the lock on pro-viz optimizations, which to the best of my knowledge, do not affect Cinebench but you should see significant gains in the specviewperf benchmarks.

gwpg.spec.org/benchmarks/benchmark/specviewperf-2020-v3-1/
Posted on Reply
#239
AusWolf
Dr. DroEven back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration. Of course, it was still a GTX 480 in every regard, core count, memory, ECC did not work, etc. - but that allowed you to install the Quadro driver, at least.

What NV does do to segment GeForce to their professional cards is selectively disable optimizations that target certain professional suites, and disable certain esoteric features like 30-bit color SDR. The pro-viz optimizations that target things like specviewperf, Autodesk suites, CATIA, etc. - was famously enabled specifically on Titan X Pascal, Xp, V and RTX, with all other professional features disabled, NVIDIA did this as an answer to the Vega Frontier, which initially explicitly supported both Radeon Pro Software and Adrenalin - nowadays this still works but it's a registry leftover and has to be toggled manually by the user.

Vega FE likewise didn't really enable everything that WX 9100 supported (stereoscopic 3D, ECC, deep color SDR, genlock etc.) are all disabled and hidden, although if you flash a WX 9100 BIOS on that GPU all of these features will be restored and fully functional as the core is exactly identical, and so is the HBM memory used, with the exception of genlock since the Vega FE board physically doesn't have the syncing connector. Only other catch is that since WX 9100 has 6 mDP and Vega FE is 3 DP + 1 HDMI, the HDMI port gets knocked out and DPs 1-3 get detected as the first three ports, with no way to connect anything to 4, 5 and 6 as that physically doesn't exist on the FE board. Since AMD bailed out of the "prosumer" deal with the Radeon VII, NV just released the 3090 as a pure gaming card, buried the Titan line and kept it that way until now. RTX 5090 is... a purebred gaming card. No extra features extended to it.

IF, and only IF this dude is telling the tiniest bit of truth, what he came across is likely the lock on pro-viz optimizations, which to the best of my knowledge, do not affect Cinebench but you should see significant gains in the specviewperf benchmarks.

gwpg.spec.org/benchmarks/benchmark/specviewperf-2020-v3-1/
Could be... but he's claiming that 40% of the chip on the 5080 is running idle during load, which I find bonkers. Why build a large and expensive chip only to disable half of it (which is fully operational otherwise) by software so that half of the internet community would hate it for being overpriced and stagnant compared to last gen? Why not just let it run wild and obliterate the competition? Or why not design a much smaller and cheaper chip for higher profit margins? It's like Bugatti releasing their newest supercar with a 16-cylinder engine, 6 of which are disabled, which makes it slower than last gen because of... ehm... reasons. :kookoo:
Posted on Reply
#240
Dr. Dro
That's why I question him, the RTX 5080 is a fully enabled GB203 chip. There is nothing to unlock in it. The only cards that aren't a full die are the 5070 Ti and the 5090.
Posted on Reply
#241
AusWolf
Dr. DroThat's why I question him, the RTX 5080 is a fully enabled GB203 chip. There is nothing to unlock in it. The only cards that aren't a full die are the 5070 Ti and the 5090.
He's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
Posted on Reply
#242
lexluthermiester
Dr. Drolol, always funny to see late night drunkposting.
And from a new user who seems to have created an account just to crap-post.
AusWolfHe's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
I'm not buying it either. It is technically possible and with all of NVidia's shenanigans I'm not willing to completely rule it out, but this kind of thing needs more evidence than just "Hey look what I can do!" kinds of claims.
Posted on Reply
#243
Sol_Badguy
Dr. DroEven back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration.
Or unlock the GTX 465 into GTX 470. Likewise the HD 6950 into HD 6970. And even unlock Phenom II CPUs. Those were the DAYS!
Dr. DroThe only cards that aren't a full die are the 5070 Ti and the 5090.
And the 5070.
www.techpowerup.com/gpu-specs/nvidia-gb205.g1074
Given the poor price-to-performance ratio of the 5070 compared to 9070/9070 XT, there's a good chance that next year we'll see the full-die 5070 Super replacing the 5070 at $550 MSRP.
Posted on Reply
#244
Vayra86
AusWolfHe's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
This fella was a clear as day troll from post #1 onwards but I liked reading your investigation there :)

Its the internet. The baseline response I have when someone says anything that isnt common sense is 'yeah, whatever'. Turns out to be the correct response in 99,99% of the exchanges you have on this medium. One needs only a brief look at social media discourse to get proof of that.
Posted on Reply
#245
AusWolf
Vayra86This fella was a clear as day troll from post #1 onwards but I liked reading your investigation there :)

Its the internet. The baseline response I have when someone says anything that isnt common sense is 'yeah, whatever'. Turns out to be the correct response in 99,99% of the exchanges you have on this medium. One needs only a brief look at social media discourse to get proof of that.
"Innocent until proven guilty", aka. "not a troll until proven to be one" is my motto here. Objective, scientific proof always decides whether you're one or not. :)
lexluthermiesterI'm not buying it either. It is technically possible and with all of NVidia's shenanigans I'm not willing to completely rule it out, but this kind of thing needs more evidence than just "Hey look what I can do!" kinds of claims.
1. I can't imagine any motivating factor to disable parts on a fully working chip by software and make it a worse product than it could be.
2. If we assume that the 5080 works with 40% of its parts disabled by default, that means that the chip is capable of performing 40% better than the 4080 Super with a similar number of components running at similar clock speeds, or it has a 12% higher power consumption while using 40% fewer components. Neither of these is possible.

I'm still willing to listen to anyone who wants to prove these points wrong simply on the basis of the above.
Posted on Reply
#246
Vayra86
AusWolf"Innocent until proven guilty", aka. "not a troll until proven to be one" is my motto here. Objective, scientific proof always decides whether you're one or not. :)
Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
Posted on Reply
#247
JustBenching
Vayra86Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
I always give the benefit of the doubt but any benefits went out of the window when he used passmark. Then he provided some CBR24 numbers and yeah, those weren't good for a "40% extra unlocked performance 5080", since it was a lot slower than my 4090. We'd have seen a game by now in 4k if there was anything to it.
Posted on Reply
#248
AusWolf
Vayra86Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.
That's how the internet dies - by idiots flooding it with misinformation, and decent people not having the time to sift through it all. We don't even need AI to make it happen.

For a while, I used to think that TPU was different, that this was a place for people who really understand tech, but I have to admit sadly that it isn't.
Vayra86I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
Very true.
Posted on Reply
#249
Vayra86
AusWolfThat's how the internet dies - by idiots flooding it with misinformation, and decent people not having the time to sift through it all. We don't even need AI to make it happen.

For a while, I used to think that TPU was different, that this was a place for people who really understand tech, but I have to admit sadly that it isn't.
TPU is different. Plenty of people here that work as stabilizers. No algorithm that makes the idiocy run wild and escalates it further. Does it weed out the nonsense, no. But it certainly helps a lot.
JustBenchingI always give the benefit of the doubt but any benefits went out of the window when he used passmark. Then he provided some CBR24 numbers and yeah, those weren't good for a "40% extra unlocked performance 5080", since it was a lot slower than my 4090. We'd have seen a game by now in 4k if there was anything to it.
I already disconnected at the clear and obvious keyboard heroism in post #1
Posted on Reply
#250
Raffles
The-ArchitectAbsolutely and I met with senior leadership at NVIDIA today at noon
Is your next meeting at sundown?
Posted on Reply
Add your own comment
Apr 10th, 2025 07:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts