• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New Leak Reveals NVIDIA RTX 5080 Is Slower Than RTX 4090

Yes 40% of the card is completely not being used. When i updated my 5080 and started Microsoft flight simulator on the loading screen i show 4000fps and during game play i am between 200 and 300 fps on ultra at 67% utilization and 60c temp stable.
Why would they do that? What's the point?
 
Edit: The only thing that matters is your card vs your needs vs the money you spent on it.
Exactly this.

You mocked it as an e-penis contest?
That's the way it came off to me as well. Then you followed with...
Cool. I brought the guillotine.
...this and...
I reverse-engineered it.
I patched it.
And I unleashed it.
...this.

So really?

Why would they do that? What's the point?
They're not, that user seems to be trying at the yanking of chains..
 
Exactly this.


That's the way it came off to me as well. Then you followed with...

...this and...

...this.

So really?


They're not, that user seems to be trying at the yanking of chains..
yanking at chains with full proof of top 99% GPU in the world. Please don't hurt yourself eating crayons.
 

Attachments

Were these independent tests conducted by people other than yourself? Yes conducted by NVIDIA today at noon via remote. How's that? Did I meet all your qualifications. Observed by the VP of GPU software and 3 people from engineering. How about that? Does that satisfy you? Oh wait you don't matter because they were.

Hey check out that revision of A1, must be an engineer with a maxed out card. Wow. Who would have ever believed it.
 

Attachments

  • 1000025412.jpg
    1000025412.jpg
    5.8 MB · Views: 52
You claim to be a tech expert and yet can't use the reply button properly? Sorry, not buying your act or your "facts".
 
Now that is an interesting thread! Plot twist and a comedy!

I have 5080 as well, how's your 3d mark scores? Port Royal? Or speedway? You will be in Hall of Fame for sure!
 
That's interesting. I'm wondering if there's ever gonna be anything official coming from Nvidia about this. An artificial lock on a GPU just will not do. Why would they do that?

So what does that lock do? Does it limit clocks, power consumption, etc? Or is it a feature lock?
LOL, do you actually believe him?

Let me light the case. Passmarks GPU tests are completely CPU bound. For example in DX9 10 11 12 tests my 4090 was chilling at 30 to 60%. So he probably has a 9800x 3d. The only test that actually stretches the GPU is Compute, and in that one I scored over 30k vs his 23k. That's a 30% difference btw. So yeah, he hasn't "unlocked" 100% of his 5080s brainpower.
 
LOL, do you actually believe him?

Let me light the case. Passmarks GPU tests are completely CPU bound. For example in DX9 10 11 12 tests my 4090 was chilling at 30 to 60%. So he probably has a 9800x 3d. The only test that actually stretches the GPU is Compute, and in that one I scored over 30k vs his 23k. That's a 30% difference btw. So yeah, he hasn't "unlocked" 100% of his 5080s brainpower.
You dare to defy The Chainbreaker?
 
LOL, do you actually believe him?
If I believed him, would I ask for clarification? ;)

I'm not here to believe. I'm here to learn.

An artificial driver limit to disable parts of a perfectly working chip doesn't make any sense from any standpoint, imo, but if he can support his claim with actual evidence, I'm willing to listen.
 
If I believed him, would I ask for clarification? ;)

I'm not here to believe. I'm here to learn.

An artificial driver limit to disable parts of a perfectly working chip doesn't make any sense from any standpoint, imo, but if he can support his claim with actual evidence, I'm willing to listen.

Even back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration. Of course, it was still a GTX 480 in every regard, core count, memory, ECC did not work, etc. - but that allowed you to install the Quadro driver, at least.

What NV does do to segment GeForce to their professional cards is selectively disable optimizations that target certain professional suites, and disable certain esoteric features like 30-bit color SDR. The pro-viz optimizations that target things like specviewperf, Autodesk suites, CATIA, etc. - was famously enabled specifically on Titan X Pascal, Xp, V and RTX, with all other professional features disabled, NVIDIA did this as an answer to the Vega Frontier, which initially explicitly supported both Radeon Pro Software and Adrenalin - nowadays this still works but it's a registry leftover and has to be toggled manually by the user.

Vega FE likewise didn't really enable everything that WX 9100 supported (stereoscopic 3D, ECC, deep color SDR, genlock etc.) are all disabled and hidden, although if you flash a WX 9100 BIOS on that GPU all of these features will be restored and fully functional as the core is exactly identical, and so is the HBM memory used, with the exception of genlock since the Vega FE board physically doesn't have the syncing connector. Only other catch is that since WX 9100 has 6 mDP and Vega FE is 3 DP + 1 HDMI, the HDMI port gets knocked out and DPs 1-3 get detected as the first three ports, with no way to connect anything to 4, 5 and 6 as that physically doesn't exist on the FE board. Since AMD bailed out of the "prosumer" deal with the Radeon VII, NV just released the 3090 as a pure gaming card, buried the Titan line and kept it that way until now. RTX 5090 is... a purebred gaming card. No extra features extended to it.

IF, and only IF this dude is telling the tiniest bit of truth, what he came across is likely the lock on pro-viz optimizations, which to the best of my knowledge, do not affect Cinebench but you should see significant gains in the specviewperf benchmarks.

 
Even back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration. Of course, it was still a GTX 480 in every regard, core count, memory, ECC did not work, etc. - but that allowed you to install the Quadro driver, at least.

What NV does do to segment GeForce to their professional cards is selectively disable optimizations that target certain professional suites, and disable certain esoteric features like 30-bit color SDR. The pro-viz optimizations that target things like specviewperf, Autodesk suites, CATIA, etc. - was famously enabled specifically on Titan X Pascal, Xp, V and RTX, with all other professional features disabled, NVIDIA did this as an answer to the Vega Frontier, which initially explicitly supported both Radeon Pro Software and Adrenalin - nowadays this still works but it's a registry leftover and has to be toggled manually by the user.

Vega FE likewise didn't really enable everything that WX 9100 supported (stereoscopic 3D, ECC, deep color SDR, genlock etc.) are all disabled and hidden, although if you flash a WX 9100 BIOS on that GPU all of these features will be restored and fully functional as the core is exactly identical, and so is the HBM memory used, with the exception of genlock since the Vega FE board physically doesn't have the syncing connector. Only other catch is that since WX 9100 has 6 mDP and Vega FE is 3 DP + 1 HDMI, the HDMI port gets knocked out and DPs 1-3 get detected as the first three ports, with no way to connect anything to 4, 5 and 6 as that physically doesn't exist on the FE board. Since AMD bailed out of the "prosumer" deal with the Radeon VII, NV just released the 3090 as a pure gaming card, buried the Titan line and kept it that way until now. RTX 5090 is... a purebred gaming card. No extra features extended to it.

IF, and only IF this dude is telling the tiniest bit of truth, what he came across is likely the lock on pro-viz optimizations, which to the best of my knowledge, do not affect Cinebench but you should see significant gains in the specviewperf benchmarks.

Could be... but he's claiming that 40% of the chip on the 5080 is running idle during load, which I find bonkers. Why build a large and expensive chip only to disable half of it (which is fully operational otherwise) by software so that half of the internet community would hate it for being overpriced and stagnant compared to last gen? Why not just let it run wild and obliterate the competition? Or why not design a much smaller and cheaper chip for higher profit margins? It's like Bugatti releasing their newest supercar with a 16-cylinder engine, 6 of which are disabled, which makes it slower than last gen because of... ehm... reasons. :kookoo:
 
That's why I question him, the RTX 5080 is a fully enabled GB203 chip. There is nothing to unlock in it. The only cards that aren't a full die are the 5070 Ti and the 5090.
 
That's why I question him, the RTX 5080 is a fully enabled GB203 chip. There is nothing to unlock in it. The only cards that aren't a full die are the 5070 Ti and the 5090.
He's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
 
lol, always funny to see late night drunkposting.
And from a new user who seems to have created an account just to crap-post.

He's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
I'm not buying it either. It is technically possible and with all of NVidia's shenanigans I'm not willing to completely rule it out, but this kind of thing needs more evidence than just "Hey look what I can do!" kinds of claims.
 
Last edited:
Even back in the Fermi days, when there was little hardware security, you could effectively convert a GTX 480 into some sort of franken 1.5GB "Quadro 6000" just by changing the soft strips configuration.
Or unlock the GTX 465 into GTX 470. Likewise the HD 6950 into HD 6970. And even unlock Phenom II CPUs. Those were the DAYS!
The only cards that aren't a full die are the 5070 Ti and the 5090.
And the 5070.
Given the poor price-to-performance ratio of the 5070 compared to 9070/9070 XT, there's a good chance that next year we'll see the full-die 5070 Super replacing the 5070 at $550 MSRP.
 
He's claiming that it's a physically fully enabled chip, with parts only disabled by the driver, which makes even less sense to me.
This fella was a clear as day troll from post #1 onwards but I liked reading your investigation there :)

Its the internet. The baseline response I have when someone says anything that isnt common sense is 'yeah, whatever'. Turns out to be the correct response in 99,99% of the exchanges you have on this medium. One needs only a brief look at social media discourse to get proof of that.
 
This fella was a clear as day troll from post #1 onwards but I liked reading your investigation there :)

Its the internet. The baseline response I have when someone says anything that isnt common sense is 'yeah, whatever'. Turns out to be the correct response in 99,99% of the exchanges you have on this medium. One needs only a brief look at social media discourse to get proof of that.
"Innocent until proven guilty", aka. "not a troll until proven to be one" is my motto here. Objective, scientific proof always decides whether you're one or not. :)

I'm not buying it either. It is technically possible and with all of NVidia's shenanigans I'm not willing to completely rule it out, but this kind of thing needs more evidence than just "Hey look what I can do!" kinds of claims.
1. I can't imagine any motivating factor to disable parts on a fully working chip by software and make it a worse product than it could be.
2. If we assume that the 5080 works with 40% of its parts disabled by default, that means that the chip is capable of performing 40% better than the 4080 Super with a similar number of components running at similar clock speeds, or it has a 12% higher power consumption while using 40% fewer components. Neither of these is possible.

I'm still willing to listen to anyone who wants to prove these points wrong simply on the basis of the above.
 
"Innocent until proven guilty", aka. "not a troll until proven to be one" is my motto here. Objective, scientific proof always decides whether you're one or not. :)
Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
 
Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
I always give the benefit of the doubt but any benefits went out of the window when he used passmark. Then he provided some CBR24 numbers and yeah, those weren't good for a "40% extra unlocked performance 5080", since it was a lot slower than my 4090. We'd have seen a game by now in 4k if there was anything to it.
 
Yeah I tried that for a few years but concluded I ain't got time for that, and its exactly what the current discourse around disinformation is all about: flooding the sane with so much bullshit there's just not enough time to sift through it all.
That's how the internet dies - by idiots flooding it with misinformation, and decent people not having the time to sift through it all. We don't even need AI to make it happen.

For a while, I used to think that TPU was different, that this was a place for people who really understand tech, but I have to admit sadly that it isn't.

I ain't fallin for that trap anymore. I like to use history as the biggest teacher. Everything we see has been done before and got its reality check before. Miracles don't really happen anymore. Boring, but true. Its a bit like an adblocker; the blacklist keeps growing, and the internet keeps getting better that way. Less is more.
Very true.
 
That's how the internet dies - by idiots flooding it with misinformation, and decent people not having the time to sift through it all. We don't even need AI to make it happen.

For a while, I used to think that TPU was different, that this was a place for people who really understand tech, but I have to admit sadly that it isn't.
TPU is different. Plenty of people here that work as stabilizers. No algorithm that makes the idiocy run wild and escalates it further. Does it weed out the nonsense, no. But it certainly helps a lot.

I always give the benefit of the doubt but any benefits went out of the window when he used passmark. Then he provided some CBR24 numbers and yeah, those weren't good for a "40% extra unlocked performance 5080", since it was a lot slower than my 4090. We'd have seen a game by now in 4k if there was anything to it.
I already disconnected at the clear and obvious keyboard heroism in post #1
 
Back
Top