• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Super Founders Edition

I don't need any argument because it's you claiming things, not me.

Says the one who tried to make some obtuse comparison between upscaling a game to creating an ai image or scientific database work. They’re not even remotely the same thing and I have no idea where or what that tangent was supposed to prove aside from moving the goal posts.
 
@W1zzard I went back to your RTX 4080 Strix review to compare to this RTX 4080 Super Strix review. I was hoping to compare OC performance between the two since they have especially high max power limits. I can see that the regular 4080 seems to overclock higher. I cannot compare the OC performance FPS between the two reviews because the 4080 used Unigine Heaven and the super uses 3DMark Time Spy GT1.

With your hands on experience with both cards, is it safe to say that the 4080 super is not being held back my max power limit? The difference in overclock is at most ~120mhz in favor of the 4080 non-super which shouldn't do much.


It is not just 5-6 games. Stable diffusion and similar compute software is growing popular in the enthusiast space. I see too many enthusiasts, particularly on reddit, who claim everyone is going must be capable of using compute software as if their jobs depend on it. They also claim anything less than a 4090 is not worth buying because you won't get reasonable FPS at 4k with heavy ray tracing.

The truth is the vast majority of PC gamers don't utilize most of Nvidia's fancy features.
If I had serious compute work to do I would be looking at Nv's professional products. And by that I mean my company is paying for the hardware at work.
 
is it safe to say that the 4080 super is not being held back my max power limit?
Depends on your exact definition. I'd say it is not held back significantly by power limit, I tested FE at max power vs FE and the perf gain was just +1%
 
i know i'm supposed to root for the "under dog" AMD and tell people not to spend so much on nvidia stuff, but i totally understand why people are flocking to buy these. 4080 super models priced between $1000 and $1100 sold out instantly, and even the $1200 models sold out soon after. all this while a 7900XTX can be bought new for $900 in the US. i tried going team red this gen. went through all sorts of dumb issues with two different 7900XTX. ended up getting rid of it and getting a 4090. haven't had a single crash/glitch since i installed it

dont fall for the illusion of high demand at launch when deliberate limited quantities are at work. Its not uncommon for these types of marketing strategies to take shape. Limiting availability invites a sense of increasing need to pull the trigger or overall generally draws the illusion for a more high-demand desirable product. Can you imagine the headlines "RTX 4080 SUPER SOLD OUT WORLDWIDE" and its effects. Now add e.g. "only 1000 units were supplied at launch for the ~8 billion population" -all of that excitement just evaporates.

i know i'm supposed to root for the "under dog" AMD

$950 for a XTX, hardly an underdog. Deserves all the criticism it gets. At this sort of price you'd expect nothing short of perfection!
 
This right here.

When can run RT on a midrange gpu and still getting playable fps that is when it will be time to be all in. So when the majority of the market can afford gpu's to use it we are then cooking at this point.

We are not there yet and I personally wouldn't spend 2K to use it on 5-6 games currently.

You're eyeballing the wrong target. The issue is when the majority of devices can properly run ray tracing. That includes consoles.

Waiting for APUs, IGPs, and SOCs that can do it reliably before it's going to be common. But that's also what's really stalling out 4k now as well.
 
Says the one who tried to make some obtuse comparison between upscaling a game to creating an ai image or scientific database work. They’re not even remotely the same thing and I have no idea where or what that tangent was supposed to prove aside from moving the goal posts.
DLSS is an AI-driven upscaling tech that mitigates its design flaw by applying corrections which belong to its AI part which is constantly getting updates. DLSS Quality is already superior to native rendering (mostly thanks to poor native TAA but still) in some games. Time flies, scientists in NVIDIA and other companies create ways to optimise the code, the library, the tensor cores themselves. This means DLSS becomes better year by year and is not completely unlikely to become constantly superior to native rendering.

You are just yelling at clouds, "This can't be real."

Perhaps you're right and it can't be. But you only call my logic a cesspool of flaws without contributing anything to explanation of your POV.

What I see is upscalers have become decent at 1440p and good at 2160p. At Quality presets (1440p from 960p), you are totally capable of telling native and cheated apart whilst observing stills and you need some attention to tell them apart in motion. At 2160p from 1440p, even stills are identical or near identical. And I drew a conclusion of it being reasonable to expect upscaling to be better than native in a dozen or two years from now.

This "obtuse" comparison was only used to list things that were at some point considered a junkie's fantasy rather than something that makes sense. My prediction on DLSS and upscalers in general belongs to this list, too, from your POV.

when the majority of devices can properly run ray tracing. That includes consoles.
And these are AMD GPUs. That's why RT sucks as of now.
 
You're eyeballing the wrong target. The issue is when the majority of devices can properly run ray tracing. That includes consoles.

Waiting for APUs, IGPs, and SOCs that can do it reliably before it's going to be common. But that's also what's really stalling out 4k now as well.
You are not wrong consoles are a big part of it.
 
It's ridiculous... just like the 4070 ti Super. Why on earth doesn't Nvidia simply reduce the price of the original versions? So far, only the 4070 Super has shown any noteworthy gains. Good times when the super suffix really made a difference, like on the RTX2060 / 2070.
It's better for PR to say "Hey, we launched a new card at a lower price!" than it is to say they dropped the price of an existing card.

Lame as the improvements are with the 4080 Super over the regular 4080, it is still technically faster and cheaper.

AMD will need to respond with price cuts to the 7900 XTX.
 
DLSS is an AI-driven upscaling tech that mitigates its design flaw by applying corrections which belong to its AI part which is constantly getting updates. DLSS Quality is already superior to native rendering (mostly thanks to poor native TAA but still) in some games. Time flies, scientists in NVIDIA and other companies create ways to optimise the code, the library, the tensor cores themselves. This means DLSS becomes better year by year and is not completely unlikely to become constantly superior to native rendering.

You are just yelling at clouds, "This can't be real."

Perhaps you're right and it can't be. But you only call my logic a cesspool of flaws without contributing anything to explanation of your POV.

What I see is upscalers have become decent at 1440p and good at 2160p. At Quality presets (1440p from 960p), you are totally capable of telling native and cheated apart whilst observing stills and you need some attention to tell them apart in motion. At 2160p from 1440p, even stills are identical or near identical. And I drew a conclusion of it being reasonable to expect upscaling to be better than native in a dozen or two years from now.

This "obtuse" comparison was only used to list things that were at some point considered a junkie's fantasy rather than something that makes sense. My prediction on DLSS and upscalers in general belongs to this list, too, from your POV.


And these are AMD GPUs. That's why RT sucks as of now.

Theres still no equivalent comparison to be drawn between the two, so irrelevant.

You can physically not create more information (more pixels) from less information (lower res) that will be a 1:1 recreation of the native resolution. It is, and always will be an approximation with inherent flaws (what we see as the many different artifacts that are plentiful in many games).

It can certainly get better, but to then make the comparison that an AI generate image (from a multitude of sources, not done in real time/on the fly) is an invalid comparison. Lets just excuse the reversed feet, extra fingers, and otherwise not accurate AI images right? Not that it’s even the same thing, but creation at that level simply isn’t achievable in real time.
 
$950 for a XTX, hardly an underdog. Deserves all the criticism it gets. At this sort of price you'd expect nothing short of perfection!
Reminder that the 4090 exists, you want "perfection" buy that one for twice money.

When can run RT on a midrange gpu and still getting playable fps
You can, the problem is Nvidia is trying hard to move the goal post to path tracing which will categorically prevent that from happening for many years to come.
 
You can, the problem is Nvidia is trying hard to move the goal post to path tracing which will categorically prevent that from happening for many years to come.
True but I mean like Cyberpunk RT none path tracing version without up scaling and native. Path tracing is as you put it moving the goal post.
 
I've been loving my 4080 and now they lowered the price by $200 and gave more performance. What a great deal. I forgot how bad amd's RT was; this card really is the nail in the coffin for them.
 
I've been loving my 4080 and now they lowered the price by $200 and gave more performance. What a great deal. I forgot how bad amd's RT was; this card really is the nail in the coffin for them.
It's bad RT performance in the relative sense (relative to its current competition), but in the absolute sense, the 7900 cards aren't all that bad. To say that ballpark of performance is bad would be indirectly telling RTX 3090 owners that the card they bought for "future proofing" isn't proofed now that the future is here, and that's just mean.
 
the 7900 cards aren't all that bad
Yeah, they just get absolutely washed out in path tracing by anything green of 500+ USD current gen/3080 Ti.
Nvidia is trying hard to move the goal post to path tracing which will categorically prevent that from happening for many years to come.
How? If you mean achieving maxed out PT on mid-range then yes, not gonna happen until they come up with something even more ridiculously demanding, also being calculated by the same units as RT, and it exists for at least a couple generations. If you mean achieving noticeable image improvement by adding on RT then it makes no sense, we only get more and more RT performance every new generation. Compare 2060 to 4060 Ti (roughly the same MSRP): the latter can do some ray tracing even in demanding titles like CP2077, whereas the former is only good for the memes.
 
True but I mean like Cyberpunk RT none path tracing version without up scaling and native.
But no one is running non path traced games without upscaling even on the most expensive GPUs because RT performance is still pretty horrid regardless.
 
RTX 4000 are terrible. No RT improvements versus RTX 3000 to speak of when you compare cards with the same raster performance. The RTX 4090 is substantially worse value than the RTX 3080, it is like going back a generation by 26 percent, not forward.

The only good RTX 4000 card is the RTX 4070 Super which is quite decent. Anyone with an older card should put the 4070 Super on the top of their list. Don't even consider the others or wait for RTX 5000.

As for AMD I just think they need a small price cut. The 7800 XT at $450 is a crazy good deal for RTX 3080 with 16GB performance. That would make it the best card ever released imo. That's why NVidia improved the 4070 Super so much, the 7800 XT is so strong, and still is at $500 even. If there were not so many used RTX 3080's out in the wild, the 7800 XT would sell even better. The 7900 XT at $700 is a lot faster than the 4070 Super. The XTX needs to drop to $875 imo. It would be unbeatable there.

rtx 3080 comparison.jpg
 
What's more appealing, xtx for $799 or 4080s for $999?
It depends on what you are about to do with the GPU. Just gaming or gaming and enjoy AI ?
I would just say, keep your money.
If you can survive with your current GPU right now, go on, the XTX is bad with AI and the 4080 is a GPU from 2022.
Wait for the end of the year or the end of 2025, go travel and use your money wisely, or not, go play at a casino, it would be less of a hassle than buying old stuff at very high price. Even a PS5 or a stupid OLED Steamdeck would be a better investment than those GPU right now xD
 
Yeah, they just get absolutely washed out in path tracing by anything green of 500+ USD current gen/3080 Ti.

How? If you mean achieving maxed out PT on mid-range then yes, not gonna happen until they come up with something even more ridiculously demanding, also being calculated by the same units as RT, and it exists for at least a couple generations. If you mean achieving noticeable image improvement by adding on RT then it makes no sense, we only get more and more RT performance every new generation. Compare 2060 to 4060 Ti (roughly the same MSRP): the latter can do some ray tracing even in demanding titles like CP2077, whereas the former is only good for the memes.
PT is very nice (sincerely), but IMO it's more of a tech demo feature at the moment, not something most people would really use outside of making pretty screenshots. It's only present in a handful of games and has a pretty unreasonably large performance penalty on current cards without laying on some thick upscaling.

Nvidia's next gen high end cards do it reasonably I expect, and there will be more games by then that have PT.
 
4080 Super : 10240 Cores , 2550Mhz , 1438Mhz
4080 : 9728 Cores , 2505Mhz, 1400Mhz
for 5% more cores , you get 1 or 2% more fps with 4K settings .something bottleneck in GPU or maybe it is what it is.
Efficiency. 4080 has 2 GPC 1024 Cuda enabled and 5 GPC of 1536 Cuda.
Those with 1536 lose performance in any shader above 1024.
Something similar can be seen with the 2080 and 2070/2080Ti, the former had 1024, the latter 1536.
Ideally, the 4080 should have been organized entirely in 10x1024 GPC and could even beat the 4090.
Yes, it's that bad. Hopefully GB203 will fix this with fewer GPC shaders.
 
PT is very nice (sincerely), but IMO it's more of a tech demo feature at the moment, not something most people would really use outside of making pretty screenshots. It's only present in a handful of games and has a pretty unreasonably large performance penalty on current cards without laying on some thick upscaling.

Nvidia's next gen high end cards do it reasonably I expect, and there will be more games by then that have PT.
I mean, you can play path traced Cyberpunk 2077 and Alan Wake 2 at 1440p with DLSS Quality on a 4070 Ti without being too annoyed. It's just shy of 60 FPS. RTX 3080 Ti is also capable of that. This is upscaling, I agree, but it's not thick.
Heck, even a 4070 non-Super is enough if you enable DLSS Balanced. Will be a little bit ugly but very much playable.

And this already is pretty much it. Another +100% performance per $ (prolly the case for RTX 6000 series) and we'll have path tracing as a norm for those who can afford $400ish GPUs.
 
It depends on what you are about to do with the GPU. Just gaming or gaming and enjoy AI ?
I would just say, keep your money.
If you can survive with your current GPU right now, go on, the XTX is bad with AI and the 4080 is a GPU from 2022.
Wait for the end of the year or the end of 2025, go travel and use your money wisely, or not, go play at a casino, it would be less of a hassle than buying old stuff at very high price. Even a PS5 or a stupid OLED Steamdeck would be a better investment than those GPU right now xD
Just gaming. I held out with my 1070ti as long as possible haha. I got a 4k monitor so an upgrade was needed. Was able to get new xtx for 799, which I've enjoyed a lot, but the nvidia bells and whistles do give me fomo a bit.
 
I mean, you can play path traced Cyberpunk 2077 and Alan Wake 2 at 1440p with DLSS Quality on a 4070 Ti without being too annoyed. It's just shy of 60 FPS. RTX 3080 Ti is also capable of that. This is upscaling, I agree, but it's not thick.
Heck, even a 4070 non-Super is enough if you enable DLSS Balanced. Will be a little bit ugly but very much playable.

And this already is pretty much it. Another +100% performance per $ (prolly the case for RTX 6000 series) and we'll have path tracing as a norm for those who can afford $400ish GPUs.
Apart from stopping to take pretty screenshots, why play with "just shy of 60 fps", when you can take better advantage of your high refresh rate screen with PT off? A much higher frame rate is preferable for actual gameplay IMO.
 
Reminder that the 4090 exists, you want "perfection" buy that one for twice money.

eww. Wouldn't touch a 4090 with a 10 foot (20x-telescopic) pole. That's not perfection, thats selling yer soul to the devil and being shafted in the BUMB with a HOT 3-pointed pitchfork :eek:

Its gotto be a balance of good value, great performance and stability. Unfortunately the 4090 never got its booster injection to come out of the pandemic... so nah, too high risk!

What's more appealing, xtx for $799 or 4080s for $999?

$200 less, easily XTX!!

But thats just me... I'm not entirely infused by all this RT/PT stuff when it's "soo" taxing and not so widely available. When we start seeing graphics intense titles at 4K comfortably hitting 120fps+ with RT enabled without having to spend ridiculous sums of money, i'm in!! I do like Nvidia's upscaling tech, its simply better but im an optimist.. FSR will eventually see further improvements down the line. Personally i still render in Native settings even having a RTX 3080 on the GO. At 1440p, it easily meets my performance/visual quality goals.
 
Last edited:
Well done review. I could do without the efforts trying to tell me what gpu I should prefer based on your criteria. RT means absolutely nothing to me personally. $100 less and 24GB vs 16GB and better performance? Give me the cheaper option.

Better yet just wait a couple months for AMD's new "7900XTX for half-price" w/RDNA4. ;)
 
Back
Top