• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Super Founders Edition

High risk to get shortage.
 
This is what happens when AMD fails to compete.
 
Sooooo, the only noticeable difference seems to be the price... I mean it looks like 20watt power difference and 1% more performance roughly. This has got to be the weirdest super launch I have ever seen. Might as well have just swapped it out with this revision, not used the name, and dropped the price.

I mean, unless I am missing something.
 
Every time you fall in need of upscaling the NV card wins. DLSS isn't the greatest influence on the image quality but it's still better than FSR.

I personally enjoy 4K + DLSS at both Quality and Balanced. At Performance, it's no more ideal but still greatly exceeds what you can have using a 1440p display (tested a friend's 3060 Ti last week). FSR becomes noticeably artifacting at Balanced; and at Performance, it also loses a bit more fidelity per se being somewhat inferior to 1440p+DLAA and having parity with 1440p+TAA.

So if you only compare pure raster speed at native resolution then yes, 7900 XTX screams "why do you exist?" in the face of 4080 series GPUs. But when it comes to RT (I can bet it's enough people willing to turn RT on after paying a thousand on a GPU) and upscaling (imagine having such a GPU 4 years later when games become even more demanding), 4080 is clearly ahead.

NGreedia have been in need for a reality check for a long while but this check can't come online because AMD need it even harder and Intel are currently too new to this market. Can't blame the latter, though. It's delusional to expect the first generation GPUs to compete with any success at 300+ USD mark. Intel are doing a good job all things considered. AMD are not.

4080 outsold 7900 XTX despite being much more expensive. 4080 Super will outsell it even harder because it's not only an NV GPU, it's also a better value overall (OG 4080 at 1200 USD VS 7900 XTX at 1000 USD is a worse value overall but NV fanbase is not to be underestimated).

I rate this release... "I didn't expect things to go better but they lowkey did. Hurray, I guess."

To each their own, but personally, I’m not buying $1000+ cards to degrade visuals by introducing ghosting, shimmering, texture smearing and other such upscaling artifacts that are present in both DLSS and FSR. Not to mention when you need either FG tech to stay above 60 FPS, you’re already in the worst case scenario, generally at 4K where the 7900XTX has its biggest lead over the 4080.

It’d be interesting to see data covering games played with and without RT, and additionally games with RT and people disabling or enabling the features. I’d be willing to be its single digit percentages of people actually utilizing RT in games. It’s astounding how poor a choice the average joe will make based what their needs actually are.
 
Sooooo, the only noticeable difference seems to be the price... I mean it looks like 20watt power difference and 1% more performance roughly. This has got to be the weirdest super launch I have ever seen. Might as well have just swapped it out with this revision, not used the name, and dropped the price.

I mean, unless I am missing something.
It's marketing Through and through at this point, more coverage, become the talk of the town if for a brief period, spark debate.

All publicity is good publicity, because with the changes they did they can only get good publicity.
 
It's pretty crazy that it has almost identical performance, I remember arguing with people on here not too long ago who were convinced it will be like 7-8% faster lol.

The 4080S is a good card, but $1000 is still a hard sell.
This thing will never be 1000$ in places where you can't buy a FE.
 
$1000+ cards to degrade visuals by introducing ghosting, shimmering, texture smearing and other such upscaling artifacts that are present in both DLSS and FSR
It's inevitable for aged GPUs. Does 1080 Ti deliver any sensible performance at 1440p? Perhaps it does but you can't max modern games out. 4080 and 7900 XTX will meet the same fate in a number of years. Their usefulness will be extended by enabling DLSS/FSR and at this point of time, DLSS is vastly superior. I don't see any way for FSR to become better than DLSS. More widely available? Could be. Better image quality and performance? Hardly doubt. And this contributes to the device longevity, love it or hate it.
It’d be interesting to see data covering games played with and without RT, and additionally games with RT and people disabling or enabling the features.
Titles like RE4 Remake unload light software RT and neither of two behave better there. AMD and NV SKUs of similar price are offering margin of error similar framerates. AMD GPUs are slightly better but it's still marginal.
Titles like Avatar: FoP unload semi-decent hardware (or is it software? Not entirely sure) RT and AMD are behind. 7900 XTX is about 10 to 20 percent slower than 4080. Not a big deal but still a win for NV.
Titles like Alan Wake 2 and Cyberpunk 2077 are virtually unplayable with path tracing if you are rocking an AMD GPU no matter how expensive it is. However, in Cyberpunk, even a 4070 Super offers playable experience at 1080p and decent framerates at 1440p + DLSS Quality with PT enabled. "Vanilla" RT is not just playable, it's enjoyable on 4070 series GPUs at both 1080p and 1440p. Also feasible at 2160p + DLSS Q. With AMD, you need at least an RX 7900 XTX for comparable (yet marginally worse) experience. In AW2, AMD GPUs are basically unplayable with RT enabled. NV GPUs suck too but you can get away with a 4080 level GPU at 1440p + RT.
 
Last edited:
The 7900XTX has had $900 options for awhile. Even with the msrp drop this is an odd take, for 90-95% of games you’re gonna play, the 7900XTX is a way better value, even then there are only a few notable games where RT provide real visual benefit and have a large delta in performance (CP2077).

With most of these models having minimal PL adjustments, seems you dont lose out on getting an msrp (“cheap”) built card, so theres that.

Surprised we didnt see more cards at the ASUS strix level of brand tax though; basically charging $250 for a higher PL and some extra vrm components.

That funky MSI all metal card seems to be a flop too, worse thermal performance and identical fps performance for an extra $150
How do you figure the 7900XTX is 'way better value'. Look, I typically prefer AMD options, because they are almost always better value. In this case, the 7900XTX($970) gives us a meager 7% better value at 4k while consuming 50w more in gaming(which is ~17% higher). $900 7900XTX options are all exclusively open box. The cheapest 7900XTX you can buy online in the US is currently $940, increasing the value slightly, but not enough to offset its disadvantages(DLSS/DLAA/RT) in my opinon.

I typically prefer AMD because of the value add, and I considered the $1000 7900XTX as low enough to be good value against the original 4080 at $1200, as I don't consider DLSS/DLAA/RT worth a $200 increase in price. Howver with the 4080 Super now at $1000, for the same performance as the original 4080, and now less than $100 off the 7900XTX's price, I think it's extremely hard to recommend a 7900XTX unless it drops below $900, at the very least.
 
Editors choice, lolz. 1% increase over regular 4080 and costs a grand.

This is progress ?

Nvidia got to be joking.
 
This is progress
101% performance at 83.(3)% price. +21.2% better value than a year ago. Not much but NV didn't need to shake the market in the first place.
 
It's inevitable for aged GPUs. Does 1080 Ti deliver any sensible performance at 1440p? Perhaps it does but you can't max modern games out. 4080 and 7900 XTX will meet the same fate in a number of years. Their usefulness will be extended by enabling DLSS/FSR and at this point of time, DLSS is vastly superior. I don't see any way for FSR to become better than DLSS. More widely available? Could be. Better image quality and performance? Hardly doubt. And this contributes to the device longevity, love it or hate it.

Titles like RE4 Remake unload light software RT and neither of two behave better there. AMD and NV SKUs of similar price are offering margin of error similar framerates. AMD GPUs are slightly better but it's still marginal.
Titles like Avatar: FoP unload semi-decent hardware RT and AMD are behind. 7900 XTX is about 10 to 20 percent slower than 4080. Not a big deal but still a win for NV.
Titles like Alan Wake 2 and Cyberpunk 2077 are virtually unplayable with path tracing if you are rocking an AMD GPU no matter how expensive it is. However, in Cyberpunk, even a 4070 Super offers playable experience at 1080p and decent framerates at 1440p + DLSS Quality with PT enabled. "Vanilla" RT is not just playable, it's enjoyable on 4080 series GPUs at both 1080p and 1440p. Also feasible at 2160p + DLSS Q. With AMD, you need at least an RX 7900 XTX for comparable (yet marginally worse) experience. In AW2, AMD GPUs are basically unplayable with RT enabled. NV GPUs suck too but you can get away with a 4080 level GPU at 1440p + RT.

You’re completely missing the point, and making it entirely about AMD vs. Nvidia. Upscaling to native vs native res is always going to look worse, and outside of a handful of RT games a 4080 or 7900XTX is not struggling at 1080/1440p, and largely at 4K in most to all modern titles (excluding the massive minority of worthwhile PT/RT games).

Its an oxymoron to increase visual fidelity by enabling RT, then at the same time reduce said quality by degrading the image and introducing artifacts via upscaling or frame gen.

Im not arguing that Nvidia isn’t superior when it comes to RT, more than its still a niche within a niche (please remember that tech enthusiasts are a niche within that niche as well). Buying a $1200-$2000 card just to play 3-4 games with RT or PT above 80 fps at 1080-1440p is such a small and irrelevant data point. Anyone is free to choose, but despite the massive improvements Nvidia has made including title support, its just not that big a selling point in my book unless you exclusively play a game like CP 2077 and thats it.

*edit for link (was the revenant dlss, fsr, xess scaling) Look at the left building wall 1080p comparison in which its smearing the brick texture and also seems to have altered the lighting and shadows of that face of the building. Even the shoulder armor looks like more of a smeary mess vs FSR in that comparison.

 
Last edited:
@W1zzard

I assume the review for the card will come later, but the link for ASUS RTX 4080 Super STRIX OC review shows the broken image link leading to a 400 (bad request) instead of a text in all "Value and Conclusion" page:

NVIDIA GeForce RTX 4080 Super Founders Edition Review - Helping you Save $200
ASUS GeForce RTX 4080 Super TUF OC Review
Gigabyte GeForce RTX 4080 Super Gaming OC Review
PNY GeForce RTX 4080 Super Verto OC Review
ASUS GeForce RTX 4080 Super STRIX OC Review
Galax GeForce RTX 4080 Super SG Review
Zotac GeForce RTX 4080 Super AMP Extreme Airo Review
Palit GeForce RTX 4080 Super GamingPro OC Review
MSI GeForce RTX 4080 Super Expert Review

Regardless, thank you very much for taking time and effort to review the 4070/4070 Ti/4080 Super series cards.

Edit: I did not realize there is a review for the card. The link for the card review in the "Value and Conclusion" page of ASUS GeForce RTS 4080 Super STRIX OC is wrong too.
 
Last edited:
Sooooo, the only noticeable difference seems to be the price... I mean it looks like 20watt power difference and 1% more performance roughly. This has got to be the weirdest super launch I have ever seen. Might as well have just swapped it out with this revision, not used the name, and dropped the price.

I mean, unless I am missing something.
We're all scratching our heads.
 
You’re completely missing the point, and making it entirely about AMD vs. Nvidia. Upscaling to native vs native res is always going to look worse, and outside of a handful of RT games a 4080 or 7900XTX is not struggling at 1080/1440p, and largely at 4K in most to all modern titles (excluding the massive minority of worthwhile PT/RT games).
Wrong.

A good upscaler like DLSS will increase fidelity in many games. Tarkov for example has much better aliasing with DLSS enabled, and it's easier to see enemies amidst clutter due to the clearer image.

There's some downsides, but they're situational. For example some devs can't code properly and there's UI issues, or scope issues etc.


DLAA is better than native most of the time.
 
The 4080S from AIBs have a bigger performance gap vs. FE 4080S than the FE 4080S vs FE 4080. How embarrassing.
 
Upscaling to native vs native res is always going to look worse
1706717941891.jpeg

Its an oxymoron to increase visual fidelity by enabling RT, then at the same time reduce said quality by degrading the image and introducing artifacts via upscaling or frame gen.
Only partially so. You enable RT = shadows, reflections and other light movements now make more sense. You enable upscaling = it's a chance (currently roughly equal to 95%) your image will get more artifacts and less fidelity in fine details. Of course the ray traced part of the image gets scaled too, yet you still have more realism in your gameplay. At 1080p, even DLSS experience is atrocious but at 4K? Unless you're observing stills or devs did something wrong with the particular implementation, the only difference you can tell is +50% FPS count.

I won't discuss frame generation because I got no clue how feasible it is in real life. On a 60 Hz display, it's doomed to be a gimmick. On a 144 Hz one, it's theoretically awesome but I never had such a display to test it.
to play 3-4 games with RT or PT above 80 fps at 1080-1440p is such a small and irrelevant data point.
We had no RT games before 2018. Now it's 2024, we have a couple-digit number of them. And the list is growing, as well as the amount of people who want something better than SSR and, especially, baked lighting. Especially if we talk game developers. Upcoming title is RT only? Well, damn shame to own an AMD GPU then.
Tarkov for example has much better aliasing with DLSS enabled
UNTRUE at 1080p.
 
is it a bird, is it a plane, NO ITS SUPER PANTS!

Was i the only village idiot looking for some wider performance gains? 10%/+?

A suffix showboating "SUPER" would generally imply "above and beyond" by a nice measurable level of superiority. I don't get out of bed for 1%

The absolute least i was expecting was SUPER PANTS beating the XTX in raw raster. FAIL!

Also let's not applaud Nvidia for lowering the price by $200. $1200 for the 4080 was actually delusional. The 4080S is a good card, but $1000 is still a hard sell.

Agreed!

You could slap on a perfy 15% increase and drop the card by another $100 and I still wouldn't touch it. Same goes for the XTX. $800 is where i'd surrender and nothing at this tolerance level is appealing enough to consider an upgrade.

I'm in a weird place.... nowadays i enjoy w1zzards reviews and the comments section more than the products themselves
 
4K average should by 78FPS. Battlefield 5, CS2, Doom, Witcher3 i think must by removed, they easy to run on most gpu
 
The only impressive feat of this card is the $200 price reduction. A little strange that it isn't any faster than the Non-Super it replaces. Memory bandwidth limited or ROP limited?

With the new price it's now much more attractive than the 7900XTX. The only real advantage the 7900XTX has now is it comes with a better Display Port, which only makes a difference if you're using something like the Samsung G9 57", which can't get to 240hz on DP 1.4a(HDMI is not supported at 240hz). I'm not sure how much AMD can lower the cost of the 7900XTX without losing money, but it needs to be at least $100 cheaper to be an attractive purchase over this.
A few weeks ago, the Powercolor version of the reference 7900 XTX was being sold for $640 at MicroCenter. So I suspect they can sell them at $600 and still make a profit.

Considering terrible yields for the maxed out Ada GPUs I'm inclined to imagine it's not a room to fit any additional speed in the same wattage.

4080 Super has lower core frequency compared to non-Super. Not much lower but still. This GPU's performance is crippled by power limit, just like it happened to RTX 3090 non-Ti which had about 40% power spendings on VRAM and RTX 4070 Super which was much closer to Ti on paper but with much more limited TDP, it behaved more like a regular 4070.
The 4070 Super has the greatest increase amongst the Ada refresh SKUs; it is much closer to the 4070 Ti than the 4070.

1706720011530.png
 
1% relative performance difference VS the regular 4080, that's not so Super...
Yea it's cheaper, but it's a complete joke of a refresh at the same time.
 
View attachment 332320

Only partially so. You enable RT = shadows, reflections and other light movements now make more sense. You enable upscaling = it's a chance (currently roughly equal to 95%) your image will get more artifacts and less fidelity in fine details. Of course the ray traced part of the image gets scaled too, yet you still have more realism in your gameplay. At 1080p, even DLSS experience is atrocious but at 4K? Unless you're observing stills or devs did something wrong with the particular implementation, the only difference you can tell is +50% FPS count.

I won't discuss frame generation because I got no clue how feasible it is in real life. On a 60 Hz display, it's doomed to be a gimmick. On a 144 Hz one, it's theoretically awesome but I never had such a display to test it.

We had no RT games before 2018. Now it's 2024, we have a couple-digit number of them. And the list is growing, as well as the amount of people who want something better than SSR and, especially, baked lighting. Especially if we talk game developers. Upcoming title is RT only? Well, damn shame to own an AMD GPU then.

UNTRUE at 1080p.

Several DLSS reviews on this website have shown texture degradation or straight up missing geometry (there was a specific game in the past 3-4 months here that was reviewed where a truck was straight missing up a portion of its wheel hub or extraneous metal object in front of the truck), not to mention the notes of ghosting or shimmering on a game to game basis. I’ll provide the link for that specific one when I have time to look for it. Tim from HWUB also has lots of good content on the subject, again not saying DLSS isnt better, simply all upscaling can have some pretty horrendous artifacts to where you are absolutely degrading visual quality.

On the subject of FG, if you’re struggling to get 60fps it’s going to be a horrible experience. So if you wanna stack upscaling artifacts on top of FG artifacts for the sake of minimal RT effects in the majority of games, thats highly questionable. You seriously have to ask yourself in the pursuit of higher FPS for the upgraded visual, are you doing yourself any favors by hurting latency, introducing potential ghosting, and otherwise canceling out the motion clarity benefit of higher FPS?

The number of quality (subjective yes) in which enabling RT would be beneficial to the experience is still quite small to where your argument isn’t really in good faith. Just because they tack RT effects onto dozens FPS or competitive style games, where FPS is significantly more important, doesn’t mean having 100+ titles with “RT” makes it any less niche than it still is.
 
Back
Top