• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 Super Founders Edition

I disagree with this reviews comments regarding the 7900XTX.
AMD's offerings this generation perform very well in rasterization, which is still the normal on 98% of the games.
Rasterization performance was considered the normal way to measure a cards up until ray tracing was released and it feels like everyone pretends its just irrelevant now.
Honestly, I am not impressed with Nvidia's RTX 4000 series and I believe Ray tracing is still in its infancy and only has become as important as it is, because of Nvidia's marketing team does a great job making it the new standard.
I think video cards are all overpriced currently, but I don't think AMD's price is far off from where it should be currently based on its performance. That said, I would love to see them lower it and maybe attack Nvidia, the way they did back in the HD4870/4850 good old days!
 
Nvidia trying so hard not to hurt their "precious" margins, Nvidia fans on the other hand :D
Must Have Want GIF by MOODMAN
 
It's pretty crazy that it has almost identical performance, I remember arguing with people on here not too long ago who were convinced it will be like 7-8% faster lol.


This thing will never be 1000$ in places where you can't buy a FE.
Given the extra SMXs, one would expect more than a 1% increase in performance which might as well be within the margin of error. I suspect the power limit is holding it back. Comparing it to the 4080 founders edition review, the vanilla 4080 clocks about 2% higher in CyberPunk than its successor. Still, I have to agree with the last couple of lines of this review:

While I'm sure there will be a lot of drama about the minimal gains, what the GPU market really needs is lower prices, not marginally better performance for the same price. In this regard, the RTX 4080 Super can be considered a success.
 
Ideally it should've been both ~ (lot) higher performance & lower price given the 4080 was probably the worst priced non "flagship" card in living memory!
 
Given the extra SMXs, one would expect more than a 1% increase in performance which might as well be within the margin of error. I suspect the power limit is holding it back. Comparing it to the 4080 founders edition review, the vanilla 4080 clocks about 2% higher in CyberPunk than its successor. Still, I have to agree with the last couple of lines of this review:

The Strix does seem to fair much better, probably limitations elsewhere not that its even a notable increase in actual hardware specs to warrant much additional performance.
 
I'm getting tired of graphics cards in general. I've been in DIY scene since ATi Radeon & 3dfx and now I get a feeling that no other time in these years dGPU market was so out of touch with enthusiast PC costumer. I mean, at least half of PC nerds are getting priced out of it. What used to cost 600, 700 bucks now costs 2000 bucks and what was 400 bucks now costs 800, 900. And this all happened in 3 GPU generations. I believe gold years of PC DIY building might be nearing it's end mainly due to an uncontrolled greed of GPU manufacturers.
 
Several DLSS reviews on this website have shown texture degradation or straight up missing geometry (there was a specific game in the past 3-4 months here that was reviewed where a truck was straight missing up a portion of its wheel hub or extraneous metal object in front of the truck), not to mention the notes of ghosting or shimmering on a game to game basis. I’ll provide the link for that specific one when I have time to look for it. Tim from HWUB also has lots of good content on the subject, again not saying DLSS isnt better, simply all upscaling can have some pretty horrendous artifacts to where you are absolutely degrading visual quality.
I don't say it's all shine and pink unicorns. Of course it's a lot of cases where upscaling introduces unwanted or even immersion breaking glitches. It was worse and it will become better. Algorithms are improving, game engines are being made with upscaling in mind so we will see less and less issues with any of upscaler, probably to the point when Quality (66% scale ratio) mode will always be superior to native image. Will be one hell of a ride but doesn't scream utter nonsense to me.
are you doing yourself any favors by hurting latency, introducing potential ghosting, and otherwise canceling out the motion clarity benefit of higher FPS?
Depends on the exact numbers. If latency penalty is only noticeable if you are a cyber athlete and artifacting is minimal then it's totally worth it. If the game exhibits 50+ ms latency penalty then it's already a questionable move. Especially if frames are generated correctly less than 95% of the time. The latter is closer to current reality than the former. But yesterday's FG was even worse. Maybe tomorrow's FG will be closer to heaven, who knows?
The number of quality (subjective yes) in which enabling RT would be beneficial to the experience is still quite small to where your argument isn’t really in good faith. Just because they tack RT effects onto dozens FPS or competitive style games, where FPS is significantly more important, doesn’t mean having 100+ titles with “RT” makes it any less niche than it still is.
The most limiting and game breaking factor is lackluster RT performance in hardware. It makes very limited sense to introduce RT games when nothing can run it. And NV, unlike AMD, are doing a far better job in RT performance improvement. If AMD GPUs caught up we'd have twice as much RT ready GPUs and thus, much more game devs interested in adding this feature. And it also would've made NV less enthusiastic in planned obsolescence. Of course it's nonsensical and senile to RT the snot outta games that are supposed to run at hundreds of FPS. But everyone will welcome more realistic graphics in games where graphics matters to the point you're not feeling offended if the game runs at "meager" 59 FPS.

When RT was initially introduced, even the 2080 Ti was hardly enough to deliver sensible gameplay at 1080p, let alone 1440p.
Now, we're talking GPUs almost 3 times cheaper being capable of more than 2080 Ti can do. Halo GPUs offer you some path tracing even at 4K.
On top of that, NV just made RT experience almost 20% cheaper by this release. I dig it.
 
Now AMD has an opportunity here to launch the 7950XTX at $1049 or $1099 with performance that sits between 4080 super and 4090
 
I have to agree with the last couple of lines of this review:
Like I said, this thing is not going to be 1000$ in most parts of the world, it will end up being pretty much as expensive as the outgoing 4080. Nvidia isn't dumb, they don't give anything for free, this is just a little trick making it appear as a better deal when in reality it wont be in most cases.
 
Like I said, this thing is not going to be 1000$ in most parts of the world, it will end up being pretty much as expensive as the outgoing 4080. Nvidia isn't dumb, they don't give anything for free, this is just a little trick making it appear as a better deal when in reality it wont be in most cases.
It already is. It's selling between 1150 and 1300€ in Europe atm. Just like 4080 did. So for us 4080S means basically nothing.
 
Wrong.

A good upscaler like DLSS will increase fidelity in many games. Tarkov for example has much better aliasing with DLSS enabled, and it's easier to see enemies amidst clutter due to the clearer image.

There's some downsides, but they're situational. For example some devs can't code properly and there's UI issues, or scope issues etc.


DLAA is better than native most of the time.

What you are referring to is the AA implemented in DLSS, also known as DLAA when standalone. The upscaler isn't responsible for that, it's a separate technology integrated into the DLSS pipeline. It would be misleading to say upscaling is doing that because the actual AA part has nothing to do with upscaling. It's just removing a graphics artifact and hence why it's possible to run without upscaling separately.

I'd also argue that tackling artifacts present in the native presentation of the game like alaising can be done via any AA alogorithm. That's not something you need upscaling for.

DLAA is better than native AA algorithms most of the time but that's got nothing to do with upscaling other than the fact that some games don't include a separate DLAA option all the time and bundle it with DLSS instead.
 
Last edited:
I don't say it's all shine and pink unicorns. Of course it's a lot of cases where upscaling introduces unwanted or even immersion breaking glitches. It was worse and it will become better. Algorithms are improving, game engines are being made with upscaling in mind so we will see less and less issues with any of upscaler, probably to the point when Quality (66% scale ratio) mode will always be superior to native image. Will be one hell of a ride but doesn't scream utter nonsense to me.

Depends on the exact numbers. If latency penalty is only noticeable if you are a cyber athlete and artifacting is minimal then it's totally worth it. If the game exhibits 50+ ms latency penalty then it's already a questionable move. Especially if frames are generated correctly less than 95% of the time. The latter is closer to current reality than the former. But yesterday's FG was even worse. Maybe tomorrow's FG will be closer to heaven, who knows?

The most limiting and game breaking factor is lackluster RT performance in hardware. It makes very limited sense to introduce RT games when nothing can run it. And NV, unlike AMD, are doing a far better job in RT performance improvement. If AMD GPUs caught up we'd have twice as much RT ready GPUs and thus, much more game devs interested in adding this feature. And it also would've made NV less enthusiastic in planned obsolescence. Of course it's nonsensical and senile to RT the snot outta games that are supposed to run at hundreds of FPS. But everyone will welcome more realistic graphics in games where graphics matters to the point you're not feeling offended if the game runs at "meager" 59 FPS.

When RT was initially introduced, even the 2080 Ti was hardly enough to deliver sensible gameplay at 1080p, let alone 1440p.
Now, we're talking GPUs almost 3 times cheaper being capable of more than 2080 Ti can do. Halo GPUs offer you some path tracing even at 4K.
On top of that, NV just made RT experience almost 20% cheaper by this release. I dig it.

I’ll never understand this first ideal. You will never be able to match native res when upscaling from a lower resolution. You literally cannot create more data from less, especially when there is artist intent when designing games (geometry, textures, coloring, lighting etc…). It will always be an approximation including some form of visual degradation (varying).

Simply put there’d be more “RT” if path tracing was viable on all hardware, traditional rendering methods still far outstrip the visual benefit/performance cost ratio on any hardware. Not to mention being industry limited by consoles being very late into their generation and comparatively being woefully under powered. The hardware isnt there yet, Nvidia and AMD. Upscaling and frame generation is just an unfortunate byproduct of that truth.
 
I'm getting tired of graphics cards in general. I've been in DIY scene since ATi Radeon & 3dfx and now I get a feeling that no other time in these years dGPU market was so out of touch with enthusiast PC costumer. I mean, at least half of PC nerds are getting priced out of it. What used to cost 600, 700 bucks now costs 2000 bucks and what was 400 bucks now costs 800, 900. And this all happened in 3 GPU generations. I believe gold years of PC DIY building might be nearing it's end mainly due to an uncontrolled greed of GPU manufacturers.
If you for some reason are stuck with a 1080p display and you can't tell RT from non-RT apart it's 450 dollars (RX 7700 XT or RX 6800) to max everything out.

This is cheaper than the MSRP of the GTX 1070 ($379):
1706722188505.png
 
Ready for the 7900XTX price drop
 
If you for some reason are stuck with a 1080p display and you can't tell RT from non-RT apart it's 450 dollars (RX 7700 XT or RX 6800) to max everything out.

This is cheaper than the MSRP of the GTX 1070 ($379):
View attachment 332334
I got ASUS GTX 1070 for 370€, now the cheapest RTX 4070 costs 600€ and 4070S 670€ here in Germany. The only new GPU that's worth buying adjusted for inflation would be 7800XT for 525€. Prices have gone up a lot, even when adjusted for inflation and currency exchange rate, at least for us in Europe.
 
You will never be able to match native res when upscaling from a lower resolution. You literally cannot create more data from less, especially when there is artist intent when designing games (geometry, textures, coloring, lighting etc…). It will always be an approximation including some form of visual degradation (varying).
Memory compression, prefetching, AI, growing databases etc, is that all a joke for you? I am not sure it's possible but nor am I sold on it being outright nonsensical. Ask anyone stuck in 2010 about AI being capable of almost realistic deepfakes, they will tell you, "This is a narcotics rehabilitation centre number. Call them immediately please."
 
If AMD GPUs caught up we'd have twice as much RT ready GPUs and thus, much more game devs interested in adding this feature.
Would you say the RTX 3090 is RT ready? What is the minimum viable RT performance in your opinion?
 
now the cheapest RTX 4070 costs 600€ and 4070S 670€ here in Germany.
4070 is not the same class as 1070. You don't compare GTX to RTX. You don't compare **70 to **70. You compare $X SKU to $X SKU. What we had for 500 dollars in 2016 is maxed out 1080p without ray tracing in mind. What we have for 500 (+inflation) dollars in 2024 is maxed out 1080p with ray tracing and either maxed out 1440p, or 1440p with ray tracing and DLSS/FSR. And also plenty decent 4K gaming as well.

Would you say the RTX 3090 is RT ready? What is the minimum viable RT performance in your opinion?
Stable 60 FPS at 1080p (no upscaling or frame generation whatsoever) with "vanilla" ray traced reflections being enabled, the reference game is Cyberpunk 2077. This is what I call RT ready. Anything starting with RTX 4070/3080, or RX 7900 XT is more than capable of that so yes, RTX 3090 is valid by me. NB: this is my opinion.
 
Last edited:
I disagree with this reviews comments regarding the 7900XTX.
AMD's offerings this generation perform very well in rasterization, which is still the normal on 98% of the games.
Rasterization performance was considered the normal way to measure a cards up until ray tracing was released and it feels like everyone pretends its just irrelevant now.
Honestly, I am not impressed with Nvidia's RTX 4000 series and I believe Ray tracing is still in its infancy and only has become as important as it is, because of Nvidia's marketing team does a great job making it the new standard.
I think video cards are all overpriced currently, but I don't think AMD's price is far off from where it should be currently based on its performance. That said, I would love to see them lower it and maybe attack Nvidia, the way they did back in the HD4870/4850 good old days!
Here's the thing, with the 7900XTX you get a tiny 4% performance increase on average, across all games, and you save a maximum of, at the time of me typing this, $60. What do you get out of going with the 7800XTX instead of the 4080 Super? Inferior upscaling, 20% worse raytracing performance and more power consumption. Explain the point of the 7900XTX unless its price drops to under $900, besides AMD is a less shitty company than Nvidia.

We're at a point where Raytracing is possible, at 4k, in some lighter implementations, even without upscaling. A Plague Tale Reckoning is getting 59 FPS at 4k on the 4080 Super. Alan Wake 2 at 33 FPS and Cyberpunk at 28.8 FPS, which would both be playable FPS with Quality DLSS. Because the 7900XTX lags so far behind in Raytracing, it cannot achieve playable FPS, or can barely achieve it, in Alan Wake 2 or Cyberpunk at 4k, even with upscaling.

Look, I don't even want to defend Nvidia, I think they're a shit company, but the 7900XTX is currently priced poorly for what it offers with this 4080 Super being $200 lower MSRP than the 4080. It's another story if $1000 4080 Super's are just a unicorn, and they all end up sitting back at the $1200 price point of the original 4080. If that ends up being the case, I think the 7900XTX still holds its value at its current prices, but if people can buy any 4080 Super's for $1000, there's little reason to get a 7900XTX.
 
4070 is not the same class as 1070. You don't compare GTX to RTX. You don't compare **70 to **70. You compare $X SKU to $X SKU. What we had for 500 dollars in 2016 is maxed out 1080p without ray tracing in mind. What we have for 500 (+inflation) dollars in 2024 is maxed out 1080p with ray tracing and either maxed out 1440p, or 1440p with ray tracing and DLSS/FSR. And also plenty decent 4K gaming as well.
GTX 1070 was 314 mm² GP104 die and 4070 is 294 mm² AD104 die. So identical die size, no matter what Nvidia's marketing team wants you to believe. In fact 1070's bit bus was wider than that of 4070 (256 vs 192 bit).
 
Now AMD has an opportunity here to launch the 7950XTX at $1049 or $1099 with performance that sits between 4080 super and 4090

Price are being slashed left right and center in the high end, in Europe at least. We can already get the current 7900xtx for £870, meanwhile 4080 Super has launched at £999 at OCUK and in stock at scan as well.

Seriously tone deaf launch from Nvidia here, the market for 1k+ GPU will totally disappear this year I think.
 
4080 Sub par..
Thanks Nvidia just instantly lost 200 off my card.
I keep getting shafted by buying 80 series cards.
Getting a bit sick of this.
 
Back
Top