• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

It's basically Ada's saving grace: if it can hit 60fps@QHD, you enable DLSS FG and you are into playable territory at 4k, with very minor IQ loses.
But again, pricing...
They will need it for the likes of City Skylines 2 it will give them a jump to about 50 FPS, late game FG will give them around 40.

Forza Motorsport and it's drops to 40 FPS on1080P.
 
Huh....

I can't honestly say that I know of anyone that's wandering around complaining of the lack of GPUs that are priced at over $1k and would be excited to have another one added to the list.

As of right now, locally, the cheapest 4080 I can get is $1200 and the cheapest 4090 is $1650. Probably going to see the current 4080 drop to $1100, the 4080 Super come in at $1399 while the 4090 maintains its even more outlandish price.

I miss the pricing of the good old days, I'll just have to keep those fond memories alive and avoid paying for this crap.....980Ti was only $650, I'm just saying.
You forget the bright side, now Nvidia can charge $25,- less for the vanilla 4080!

Oh man. This is Turing all over again innit. Solid GPUs but grossly overpriced. Repairs done mid gen to make the line up something you might actually want, and even the identical top part everyone wants (2080ti, the only sensible Turing initial release vs 4090, which is the same thing for Ada).

Huang, you are getting boring. Please devise a new strategy, we see through this too easily now.

OTOH, if you look at the whole Super rumor mill, a 4080 20GB slotting in under the 4090 makes a lot of sense, that gap is huge. The rest though...

It's basically Ada's saving grace: if it can hit 60fps@QHD, you enable DLSS FG and you are into playable territory at 4k, with very minor IQ loses.
But again, pricing...
You optimist, you
 
Last edited:
You optimist, you
Nothing optimistic about that. It's been repeatedly tested here on TPU that you can get good frame rates at 4k via DLSS FG. But it's contingent on getting 60fps@QHD in the first place, which isn't always achievable.
 
Nothing optimistic about that. It's been repeatedly tested here on TPU that you can get good frame rates at 4k via DLSS FG. But it's contingent on getting 60fps@QHD in the first place, which isn't always achievable.
Frame times?
 
Don't forget whatever they release next year is probably going to crap all over these too.
This has the potential to be just like when they released the 3000 series.

83jks7.jpg
 
I think Nvidia and AMD should have a word with game devs.
Their top tier cards aren't able to play most recent games at decent framerates.
That's more of a Dev problem, and its making their top tier cards look like crap.

Every game wants to be the next 'Crysis'....
 
Sorry...this generation is dead in the water because there is no hope for any longevity. I've seen what UE5 does to the 4090 and this Super will not be stronger, so what is the point? No refresh from either (any) team will be worth the silicon its made on. Holla at me during the 50/8xxx series!

I think Nvidia and AMD should have a word with game devs.
Their top tier cards aren't able to play most recent games at decent framerates.
That's more of a Dev problem, and its making their top tier cards look like crap.

Every game wants to be the next 'Crysis'....
Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
 
Sorry...this generation is dead in the water because there is no hope for any longevity. I've seen what UE5 does to the 4090 and this Super will not be stronger, so what is the point? No refresh from either (any) team will be worth the silicon its made on. Holla at me during the 50/8xxx series!


Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
Please, show me an example of this graphical advance because I haven't seen anything impressive enough compared to what was around 4 years ago running smoothly on GPUs with 1/3 the performance of current high-end GPUs... You know well that they are bringing nothing to the table but broken games.
 
Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
If the fastest gaming GPU in the world cannot handle the game, then it's on the game developers and not the GPU manufactures for poor performance. The first rule in making a game is to make a game people can actually play.
 
Not a terrible idea for between the 4080 and 4090, but the rest rather pointless and pricing will just continue to suck. I just got a 7900 xtx liquid devil and hoping that is best bang for buck atm!
 
so, the 4050 super is coming eventually
 
I'm happy to wait for 2nd hand market. Not paying these asinine prices.
 
If the fastest gaming GPU in the world cannot handle the game, then it's on the game developers and not the GPU manufactures for poor performance. The first rule in making a game is to make a game people can actually play.

There are three parties to blame imo: Investors, Devs, and Nvidia/AMD. Investors for rushing products out the door with no polish. Devs for not putting their foot down and taking the path of least resistance for basing playability using upscale/fg tech. And Nvidia, to a lesser degree AMD, for pushing upscaling for the last 3 gens and now FG.

I guess theres a fourth if you consider the people who just blindly pre-order everything under the sun. On the bright side, I personally don’t think there are many titles that have been released lately worth buying or don’t run well if they are; just a bunch of formulaic open-world garbage.

OT:

4080 super duper for $1499 with just enough of a performance gap to continue to upsell people to a 4090 so they can maximize margins.
 
Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.

The performance gap between the RTX 3090 Ti and the RTX 4090 is huge, along with the RX 6950 XT and the RX 7900 XTX, in both older and newer games. Its mostly Ray Tracing and Path Tracing which are bringing GPUs to their knees, and these are just halo, show-it-off graphics technologies.

Game developers are supposed to push advancements in gameplay while optimizing the engine/graphics renderer they are building their games in. Either Unreal Engine 5 is unoptimized itself or AAA game developers are lazy AF. I tend to think the latter, considering Fortnite still runs just as it was in UE4 at competitive graphics settings.
 
Game developers are supposed to push advancements in gameplay while optimizing the engine/graphics renderer they are building their games in. Either Unreal Engine 5 is unoptimized itself or AAA game developers are lazy AF. I tend to think the latter, considering Fortnite still runs just as it was in UE4 at competitive graphics settings.
Fortnite is the prime example of what I see as wrong with a large chunk of the industry using UE. Of course Epic gets to optimize Fortnite on UE, it's their own engine. Everyone else should either get on their toes to optimize their games on a third-party's engine (which they don't) or develop their own in-house engines (which they won't as it's too costly - time and money - and investors don't want any of that).
 
It depends on what "decent" means to you. Which res? Which refresh rate are you targeting?
My guess is next-gen will be at least halfway there and the gen after that may finally democratize RT. The only problem, next gen isn't coming for at least another year :(

Yeah. I agree.

I think Navi 5 *could* bring decent RT to the mainstream, while nvidia will respond with either a new chip and/or series....but that's 2025(/2026?). Regardless, it will probably happen by the time/for the chip design used inside of new consoles, but probably be a premium/unusable feature until then.

I think there's a pretty good chance Blackwell will bring the same divide as Ada; Sure, the 384-bit/512-bit chips will probably be nice (and expensive), but there's a fair chance the chip below them will be 192-bit/32gbps and compete with (and probably slightly beat) N4x/BM(/ad103). RT on all those parts may still (relatively) suck. I still think we're looking at/waiting for a 3nm 256-bit chip for it to be realistic for how most people want to use it, and that will come when it's cost-effective for AMD and then nVIDIA is forced to relegate their crown jewel to the mainstream. I imagine this might be the 'X' generation that's spoken of in that slide deck, and that it might be N3P (which AMD may wait to use), but I don't know.

I think we're waiting for a few things to line up. That could be a newer/better-yielding/cheaper process (like N3P) as well as 4GB (32Gb) GDDR7. I have to imagine AMD is itching to have a 256-bit/32GB chip to put in a new PS/xbox, and they might just be waiting for those stars to align before releasing Navi 5. While I could absolutely see nVIDIA trying to get away with using 12GB for a native 3nm chip, I just don't think AMD will; I think even 128-bit will be 16GB. That said, there's always the chance GB205 (et al) may use GDDR6(x) which then makes 256-bit/16GB more likely, and the possibility that some-what mainstream GPU may have a chance at using the feature adequately.

It's just one of those things that once Navi 4/BM release, I don't see why you'd wait for GB205 which will probably be similarish performance, the same or less ram, and likely more expensive (given how nVIDIA refuses to drop the price on 4070ti/4080).

While I can't speak to everyone's disposition, I look at this chart a hint toward the future:

1698081487963.png


First of all this uses 7800xt 1440p/60 as a baseline, which I think is realistic for what to expect in the future (ps5 pro). I think 7900xt/4080 is realistic future-proofing until the end of generation (and games fall to FSR/DLSS 4k balanced 30/60fps, if not rarely 1080p on OG PS5).

While speculation, I think it's fair to guess 4070 super would be ~61fps and 4070ti Super (still weird) roughly ~76fps, as that would be splitting the difference between 4070/4070ti and 4070ti/4080. 4080S ~90fps.

I truly believe (BM?/)N4xpro will be ~7168sp/2800mhz, similar to ps5pro (7680/2600mhz?), 12gb, and slightly faster than a hypothetical 4070 super. I also think it (they?) will OC to the realm of 4070ti, just like 7800xt, while 4070S will not (because market segmentation and proven nvidia overclocking suckage). My guess is ~$450, while 7800xt drops to ~$400.

N4XXT (okay, that's pretty horrible naming too) is a wildcard, but it's important to realize 7800xt is clocked at a paltry 2425mhz and uses a max of 250w, which is somewhere between a travesty and extremely foreboding for what they plan for the future, granted the price/perf is good. We see 7700xt clock up to 3113mhz (artificial limit?), and the computer hardware community's own Freddie Mercury stand-in reminds us 7900xtx can do ~3200-3300mhz. As I've kind of spoken about before, with 20gbps ram 7800xt could've clocked up to 2900mhz and been fine...but AMD distinctly chose not to do that (or even really allow it because power limit), clearly favoring lower prices and to push sales of 7900xt at this/that moment in time. With 24gbps we could be talking up to almost 3500mhz as a possibility, if even a slight one. When you take what Apple achieved on N4P/N5P (A16 is 3.46ghz, M2 is 3-3.66ghz), or even the 11% TSMC promises N4P over N5 (and 4% for n4x on top of that if AMD chose to use it), this is actually hilariously realistic. All they really need to do is give it a max PL of 375W (max of two 8-pin connectors) and let people go ham. I think a more realistic scenario is 8192sp and a stock clock of 3-3.25ghz depending on how much they care about how 7900xt looks comparatively, but like I said, wild card. The important thing to realize is this card (and conceivably a BM alternative) have to fight against 4070ti/4070tititititi in pricing. I think it's a foregone conclusion to be ~$600, but we shall see. Point is, unless AMD completely screws the pooch, the chip could/should literally be up to 40% faster (a typical generational hike) than a 7800xt at stock, or 20% faster than it's absolute performance or a stock 4070ti; conceivably right on top of 7900xt/4080. If not at stock, certainly with overclocking. I could see AMD locking down memory to 3200mhz (25.6gbps, 6.6_%), but that still gives the GPU room to ~3500mhz with 8192sp, or 3700+ if 7680sp. This would make anything below a 4090 essentially moot wrt typical playable settings, barring some major price adjustments and/or amazing value skus from nVIDIA.

I remain incredibly curious what nVIDIA will do in the face of this, because it ain't a pipe dream. It's a very realistic scenario that's likely going to make nVIDIA look like fools wrt pricing, more-so than 7800xt already does. Even if they slot in a 4080 Super, who cares? You *might* get 4K60 DLSS balanced with RT in AW2, but I still think it's a tough sell imho because the tangible real-world performance benefit just isn't going to be there, especially for the likely price premium ($1000?).
 
Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
Yes, but you push the envelope with high/very high settings. You don't come out with system requirements for 30fps@720p medium settings and call that "pushing the envelope".

I still remember the words of a teaching in college: every now and then I get a student saying they wrote a fantastic program for their diploma, only they can't run it because they need a computer only NASA has. That's how most devs sound today.
 
Last edited:
And Nvidia, to a lesser degree AMD, for pushing upscaling for the last 3 gens and now FG.
Upscaling is the future of gaming. TV's increase their pixels by a factor of 4 when improving definition. There is no silicon manufacturing process in the world that can keep up with that exponential increase. The only way to do it is through upscaling. We were stuck with SD for a long time before HD came along and now you really have to look for a 1080p TV, they are all 4k.
 
Upscaling is the future of gaming. TV's increase their pixels by a factor of 4 when improving definition. There is no silicon manufacturing process in the world that can keep up with that exponential increase. The only way to do it is through upscaling. We were stuck with SD for a long time before HD came along and now you really have to look for a 1080p TV, they are all 4k.
Marketing people have sold you 2160p screens because they want you to buy them.
Great for TV content.

Back in the day, not many people had 1600x1200 monitors (CRTs), or if they did, they ran them at lower resolutions, and higher frame rates.
800x600 at 75 or 85 Hz was a lot better than 1024x768 at 60Hz.
1280x960 at 75Hz was a lot better than 1600x1200 at 60Hz.

I don't want a 2160p monitor until games can play them properly, I only recently moved up to 1440p 144Hz.
I'd like to be able to play at that with everything turned on someday before I retire...

edit : I'd also like to see dot pitch go up more for monitors.
1600x1200 on 19" was gorgeous.
Less jaggies visible onscreen...
 
Last edited:
Making the 4080 with 102 die is a bit of a strange step, it would make more sense if it was full die AD103, with more RAM, and would fit in a better TDP envelope
It would be a 32GB card then as AD103 has a 256-bit bus. As we remember from 500/600 series, using mixed density chips wasn't a wise solution.
 
They should just drop the 4080 and replace it with this.

That's their plan isn't it? That's what they didn't last time anyway.

I bet on this that you didn't play games on raw lithography process but on GPU architecture.

Process and architecture are kinda tied at the hip. To better architecture, they need more transistors, and to fit more transistors, they need better processes (to keep costs reasonable at least).
 
Back
Top