Monday, October 23rd 2023

NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

NVIDIA's upcoming mid-life refresh for its GeForce RTX 40-series "Ada" product stack sees the introduction of three new SKUs, led by the GeForce RTX 4080 SUPER, as was reported last week. In the older report, we speculated how NVIDIA could go about creating the RTX 4080 SUPER. BenchLife reports that the RTX 4080 SUPER will be given 20 GB as its standard memory size, and will be based on the larger "AD102" silicon. The SKU will utilize a 320-bit wide memory interface carved out of the 384-bit available to the silicon. The "AD102" has 144 streaming multiprocessors (SM) on die, from which the flagship RTX 4090 is configured with 128, and so NVIDIA could pick an SM count that's lower than that of the RTX 4090, while being higher than the 76 of the current RTX 4080.
Sources: Wccftech, BenchLife.info
Add your own comment

145 Comments on NVIDIA GeForce RTX 4080 SUPER to Feature 20GB Memory, Based on AD102

#26
Vayra86
neatfeatguyHuh....

I can't honestly say that I know of anyone that's wandering around complaining of the lack of GPUs that are priced at over $1k and would be excited to have another one added to the list.

As of right now, locally, the cheapest 4080 I can get is $1200 and the cheapest 4090 is $1650. Probably going to see the current 4080 drop to $1100, the 4080 Super come in at $1399 while the 4090 maintains its even more outlandish price.

I miss the pricing of the good old days, I'll just have to keep those fond memories alive and avoid paying for this crap.....980Ti was only $650, I'm just saying.
You forget the bright side, now Nvidia can charge $25,- less for the vanilla 4080!

Oh man. This is Turing all over again innit. Solid GPUs but grossly overpriced. Repairs done mid gen to make the line up something you might actually want, and even the identical top part everyone wants (2080ti, the only sensible Turing initial release vs 4090, which is the same thing for Ada).

Huang, you are getting boring. Please devise a new strategy, we see through this too easily now.

OTOH, if you look at the whole Super rumor mill, a 4080 20GB slotting in under the 4090 makes a lot of sense, that gap is huge. The rest though...
bugIt's basically Ada's saving grace: if it can hit 60fps@QHD, you enable DLSS FG and you are into playable territory at 4k, with very minor IQ loses.
But again, pricing...
You optimist, you
Posted on Reply
#27
bug
Vayra86You optimist, you
Nothing optimistic about that. It's been repeatedly tested here on TPU that you can get good frame rates at 4k via DLSS FG. But it's contingent on getting 60fps@QHD in the first place, which isn't always achievable.
Posted on Reply
#28
Deleted member 234997
bugNothing optimistic about that. It's been repeatedly tested here on TPU that you can get good frame rates at 4k via DLSS FG. But it's contingent on getting 60fps@QHD in the first place, which isn't always achievable.
Frame times?
Posted on Reply
#29
Unregistered
Don't forget whatever they release next year is probably going to crap all over these too.
This has the potential to be just like when they released the 3000 series.

Posted on Edit | Reply
#30
Evildead666
I think Nvidia and AMD should have a word with game devs.
Their top tier cards aren't able to play most recent games at decent framerates.
That's more of a Dev problem, and its making their top tier cards look like crap.

Every game wants to be the next 'Crysis'....
Posted on Reply
#31
Guwapo77
Sorry...this generation is dead in the water because there is no hope for any longevity. I've seen what UE5 does to the 4090 and this Super will not be stronger, so what is the point? No refresh from either (any) team will be worth the silicon its made on. Holla at me during the 50/8xxx series!
Evildead666I think Nvidia and AMD should have a word with game devs.
Their top tier cards aren't able to play most recent games at decent framerates.
That's more of a Dev problem, and its making their top tier cards look like crap.

Every game wants to be the next 'Crysis'....
Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
Posted on Reply
#32
Denver
Guwapo77Sorry...this generation is dead in the water because there is no hope for any longevity. I've seen what UE5 does to the 4090 and this Super will not be stronger, so what is the point? No refresh from either (any) team will be worth the silicon its made on. Holla at me during the 50/8xxx series!


Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
Please, show me an example of this graphical advance because I haven't seen anything impressive enough compared to what was around 4 years ago running smoothly on GPUs with 1/3 the performance of current high-end GPUs... You know well that they are bringing nothing to the table but broken games.
Posted on Reply
#33
FeelinFroggy
Guwapo77Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
If the fastest gaming GPU in the world cannot handle the game, then it's on the game developers and not the GPU manufactures for poor performance. The first rule in making a game is to make a game people can actually play.
Posted on Reply
#34
A&P211
b1k3rdudeThe only 'refresh' to keep an eye on is the 4070/ti super w/extra vram.
Give another "ti" to the 4070ti, name it 4070titi
Posted on Reply
#35
dalekdukesboy
Not a terrible idea for between the 4080 and 4090, but the rest rather pointless and pricing will just continue to suck. I just got a 7900 xtx liquid devil and hoping that is best bang for buck atm!
Posted on Reply
#37
Unregistered
I'm happy to wait for 2nd hand market. Not paying these asinine prices.
Posted on Edit | Reply
#39
rv8000
FeelinFroggyIf the fastest gaming GPU in the world cannot handle the game, then it's on the game developers and not the GPU manufactures for poor performance. The first rule in making a game is to make a game people can actually play.
There are three parties to blame imo: Investors, Devs, and Nvidia/AMD. Investors for rushing products out the door with no polish. Devs for not putting their foot down and taking the path of least resistance for basing playability using upscale/fg tech. And Nvidia, to a lesser degree AMD, for pushing upscaling for the last 3 gens and now FG.

I guess theres a fourth if you consider the people who just blindly pre-order everything under the sun. On the bright side, I personally don’t think there are many titles that have been released lately worth buying or don’t run well if they are; just a bunch of formulaic open-world garbage.

OT:

4080 super duper for $1499 with just enough of a performance gap to continue to upsell people to a 4090 so they can maximize margins.
Posted on Reply
#40
Cheeseball
Not a Potato
Guwapo77Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
The performance gap between the RTX 3090 Ti and the RTX 4090 is huge, along with the RX 6950 XT and the RX 7900 XTX, in both older and newer games. Its mostly Ray Tracing and Path Tracing which are bringing GPUs to their knees, and these are just halo, show-it-off graphics technologies.

Game developers are supposed to push advancements in gameplay while optimizing the engine/graphics renderer they are building their games in. Either Unreal Engine 5 is unoptimized itself or AAA game developers are lazy AF. I tend to think the latter, considering Fortnite still runs just as it was in UE4 at competitive graphics settings.
Posted on Reply
#41
wNotyarD
CheeseballGame developers are supposed to push advancements in gameplay while optimizing the engine/graphics renderer they are building their games in. Either Unreal Engine 5 is unoptimized itself or AAA game developers are lazy AF. I tend to think the latter, considering Fortnite still runs just as it was in UE4 at competitive graphics settings.
Fortnite is the prime example of what I see as wrong with a large chunk of the industry using UE. Of course Epic gets to optimize Fortnite on UE, it's their own engine. Everyone else should either get on their toes to optimize their games on a third-party's engine (which they don't) or develop their own in-house engines (which they won't as it's too costly - time and money - and investors don't want any of that).
Posted on Reply
#42
alwayssts
bugIt depends on what "decent" means to you. Which res? Which refresh rate are you targeting?
My guess is next-gen will be at least halfway there and the gen after that may finally democratize RT. The only problem, next gen isn't coming for at least another year :(
Yeah. I agree.

I think Navi 5 *could* bring decent RT to the mainstream, while nvidia will respond with either a new chip and/or series....but that's 2025(/2026?). Regardless, it will probably happen by the time/for the chip design used inside of new consoles, but probably be a premium/unusable feature until then.

I think there's a pretty good chance Blackwell will bring the same divide as Ada; Sure, the 384-bit/512-bit chips will probably be nice (and expensive), but there's a fair chance the chip below them will be 192-bit/32gbps and compete with (and probably slightly beat) N4x/BM(/ad103). RT on all those parts may still (relatively) suck. I still think we're looking at/waiting for a 3nm 256-bit chip for it to be realistic for how most people want to use it, and that will come when it's cost-effective for AMD and then nVIDIA is forced to relegate their crown jewel to the mainstream. I imagine this might be the 'X' generation that's spoken of in that slide deck, and that it might be N3P (which AMD may wait to use), but I don't know.

I think we're waiting for a few things to line up. That could be a newer/better-yielding/cheaper process (like N3P) as well as 4GB (32Gb) GDDR7. I have to imagine AMD is itching to have a 256-bit/32GB chip to put in a new PS/xbox, and they might just be waiting for those stars to align before releasing Navi 5. While I could absolutely see nVIDIA trying to get away with using 12GB for a native 3nm chip, I just don't think AMD will; I think even 128-bit will be 16GB. That said, there's always the chance GB205 (et al) may use GDDR6(x) which then makes 256-bit/16GB more likely, and the possibility that some-what mainstream GPU may have a chance at using the feature adequately.

It's just one of those things that once Navi 4/BM release, I don't see why you'd wait for GB205 which will probably be similarish performance, the same or less ram, and likely more expensive (given how nVIDIA refuses to drop the price on 4070ti/4080).

While I can't speak to everyone's disposition, I look at this chart a hint toward the future:



First of all this uses 7800xt 1440p/60 as a baseline, which I think is realistic for what to expect in the future (ps5 pro). I think 7900xt/4080 is realistic future-proofing until the end of generation (and games fall to FSR/DLSS 4k balanced 30/60fps, if not rarely 1080p on OG PS5).

While speculation, I think it's fair to guess 4070 super would be ~61fps and 4070ti Super (still weird) roughly ~76fps, as that would be splitting the difference between 4070/4070ti and 4070ti/4080. 4080S ~90fps.

I truly believe (BM?/)N4xpro will be ~7168sp/2800mhz, similar to ps5pro (7680/2600mhz?), 12gb, and slightly faster than a hypothetical 4070 super. I also think it (they?) will OC to the realm of 4070ti, just like 7800xt, while 4070S will not (because market segmentation and proven nvidia overclocking suckage). My guess is ~$450, while 7800xt drops to ~$400.

N4XXT (okay, that's pretty horrible naming too) is a wildcard, but it's important to realize 7800xt is clocked at a paltry 2425mhz and uses a max of 250w, which is somewhere between a travesty and extremely foreboding for what they plan for the future, granted the price/perf is good. We see 7700xt clock up to 3113mhz (artificial limit?), and the computer hardware community's own Freddie Mercury stand-in reminds us 7900xtx can do ~3200-3300mhz. As I've kind of spoken about before, with 20gbps ram 7800xt could've clocked up to 2900mhz and been fine...but AMD distinctly chose not to do that (or even really allow it because power limit), clearly favoring lower prices and to push sales of 7900xt at this/that moment in time. With 24gbps we could be talking up to almost 3500mhz as a possibility, if even a slight one. When you take what Apple achieved on N4P/N5P (A16 is 3.46ghz, M2 is 3-3.66ghz), or even the 11% TSMC promises N4P over N5 (and 4% for n4x on top of that if AMD chose to use it), this is actually hilariously realistic. All they really need to do is give it a max PL of 375W (max of two 8-pin connectors) and let people go ham. I think a more realistic scenario is 8192sp and a stock clock of 3-3.25ghz depending on how much they care about how 7900xt looks comparatively, but like I said, wild card. The important thing to realize is this card (and conceivably a BM alternative) have to fight against 4070ti/4070tititititi in pricing. I think it's a foregone conclusion to be ~$600, but we shall see. Point is, unless AMD completely screws the pooch, the chip could/should literally be up to 40% faster (a typical generational hike) than a 7800xt at stock, or 20% faster than it's absolute performance or a stock 4070ti; conceivably right on top of 7900xt/4080. If not at stock, certainly with overclocking. I could see AMD locking down memory to 3200mhz (25.6gbps, 6.6_%), but that still gives the GPU room to ~3500mhz with 8192sp, or 3700+ if 7680sp. This would make anything below a 4090 essentially moot wrt typical playable settings, barring some major price adjustments and/or amazing value skus from nVIDIA.

I remain incredibly curious what nVIDIA will do in the face of this, because it ain't a pipe dream. It's a very realistic scenario that's likely going to make nVIDIA look like fools wrt pricing, more-so than 7800xt already does. Even if they slot in a 4080 Super, who cares? You *might* get 4K60 DLSS balanced with RT in AW2, but I still think it's a tough sell imho because the tangible real-world performance benefit just isn't going to be there, especially for the likely price premium ($1000?).
Posted on Reply
#43
bug
Guwapo77Its the other way around... Game Devs are suppose to push the envelop. Its hardware that needs to get moving, this generation didn't do much of anything except for the 4090 and 7900XTX (w/o ray tracing). Everything else is like Intel's 14th gen.
Yes, but you push the envelope with high/very high settings. You don't come out with system requirements for 30fps@720p medium settings and call that "pushing the envelope".

I still remember the words of a teaching in college: every now and then I get a student saying they wrote a fantastic program for their diploma, only they can't run it because they need a computer only NASA has. That's how most devs sound today.
Posted on Reply
#44
FeelinFroggy
rv8000And Nvidia, to a lesser degree AMD, for pushing upscaling for the last 3 gens and now FG.
Upscaling is the future of gaming. TV's increase their pixels by a factor of 4 when improving definition. There is no silicon manufacturing process in the world that can keep up with that exponential increase. The only way to do it is through upscaling. We were stuck with SD for a long time before HD came along and now you really have to look for a 1080p TV, they are all 4k.
Posted on Reply
#45
Evildead666
FeelinFroggyUpscaling is the future of gaming. TV's increase their pixels by a factor of 4 when improving definition. There is no silicon manufacturing process in the world that can keep up with that exponential increase. The only way to do it is through upscaling. We were stuck with SD for a long time before HD came along and now you really have to look for a 1080p TV, they are all 4k.
Marketing people have sold you 2160p screens because they want you to buy them.
Great for TV content.

Back in the day, not many people had 1600x1200 monitors (CRTs), or if they did, they ran them at lower resolutions, and higher frame rates.
800x600 at 75 or 85 Hz was a lot better than 1024x768 at 60Hz.
1280x960 at 75Hz was a lot better than 1600x1200 at 60Hz.

I don't want a 2160p monitor until games can play them properly, I only recently moved up to 1440p 144Hz.
I'd like to be able to play at that with everything turned on someday before I retire...

edit : I'd also like to see dot pitch go up more for monitors.
1600x1200 on 19" was gorgeous.
Less jaggies visible onscreen...
Posted on Reply
#46
Ruru
S.T.A.R.S.
Dr_b_Making the 4080 with 102 die is a bit of a strange step, it would make more sense if it was full die AD103, with more RAM, and would fit in a better TDP envelope
It would be a 32GB card then as AD103 has a 256-bit bus. As we remember from 500/600 series, using mixed density chips wasn't a wise solution.
Posted on Reply
#47
TumbleGeorge
FeelinFroggyThere is no silicon manufacturing process in the world that can keep up with that exponential increase.
I bet on this that you didn't play games on raw lithography process but on GPU architecture.
Posted on Reply
#48
Totally
Xex360They should just drop the 4080 and replace it with this.
That's their plan isn't it? That's what they didn't last time anyway.
TumbleGeorgeI bet on this that you didn't play games on raw lithography process but on GPU architecture.
Process and architecture are kinda tied at the hip. To better architecture, they need more transistors, and to fit more transistors, they need better processes (to keep costs reasonable at least).
Posted on Reply
#49
TumbleGeorge
TotallyTo better architecture, they need more transistors, and to fit more transistors
I didn't agree. GPU architectures isn't bodybuilders to make poses with their muscles.
Posted on Reply
#50
Random_User
Excuse me, please for another long and retarded post.
remekraYes but will it run new games at over 30fps? Because it seems that with newest ones, you can spend 1000$ on a GPU and it still does not guarantee max settings and 60fps.
Exactly! Remember this? A "holy grail" of Cyberpunk and Ray Tracing? Years have passed, and top nvidia cards still struggle with RTX in that game.
Each generation, they say the very same marketing bullsh*t all over again. Each is just to be replaced with yet another "the very best ever" in couple of years. And each gen is somehow still not enough for native performance without frame "generation".

remekraAs someone mentioned before the good old days when you spent 600-700$ on a top of the line GPU and you actually feel like you have top of the line hardware.
I can't agree with that. Once I bought myself the mighty 4870X2. The most powerful card at the time, for $630, while the rest of the world had it for $550, couple month before crisis hit, and being fired. The worst VGA I ever used. Not to mention, that it was very faulty. It's been RMAed, and replacement ended up being defective as well.

Thus I have been taught the hard way of how to pay for high-end/premium products. Now even lowest end ones priced as premium.
neatfeatguyProbably going to see the current 4080 drop to $1100, the 4080 Super come in at $1399 while the 4090 maintains its even more outlandish price.
Well. Yeah. Why should they lower prices, if they just can stack their price above older generation?

If I'm not mistaken, this was even said during with the announcement of the RTX lineup. Sorry, can't find the direct quote.
There's huge concern, that nVidia would repeat the same move as with original"Super" series. With H100 being bought out like hot cakes, there is no incentive to drop the prices. This Super stuff is just testing the waters. A desperate push of greed.

I even am somewhat sure, that NV will artificaly split classes and add as much artifical segmentation, as much and as long as they can. Who can stop them? Same goes for AMD.

This is the result, that people have bought into this hostile pricing pattern. People agreed to pay more for newer generation, that used to replace previous ones with the same price. Yes, newer nodes, R&D are expensive. But they were expensive always. The companies used pay millions if not billions into the development in all times.

And the one interesting thing to note: when everything becomes more expensive, like R&D, materials, logistics etc, it should eat up the profits and margins, not increase them by 60-70% plus numbers. No?

If that was/is that expensive, Nvidia wouldn't become a trillion bucks company. And AMD went to MCM for GPUs to make production cheaper. It should have lower the cost. They even did conferences how this should make GPUS even more affordable. Instead they just hiked prices, because they can.

The saddest thing, in many countries, the sellers place AMD products at the same pricing as Nvidia ones, despite they being much cheaper than the last.

"Welcome to RTX family"
Evildead666I think Nvidia and AMD should have a word with game devs.
Their top tier cards aren't able to play most recent games at decent framerates.
That's more of a Dev problem, and its making their top tier cards look like crap.

Every game wants to be the next 'Crysis'....
Yes. They should have like one and a half decade ago. The another reason beyond bad drivers, ATI/AMD being accused of for ages, is "lack of game support".
People confused the support of the developers with nVidia's, basically bribes behind the curtains. And took that as granted. But... instead of finding out the problems, and demand the justice and equity, they joined the game.

Surely, development costs a huge lot of money. And optimising for various different HW is pain. However, none of the GPU makers should ever have any influence on developers. That should be prohibited altogether. Not even be in contact, unless the software/game developer asks for assistance, or notifies of a problem. They should make sure, that games work equally as good/not hindered, for all vendors and their HW that support the technologies the games are based on.

I'm not the developer myself, but I still consider as crime, when developers can "prefer" one vendor over other, as this directly affect the sales and brand recognition. Doesn't matter be it Nvidia, AMD, or Intel. The devs should instead use the universal technologies and instruction sets available in game industry.
The only interest the vendor should have for their HW to work as supposed. The only time when the performance may vary, is indeed due to lack of certain feature/instructions/technologies.
That worth mentioning, that it was AMD which first presented RT back in 2016. And this is AMD, that pushed the stuff, that is being called Vulcan and HBM nowadays.

And yes, game devs want their game to be another Crysis, because this grants them "incentives" for pushing the GPU sales. Except this time they are utterly out of touch, and people won't run to buy the latest and greatest. Just because they cannot. The udders are dry.
DenverPlease, show me an example of this graphical advance because I haven't seen anything impressive enough compared to what was around 4 years ago running smoothly on GPUs with 1/3 the performance of current high-end GPUs... You know well that they are bringing nothing to the table but broken games.
Yeah. Either they did reach their limits. Or considering the power and performance room of current hardware, the publishers just got lazy, and don't want for devs to fix and optimize games. Everyone wants to be Google Chomium.
Posted on Reply
Add your own comment
Dec 20th, 2024 19:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts