• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's Next-Gen Big GPU AD102 Features 18,432 Shaders

Honestly I don't think they care one iota what the end price and availability is in the market. Currently they can sell all what they can produce and are still getting record revenues quarter after quarter. And more different skus they made more new cards they can sell...

Sure by consumer's pov current market sucks big time. But it's more because of absent for true new gen mid-range cards rather than pricier high end(I don't count 3060ti as mid-range). Ancient Polaris and lack luster 16 -series Turings just wont cut it anymore. Sure one can have good deals from 1-gen navis, but those aren't really mid-range either and yeah i think they are EOLled anyway to get room for other 7nm products. So what market really needs is a new $200-$250 mid-range champ, perfect 1080p and capable 1440p card as rx 480 or gtx 1060 were at the beginning.

I have that dream too :(
 
2021 H2 end Apple will free 5nm product lines+time to adaptation to big cores+time to release GPUs in ready to market graphic cards ...2022 autumn or little earlier?
 
That is a first for nVidia me thinks! The RTX3000 aren't good enough in efficiency and cost and are still absent from the shelves and they were forced to intentionally leak critical detail of their next arch.
 
Honestly I don't think they care one iota what the end price and availability is in the market. Currently they can sell all what they can produce and are still getting record revenues quarter after quarter. And more different skus they made more new cards they can sell...

Sure by consumer's pov current market sucks big time. But it's more because of absent for true new gen mid-range cards rather than pricier high end(I don't count 3060ti as mid-range). Ancient Polaris and lack luster 16 -series Turings just wont cut it anymore. Sure one can have good deals from 1-gen navis, but those aren't really mid-range either and yeah i think they are EOLled anyway to get room for other 7nm products. So what market really needs is a new $200-$250 mid-range champ, perfect 1080p and capable 1440p card as rx 480 or gtx 1060 were at the beginning.

i dont agree. Nvidia and other company decide a plan, that plan say you have to sell a minimum pieces at a certain price (msrp) to make enough profit and close the year positively.
If the market, is so much distorced that you can see gpus on sell at double their msrp thing are not good for nvidia\ amd for at least 2 reasons
1) those price are for the final customer, due to scalping. a custom gpu selled at the customer at msrp or double for nvidia amd is the same.

2) if the number ar so much few that such a price distortion may happen, i can bet what ever you want that amd nvidia are doing whatever they can to maximise production and reach beak even. I think that after almost 3 month from presentation they arent even near to that
 
Dumpster tier rumor. While there is no question Nvidia is working on the next gen and has probably been working on it even before Ampere launch, anything like shader count, SM's, clock speed, vram, etc... probably even the engineers and designers don't know themselves, as they are probably just testing stuff out and figuring it out. It takes anywhere from 6 months to 18 months to finalize a GPU design.

Plus Ampere cards just started selling, barely 2 months ago and they aren't even in stock, they haven't even released mid or low tier cards yet. So any new generation would wait for a good year at the least!
 
And AMD will quadruple the performance, and none of this matter as we can't buy any of these damn cards anyways.
 
This is WCFtech level stuff article.
And the name for the Following GPU after Ada will be Linda (lovelace)
Source for this is of course >> Deep Throat
 
More and more it seems that Ampere in Geforce is just an opportunity not missed. How do you utilize 10k cuda cores for gaming efficiently? They will learn how, utilize 18k cores and make it work and manufacture it on way better litography.
 
But if numbers is true AMD has not any chance to competition. With up to +50% perf/watt for rdna3 will be joke to compare with Nvidia's progress.

The fact that you think this will scale 100% is a joke in itself. Nvidia will likely get 50-60% performance bump over ampere. RDNA3 is not only rumored to carry same efficiency as RDNA2 it is also rumored to be a large architectural upgrade not just more shaders etc, big upgrade to geometry engine and stuff. Nvidia was rumored to go MCM after ampere the fact they had to delay it is not a great sign if RDNA 3 is such a large leap and could possibly be MCM.
 
Will see next flagship battle on 8k arena. On 4k in battles will go only middle class graphic cards.
 
If rdna3 is from team that brought rdna1, than we should just wait for even.
 
The fact that you think this will scale 100% is a joke in itself. Nvidia will likely get 50-60% performance bump over ampere. RDNA3 is not only rumored to carry same efficiency as RDNA2 it is also rumored to be a large architectural upgrade not just more shaders etc, big upgrade to geometry engine and stuff. Nvidia was rumored to go MCM after ampere the fact they had to delay it is not a great sign if RDNA 3 is such a large leap and could possibly be MCM.

This is weird. Ampere has aprox 70% more perf than turing, if the next gen has 70% more cores, that's like 50% more perf. So a smaller jump than from turing to ampere.
 
This is weird. Ampere has aprox 70% more perf than turing, why is 50% of this news any good?
Is it always possible to push +50% IPC improvements from gen to gen?
 
Is it always possible to push +50% IPC improvements from gen to gen?

As i understand it they aren't talking about ipc but just outright increase in cores.
 
As i understand it they aren't talking about ipc but just outright increase in cores.
Ah, sorry, I completely misunderstood.
 
Is this WccfTech? Last I checked, 99% of all RTX3080 buyers still cannot order their graphics cards. Not to mention 5% of those buyers will die from Covid before being able to even get their hands on one. Yet someone finds time to speculate the RTX4080 is going to debut 1H 2021 -- for what end?

What's unsettling is not that AMD's cards are doing good (you still cant buy either), but that the only people who need the Next Gen cards are those strictly playing true 4k 120hz+ (or higher)with an additional monitor. The Green+Red team will achieve 100+ FPS in 1440P with high graphics settings and not even struggle. Where is the artificial demand? that is what we should be afraid of.
 
Why stop at 18K, why not 24K or 48K? If they really roll out a new GPU in a few months I can't wait to hear the justifications for Nvidiots paying through the nose for another GPU or have reduced epeen from not paying through the nose for the latest again.

Also, Nvidia isn't going to TSMC, they are staying with samsung, so unless Samsung improves their node they are still going to be limited by the process more than architecture again, and have a even more power hungry chip, that if larger will mean more defect dies, and thus higher priced.... Competition is glorious, AMD is held back by wafer supply, Nvidia is held back by wafer supply, and we the consumers have to pay to play.

10 Gigarays did sound better to me, at least you could readily see the nonsense and laugh about it. With Ampere we never heard of those again, but shouldn't that be at least 19 Gigarays now? Wait, no... 38? (1.9x perf and double shader count after all)

That is a first for nVidia me thinks! The RTX3000 aren't good enough in efficiency and cost and are still absent from the shelves and they were forced to intentionally leak critical detail of their next arch.

This man gets it. I damn well knew it the moment I saw those die shots/sizes and accompanying specs + TDP. Nvidia hoped to sell a lot of pre-orders and pre-empt everything because once the dust truly settles, this whole gen is obsolete faster than Turing. Inb4 the refresh with a supposedly better Samsung node underneath, or even a last minute change to TSMC. All options are open at this time really, but its clear Nvidia will move in one way or another. They already priced themselves back to Pascal levels, which speaks volumes.
 
Last edited:
Whether or not it is feasible to put that many shaders on a GPU in a year from now, it seems perfectly logical to me that since Nvidia has found that beating AMD in the ray-tracing department hasn't generated the expected enthusiasm among consumers, for their next generation they will emphasize conventional muscle in things like shaders.
 
Back
Top