Friday, June 19th 2020
Possible NVIDIA GeForce RTX 3090, RTX 3080, and "TITAN Ampere" Specs Surface
Alleged specifications of NVIDIA's upcoming GeForce RTX 3090, RTX 3080, and next-generation TITAN graphics cards, based on the "Ampere" graphics architecture, surfaced in tweets by KatCorgi, mirroring an early-June kopite7kimi tweet, sources with a high hit-rate on NVIDIA rumors. All three SKUs will be based on the 7 nm "GA102" silicon, but with varying memory and core configurations, targeting three vastly different price-points. The RTX 3080 succeeds the current RTX 2080/Super, and allegedly features 4,352 CUDA cores. It features a 320-bit GDDR6X memory interface, with its memory ticking at 19 Gbps.
The RTX 3090 is heir-apparent to the RTX 2080 Ti, and is endowed with 5,248 CUDA cores, 12 GB of GDDR6X memory across a 384-bit wide memory bus clocked at 21 Gbps. The king of the hill is the TITAN Ampere, succeeding the TITAN RTX. It probably maxes out the GA102 ASIC with 5,326 CUDA cores, offers double the memory amount of the RTX 3090, at 24 GB, but at lower memory clock speeds of 17 Gbps. NVIDIA is expected to announce these cards in September, 2020.
Sources:
KatCorgi (Twitter), VideoCardz
The RTX 3090 is heir-apparent to the RTX 2080 Ti, and is endowed with 5,248 CUDA cores, 12 GB of GDDR6X memory across a 384-bit wide memory bus clocked at 21 Gbps. The king of the hill is the TITAN Ampere, succeeding the TITAN RTX. It probably maxes out the GA102 ASIC with 5,326 CUDA cores, offers double the memory amount of the RTX 3090, at 24 GB, but at lower memory clock speeds of 17 Gbps. NVIDIA is expected to announce these cards in September, 2020.
58 Comments on Possible NVIDIA GeForce RTX 3090, RTX 3080, and "TITAN Ampere" Specs Surface
I will not focus on speculating on precise clock speeds or SM count, I believe this is a new major architecture, totally rebalanced and changed caches, so it will be hard to predict its performance characteristics.
But I do think memory bandwidth will be a deciding factor of how well these new cards scale. I saw some of the latest speculation about Navi 21 featuring 448 bit memory at 21 Gbps… It will be interesting to see how well that could be implemented.
@topic 4GB should be enough for low-end GPUs. but for the 4K+ market considerable Vram upgrades will be required.
The switch from 12/14nm to 7nm should allow twice the number of CUDA cores at the same clock speed and TDP, and roughy the same size. 20% extra doesn’t seem like much. If this is true, we will get significantly smaller chips than last gen or maybe nVidia used the extra space for more RT cores. I am not sure how I feel about that, when we haven’t even reached 4k at 120Hz. Even 3440x1440 at 120Hz can’t be done at max settings in most modern games on an RTX 2080 Ti.
4K 120 Hz? There are literally a small handful of those monitors around and are cost prohibitive to most in the first place. I think more would rather see a performance improvement in RT (which everyone can use) than seeing 4k/120 come to fruition in this generation.
Also, Intel can cross-agree with AMD the use of that IP, which they have already done.
As for who needs 4k at 120Hz - plenty of tvs support it, 27” 4k monitor prices are getting lower and lower too (looking at the EVE Spectrum - I paid a similar amount for a 27” 1440p 60Hz IPS monitor 9 or 10 years ago). I don’t think it is outrageous to think that in 2020 someone who pays $1000+ for a graphics card will want to pair it with a high refresh rate 4k or ultrawide monitor. Why else would you spend so much on a graphics card?
4K TVs are already invading our living rooms from quite a while with no juice for feeding them
The competition for next gen is going to be so fierce neither one wants to play it's hand and I have no doubt both are leaking fake specs. Even the decision of who will release first will be causing great angst in AMD and Nvidia.
ark.intel.com/content/www/us/en/ark/products/94456/intel-core-i7-6950x-processor-extreme-edition-25m-cache-up-to-3-50-ghz.html
I doubt they'll want to repeat that
But yes, this 'leak' is full of nonsense. I didn't even bother much to dive into it actually, when I read 3090 my mind goes elsewhere. Also, Titan Ampere on a hard launch along with the rest? Total BS, makes zero sense.
What WILL happen is a 3080 launch on a GA104 die, or something along those lines. Nvidia will NEVER launch the biggest SKU first in this gen, its a new node, so yields will prob not even allow it. Was gonna type that yesterday, yeah.. There is ALWAYS a product with an X. Always. Stop following the news, and wait for cards on shelves, read reviews, buy card when availability is good.
Its never good to anxiously await new GPUs, and much more comfy to buy mid-gen. You avoid early adopter woes, drivers are solid, prices are normalized. This applies to... almost all purchases ever ;)