Friday, June 19th 2020

Possible NVIDIA GeForce RTX 3090, RTX 3080, and "TITAN Ampere" Specs Surface

Alleged specifications of NVIDIA's upcoming GeForce RTX 3090, RTX 3080, and next-generation TITAN graphics cards, based on the "Ampere" graphics architecture, surfaced in tweets by KatCorgi, mirroring an early-June kopite7kimi tweet, sources with a high hit-rate on NVIDIA rumors. All three SKUs will be based on the 7 nm "GA102" silicon, but with varying memory and core configurations, targeting three vastly different price-points. The RTX 3080 succeeds the current RTX 2080/Super, and allegedly features 4,352 CUDA cores. It features a 320-bit GDDR6X memory interface, with its memory ticking at 19 Gbps.

The RTX 3090 is heir-apparent to the RTX 2080 Ti, and is endowed with 5,248 CUDA cores, 12 GB of GDDR6X memory across a 384-bit wide memory bus clocked at 21 Gbps. The king of the hill is the TITAN Ampere, succeeding the TITAN RTX. It probably maxes out the GA102 ASIC with 5,326 CUDA cores, offers double the memory amount of the RTX 3090, at 24 GB, but at lower memory clock speeds of 17 Gbps. NVIDIA is expected to announce these cards in September, 2020.
Sources: KatCorgi (Twitter), VideoCardz
Add your own comment

58 Comments on Possible NVIDIA GeForce RTX 3090, RTX 3080, and "TITAN Ampere" Specs Surface

#26
efikkan
If Ampere features "GDDR6x", we should see these memory chips launching sometime before the graphics cards.

I will not focus on speculating on precise clock speeds or SM count, I believe this is a new major architecture, totally rebalanced and changed caches, so it will be hard to predict its performance characteristics.

But I do think memory bandwidth will be a deciding factor of how well these new cards scale. I saw some of the latest speculation about Navi 21 featuring 448 bit memory at 21 Gbps… It will be interesting to see how well that could be implemented.
Posted on Reply
#27
Dante Uchiha
cuemanoh,huge hardware scores, alot, when we remembe ,ampere is 7nm,so clocks are much higher than usuall... i say put 250mhz easily more basic hz than usuall and then add oc'd capacity alot more.
also,seens nvidia leave bank alot cores,near 3000! max cores numbers are over 8000,yes, you read right.


big navi.. i seen it again...errh, and it cant get even near rtx 3090...what dream world some2 live?

amd cant beat now even 12nm rtx 2070 super OC gpu, and then its still must beat its four big brother..,there is enough to job legendary big navi.

also, we all KNOW fact that big navi is still 7nm gpu with nothing new special upgrade...remembe this, i bet that big navi is watercooled gpu....still i cant belive it, these days..
sure im sure it should beat now old 16nm gtx 1080 ti OC gpu..hmm, must...but remembe check it tdp and efficiency.


well 'my' leak it small, and i warn amd fans,dont except it anyhting what it CANT be, rtx 3070 is too much it.


anyway,competition is always good,you will seen both september and october. nvidia and amd.

btw, i wait also intel high end gpu...exiting...

great weekends all!
If you're not a bot, you should try to better organize your ideas. To expect any competition from intel in the GPUs market is to be, at least, misinformed. AMD and Nvidia holds tons of patents on everything GPU-related, 3D rendering and etc..

@topic 4GB should be enough for low-end GPUs. but for the 4K+ market considerable Vram upgrades will be required.
Posted on Reply
#28
ppn
If there is GDDR6X, we have to wait for the real GDDR7. not this half baked immitation.
Posted on Reply
#29
FordGT90Concept
"I go fast!1!11!1!"
If they really do come out the door with an RTX 3090, NVIDIA is concerned about Sienna.
Posted on Reply
#30
ObiFrost
P4-630www.techpowerup.com/forums/threads/amd-recommending-graphics-cards-with-6gb-vram-or-more-for-aaa-games.268158/
Yeah, but it's coming from AMD, not Nvidia. Even 5500XT had 4 GB. I'm not purposefully defending 4 GB as games will continue to climb in min. requirements with bigger VRAM capacities, but for 1080p 60 FPS the 4 GB will remain a standard till Hopper/RDNA3 or RTX 5000/RDNA4. The simple reason is most low-end users whom the 4 GB is aimed at won't really stand a chance at acquiring 6 GB or 8 GB midrange cards (sure, we could say 5500XT is low end with 8 GBs, but other specs are bottlenecking performance) and unless AMD and Nvidia produce 1650 Super like card with 6 GB VRAM, I don't see a viable reason for 4 GB to vanish yet.
Posted on Reply
#31
cueman
The king of the hill
Posted on Reply
#32
chris.london
I hoped for something better.
The switch from 12/14nm to 7nm should allow twice the number of CUDA cores at the same clock speed and TDP, and roughy the same size. 20% extra doesn’t seem like much. If this is true, we will get significantly smaller chips than last gen or maybe nVidia used the extra space for more RT cores. I am not sure how I feel about that, when we haven’t even reached 4k at 120Hz. Even 3440x1440 at 120Hz can’t be done at max settings in most modern games on an RTX 2080 Ti.
Posted on Reply
#33
EarthDog
chris.londonI hoped for something better.
The switch from 12/14nm to 7nm should allow twice the number of CUDA cores at the same clock speed and TDP, and roughy the same size. 20% extra doesn’t seem like much. If this is true, we will get significantly smaller chips than last gen or maybe nVidia used the extra space for more RT cores. I am not sure how I feel about that, when we haven’t even reached 4k at 120Hz. Even 3440x1440 at 120Hz can’t be done at max settings in most modern games on an RTX 2080 Ti.
There's also 'IPC' improvements to be had with the move to a new architecture.

4K 120 Hz? There are literally a small handful of those monitors around and are cost prohibitive to most in the first place. I think more would rather see a performance improvement in RT (which everyone can use) than seeing 4k/120 come to fruition in this generation.
Posted on Reply
#34
Vayra86
ppnIf there is GDDR6X, we have to wait for the real GDDR7. not this half baked immitation.
Nah man more X is always more better.
Posted on Reply
#35
AusWolf
EarthDogThere's also 'IPC' improvements to be had with the move to a new architecture.

4K 120 Hz? There are literally a small handful of those monitors around and are cost prohibitive to most in the first place. I think more would rather see a performance improvement in RT (which everyone can use) than seeing 4k/120 come to fruition in this generation.
Who needs 4k/120 Hz anyway? Even 120 Hz alone at any resolution is something only a handful of gamers will ever need. I personally think the advantage over 60 Hz is just placebo.
Vayra86Nah man more X is always more better.
That's why Intel processors are not as interesting as they used to be. They're still stuck with K, while AMD switched to X years ago.
Posted on Reply
#36
ymbaja
neatfeatguyIf the 3070 gives 2080Ti performance and doesn't cost over $600 I may actually upgrade from my 980Ti.....I'm thinking I won't be that lucky, on the price, that is.
It’s a nice thought, but never gonna happen. The only thing that will beat a 2080ti in their new lineup is a 3080ti. Its the Nvidia way...
Posted on Reply
#37
ARF
Dante UchihaIf you're not a bot, you should try to better organize your ideas. To expect any competition from intel in the GPUs market is to be, at least, misinformed. AMD and Nvidia holds tons of patents on everything GPU-related, 3D rendering and etc..

@topic 4GB should be enough for low-end GPUs. but for the 4K+ market considerable Vram upgrades will be required.
Intel can pay royalties to use the aforementioned patents - it's never an obstacle. Also, Intel has so large man power that they can literally discover their own things and still be very competitive. It hasn't been their priority, though.

Also, Intel can cross-agree with AMD the use of that IP, which they have already done.
Posted on Reply
#38
chris.london
EarthDogThere's also 'IPC' improvements to be had with the move to a new architecture.

4K 120 Hz? There are literally a small handful of those monitors around and are cost prohibitive to most in the first place. I think more would rather see a performance improvement in RT (which everyone can use) than seeing 4k/120 come to fruition in this generation.
Yes, and there will be some clock speed improvement as well, but even if we can get to 120Hz at 4k in current gen titles, it may not be enough next year (or in CP 2077 - edit: or in any Assassin’s Creed game).

As for who needs 4k at 120Hz - plenty of tvs support it, 27” 4k monitor prices are getting lower and lower too (looking at the EVE Spectrum - I paid a similar amount for a 27” 1440p 60Hz IPS monitor 9 or 10 years ago). I don’t think it is outrageous to think that in 2020 someone who pays $1000+ for a graphics card will want to pair it with a high refresh rate 4k or ultrawide monitor. Why else would you spend so much on a graphics card?
Posted on Reply
#39
ARF
chris.londonYes, and there will be some clock speed improvement as well, but even if we can get to 120Hz at 4k in current gen titles, it may not be enough next year (or in CP 2077 - edit: or in any Assassin’s Creed game).

As for who needs 4k at 120Hz - plenty of tvs support it, 27” 4k monitor prices are getting lower and lower too (looking at the EVE Spectrum - I paid a similar amount for a 27” 1440p 60Hz IPS monitor 9 or 10 years ago). I don’t think it is outrageous to think that in 2020 someone who pays $1000+ for a graphics card will want to pair it with a high refresh rate 4k or ultrawide monitor. Why else would you spend so much on a graphics card?
Unfortunately, the 4K monitors prices are not getting lower. They keep the same level for dozens of months. The cheapest 4K monitor is around 230 euros, while the cheapest 1080p monitor is around 60-70 euros. This is an extremely strong prices dumping by the 1080p monitors manufacturers and it effectively distorts the whole market. And stops the innovation and progress.
Posted on Reply
#40
ZoneDymo
MetroidI still think for high end gpu's like 3080 and up minimum memory 16gb, if this is true, 10gb, disappointed.
I just wish HBM was more of a thing, maybe in some sort of combo thing to keep cost down, but then with games etc being programmed with the faster memory in mind.
Posted on Reply
#41
nikoya
ppnSave you wallet for 5nm. as amazing this is, I mean 7nm 65.6Mtr/mm2 jump from 14nm 25Mtr/mm2, shrinks 2080Ti under 300mm2. ffs it is just the beginning. And you don't want to be unprepared when it happens. Yeah there will be 3080, and 3080Super, and then 4080, the right time to buy is with 4080Super, or 5080 even. or 3-4 years from now. Now you can take 3070 or something under 399 if it performs like 2080Ti.
Problem is humanity hasn't invented cryogenic sleep yet...so what are you going to do during 4 years? ;)

4K TVs are already invading our living rooms from quite a while with no juice for feeding them
Posted on Reply
#42
bug
chris.londonI hoped for something better.
The switch from 12/14nm to 7nm should allow twice the number of CUDA cores at the same clock speed and TDP, and roughy the same size. 20% extra doesn’t seem like much. If this is true, we will get significantly smaller chips than last gen or maybe nVidia used the extra space for more RT cores. I am not sure how I feel about that, when we haven’t even reached 4k at 120Hz. Even 3440x1440 at 120Hz can’t be done at max settings in most modern games on an RTX 2080 Ti.
You probably forgot Turing dies are at the size limit of what 12nm manufacturing allows. That's really not where you want to be, if you can help it ;)
Posted on Reply
#43
Decryptor009
AusWolfWho needs 4k/120 Hz anyway? Even 120 Hz alone at any resolution is something only a handful of gamers will ever need. I personally think the advantage over 60 Hz is just placebo.


That's why Intel processors are not as interesting as they used to be. They're still stuck with K, while AMD switched to X years ago.
You would actually have to be blind to not notice a difference between 60 & 120hz.
Posted on Reply
#44
moproblems99
AusWolfI personally think the advantage over 60 Hz is just placebo.
Have you spent any serious time with 120+hz? I used to say the same thing until I bought it. 165+hz, not really worries about it.
Posted on Reply
#45
Minus Infinity
BoboOOZSo much speculation going around these days. It feels like AMD and Nvidia themselves are participating in the misinformation.

Anyways, these some of the least credible specs that I've seen.
This.

The competition for next gen is going to be so fierce neither one wants to play it's hand and I have no doubt both are leaking fake specs. Even the decision of who will release first will be causing great angst in AMD and Nvidia.
Posted on Reply
#46
MxPhenom 216
ASIC Engineer
AusWolfWho needs 4k/120 Hz anyway? Even 120 Hz alone at any resolution is something only a handful of gamers will ever need. I personally think the advantage over 60 Hz is just placebo.


That's why Intel processors are not as interesting as they used to be. They're still stuck with K, while AMD switched to X years ago.
:laugh: You better sit this one out.
Posted on Reply
#47
jeremyshaw
ARFIntel can pay royalties to use the aforementioned patents - it's never an obstacle. Also, Intel has so large man power that they can literally discover their own things and still be very competitive. It hasn't been their priority, though.

Also, Intel can cross-agree with AMD the use of that IP, which they have already done.
Just like anyone can just pay royalties for x86... wait! They can pay for lawyers as Intel sues the living daylights out of any company silly enough to even consider that.
Posted on Reply
#48
1d10t
What about Ti, SUPER and Ti SUPER ?
Posted on Reply
#50
Vayra86
1d10tWhat about Ti, SUPER and Ti SUPER ?
Super only existed to tell us Turing went from shit to not all too shit.

I doubt they'll want to repeat that

But yes, this 'leak' is full of nonsense. I didn't even bother much to dive into it actually, when I read 3090 my mind goes elsewhere. Also, Titan Ampere on a hard launch along with the rest? Total BS, makes zero sense.

What WILL happen is a 3080 launch on a GA104 die, or something along those lines. Nvidia will NEVER launch the biggest SKU first in this gen, its a new node, so yields will prob not even allow it.
TranceHead2016 calling
ark.intel.com/content/www/us/en/ark/products/94456/intel-core-i7-6950x-processor-extreme-edition-25m-cache-up-to-3-50-ghz.html
Was gonna type that yesterday, yeah.. There is ALWAYS a product with an X. Always.
Aceman.auAs someone who is finally looking to upgrade after 5-6 years, the amount of blue balling going on here is really getting on my nerves. Just release the friggin' specs and a expected shipping date. Absolutely ridiculous, this whole charade.
Stop following the news, and wait for cards on shelves, read reviews, buy card when availability is good.

Its never good to anxiously await new GPUs, and much more comfy to buy mid-gen. You avoid early adopter woes, drivers are solid, prices are normalized. This applies to... almost all purchases ever ;)
Posted on Reply
Add your own comment
Dec 24th, 2024 14:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts