Thursday, April 22nd 2021

NVIDIA GeForce RTX 3080 Ti GA102-225 GPU Pictured and Detailed

The launch of NVIDIA's upcoming GeForce RTX 3080 Ti graphics card is upon us. The number of rumors circulating the web is getting greater and we have just received die pictures of the GA102 silicon and the specification of the specific SKU. Sources over at VideoCardz have provided the website with the first die picture of GA102-225 silicon, which powers the NVIDIA GeForce RTX 3080 Ti graphics card. Pictured below, it doesn't appear much different compared to the GA102-300 SKU found inside the RTX 3090 card, with the only obvious differentiator being the SKU ID. However, the difference only appears under the hood, with the GA102-225 SKU having 10240 CUDA cores instead of 10752 CUDA cores found inside GA102-300 of RTX 3090.

Paired with 12 GB of GDDR6X memory on a 384-bit bus, the memory will have run around 19 Gbps speeds. That will result in a bandwidth of 912 GB/s. If you are wondering about the performance of the card, it should remain within a few percent of its bigger brother RTX 3090. We have the first leak showing Ethereum mining performance and the GA102-225 silicon achieved a mining hash rate of 118.9 Mh/s with some tuning. The memory was overclocked to 21.5 Gbps, while the GPU TDP was limited to 278 Watts. The leak shows that the card has managed to achieve a 1365 MHz base and 1665 MHz boost frequency. While we don't have the exact launch date, the supposed MSRP will be anywhere from $999 to $1099, assuming you can get it at all at any price.
Source: VideoCardz
Add your own comment

62 Comments on NVIDIA GeForce RTX 3080 Ti GA102-225 GPU Pictured and Detailed

#27
BArms
This could be a good thing for more supply, if there is shortage of GDDR6x which prevents them from shipping more 3090s. Instead of selling more 3080's they could have just calculated they can make an extra $300 per card with a new SKU with half the ram of the 3090 while simultaneously shipping more cards in total. So I guess optimistically, this could b a win-win assuming one is in the market for a >$1000 card.
Posted on Reply
#28
ComedicHistorian
moproblems99I'll take either.
Hilarious!
gdallskAvailability:
Wait wait wait.....nobody said anything about it coming with cookies. This pandemic has taken me from an A cup to a solid B - MAYBE even C - cup so why tf not throw in some dessert with my overpriced graphics card?
Posted on Reply
#29
Atnevon
Maybe we should take ads on Fox News to advertise crypto to the crowds of easily swayed old-folks. It'd be the chinpokomon effect and then it wont be cool or profitable to mine anymore! The monopoly money-scheme will fall flat for what it is, the miners will tire of grandma wanting tech support for her "nintendo money box-a-machine", and then we'll have a market back to normal.

EDIT: call it.....Matlock coin!
Posted on Reply
#30
ComedicHistorian
AtnevonMaybe we should take ads on Fox News to advertise crypto to the crowds of easily swayed old-folks. It'd be the chinpokomon effect and then it wont be cool or profitable to mine anymore! The monopoly money-scheme will fall flat for what it is, the miners will tire of grandma wanting tech support for her "nintendo money box-a-machine", and then we'll have a market back to normal.

EDIT: call it.....Matlock coin!
YES. And have Tom Selleck hawk some reverse mortgage scheme involving crypto. They all trust Selleck, every last one of 'em
Posted on Reply
#31
watzupken
Have a good look at them before they are snatched up by scalpers. :laugh:
Posted on Reply
#32
Bwaze
Good thing we'll still be able to buy picture of a graphics card on Ebay.
Posted on Reply
#33
N3M3515
Great.......another freaking $1k gpu...:roll:
Posted on Reply
#34
evernessince
N3M3515Great.......another freaking $1k gpu...:roll:
Yep, even if this pandemic weren't around GPUs would still be overpriced regardless.

AMD and Nvidia are less competing and more pricing around each other without disrupting profit margins.
Posted on Reply
#35
owen10578
So this will perform exactly like an RTX 3090 considering the nearly the same core count and same 384-bit bus. Just with half the memory amount.
Posted on Reply
#36
watzupken
N3M3515Great.......another freaking $1k gpu...:roll:
Its 1k on paper. Considering that the RTX 3080 is now 699 on paper, but actual price currently is probably around 900 or more, depending on the model. So I am not surprise this will replace RTX 3090's price bracket in the current situation.
Posted on Reply
#37
sepheronx
watzupkenIts 1k on paper. Considering that the RTX 3080 is now 699 on paper, but actual price currently is probably around 900 or more, depending on the model. So I am not surprise this will replace RTX 3090's price bracket in the current situation.
Where is it $900 usd for a 3080? I've seen about $1500 -$2200 usd for a 3080.

Unless you mean what MSRP is....
Posted on Reply
#38
Radi_SVK
moproblems99The problem is that two 6800xts draw a lot more power than one of these.
This..and also 2 x 6800xt is def not anywhere in the fragments range of $1000 :D
Also looking at it that way, 60 - 70 MH/s that the guy stated for 6800xt doesnt look so good either anymore..when rtx 3060 does 50 MH/s and the rtx 3060ti crosses 60MH/s
Posted on Reply
#39
las
lynx29118 megahash doesn't seem all that great imo for a 3080 ti level card. I mean I think I read 6800 xt's get like 60 or 70... and they are like 1/3 1/4 the price of what this will be scalped for.
Ampere easily beats RDNA2 in mining, especially when tweaked - Go look up calculators if you want proof
owen10578So this will perform exactly like an RTX 3090 considering the nearly the same core count and same 384-bit bus. Just with half the memory amount.
Probably since 24GB VRAM is overkill for gaming

Problem with 3080 Ti is that price willl be higher than 3090 was on release - so it's pointless. 3090 came out 8 months ago and I bet it will be 2-3-4 months more till 3080 Ti can be bought so people pretty much waited 1 full year more, to buy a lesser version of 3090, for a higher price

By sep 2022, 4000 series launch and hopefully GPU availability is back to normal
Posted on Reply
#40
64K
They will show up on sites like Ebay for between $2,500 and $3,000 and some people will buy them anyway.
Posted on Reply
#41
JcRabbit
lasProbably since 24GB VRAM is overkill for gaming
Directly, yes, but not if you are anything like me, with a PC running 24/7 and tens of Firefox windows and tabs open simultaneously (yes, those people do exist lol) with GPU hardware acceleration enabled.

In such a case (especially if you spend a lot of time on Youtube) VRAM usage from the browser alone can reach as high as 8 GB - on a 12 GB card that would leave only 4 GB available for games. Definitely not enough if you are running at 4K.

Of course, you could always exit Firefox before running a game to reclaim all that memory, but with 24GB VRAM you really don't need to - convenience wins!
Posted on Reply
#42
Radi_SVK
I dont think there is enough facepalms left for this...Lets have a $2500 gpu so we dont need to close them tabs..but still go on and express our anxiety about VRAM never quite enough.
Posted on Reply
#43
JcRabbit
Radi_SVKI dont think there is enough facepalms left for this...Lets have a $2500 gpu so we dont need to close them tabs..
That is actually only ONE of the reasons, obviously. But facepalm away! :)
Posted on Reply
#44
Radi_SVK
JcRabbitThat is actually only ONE of the reasons, obviously. But facepalm away! :)
...and facepalm away I will...oh dont forget,you can splash another $2500 for a second 3090,SLI them and kill your anxiety once and for all!
Posted on Reply
#45
las
JcRabbitDirectly, yes, but not if you are anything like me, with a PC running 24/7 and tens of Firefox windows and tabs open simultaneously (yes, those people do exist lol) with GPU hardware acceleration enabled.

In such a case (especially if you spend a lot of time on Youtube) VRAM usage from the browser alone can reach as high as 8 GB - on a 12 GB card that would leave only 4 GB available for games. Definitely not enough if you are running at 4K.

Of course, you could always exit Firefox before running a game to reclaim all that memory, but with 24GB VRAM you really don't need to - convenience wins!
I use Chrome and have 25+, often 50+ and sometimes 100+ tabs open with GPU hw accel enabled and I'm not using anywhere near that, makes no sense, your VRAM usage is allocation not required amount

More VRAM, more usage (higher allocation) this is nothing new, you would be able to do the exact same thing on a RTX 2060 with 6GB.
Posted on Reply
#46
64K
lasMore VRAM, more usage (higher allocation) this is nothing new, you would be able to do the exact same thing on a RTX 2060 with 6GB.
This is common especially among gamers. As you said VRAM needed isn't the same as VRAM allocated.

Gamers will definitely know whether or not they actually need more VRAM because the GPU will start using System RAM which is much slower than VRAM.
Posted on Reply
#47
JcRabbit
lasI use Chrome and have 25+, often 50+ and sometimes 100+ tabs open with GPU hw accel enabled and I'm not using anywhere near that, makes no sense, your VRAM usage is allocation not required amount
Tell that to Mozila then. :)

Actually I think the problem is related to a combination of Firefox and YouTube: looks like a memory leak of some kind, but in VRAM. It's not just the amount of tabs you have open, but also the amount of time you have Firefox running while browsing the Internet and continuously watching tons of YouTube videos.

As I mentioned, my PC is on 24/7 and I only tend to reboot it whenever installing an update - so that can mean a month of runtime or more, for instance. Leaks eventually pile up - current VRAM usage is 4455 MB but I have seen it eventually go as high as 8 GB.

First time I noticed the issue was when playing Wolfenstein II The New Colossus (a game that likes to load a ton of textures onto VRAM, IIRC it could actually use up to 8-9GB?) on my old 2080 Ti (with 'only' 11GB VRAM): I launched the game and it was running slow as molasses, but actual GPU usage as measured by Afterburner on my secondary monitor was very low. That's when I noticed that *PCIe bus usage* was peaking to 100% (normally it's negligible) and that VRAM usage was maxed out. Basically the game could not load all the textures it needed into VRAM and so it was swapping them out 'on the fly'. Exiting the game showed why: Firefox was using nearly all of the available video memory.

To prevent other users from hurting themselves while face palming (eheh), the MAJOR reason I upgraded from a 2080 TI to a 3090 was actually the HDMI 2.1 support. This, together with a 48" LG CX OLED meant I could FINALLY experience my games at 4K 120Hz HDR with full chroma sampling, all at the same time. I had been waiting for HDMI 2.1 support for a very long time, as up until then I was limited to 60 Hz on my LG 43" 4K non-HDR IPS monitor.

The 3090 was also faster than a 3080 (I do like to run games with ray tracing enabled, when available) and the huge amount of VRAM was (to me) a big bonus given the above. The fact that I got my 3090 for basically MSRP, thus much less than people are paying for a 3080 these days, makes this a win-win, sorry. :)
lasMore VRAM, more usage (higher allocation) this is nothing new, you would be able to do the exact same thing on a RTX 2060 with 6GB.
Not sure what led you to say something like this? Games won't normally allocate more than they actually require, and Firefox usage will increase over time because this is likely a memory leak (not sure if memory fragmentation can occur on VRAM, but it's also a possibility).
Posted on Reply
#48
las
64KThis is common especially among gamers. As you said VRAM needed isn't the same as VRAM allocated.

Gamers will definitely know whether or not they actually need more VRAM because the GPU will start using System RAM which is much slower than VRAM.
For sure, when you run out of VRAM you will know. Very noticeable stutter will occour. Very low fps dips, often to 0 and back up.

Last time I personally experienced this, was in Bad Company 2 with a GTX 570 I think it was. 1.25GB VRAM maxed out.
JcRabbitNot sure what led you to say something like this? Games won't normally allocate more than they actually require, and Firefox usage will increase over time because this is likely a memory leak (not sure if memory fragmentation can occur on VRAM, but it's also a possibility).
Tons of game engines allocate all VRAM (or 80-90%). COD games usually do, for example.

Generally you can't really trust the VRAM Usage, it does not tell you much. If you are not stuttering, you have enough
Posted on Reply
#49
JcRabbit
lasTons of game engines allocate all VRAM (or 80-90%). COD games usually do, for example.

Generally you can't really trust the VRAM Usage, it does not tell you much. If you are not stuttering, you have enough
Ah, sure, VRAM allocated by a game is not the same as actual VRAM usage, just the 'maximum' the game thinks it might need, I understand that. That's not what is happening with Firefox though, as it allocates VRAM on a 'as needed' base.

So far I haven't seen any game max out VRAM usage on my 3090 though, even with Firefox gobbling up tons of it. :) Games cannot simply allocate ALL of the VRAM to themselves, as the Windows DWM also uses it for desktop composition (plus modern browsers) etc
Posted on Reply
#50
las
JcRabbitAh, sure, VRAM allocated by a game is not the same as actual VRAM usage, just the 'maximum' the game thinks it might need, I understand that. That's not what is happening with Firefox though, as it allocates VRAM on a 'as needed' base.

So far I haven't seen any game max out VRAM usage on my 3090 though, even with Firefox gobbling up tons of it. :) Games cannot simply allocate ALL of the VRAM to themselves, as the Windows DWM also uses it for desktop composition (plus modern browsers) etc
Yeah memory leak maybe :D
Posted on Reply
Add your own comment
Dec 18th, 2024 23:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts