Sunday, May 14th 2023
NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code
NVIDIA is launching 8 GB and 16 GB variants of its upcoming GeForce RTX 4060 Ti graphics card, with the 8 GB model debuting later this month, and the 16 GB model slated for July, as we learned in an older article. We are learning what else sets the two apart. Both are based on the 5 nm "AD106" silicon, by enabling 34 out of 36 SM physically present on the silicon, which works out to 4,352 out of 4,608 CUDA cores present. While the 8 GB model has the ASIC code "AD106-350," the 16 GB model gets the ASIC code "AD106-351."
The 16 GB model of the RTX 4060 Ti also has a slightly higher TDP, rated at 165 W, compared to 160 W of the 8 GB model. This is the TDP of the silicon, and not TGP (typical graphics power,) which takes into account power drawn by the entire board. The 16 GB model is sure to have a higher TGP on account of its higher-density memory. NVIDIA is likely to use four 32 Gbit (4 GB) GDDR6 memory chips to achieve 16 GB (as opposed to eight 16 Gbit ones with two chips piggybacked per 32-bit path).
Source:
VideoCardz
The 16 GB model of the RTX 4060 Ti also has a slightly higher TDP, rated at 165 W, compared to 160 W of the 8 GB model. This is the TDP of the silicon, and not TGP (typical graphics power,) which takes into account power drawn by the entire board. The 16 GB model is sure to have a higher TGP on account of its higher-density memory. NVIDIA is likely to use four 32 Gbit (4 GB) GDDR6 memory chips to achieve 16 GB (as opposed to eight 16 Gbit ones with two chips piggybacked per 32-bit path).
59 Comments on NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code
Since the 3070 is already at around 450€, I really want the 4060TI to be better than the 3070, but at the same time I don't want to set very high expectations.
Regarding Cyberpunk: Understandable, playing an atrocious game at an atrocious frame rate just isn't a good thing :laugh:
Real talk though, this is likely due to the crazy blur (even with motion blur setting off) from the imperfect path tracing and motion imperfections due to lower than intended frame rate added to the general engine jankiness, it's definitely not going to be a great experience.
Path traced the game runs at 15 FPS :D And then considering the visual upgrade, that is in fact more of a sidegrade, I nearly fell off my chair laughing at the ridiculousness and pointlessness of it. Seriously.
But then the things that truly matter... the game isn't fun. I genuinely don't care how it runs anymore, I played it on a GTX 1080... at 50-55FPS. With FSR.
Its a neat tech demo to me, with a decent story and otherwise entirely forgettable experience. I don't even feel the urge to play it again with a new GPU that runs it ten times better and adds RT go figure. The 4060ti has a stronger core that is for damn sure, so with 8GB it'll be starved there is no doubt whatsoever.
Of course it damn well matters that the 3070 would have been better with more VRAM. That is precisely the damn point that is being made wrt your average Nvidia release. Again: are you going to keep parroting the marketing story, or can we just concede on the fact Nvidia is anal probing us all? This is no you vs me debate... Its us vs them.
I agree the sensationalist Youtuber tone of voice is annoying as fuck, but the facts just don't lie, and the 16GB endowed 3070 shows facts.
People need to stop the denial, the proof is everywhere, even if that doesn't directly affect your personal use case - in the same way I tweak some settings to get Cyberpunk playable. But when people start saying 'lower quality textures do look better' to somehow defend the absence of VRAM... wow, just wow. Its of the same category of selective blindness as someone up here stating a 4070ti is unfit for 4K. It is absolute nonsense.
It reminds me of Al Gore's 'Inconvenient Truth'. Look where we are today wrt climate. We humans are exceptionally good at denial if it doesn't fit our agenda. Recognize. I'm using a rhetorical sledgehammer to keep reminding people here.
That's why I bought the 3090 (and would have bought the Titan RTX even earlier had it been available in my country at all) actually. The 24 GB lets me use and abuse of high resolution assets, even in the most unoptimized formats and engines you can imagine.
I'm an avid fan of Bethesda's RPGs, I promise you haven't seen a game chug VRAM until you run Fallout 4 or 76 with a proper UHD texture pack :laugh:
I don't care which card rubs higher presets, I care about which card offers higher image quality, and none of his testing shows us which which is which. That is assuming games on ultra textures have 4k resolutions, which they do not. Even tlou uses a combination of 256 and 512. All I'm saying is, until dlss + high textures vs fsr + ultra textures is actually tested, I can't say one card is better than the other cause of the Vram.
I'm curious why AMD, Intel and Nvidia didn't just double all their VRAM this generation. 8GB has been a problem for something approaching a year now, and in the professional space, 12-24GB has been crippling in the "mainstream" space where people are trying to GPU-accelerate things that used to run in 64-128GB of system RAM. We've never bought a 48GB Quadro RTX 8000. I demoed one and it was great but the actual markup Nvidia put on that was so high it was vastly cheaper for us to just farm those jobs out to Amazon/Deadline in the cloud. You'd need to be using your RTX 8000 24/7 on high value projects to justify it
Sadly DramExchange only lists pricing for 1 GB chips, although the average price for 1 GB (8 Gbit) of GDDR6 on there is listed at US$3.4, so 8 GB would be just over $27 and 16 GB if we assume the same cost, would be around $54. Keep in mind that these are spot prices and contract prices are negotiated months ahead of any production and can as such be both higher and lower.
www.dramexchange.com/
The actual cost at the fab that makes the memory ICs, I really don't know, but again, it's hardly going to be twice the price.
The cycle will continue until the problem is fixed, one way or another, and I highly doubt that game devs are going to take two steps backwards just to accommodate GPU manufacturers being cheap.
The problem hasn't changed and won't change, your argument only exists because Nvidia is stingy on VRAM, not because AMD has 'lower RT perf' nor because FSR differs from DLSS. The two are not the same thing and never will be. Especially if you consider that the higher RT perf will also eat a chunk of VRAM. You might end up having to choose between those sweet sweet High textures and RT, again because of VRAM constraints. Your comparison makes no sense. Another easy way to figure that out is by turning the argument around: what if high + DLSS actually looks worse than ultra? Still picking that 'better image quality' now... or are you actually left with no choice but to compromise after all.
And on top of that, you're fully reliant on Nvidia's DLSS support to get that supposed higher image quality in the odd title. If this isn't grasping at straws on your end, I don't know what is. You're comparing a per-game proprietary implementation quality with the presence of hardware chips to make any kind of graphics happen. Da. Fuq. It takes AMD one copy/paste iteration of FSR to get that done. That's a driver update. Did you download additional RAM yet? As much as you're still in denial that the world is actually moving on wrt VRAM ;) After all the majority market has 8GB you say. But then you keep forgetting the console market.
You can comfort yourself that you're not the only one in denial. See above subject.
The PS4 was the era when 8GB was enough. It has ended, and PC GPUs then suffered a pandemic and crypto, several times over, while at the end of that cycle we noticed raster perf is actually stalling. All those factors weigh in on the fact that 8GB is history. To get gains, yes, you will see more stuff moved to VRAM. Its the whole reason consoles have that special memory subsystem and its (part of) the whole reason AMD is also moving to chiplet for GPU. This train has been coming for a decade, and it has now arrived.
Did anyone REALLY think the consoles were given 16GB of GDDR to let it sit idle or to keep 2GB unused because of the poor PC market? If you ever did, you clearly don't understand consoles at all.
My case is settled so not really a choice i have to make. Going out stupid on Vram will only damn the industry, because like is said the insanely majority of gamers don't really have more then 8GB and that won't change anytime soon especially with stupid prices like we have no.
I don't live in a bubble and think everyone is buying 600$, 32GB VRAM cards. PC will not survive on the small number of sales from the few deep pocketed gamers. And no the majority is not slowing down progress, they are the ones that keep the industry alive, not the infinite minority
But nobody is going out 'stupid on VRAM', that is a figment of your imagination, and thát is what I'm calling out.
What will really happen to the PC market though is that the baseline of 8GB will slowly die off and get replaced with 12, or 16. You don't need deep pockets to buy a 12 or 16GB GPU. They're readily available and they are not expensive. And commerce also wants you to buy them - both publishers and camps red&green. You will get those killer apps where you can't do without and then you'll empty that wallet, or you've just switched off from gaming entirely and you would have regardless.
Just because Nvidia chooses to adopt a market strategy of progressive planned obsolescence over the last three generations of cards, and 'because they have the larger market share' (news flash: they don't, on x86 gaming as a whole), that does not mean they get to decide what the bottomline of mainstream happens to be. Just as they don't dictate when 'the industry adopts RT' - as we can clearly see. The industry is ready when its ready, and when it moves, it moves. Despite what Nvidia loves to tell us, they're not dictating shit. Consoles dictate where gaming goes.
History repeats...
Here's PC gaming over the years. Pandemic, War, Inflation, crypto... irrelevant! It is a growth market. Its almost as certain as the revenue of bread or milk. Even the 'pandemic effect' - major growth in 2020 - didn't get corrected to this day - people started gaming and they keep doing it. Sorry about the weird thing over the title, I had to grab the screen fast or it would get covered in popups. Link below
www.statista.com/statistics/274806/pc-gaming-software-sales-revenue-worldwide/
All tech companies are down, sales are down, shares are down, profits are down, lay offs, cutting production, etc... only in your dreams would you say things are going good.
My argument exists because nvidia is stingy on vram and amd is stingy on rt and upscaling. If high textures + Dlss looks worse than fsr + ultra textures then obviously amd cards are the better choice. That's why I want someone to test this first before concluding one way or another.
Until fsr matches dlss my point stands. After all ampere cards are already approaching the 3 year old mark. If the amd user has to wait 4-5 years for his midrange card to match or beat the Nvidia card in image quality... then I'd call that a bad purchase.
Of course it comes down to preferences after all but personally I can't stand FSR,cause I dislike sharpening with a passion and fsr adds a ton of it. I don't use fsr even on my 6700s laptop while I'm Always using dlss on my 4090 desktop, cause it is that good.
As for things going 'good' - there is a big gap between PC gaming dying and 'good'. What I'm saying is that despite all the crap we've endured, it remains a growth market - even despite all the shit you too describe. Tech down... gaming up. And that explains why I have confidence the baseline will be pushed up and you will be forced to move up from 8GB. 8GB has been live in the midrange since 2016. Its 2023. Let's leave it there, because on that point we can indeed agree.
people buy software, it just takes to be seen on what, if hardware sales are down, not mega hyper super AAA releases that demand 120GB of vram that's for sure
arstechnica.com/gaming/2023/04/pc-gaming-market-is-set-to-grow-again-after-pandemic-and-overstock-corrections/
But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.
The shitty games will not sell and blame the weather and stuff. In this case most AAA games this days.