Sunday, May 14th 2023

NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code

NVIDIA is launching 8 GB and 16 GB variants of its upcoming GeForce RTX 4060 Ti graphics card, with the 8 GB model debuting later this month, and the 16 GB model slated for July, as we learned in an older article. We are learning what else sets the two apart. Both are based on the 5 nm "AD106" silicon, by enabling 34 out of 36 SM physically present on the silicon, which works out to 4,352 out of 4,608 CUDA cores present. While the 8 GB model has the ASIC code "AD106-350," the 16 GB model gets the ASIC code "AD106-351."

The 16 GB model of the RTX 4060 Ti also has a slightly higher TDP, rated at 165 W, compared to 160 W of the 8 GB model. This is the TDP of the silicon, and not TGP (typical graphics power,) which takes into account power drawn by the entire board. The 16 GB model is sure to have a higher TGP on account of its higher-density memory. NVIDIA is likely to use four 32 Gbit (4 GB) GDDR6 memory chips to achieve 16 GB (as opposed to eight 16 Gbit ones with two chips piggybacked per 32-bit path).
Source: VideoCardz
Add your own comment

59 Comments on NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code

#26
tvshacker
Dr. DroThey've somewhat already done this, comparing the 3070 to its pro counterpart, if you missed it you might want to watch it:


It's quite the terrifying difference and I bet it's had a lot to do with people's sudden change of heart on 16 GB RAM+8 GB GPU PCs in the recent months.
Yes I saw it. But we're still not 100% sure where the 4060TI will sit on the stack.
Since the 3070 is already at around 450€, I really want the 4060TI to be better than the 3070, but at the same time I don't want to set very high expectations.
Posted on Reply
#27
Dr. Dro
tvshackerYes I saw it. But we're still not 100% sure where the 4060TI will sit on the stac.
Since the 3070 is already at around 450€, I really want the 4060TI to be better than the 3070, but at the same time I don't want to set very high expectations.
It will 100% undoubtedly suffer from the same problems the 3070 and 3070 Ti does. Lesser degree due to Ada's special sauce/highly optimized BVH traversal resolution which saves some memory and a significant amount of memory bandwidth on top of the large on-die cache, but I do not think it's going to save enough to save these cards' proverbial bacon :laugh:
fevgatosHis videos lately are just BS. Chooses settings to hog the Vram of the 3070,and then acts surprised it stutters like crazy. The actual question is, what is the image quality impact in those games if you drop textures to high instead of ultra? Not a lot I'd imagine, that's why he is not testing it. Won't generate as many clicks as just pooping on nvidia will.

Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder


Generally speaking, sure, but cyberpunk specifically, for some freaking reason 40 fps is a horrible experience to me. I've played games at 30 fps and it doesn't bother me that much, but 40 on cyberpunk feels atrocious.
I believe you missed the point, that the RTX 3070 would be a significantly more capable graphics card if it had the memory to deal with the workload it was presented with. That is not a flaw that the similarly performing RX 6750 XT and 6800 share, which makes of them a generally better product for the same money, especially as the cards age and VRAM requirements rise. Which they are.

Regarding Cyberpunk: Understandable, playing an atrocious game at an atrocious frame rate just isn't a good thing :laugh:

Real talk though, this is likely due to the crazy blur (even with motion blur setting off) from the imperfect path tracing and motion imperfections due to lower than intended frame rate added to the general engine jankiness, it's definitely not going to be a great experience.
Posted on Reply
#28
Vayra86
fevgatosHow is it playable on a 7900xt? It averages 70-80 fps on my 4090 with dlss on.
I had indistinguishable IQ and 60+ FPS with FSR quality and some minor tweaks, compared to native maxed out. And I've been staring at several scenes for about a half hour swapping between options there. I was genuinely curious. The only caveat is that FSR will have its typical ghosting thingy in places.

Path traced the game runs at 15 FPS :D And then considering the visual upgrade, that is in fact more of a sidegrade, I nearly fell off my chair laughing at the ridiculousness and pointlessness of it. Seriously.

But then the things that truly matter... the game isn't fun. I genuinely don't care how it runs anymore, I played it on a GTX 1080... at 50-55FPS. With FSR.
Its a neat tech demo to me, with a decent story and otherwise entirely forgettable experience. I don't even feel the urge to play it again with a new GPU that runs it ten times better and adds RT go figure.
tvshackerYes I saw it. But we're still not 100% sure where the 4060TI will sit on the stack.
Since the 3070 is already at around 450€, I really want the 4060TI to be better than the 3070, but at the same time I don't want to set very high expectations.
The 4060ti has a stronger core that is for damn sure, so with 8GB it'll be starved there is no doubt whatsoever.
Posted on Reply
#29
fevgatos
Dr. DroIt will 100% undoubtedly suffer from the same problems the 3070 and 3070 Ti does. Lesser degree due to Ada's special sauce/highly optimized BVH traversal resolution which saves some memory and a significant amount of memory bandwidth on top of the large on-die cache, but I do not think it's going to save enough to save these cards' proverbial bacon :laugh:



I believe you missed the point, that the RTX 3070 would be a significantly more capable graphics card if it had the memory to deal with the workload it was presented with. That is not a flaw that the similarly performing RX 6750 XT and 6800 share, which makes of them a generally better product for the same money, especially as the cards age and VRAM requirements rise. Which they are.

Regarding Cyberpunk: Understandable, playing an atrocious game at an atrocious frame rate just isn't a good thing :laugh:

Real talk though, this is likely due to the crazy blur (even with motion blur setting off) from the imperfect path tracing and motion imperfections due to lower than intended frame rate added to the general engine jankiness, it's definitely not going to be a great experience.
Of course the 3070 would have been better with more vram but does it matter? The question is does the current 3070 offer better image quality than it's competitor the 6700xt? I'd argue probably, because dlss + high textures is better than fsr + ultra textures, at least in some of the games, hogwarts being the prime example.
Posted on Reply
#30
Vayra86
fevgatosOf course the 3070 would have been better with more vram but does it matter? The question is does the current 3070 offer better image quality than it's competitor the 6700xt? I'd argue probably, because dlss + high textures is better than fsr + ultra textures, at least in some of the games, hogwarts being the prime example.
Now you're just grasping at straws buddy, please don't continue, this is turning into sad pixel peeper DLSS/FSR comparison topic territory. I'm sure Hogwarts has fantastic unforgettable generic walls to look at.

Of course it damn well matters that the 3070 would have been better with more VRAM. That is precisely the damn point that is being made wrt your average Nvidia release. Again: are you going to keep parroting the marketing story, or can we just concede on the fact Nvidia is anal probing us all? This is no you vs me debate... Its us vs them.

I agree the sensationalist Youtuber tone of voice is annoying as fuck, but the facts just don't lie, and the 16GB endowed 3070 shows facts.
People need to stop the denial, the proof is everywhere, even if that doesn't directly affect your personal use case - in the same way I tweak some settings to get Cyberpunk playable. But when people start saying 'lower quality textures do look better' to somehow defend the absence of VRAM... wow, just wow. Its of the same category of selective blindness as someone up here stating a 4070ti is unfit for 4K. It is absolute nonsense.

It reminds me of Al Gore's 'Inconvenient Truth'. Look where we are today wrt climate. We humans are exceptionally good at denial if it doesn't fit our agenda. Recognize. I'm using a rhetorical sledgehammer to keep reminding people here.
Posted on Reply
#31
Dr. Dro
fevgatosOf course the 3070 would have been better with more vram but does it matter? The question is does the current 3070 offer better image quality than it's competitor the 6700xt? I'd argue probably, because dlss + high textures is better than fsr + ultra textures, at least in some of the games, hogwarts being the prime example.
Sorry, I'm gonna have to disagree. Despite DLSS indeed being generally superior to FSR (especially with the DLSS 2.5 series DLLs installed), IMHO the single biggest improvement in image quality you can have on a game, ranking above even rendering resolution (thanks to excellent upscaling tech from both vendors) is high resolution assets and textures.

That's why I bought the 3090 (and would have bought the Titan RTX even earlier had it been available in my country at all) actually. The 24 GB lets me use and abuse of high resolution assets, even in the most unoptimized formats and engines you can imagine.

I'm an avid fan of Bethesda's RPGs, I promise you haven't seen a game chug VRAM until you run Fallout 4 or 76 with a proper UHD texture pack :laugh:
Posted on Reply
#32
fevgatos
Vayra86Now you're just grasping at straws buddy, please don't continue, this is turning into sad pixel peeper DLSS/FSR comparison topic territory. I'm sure Hogwarts has fantastic unforgettable generic walls to look at.

Of course it damn well matters that the 3070 would have been better with more VRAM. That is precisely the damn point that is being made wrt your average Nvidia release. Again: are you going to keep parroting the marketing story, or can we just concede on the fact Nvidia is anal probing us all? This is no you vs me debate... Its us vs them.

I agree the sensationalist Youtuber tone of voice is annoying as fuck, but the facts just don't lie, and the 16GB endowed 3070 shows facts.
People need to stop the denial, the proof is everywhere, even if that doesn't directly affect your personal use case - in the same way I tweak some settings to get
The same argument can be used about textures. If you can't tell the difference between dlss and fsr, can you tell the difference between ultra and high textures? I've done some tests on hogwarts and it is the case that high textures + Dlss looks better than ultra + fsr. Don't know if it's the case with other games as well, but that's what hwunboxed should be testing imo.

I don't care which card rubs higher presets, I care about which card offers higher image quality, and none of his testing shows us which which is which.
Vayra86Cyberpunk playable. But when people start saying 'lower quality textures do look better' to somehow defend the absence of VRAM... wow, just wow. Its of the same category of selective blindness as someone up here stating a 4070ti is unfit for 4K. It is absolute nonsense.

It reminds me of Al Gore's 'Inconvenient Truth'. Look where we are today wrt climate. We humans are exceptionally good at denial if it doesn't fit our agenda. Recognize. I'm using a rhetorical sledgehammer to keep reminding people here.
Dr. DroSorry, I'm gonna have to disagree. Despite DLSS indeed being generally superior to FSR (especially with the DLSS 2.5 series DLLs installed), IMHO the single biggest improvement in image quality you can have on a game, ranking above even rendering resolution (thanks to excellent upscaling tech from both vendors) is high resolution assets and textures.

That's why I bought the 3090 (and would have bought the Titan RTX even earlier had it been available in my country at all) actually. The 24 GB lets me use and abuse of high resolution assets, even in the most unoptimized formats and engines you can imagine.

I'm an avid fan of Bethesda's RPGs, I promise you haven't seen a game chug VRAM until you run Fallout 4 or 76 with a proper UHD texture pack :laugh:
That is assuming games on ultra textures have 4k resolutions, which they do not. Even tlou uses a combination of 256 and 512. All I'm saying is, until dlss + high textures vs fsr + ultra textures is actually tested, I can't say one card is better than the other cause of the Vram.
Posted on Reply
#33
Chrispy_
How expensive is it to double-stack or double-density GDDR6? That's probably a question aimed @TheLostSwede

I'm curious why AMD, Intel and Nvidia didn't just double all their VRAM this generation. 8GB has been a problem for something approaching a year now, and in the professional space, 12-24GB has been crippling in the "mainstream" space where people are trying to GPU-accelerate things that used to run in 64-128GB of system RAM. We've never bought a 48GB Quadro RTX 8000. I demoed one and it was great but the actual markup Nvidia put on that was so high it was vastly cheaper for us to just farm those jobs out to Amazon/Deadline in the cloud. You'd need to be using your RTX 8000 24/7 on high value projects to justify it
Posted on Reply
#34
TheLostSwede
News Editor
Chrispy_How expensive is it to double-stack or double-density GDDR6? That's probably a question aimed @TheLostSwede

I'm curious why AMD, Intel and Nvidia didn't just double all their VRAM this generation. 8GB has been a problem for something approaching a year now, and in the professional space, 12-24GB has been crippling in the "mainstream" space where people are trying to GPU-accelerate things that used to run in 64-128GB of system RAM. We've never bought a 48GB Quadro RTX 8000. I demoed one and it was great but the actual markup Nvidia put on that was so high it was vastly cheaper for us to just farm those jobs out to Amazon/Deadline in the cloud. You'd need to be using your RTX 8000 24/7 on high value projects to justify it
Sorry, not a graphics card person, but twice the memory in any application, is rarely twice the price.
Sadly DramExchange only lists pricing for 1 GB chips, although the average price for 1 GB (8 Gbit) of GDDR6 on there is listed at US$3.4, so 8 GB would be just over $27 and 16 GB if we assume the same cost, would be around $54. Keep in mind that these are spot prices and contract prices are negotiated months ahead of any production and can as such be both higher and lower.
www.dramexchange.com/

The actual cost at the fab that makes the memory ICs, I really don't know, but again, it's hardly going to be twice the price.
Posted on Reply
#35
Dr. Dro
Chrispy_How expensive is it to double-stack or double-density GDDR6? That's probably a question aimed @TheLostSwede

I'm curious why AMD, Intel and Nvidia didn't just double all their VRAM this generation. 8GB has been a problem for something approaching a year now, and in the professional space, 12-24GB has been crippling in the "mainstream" space where people are trying to GPU-accelerate things that used to run in 64-128GB of system RAM. We've never bought a 48GB Quadro RTX 8000. I demoed one and it was great but the actual markup Nvidia put on that was so high it was vastly cheaper for us to just farm those jobs out to Amazon/Deadline in the cloud. You'd need to be using your RTX 8000 24/7 on high value projects to justify it
It's pricier, but something customers would accept given recent price increases. However, they do not want to eat their own lunch so to speak, they need to give people a reason to buy their workstation grade hardware. Quite artificial reasons. Also, you don't want clamshell fast memory if it can be helped: see RTX 3090's extreme VRAM energy consumption
fevgatosThat is assuming games on ultra textures have 4k resolutions, which they do not. Even tlou uses a combination of 256 and 512. All I'm saying is, until dlss + high textures vs fsr + ultra textures is actually tested, I can't say one card is better than the other cause of the Vram.
Like I said, try some modded Bethesda RPGs with actually handmade, high resolution assets, you'll find that it's well beyond a midrange GPU's capabilities :)

Posted on Reply
#36
Bomby569
the forum is in a loop at this point with this subject.
Posted on Reply
#37
Chrispy_
Bomby569the forum is in a loop at this point with this subject.
Yes, because games need more VRAM and many new cards don't have enough VRAM.
The cycle will continue until the problem is fixed, one way or another, and I highly doubt that game devs are going to take two steps backwards just to accommodate GPU manufacturers being cheap.
Posted on Reply
#38
Bomby569
Chrispy_Yes, because games need more VRAM and many new cards don't have enough VRAM.
The cycle will continue until the problem is fixed, one way or another, and I highly doubt that game devs are going to take two steps backwards just to accommodate GPU manufacturers being cheap.
Considering that 80% or more of gamers don't have more then 8GB of vram i would say they are just dumb by doing so. But these are the same genius that can't release a finished game to save their lives so i guess it checks out.
Posted on Reply
#39
Vayra86
fevgatosThe same argument can be used about textures. If you can't tell the difference between dlss and fsr, can you tell the difference between ultra and high textures? I've done some tests on hogwarts and it is the case that high textures + Dlss looks better than ultra + fsr. Don't know if it's the case with other games as well, but that's what hwunboxed should be testing imo.

I don't care which card rubs higher presets, I care about which card offers higher image quality, and none of his testing shows us which which is which.



That is assuming games on ultra textures have 4k resolutions, which they do not. Even tlou uses a combination of 256 and 512. All I'm saying is, until dlss + high textures vs fsr + ultra textures is actually tested, I can't say one card is better than the other cause of the Vram.
You are totally glossing over the fact that even high textures might go out of reach for low VRAM GPUs, while they will still be in reach for higher VRAM GPUs. After all if the fidelity of said texturing is SO incredible, you're talking about big chunks of data. Case in point, because Hogwarts eats VRAM like candy.

The problem hasn't changed and won't change, your argument only exists because Nvidia is stingy on VRAM, not because AMD has 'lower RT perf' nor because FSR differs from DLSS. The two are not the same thing and never will be. Especially if you consider that the higher RT perf will also eat a chunk of VRAM. You might end up having to choose between those sweet sweet High textures and RT, again because of VRAM constraints. Your comparison makes no sense. Another easy way to figure that out is by turning the argument around: what if high + DLSS actually looks worse than ultra? Still picking that 'better image quality' now... or are you actually left with no choice but to compromise after all.

And on top of that, you're fully reliant on Nvidia's DLSS support to get that supposed higher image quality in the odd title. If this isn't grasping at straws on your end, I don't know what is. You're comparing a per-game proprietary implementation quality with the presence of hardware chips to make any kind of graphics happen. Da. Fuq. It takes AMD one copy/paste iteration of FSR to get that done. That's a driver update. Did you download additional RAM yet?
Bomby569the forum is in a loop at this point with this subject.
As much as you're still in denial that the world is actually moving on wrt VRAM ;) After all the majority market has 8GB you say. But then you keep forgetting the console market.
You can comfort yourself that you're not the only one in denial. See above subject.

The PS4 was the era when 8GB was enough. It has ended, and PC GPUs then suffered a pandemic and crypto, several times over, while at the end of that cycle we noticed raster perf is actually stalling. All those factors weigh in on the fact that 8GB is history. To get gains, yes, you will see more stuff moved to VRAM. Its the whole reason consoles have that special memory subsystem and its (part of) the whole reason AMD is also moving to chiplet for GPU. This train has been coming for a decade, and it has now arrived.

Did anyone REALLY think the consoles were given 16GB of GDDR to let it sit idle or to keep 2GB unused because of the poor PC market? If you ever did, you clearly don't understand consoles at all.
Posted on Reply
#40
Bomby569
Vayra86As much as you're still in denial that the world is actually moving on wrt VRAM ;) After all the majority market has 8GB you say. But then you keep forgetting the console market.
You can comfort yourself that you're not the only one in denial. See above subject.
i've been lowering settings since before you were born probably and will do it again. Like someone said before i don't care about ultra, or ever cared about it, i doubt anyone can tell the difference in gameplay, and i can't and wont be examining power lines in a game or the pimple on a npc's face.

My case is settled so not really a choice i have to make. Going out stupid on Vram will only damn the industry, because like is said the insanely majority of gamers don't really have more then 8GB and that won't change anytime soon especially with stupid prices like we have no.

I don't live in a bubble and think everyone is buying 600$, 32GB VRAM cards. PC will not survive on the small number of sales from the few deep pocketed gamers. And no the majority is not slowing down progress, they are the ones that keep the industry alive, not the infinite minority
Posted on Reply
#41
Vayra86
Bomby569i've been lowering settings since before you were born probably and will do it again. Like someone said before i don't care about ultra, or ever cared about it, i doubt anyone can tell the difference in gameplay, and i can't and wont be examining power lines in a game or the pimple on a npc's face.

My case is settled so not really a choice i have to make. Going out stupid on Vram will only damn the industry, because like is said the insanely majority of gamers don't really have more then 8GB and that won't change anytime soon especially with stupid prices like we have no.

I don't live in a bubble and think everyone is buying 600$, 32GB VRAM cards. PC will not survive on the small number of sales from the few deep pocketed gamers. And no the majority is not slowing down progress, they are the ones that keep the industry alive, not the infinite minority
I think we fully agree on those points, honestly.
But nobody is going out 'stupid on VRAM', that is a figment of your imagination, and thát is what I'm calling out.
What will really happen to the PC market though is that the baseline of 8GB will slowly die off and get replaced with 12, or 16. You don't need deep pockets to buy a 12 or 16GB GPU. They're readily available and they are not expensive. And commerce also wants you to buy them - both publishers and camps red&green. You will get those killer apps where you can't do without and then you'll empty that wallet, or you've just switched off from gaming entirely and you would have regardless.

Just because Nvidia chooses to adopt a market strategy of progressive planned obsolescence over the last three generations of cards, and 'because they have the larger market share' (news flash: they don't, on x86 gaming as a whole), that does not mean they get to decide what the bottomline of mainstream happens to be. Just as they don't dictate when 'the industry adopts RT' - as we can clearly see. The industry is ready when its ready, and when it moves, it moves. Despite what Nvidia loves to tell us, they're not dictating shit. Consoles dictate where gaming goes.

History repeats...

Here's PC gaming over the years. Pandemic, War, Inflation, crypto... irrelevant! It is a growth market. Its almost as certain as the revenue of bread or milk. Even the 'pandemic effect' - major growth in 2020 - didn't get corrected to this day - people started gaming and they keep doing it. Sorry about the weird thing over the title, I had to grab the screen fast or it would get covered in popups. Link below


www.statista.com/statistics/274806/pc-gaming-software-sales-revenue-worldwide/
Posted on Reply
#42
Speedyblupi
ArkzYeah pretty sure the biggest package GDDR6 chips come in is 16Gb. Dunno where OP pulled 32Gb from... AFAIK the only 32Gb ones are the GDDR6W Samsung made, but I'm not even aware of any consumer card having them.
GDDR6W is also 64-bit per package instead of 32-bit like normal GDDR6 and GDDR6X, so it doesn't increase the amount of VRAM you can install for a given bus width. All 3 types are only up to 4GB per 64-bit of bus width, unless you use a clamshell topology.
Posted on Reply
#43
64K
Dr. DroContrary to popular belief, 80fps at native settings isn't necessary for it to be considered playable :laugh:

I'm of the "60 fps or go home" school of thought, but a lot of people may be happy with 40ish in games like that. FSR would sacrifice some image quality to keep that on the upper end of that range, closer to 60 really.
I want 60 FPS in shooters. I will be alright with a little less in other genres.
Posted on Reply
#44
Bomby569
Vayra86I think we fully agree on those points, honestly.
But nobody is going out 'stupid on VRAM', that is a figment of your imagination, and thát is what I'm calling out.
What will really happen to the PC market though is that the baseline of 8GB will slowly die off and get replaced with 12, or 16. You don't need deep pockets to buy a 12 or 16GB GPU. They're readily available and they are not expensive. And commerce also wants you to buy them - both publishers and camps red&green. You will get those killer apps where you can't do without and then you'll empty that wallet, or you've just switched off from gaming entirely and you would have regardless.

Just because Nvidia chooses to adopt a market strategy of progressive planned obsolescence over the last three generations of cards, and 'because they have the larger market share' (news flash: they don't, on x86 gaming as a whole), that does not mean they get to decide what the bottomline of mainstream happens to be. Just as they don't dictate when 'the industry adopts RT' - as we can clearly see. The industry is ready when its ready, and when it moves, it moves. Despite what Nvidia loves to tell us, they're not dictating shit. Consoles dictate where gaming goes.

History repeats...

Here's PC gaming over the years. Pandemic, War, Inflation, crypto... irrelevant! It is a growth market. Its almost as certain as the revenue of bread or milk. Sorry about the weird thing over the title, I had to grab the screen fast or it would get covered in popups. Link below

www.statista.com/statistics/274806/pc-gaming-software-sales-revenue-worldwide/
AMD is just about to release a new 8GB card, this fixation with nvidia is idiotic at this point, a case for your psychologist to diagnose

All tech companies are down, sales are down, shares are down, profits are down, lay offs, cutting production, etc... only in your dreams would you say things are going good.
Posted on Reply
#45
fevgatos
Vayra86You are totally glossing over the fact that even high textures might go out of reach for low VRAM GPUs, while they will still be in reach for higher VRAM GPUs. After all if the fidelity of said texturing is SO incredible, you're talking about big chunks of data. Case in point, because Hogwarts eats VRAM like candy.

The problem hasn't changed and won't change, your argument only exists because Nvidia is stingy on VRAM, not because AMD has 'lower RT perf' nor because FSR differs from DLSS. The two are not the same thing and never will be. Especially if you consider that the higher RT perf will also eat a chunk of VRAM. You might end up having to choose between those sweet sweet High textures and RT, again because of VRAM constraints. Your comparison makes no sense. Another easy way to figure that out is by turning the argument around: what if high + DLSS actually looks worse than ultra? Still picking that 'better image quality' now... or are you actually left with no choice but to compromise after all.

And on top of that, you're fully reliant on Nvidia's DLSS support to get that supposed higher image quality in the odd title. If this isn't grasping at straws on your end, I don't know what is. You're comparing a per-game proprietary implementation quality with the presence of hardware chips to make any kind of graphics happen. Da. Fuq. It takes AMD one copy/paste iteration of FSR to get that done. That's a driver update. Did you download additional RAM yet?
If high textures go out of reach on an 8gb 3070 or 3060ti then ultra will probably go out of reach for the 10 / 12 gb 6700 and 6700xt as well. Up until now it hasn't happened. I know cause I have a 3060ti, it works great on hogwarts for example with textures at high.

My argument exists because nvidia is stingy on vram and amd is stingy on rt and upscaling. If high textures + Dlss looks worse than fsr + ultra textures then obviously amd cards are the better choice. That's why I want someone to test this first before concluding one way or another.

Until fsr matches dlss my point stands. After all ampere cards are already approaching the 3 year old mark. If the amd user has to wait 4-5 years for his midrange card to match or beat the Nvidia card in image quality... then I'd call that a bad purchase.

Of course it comes down to preferences after all but personally I can't stand FSR,cause I dislike sharpening with a passion and fsr adds a ton of it. I don't use fsr even on my 6700s laptop while I'm Always using dlss on my 4090 desktop, cause it is that good.
Posted on Reply
#46
Vayra86
Bomby569AMD is just about to release a new 8GB card, this fixation with nvidia is idiotic at this point, a case for your psychologist to diagnose

All tech companies are down, sales are down, shares are down, profits are down, lay offs, cutting production, etc... only in your dreams would you say things are going good.
Yes, AMD will too, and its price will make or break it, is clearly the sentiment. I agree on that. 8GB is fine, it just depends where on the ladder it sits. And I reckon there is a good chance AMD will also price it too high. The fixation with 'Nvidia' is focused on pretty much the 4060ti and up.

As for things going 'good' - there is a big gap between PC gaming dying and 'good'. What I'm saying is that despite all the crap we've endured, it remains a growth market - even despite all the shit you too describe. Tech down... gaming up. And that explains why I have confidence the baseline will be pushed up and you will be forced to move up from 8GB. 8GB has been live in the midrange since 2016. Its 2023.
fevgatosOf course it comes down to preferences
Let's leave it there, because on that point we can indeed agree.
Posted on Reply
#47
Bomby569
Vayra86Yes, AMD will too, and its price will make or break it, is clearly the sentiment. I agree on that. 8GB is fine, it just depends where on the ladder it sits. And I reckon there is a good chance AMD will also price it too high. The fixation with 'Nvidia' is focused on pretty much the 4060ti and up.

As for things going 'good' - there is a big gap between PC gaming dying and 'good'. What I'm saying is that despite all the crap we've endured, it remains a growth market - even despite all the shit you too describe. Tech down... gaming up. And that explains why I have confidence the baseline will be pushed up and you will be forced to move up from 8GB. 8GB has been live in the midrange since 2016. Its 2023.


Let's leave it there, because on that point we can indeed agree.
pc gaming is a lot of things, PC gaming is csgo, is pubg, is dota, sims 4, is microtransactions, is a lot of things. I doubt this is a broad discussion when we are talking about the 8GB VRAM issue. Your statement is more vram fine, people will just buy new gpus, so coming with a general pc gaming revenue chart is disingenuous at best. Specifically and that's what matters gpu sales are down, ssd sales are down, chip production is being cut down, motherboard sales are down, etc.. etc... etc...

people buy software, it just takes to be seen on what, if hardware sales are down, not mega hyper super AAA releases that demand 120GB of vram that's for sure
Posted on Reply
#48
64K
Vayra86As for things going 'good' - there is a big gap between PC gaming dying and 'good'. What I'm saying is that despite all the crap we've endured, it remains a growth market
True. The doom-and-gloomers have been around for a long, long time saying PC gaming is dying. They have cut down on their predictions of doom on forums a good bit but they are still out there. The fact is that PC gaming hardware growth has shrunk since the Pandemic and mining boom when growth in PC hardware sales ballooned enormously and prices did as well but it is still positive growth and the future looks bright.



arstechnica.com/gaming/2023/04/pc-gaming-market-is-set-to-grow-again-after-pandemic-and-overstock-corrections/
Posted on Reply
#49
Vayra86
Bomby569pc gaming is a lot of things, PC gaming is csgo, is pubg, is dota, sims 4, is microtransactions, is a lot of things. I doubt this is a broad discussion when we are talking about the 8GB VRAM issue. Your statement is more vram fine, people will just buy new gpus, so coming with a general pc gaming revenue chart is disingenuous at best. Specifically and that's what matters gpu sales are down, ssd sales are down, chip production is being cut down, motherboard sales are down, etc.. etc... etc...

people buy software, it just takes to be seen on what, if hardware sales are down, not mega hyper super AAA releases that demand 120GB of vram that's for sure
That is a strong argument, indeed. People will simply not play things they can't run. Time will tell who blinks first, consumer or industry push.

But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.
Posted on Reply
#50
Bomby569
Vayra86That is a strong argument, indeed. People will simply not play things they can't run. Time will tell who blinks first, consumer or industry push.

But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.
what will save them is gamers will buy them because they want to play them if the game is really good, and just play at low, until they can upgrade, like everyone ever did since PC gaming started. People play games on toasters and it never stopped no one.

The shitty games will not sell and blame the weather and stuff. In this case most AAA games this days.
Posted on Reply
Add your own comment
May 29th, 2024 07:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts