Thursday, January 19th 2023

NVIDIA GeForce RTX 4060 Ti Possible Specs Surface—160 W Power, Debuts AD106 Silicon

NVIDIA's next GeForce RTX 40-series "Ada" graphics card launch is widely expected to be the GeForce RTX 4070 (non-Ti), and as we approach Spring 2023, the company is expected to ramp up to the meat of its new generation, with xx60-segment, beginning with the GeForce RTX 4060 Ti. This new performance-segment SKU debuts the 4 nm "AD106" silicon. A set of leaks by kopite7kimi, a reliable source with NVIDIA leaks, shed light on possible specifications.

The RTX 4060 Ti is based on the AD106 silicon, which is expected to be much smaller than the AD104 powering the RTX 4070 series. The reference board developed at NVIDIA, codenamed PG190, is reportedly tiny, and yet it features the 16-pin ATX 12VHPWR connector. This is probably set for 300 W at its signal pins, and adapters included with graphics cards could convert two 8-pin PCIe into one 300 W 16-pin connector. The RTX 4060 Ti is expected to come with a typical graphics power value of 160 W.
At this point we don't know whether the RTX 4060 Ti maxes out the AD106, but its rumored specs read as follows: 4,352 CUDA cores across 34 streaming multiprocessors (SM), 34 RT cores, 136 Tensor cores, 136 TMUs, and an unknown ROP count. The GPU is expected to feature a 128-bit wide GDDR6/X memory interface, and 8 GB could remain the standard memory size. NVIDIA is expected to use JEDEC-standard 18 Gbps GDDR6 memory, which should yield 288 GB/s of memory bandwidth. It will be very interesting to see how much faster the RTX 4060 Ti is over its predecessor, the RTX 3060 Ti, given that it has barely two-thirds the memory bandwidth. NVIDIA has made several architectural improvements to the memory sub-system with "Ada," and the AD106 is expected to get a large 32 MB L2 cache.
Sources: kopite7kimi (Twitter), VideoCardz
Add your own comment

164 Comments on NVIDIA GeForce RTX 4060 Ti Possible Specs Surface—160 W Power, Debuts AD106 Silicon

#76
HisDivineOrder
I'm guessing $750 for the 4060 Ti, $700 for the 4060, $650 for the 4050 Ti, and $600 for the 4050. Then they'll introduce a x40 series that will slot in $50 less with a Ti and so on and so forth. They'll keep the 40 Series above $500 because Jensen heard somewhere that $500 is the new $100.

That's what he heard!
Posted on Reply
#77
ARF
Bet0nI think there is one thing which is constant across generations: power consumption. AIBs buy graphics cards based on price and TDP not codenames or memory bandwidth and such.
Based on this metric you can compare the cards and tell which one is which ones successor in Nvidia's mind so to speak. Whoever said the GTX 680 was supposed to be a midrange card (at least based on the TDP of previous generations) is imo right but market circumstances "promoted" it into the 80 class (AMD was slower).
As it happens I made an excel table based on TDP. I have another one with AMD cards if you're interested.
I think this table shows that the cards above 250W are more expensive now exactly because they relatively perform better than "old day&age" high-end SKUs (those had 250W TDP).
As you can see Nvidia moved up the numbers so at the same TDP you either get a "lower class" GPU for the same price or you have to pay more for the "same" class with the same naming. People usually upgrade within the same class so when they see the lower class card for the same money they rather buy the more expensive card. Brilliant and very simple marketing by Nvidia.
But this is just my thinking.

I think this is a natural trend because we are fast reaching the limits of any semiconductors manufacturing nodes progress.
Today there is a news that Apple which is the first and exclusive customer on the optical shrink TSMC N3 will begin to ship Apple M2 products as late as late 2023.
3nm M3 Chips To Launch With MacBook Air In 2H23, Will Bring Major Performance And Battery Life Gains (wccftech.com)

This means that AMD will move to N3 in 2025 or 2026.

Goodbye PC business !@ You won't be missed.
Posted on Reply
#78
TumbleGeorge
H2 starting with 1st July. Relax, ZEN 5 and RX 8000 series will be on 3nm from around end of 2024.
Posted on Reply
#79
Bet0n
ARFI think this is a natural trend because we are fast reaching the limits of any semiconductors manufacturing nodes progress.
Today there is a news that Apple which is the first and exclusive customer on the optical shrink TSMC N3 will begin to ship Apple M2 products as late as late 2023.
3nm M3 Chips To Launch With MacBook Air In 2H23, Will Bring Major Performance And Battery Life Gains (wccftech.com)

This means that AMD will move to N3 in 2025 or 2026.

Goodbye PC business !@ You won't be missed.
What is natural? That we have 300+ Watts cards that didn't exist before Turing (besides the dual GPU cards)?
In normal times the 3070 Ti (not even that, because the TDP of 3070 Ti is 290W) would've been the top card (I mean that kind of performance, not the name) in the Ampere generation.
I think two things happened.
Firstly NV saw in 2016-17 the first bigger crypto mining craze (they started working on Ampere around that time) and realized people want more calculation powa and more cards.
Secondly since they were in contract with Samsung on a crappy node with high power consumption this new top card with 250W TDP would've been only maximum 10% faster than the previous flagship (2080 Super/Ti) which is a very bad generational uplift, worse than Pascal->Turing.
They were cocky by going with Samsung because AMD was nowhere to be found (in 2016 AMD had RX 480 and Fury X and that's it).
So they went allin not caring about TDP with Ampere so we got these 300W+ cards. They're much faster than Turing but they also eat much more than Turing. Or should I say they're faster only because they eat much more.
Look at the TPU database: the 3060 is barely faster than the 2060/Super or the 2070, the 3060 Ti barely faster than the 2070 Super/2080 and the same is true 1660TI vs 3050. These are same TDP tiers. At least they lowered the prices by moving down a tier these performance levels.
They got lucky with the second big crypto mining boom and the Ampere cards sold out easily despite the high TDPs.
Since people want these cards they gonna make them but I think without crypto mining the history would've completely different with an Ampere generation barely faster than Turing showing NV dominance and monopolistic behavior in the GPU space.
Posted on Reply
#80
Minus Infinity
lasYour 6800XT is not even close. 4070 Ti is ~20% faster than 6800XT and 3080 overall at 1440p.

In newer games its more like 25-30%.

Also 4070 Ti is doing that with 50-75w less + DLSS, DLDSR, NvEnc and way more features in generel + Better RT

Yes you can probably cherrypick a game or two 6800XT wins slightly, but overall it does not and in some other games, the 4070 Ti is over 30% faster than 6800 XT as well.

In most new games 4070 Ti is 25% faster than 6800XT stock vs stock. Old games is dragging down the perf of 4000 series in comparisons, because of no optimizations. Thats why performance is better in newer/popular titles in general.

www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/32.html

4070 Ti is 30-35% faster than 6800XT in these newer games

www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/12.html
www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/10.html
www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/17.html

So unless you have overclocked your 6800XT to the absolute limit (and looking at 400-500 watt usage) then your 6800XT is not even close.

4070 Ti beats even 6900XT with like 15%...

You sound like people with 3080 that thinks their 3080 also performs just like a 4070 Ti, it don't.

Both 3080 and 6800XT are 20-25-30% behind 4070 Ti depending on title, in GPU limited games.
Wow 20% faster and I get to pay $1500AU or I get a near new second hand 6800XT for $800. You argument only makes sense if the price/performance ratio is similar. It's pathetic value. At $599 it would make sense.
Posted on Reply
#81
johnspack
Here For Good!
Thank god I've given up on gaming on a pc. My 980Ti will last me for years more now. Entry level cards at 1kcan... nope they can keep them.
So many other things you can do on a pc besides game.....
Posted on Reply
#82
bonehead123
Can we getz some cookies with this (lukewarm) milk please ?

Figures that after they've completely milked the masses with their overpriced power sukin crap, they would bring out this low-end thing that probably can't even run the "can it run crysis" utility without stuttering or overheating......
Posted on Reply
#83
Chrispy_
Vya DomusOver a decade and 6 consecutive architectures with almost the same naming schemes with a few exceptions here and there seems pretty consistent to me.

What's your timeframe at which point you consider this consistent ? 20, 30 Years ? Centuries ?
Your argument was that xx60 Ti models should use 104 dies:

There is no consistency over the last decade with Ti models. I went back so many generations solely to give your argument the benefit of the doubt, because prior to Ampere, there was another xx60 Ti model using a 104 die, the 660 Ti - but that was 13 years ago!

The 4060 Ti isn't using a 104 die
Ampere 3060 Ti did use a 104 die. We've established that - but this ONE generation occurence isn't enough to call it a concrete pattern like you say it should.
Turing 2060 Ti didn't exist at all, and if you say "super=Ti" then it still didn't use a 104 die
Turing 1660 Ti didn't use 114 die
Pascal 1060 Ti didn't exist.
Maxwell 960 Ti didn't exist
Kepler 760 Ti didn't exist
Kepler 660 Ti (finally, we get there) used a 104 die.

Please, explain to me why you're so adamant that the 4060 Ti should be a 104 die again. I honestly don't understand the point you're trying to make.
Posted on Reply
#84
N/A
Chrispy_Geforce GTX 960 = GM204
This is GM206.

this time around 4070 is a rebranded 60 Ti, just look at the 4070 ti to 4070, they cut more than 20%, this is precisely what 3060 Ti was compared to a 3070 Ti.
And the 60 Ti falls in the abyss of 128 bit bus.
Posted on Reply
#85
Tsukiyomi91
y'all do know Ada Lovelace core has larger L2 cache than Ampere, right? What if this 4060Ti is keeping up or even beating a 3080 despite having the "inferior" AD106 silicon, smaller bus, lower CUDA count and whatnot? If it did priced at $600 or more, then go find a known used 3080 (or a Ti variant) and be happy with it. No one is forcing you to buy or even consider getting this not-releasing-so-soon GPU.
Posted on Reply
#86
Dirt Chip
Vya DomusDie names are not just random worthless information, Nvidia uses the same nomenclature every time for a reason. If a certain card now uses a chip that's a lower tier according to the silicon naming scheme that means they've just upsold their customers a product that is inferior according to their own internal classification compared to the previous generation. In other words they're giving you less for more money.

If you think that doesn't matter you're being oblivious because it's those things that actually dictate price and performance.
Right, but what matters is price to preformance and not the naming that can change according to each gen positioning and time circumstances.

Many limbo on names, endlessly compering 10 gens back of tiers according to them instead of just going price\pref at a given time.

Things change rapidly, adapt or be left behind. The neverending rant on memory bit, die size\name, product name is futile as dust.

No one disputing the fact that today you pay extra more and getting less for that extra but clinging to what was in the (not far) past and not accepting the economy global changes and NV hunger for more $$$ is somewhat naive.

The best thing you can do is to delay your GPU purchase as much as you can (I advise 5 years, no less) and if you really need\want to buy go for best cost\pref on the day of purchase. All else is just confusing spec info that might diviate you to less than optimal purchase decision.
Posted on Reply
#87
Vya Domus
Chrispy_Your argument was that xx60 Ti models should use 104 dies:
No it wasn't. I simply pointed out that there is going to be a downgrade, the silicon is going to be a tier lower.
Posted on Reply
#88
JustBenching
Vayra86Yep. And they can't because...............
R
T

How are we liking that price of RT yet? Glorious innit


And faster VRAM, which is what held the 680 back. I owned a 770, was a pretty neat card, basically GK104 in optimal form. Experience was better than the SLI GTX 660's I had prior to it, and that was a free 'upgrade', so to speak :D Good times... You actually had choices. Now its 'okay, where does my budget land in the stack, and how much crap am I getting along with it that I really don't want to pay for but still do?'
Tbf, if someone doesnt care about rt, which i hear a lot, shouldn't complain about the prices either. A 350-400 euro cards kills everything with rt off at 1440p (lets say a 6700xt or a 3060ti). That kind of performance used to be ultra high end a few years ago. I have a 3060ti in my gfs pc with a 1440p ultrawide monitor, she doesnt really care about rt, framerate is great, and there is also the dlss option for backup.

I really don't think we ever got more performnace in the midrange than we do today. And since you mentioned the gtx 770,got that card almost day one for 370 euros if i remember, day one it was dropping easily to below 60 and even 30 in plenty of games. So sure we might think the prices have gone up, but fact of the matter is todays midrange gives you way more performance in modern games than the high end gave you 10 years ago.
Posted on Reply
#89
64K
It is disturbing watching the "just accept it and buy the GPU anyway no matter how much Nvidia is ripping gamers off mentality" concerning Nvidia's GPUs. Don't you realize that once Nvidia gets away with this it becomes precedent just like the GTX 680 almost 11 years ago. They have never retreated from their over-pricing schemes since then. It only continues to get worse and now with Ada it is utterly ridiculous.

What's next with the RTX 50 series? Higher prices still most likely.

Some people have asked me why I care. I care because I see PC gaming as a hobby and I always have. I see PC gamers as fellow hobbyists and I don't like seeing my fellow hobbyists being abused and ripped off by Nvidia. $550, if not more, for a lower end Ada RTX 4060 Ti which is really an entry level RTX 4050 Ti is ridiculous. Don't be silent and just take it like is being advised by a vocal minority here. Fight back.
Posted on Reply
#90
JustBenching
Bet0nWhat is natural? That we have 300+ Watts cards that didn't exist before Turing (besides the dual GPU cards)?
In normal times the 3070 Ti (not even that, because the TDP of 3070 Ti is 290W) would've been the top card (I mean that kind of performance, not the name) in the Ampere generation.
I think two things happened.
Firstly NV saw in 2016-17 the first bigger crypto mining craze (they started working on Ampere around that time) and realized people want more calculation powa and more cards.
Secondly since they were in contract with Samsung on a crappy node with high power consumption this new top card with 250W TDP would've been only maximum 10% faster than the previous flagship (2080 Super/Ti) which is a very bad generational uplift, worse than Pascal->Turing.
They were cocky by going with Samsung because AMD was nowhere to be found (in 2016 AMD had RX 480 and Fury X and that's it).
So they went allin not caring about TDP with Ampere so we got these 300W+ cards. They're much faster than Turing but they also eat much more than Turing. Or should I say they're faster only because they eat much more.
Look at the TPU database: the 3060 is barely faster than the 2060/Super or the 2070, the 3060 Ti barely faster than the 2070 Super/2080 and the same is true 1660TI vs 3050. These are same TDP tiers. At least they lowered the prices by moving down a tier these performance levels.
They got lucky with the second big crypto mining boom and the Ampere cards sold out easily despite the high TDPs.
Since people want these cards they gonna make them but I think without crypto mining the history would've completely different with an Ampere generation barely faster than Turing showing NV dominance and monopolistic behavior in the GPU space.
Isnt it crazy blaming everything on nvidia, when in fact its amd that messed up? Here you go, turing came after VEGA and look at that, amd topping the power draw graphs
But lets blame nvidia regardless

tpucdn.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/images/power_average.png
64KIt is disturbing watching the "just accept it and buy the GPU anyway no matter how much Nvidia is ripping gamers off mentality" concerning Nvidia's GPUs. Don't you realize that once Nvidia gets away with this it becomes precedent just like the GTX 680 almost 13 years ago. They have never retreated from their over-pricing schemes since then. It only continues to get worse and now with Ada it is utterly ridiculous.

What's next with the RTX 50 series? Higher prices still most likely.

Some people have asked me why I care. I care because I see PC gaming as a hobby and I always have. I see PC gamers as fellow hobbyists and I don't like seeing my fellow hobbyists being abused and ripped off by Nvidia. $550, if not more, for a lower end Ada RTX 4060 Ti which is really an entry level RTX 4050 Ti is ridiculous. Don't be silent and just take it like is being advised by a vocal minority here. Fight back.
But Nvidia's gpus are CHEAPER than amds, wth are you talking about?? What nvidia pricing schemes, dont you see that amd is the one fleecing us? Like seriously what the heck is wrong with people and their nvidia hating

The xt is priced 14% higher across eu while being a worse product than the 4070ti. Lol
Posted on Reply
#91
64K
fevgatosIBut Nvidia's gpus are CHEAPER than amds, wth are you talking about?? What nvidia pricing schemes, dont you see that amd is the one fleecing us? Like seriously what the heck is wrong with people and their nvidia hating
If I'm a Nvidia hater then I'm doing it wrong. I have been using exclusively Nvidia GPUs in all of my builds starting with a 8800 GT over 15 years ago. Speaking out against shitty practices by a company isn't hating that company. It's my right as a paying customer.
Posted on Reply
#92
JustBenching
64KIf I'm a Nvidia hater then I'm doing it wrong. I have been using exclusively Nvidia GPUs in all of my builds starting with a 8800 GT almost 15 years ago. Speaking out against shitty practices by a company isn't hating that company. It's my right as a paying customer.
But you are speaking about the wrong company. Nvidia has the cheapest and the best value gpu out of all the new cards. That's just a fact, the 4070ti offers both better raster per dollar and way better rt per dollar while costing less money, consuming less power having more features and being cheaper than the competition. Why the heck would you expect nvidia to lower its price? It already is the best value gpu, what you are saying doesnt make much sense to me.
Posted on Reply
#93
Bet0n
fevgatosIsnt it crazy blaming everything on nvidia, when in fact its amd that messed up? Here you go, turing came after VEGA and look at that, amd topping the power draw graphs
But lets blame nvidia regardless

tpucdn.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/images/power_average.png
I didn't blame them for anything except for making very power hungry cards out of necessity (because of Samsung node) and partially market demand (sprinkled with some deception).
Vega was below 300W while the Ampere generation has 5 cards above 300W. AMD made power hungry cards earlier (HD 7970 GHz, R9 290/X, R9 390/X, R9 Fury/X all below 300W) but they had to in order to stay somewhat competitive. In the RDNA2 lineup only the 6950XT (an afterthought card) consumes more than 300W.
NV is the market leader so they should set an example.
Posted on Reply
#94
Vayra86
fevgatosTbf, if someone doesnt care about rt, which i hear a lot, shouldn't complain about the prices either. A 350-400 euro cards kills everything with rt off at 1440p (lets say a 6700xt or a 3060ti). That kind of performance used to be ultra high end a few years ago. I have a 3060ti in my gfs pc with a 1440p ultrawide monitor, she doesnt really care about rt, framerate is great, and there is also the dlss option for backup.

I really don't think we ever got more performnace in the midrange than we do today. And since you mentioned the gtx 770,got that card almost day one for 370 euros if i remember, day one it was dropping easily to below 60 and even 30 in plenty of games. So sure we might think the prices have gone up, but fact of the matter is todays midrange gives you way more performance in modern games than the high end gave you 10 years ago.
Part of this is true... but then you have the beautiful situation where that 3060ti is endowed with a measly 8GB and you still find yourself cutting back on textures and whatnot to not stutter through your content. Like you say there is ample core power in current and last gen cards. Turing much the same really.

You could indeed also get an RDNA2 midrange card to have the VRAM. But the 3060ti is a horrible example of midrange progress, its a card that effectively regressed in how long it'll last in the midrange compared to say a Pascal or Turing alternative with similar VRAM but a much less powerful core. So even in the midrange, on Nvidia, you do have adverse effects besides even price, which result in reduced usable lifespan before you're cutting down settings that noticeably hit IQ.

The bottom line remains: on Nvidia, you are paying the RT price on the entire stack. The last cards avoiding that issue are the 16 series - not exactly midrange by todays standards.

IMHO the midrange has definitely not improved in any way shape or form, past the 1060 6GB. It has regressed. Perf/$ gen to gen barely moves forward or is in fact worse; VRAM relative to core power got fár worse and cache doesn't alleviate that in most cases; and absolute pricing also just went up.

I'm honestly trying to find a nice purchase now in the midrange of last several generations. Its just not there. Every single Nvidia card is handicapped in some way or another at its given price point and stack position. Coming from a 1080, it seems the only cards I can look at are 6800 (-XT) or 3080 10GB, the latter being sub optimal just the same - and a whoppin 320W TDP. And then I'm still looking not at 450-500 (reasonable range imho) but 600+ EUR.

Seriously man, I need a full bucket of cognitive dissonance to see improvement here over the last two generations...

I also dare your GF to run Darktide @ 1440p on a 3060ti. Gonna be a fun sub 60 FPS experience... You're going to have to use FSR/DLSS as well to get decent frames. Please stop selling nonsense. And this isn't anything special either, its just a UE4 game.



Posted on Reply
#95
JustBenching
Vayra86Part of this is true... but then you have the beautiful situation where that 3060ti is endowed with a measly 8GB and you still find yourself cutting back on textures and whatnot to not stutter through your content. Like you say there is ample core power in current and last gen cards. Turing much the same really.

You could indeed also get an RDNA2 midrange card to have the VRAM. But the 3060ti is a horrible example of midrange progress, its a card that effectively regressed in how long it'll last in the midrange compared to say a Pascal or Turing alternative with similar VRAM but a much less powerful core. So even in the midrange, on Nvidia, you do have adverse effects besides even price, which result in reduced usable lifespan before you're cutting down settings that noticeably hit IQ.

The bottom line remains: on Nvidia, you are paying the RT price on the entire stack. The last cards avoiding that issue are the 16 series - not exactly midrange by todays standards.

IMHO the midrange has definitely not improved in any way shape or form, past the 1060 6GB. It has regressed. Perf/$ gen to gen barely moves forward or is in fact worse; VRAM relative to core power got fár worse and cache doesn't alleviate that in most cases; and absolute pricing also just went up.

I'm honestly trying to find a nice purchase now in the midrange of last several generations. Its just not there. Every single Nvidia card is handicapped in some way or another at its given price point and stack position. Coming from a 1080, it seems the only cards I can look at are 6800 (-XT) or 3080 10GB, the latter being sub optimal just the same - and a whoppin 320W TDP. And then I'm still looking not at 450-500 (reasonable range imho) but 600+ EUR.

Seriously man, I need a full bucket of cognitive dissonance to see improvement here over the last two generations...

I also dare your GF to run Darktide @ 1440p on a 3060ti. Gonna be a fun sub 60 FPS experience... You're going to have to use FSR/DLSS as well to get decent frames. Please stop selling nonsense. And this isn't anything special either, its just a UE4 game.



Is darktide really worth it in terms of visuals? I dont know the game. Cause sure, i can easily find plenty of games that wont cut it, but usually those games look awful on top of the horrible performance. Case in point, ark survival was struggling on a 3090 (lol) and it looked AWFUL. Like one of the worst games ive ever laid my eyes upon.

Watchdogs 2 was another one, although it didnt look bad, setting shadows to the pcss option means your framerate will drop to the low 30ies. Its a game that came out with your 1080, try it, youll probably struggle to go above 20 at 1080p maxed out.

Some games have stupid max settings (gtav had 8x msaa lol). You cant blame the cards if they cant handle that.

But anyways, dlss exists for a reason, and its absolutely amazing. I know the usual crowd that's going to boo but in fact dlss is as good, sometimes even better than native (cod black ops for example looks way better with dlss q than native). So i dont see the problem with activating it. In fact i have it perma on even on my 4090,cause there is no reason not to.
Posted on Reply
#96
Vayra86
fevgatosIs darktide really worth it in terms of visuals? I dont know the game. Cause sure, i can easily find plenty of games that wont cut it, but usually those games look awful on top of the horrible performance. Case in point, ark survival was struggling on a 3090 (lol) and it looked AWFUL. Like one of the worst games ive ever laid my eyes upon.

Watchdogs 2 was another one, although it didnt look bad, setting shadows to the pcss option means your framerate will drop to the low 30ies. Its a game that came out with your 1080, try it, youll probably struggle to go above 20 at 1080p maxed out.

Some games have stupid max settings (gtav had 8x msaa lol). You cant blame the cards if they cant handle that.

But anyways, dlss exists for a reason, and its absolutely amazing. I know the usual crowd that's going to boo but in fact dlss is as good, sometimes even better than native (cod black ops for example looks way better with dlss q than native). So i dont see the problem with activating it. In fact i have it perma on even on my 4090,cause there is no reason not to.
Darktide doesn't look bad. I'm running it on a 1080 @ 3440x1440, that's sub 60 with lows @ 40 and reduced settings left and right + FSR at native res. If you look at those charts, you see the cards have plateaud from the 2080ti onwards, look at the massive jump there between it and 980ti. 1080ti slots somewhere in the middle between them. After that? 10-15% at best per gen. 1080p...

Its easy to spot this trend here.
Posted on Reply
#97
JustBenching
Vayra86Darktide doesn't look bad. I'm running it on a 1080 @ 3440x1440, that's sub 60 with lows @ 40 and reduced settings left and right + FSR at native res. If you look at those charts, you see the cards have plateaud from the 2080ti onwards, look at the massive jump there between it and 980ti. 1080ti slots somewhere in the middle between them. After that? 10-15% at best per gen. 1080p...

Its easy to spot this trend here.
Then there is a problem with the game, cause a 4090 is nowhere near only 22% faster than the 3080.
Posted on Reply
#98
Vayra86
fevgatosThen there is a problem with the game, cause a 4090 is nowhere near only 22% faster than the 3080.
There are lots of examples like it... if the better half you're playing 'is a problem with the game' that is one strange situation we're in. In a practical sense, you're then still not getting your money's worth in that midrange though, which was the point. Perhaps that new box of hardware tricks implemented doesn't really prove its worth in all the content ;)

You can keep moving the goalposts to imply denial of an obvious trend, you do you. Let's see where a 160W 4060ti goes with 228GB/s... at a supposed 350-400 bucks like you say.
Posted on Reply
#99
JustBenching
Vayra86There are lots of examples like it... if the better half you're playing 'is a problem with the game' that is one strange situation we're in. In a practical sense, you're then still not getting your money's worth in that midrange though, which was the point. Perhaps that new box of hardware tricks implemented doesn't really prove its worth in all the content ;)

You can keep moving the goalposts to imply denial of an obvious trend, you do you. Let's see where a 160W 4060ti goes with 228GB/s...
I dont understand what goalposts im moving. I said that a 400 euro card gives you insane 1440p performance with dlss or fsr to boot. Thats already high end resolution for me. If you want to go 4k maxed out with rt you have to pay, but that's luxury.
Posted on Reply
#100
Vayra86
fevgatosI dont understand what goalposts im moving. I said that a 400 euro card gives you insane 1440p performance with dlss or fsr to boot. Thats already high end resolution for me. If you want to go 4k maxed out with rt you have to pay, but that's luxury.
Yeah you said that, and then you're dragging all sorts of silly arguments across the table to reinforce it, when I show you that midrange cards can't even max out 1080p/60 proper in recent games. If you're gaming longer than today you know as well as I do that 1440p performance will nosedive as new titles come out.
Posted on Reply
Add your own comment
Dec 20th, 2024 13:49 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts