Monday, April 17th 2023

NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source: Red Gaming Tech (YouTube)
Add your own comment

237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

#176
Selaya
Chrispy_There's definitely some delusions about costs of production and economic changes in the last 12 years, which is the last time a x60Ti card sold for $250.

[ ... ]
moore's law is dead
- jensen
Posted on Reply
#177
progste
Production costs and inflation still don't justify why GPUs are the only component that has tripled in price in the last 10 years.
The truth is that during the mining rush they got away with absurd prices and they don't want to come down now.
Posted on Reply
#178
wheresmycar
progsteProduction costs and inflation still don't justify why GPUs are the only component that has tripled in price in the last 10 years.
The truth is that during the mining rush they got away with absurd prices and they don't want to come down now.
I share the same sentiments. Pre-owners and scalpers were selling top end 30-series for rediculous sums of money during Covid/Crypto. Even i was surprised to see what people were prepared to pay for these cards. Nvidia got a whiff of it and now we have "in-house scalping" and "never ending pandemic crisis prices" and "greedy fat self-inflated inflation" on our hands. From a business perspective, Nvidia has hit GOLD!! and the green monster and its shareholders won't let it slide so easily until the cracks emerge. Yep cracks opposed to AMD or Intel catching up... they're no different in the grand scheme of Gold winning up-the-price-anti insatiability. Only it took the Greens to set the loathsome bar so high.
Posted on Reply
#179
Chrispy_
progsteProduction costs and inflation still don't justify why GPUs are the only component that has tripled in price in the last 10 years.
The truth is that during the mining rush they got away with absurd prices and they don't want to come down now.
It certainly feels that way. Realistically I suspect it's a little bit of greed, economics, foundry demand, and increased TDP that contribute to higher prices in equal measure.
  • Greed is evident, the fact that Nvidia have caved to outcry of greed with the 4080 12GB and now the near-immediate price-cuts on the 4070.
  • Economics have changed dramatically in the just the last half decade. Chinese wages, exchange rates, trade wars and import duties, legislation, global shipping costs etc.
  • Foundry demand, with both Nvidia and Intel being two insanely-rich customers added to the already-expensive bidding war for TSMC capacity against AMD, Apple, and Qualcomm.
  • Todays midrange cards use 175-250W, which means an appropriately-priced cooler and VRM phases. Going back to the 10-series, for example, the 1060 6GB was just 120W.
Posted on Reply
#180
Gica
Vayra86Ah yes, the difference in averages again, yes, I agree on those. Third time now you've repeated that ;) Like I said, enjoy that 8GB while it lasts. I'll just eat more popcorn as the years go by and whole groups of people suddenly feel 'forced' to upgrade to yet another Nvidia midrange with lacking memory. This has been happening since Maxwell. It happened to 4GB Fury X owners and 1060 3GB owners. And it'll happen to 8GB Ampere owners too. At that point you're also left with a card that has abysmal resale value because, quite simply, it won't do well anymore. That sentiment has clearly already begun to land, as well.
I have two years of satisfaction with this video card and will have at least two more. You took into account the release years of those games, well done! , but it seems you don't know that some of them are the most used in the world online. I'll let you play with 128TB online, 4GB is enough for me.

Finally
1. I play WoT and AW (exclusively online) with the UHD 770. The 8GB vRAM of the RTX 3070Ti is probably to blame for my lack of superior performance as a gamer. It definitely increases the rating with 24GB, I reach the top 10.

2. For Blender, going from 3070 Ti to 7900XTX 24GB (oooooooohooooo!!!) just means a downgrade. People from Puget say it.
Posted on Reply
#181
Chrispy_
BSim500Actually it was 2019-2020:-

"Most relevant to potential buyers of the GTX 1660 Ti is the GeForce GTX 1660 Super, which delivers similar performance to the 1660 Ti, at a lower starting price of $229. At this writing, that's about $30 less than the lowest-price GTX 1660 Ti".
www.tomshardware.com/reviews/nvidia-geforce-gtx-1660-ti-turing,6002.html
Technically correct, but hard to class the 16-series as equivalent to the earlier 20-series. Realistically, all of the 16-series were below even the base-model 2060. Nvidia just messed around with the naming that generation and spread Turing over two series rather than one.
Posted on Reply
#182
tussinman
BSim500Actually it was 2019-2020:-

"Most relevant to potential buyers of the GTX 1660 Ti is the GeForce GTX 1660 Super, which delivers similar performance to the 1660 Ti, at a lower starting price of $229. At this writing, that's about $30 less than the lowest-price GTX 1660 Ti".
www.tomshardware.com/reviews/nvidia-geforce-gtx-1660-ti-turing,6002.html
I wouldn't really classify the 16xx series as a xx60ti card.

The closest they got that gen to a 2060 stop gap was arguably the 2060 Super which came out only 8 months after the 2060.
Chrispy_Technically correct, but hard to class the 16-series as equivalent to the earlier 20-series. Realistically, all of the 16-series were below even the base-model 2060. Nvidia just messed around with the naming that generation and spread Turing over two series rather than one.
Exactly. The closest was got to a midrange Ti card was the 2060 super which was a $400-430 card.
Posted on Reply
#183
Vayra86
GicaI have two years of satisfaction with this video card and will have at least two more. You took into account the release years of those games, well done! , but it seems you don't know that some of them are the most used in the world online. I'll let you play with 128TB online, 4GB is enough for me.

Finally
1. I play WoT and AW (exclusively online) with the UHD 770. The 8GB vRAM of the RTX 3070Ti is probably to blame for my lack of superior performance as a gamer. It definitely increases the rating with 24GB, I reach the top 10.

2. For Blender, going from 3070 Ti to 7900XTX 24GB (oooooooohooooo!!!) just means a downgrade. People from Puget say it.
Its all good then!
Posted on Reply
#184
Gica
Vayra86Its all good then!
It's good that we understood each other. :slap:
I'm reposting this printscreen, don't forget about it. Added new games, video cards are two years old, 16GB didn't make a difference. And they won't do it either in 2024 or 2025, maybe in 2030. Then, this memory surplus will help the old GPU to render with a 10% boost, with a jump from 10 to 11 FPS. If you don't believe it, run the new games in extreme detail with Radeon VegaI 16GB. Even in 4K, because it has enough memory. :D



Posted on Reply
#185
ixi
This is low quality post. lolololololololololololololololololololololololololololololololol. 450 for 60 series. Nice.
Posted on Reply
#186
tfdsaf
GicaIt's good that we understood each other. :slap:
I'm reposting this printscreen, don't forget about it. Added new games, video cards are two years old, 16GB didn't make a difference. And they won't do it either in 2024 or 2025, maybe in 2030. Then, this memory surplus will help the old GPU to render with a 10% boost, with a jump from 10 to 11 FPS. If you don't believe it, run the new games in extreme detail with Radeon VegaI 16GB. Even in 4K, because it has enough memory. :D



LOL! Vram amount doesn't matter until you run out of it, then you literally can't play the game or you have insane stutters! We already have 5 games already that use more than 8GB of vram, in fact going towards 12GB realistically and we are early 2023, most upcoming games are going to use 10GB or more.

8GB is ENTRY level amount, 12GB is the bare minimum for mid range with 16GB being realistically the target for mid range and upper mid range.
Posted on Reply
#187
bug
ixiThis is low quality post. lolololololololololololololololololololololololololololololololol. 450 for 60 series. Nice.
6650XT launched at $400 (and couldn't be had at that price) and it wasn't faster than a 3060Ti.
Posted on Reply
#188
64K
tfdsafLOL! Vram amount doesn't matter until you run out of it, then you literally can't play the game or you have insane stutters! We already have 5 games already that use more than 8GB of vram, in fact going towards 12GB realistically and we are early 2023, most upcoming games are going to use 10GB or more.

8GB is ENTRY level amount, 12GB is the bare minimum for mid range with 16GB being realistically the target for mid range and upper mid range.
The reason for the stutters is when you run out of VRAM the GPU begins to use System RAM also which is much, much slower than VRAM.
Posted on Reply
#189
bug
tfdsaf8GB is ENTRY level amount, 12GB is the bare minimum for mid range with 16GB being realistically the target for mid range and upper mid range.
Because when you play the latest and greatest at high res, ultra quality, you totally buy a mid-range video card, right? :wtf:
Posted on Reply
#190
progste
bugBecause when you play the latest and greatest at high res, ultra quality, you totally buy a mid-range video card, right? :wtf:
considering the prices of high end GPUs it seems completely reasonable to aim for the mid-range.
Posted on Reply
#191
bug
progsteconsidering the prices of high end GPUs it seems completely reasonable to aim for the mid-range.
Without lowering your expectations? Is that what passes as "reasonable" these days?
Posted on Reply
#192
Vayra86
bugWithout lowering your expectations? Is that what passes as "reasonable" these days?
At release a midrange GPU was definitely sufficient to play everything, and I used my 1080 a year post release to max out everything at 1080p at 60-100 FPS or better.

So yes, a midrange GPU today should definitely run everything at pretty much v high or ultra. At 1080p. And with very minor concessions at 1440p.

That is definitely not unreasonable, its the norm. We dont expect 4K maxed in 2023... but 1440p pretty much maxed? Yep. Its even Nvidias punchline for the 4070.
Posted on Reply
#193
bug
Vayra86At release a midrange GPU was definitely sufficient to play everything, and I used my 1080 a year post release to max out everything at 1080p at 60-100 FPS or better.

So yes, a midrange GPU today should definitely run everything at pretty much v high or ultra. At 1080p. And with very minor concessions at 1440p.

That is definitely not unreasonable, its the norm. We dont expect 4K maxed in 2023... but 1440p pretty much maxed? Yep. Its even Nvidias punchline for the 4070.
4070? We're discussing 4060Ti here.

As for expectations, titles from id traditionally could not be maxxed out at launch on anything, the hardware was built yet. Happened to at least Quake, Doom3 and Rage.
I do get what you're saying. I always bought mid-rangers and I could play everything on them. At the same time, I had to lower quality from time to time, but they were powerful enough that I could always find a combination of setting to lower that didn't make a distinguishable visual difference.
Posted on Reply
#194
progste
bug4070? We're discussing 4060Ti here.

As for expectations, titles from id traditionally could not be maxxed out at launch on anything, the hardware was built yet. Happened to at least Quake, Doom3 and Rage.
I do get what you're saying. I always bought mid-rangers and I could play everything on them. At the same time, I had to lower quality from time to time, but they were powerful enough that I could always find a combination of setting to lower that didn't make a distinguishable visual difference.
those titles, just like Crysis, are exceptions. Most other games have usually been playable at max with midrange GPUs like the 760, 960, 1060 or AMD equivalent.
Posted on Reply
#195
bug
progstethose titles, just like Crysis, are exceptions. Most other games have usually been playable at max with midrange GPUs like the 760, 960, 1060 or AMD equivalent.
I just told you that I always bought into the midrange and had to lower setting more often than not. But ok.
Posted on Reply
#196
mechtech
Still more than I paid for my RX6800
Posted on Reply
#197
Blitzkuchen
sLowEndLet's reframe the cost here.

If you buy a $1000 card every 2 years, that works out to be about $42 a month. Does that seem like an astronomical, must-live-in-your-parent's-basement kind of cost to you?
42$ for the GPU, 21$ for the whole System (1000$ every 4 Years), 10-20$ for the electricity = 73-83$ a month :laugh:

Now its related by the country where u life
Posted on Reply
#198
Vayra86
bug4070? We're discussing 4060Ti here.

As for expectations, titles from id traditionally could not be maxxed out at launch on anything, the hardware was built yet. Happened to at least Quake, Doom3 and Rage.
I do get what you're saying. I always bought mid-rangers and I could play everything on them. At the same time, I had to lower quality from time to time, but they were powerful enough that I could always find a combination of setting to lower that didn't make a distinguishable visual difference.
Exceptions, right. I mean, right now what you get is a basic, off the shelf engine that will push features that are going to destroy these cards. Consoles will carry that push.

As for discussing the 4060ti - yes, so you won't expect to max out 1440p, you will tweak a little more. But even that won't carry you, because 8GB simply won't suffice. And even 1080p might turn out to be problematic pretty soon. I was already seeing lots of instances where its 7+ GB in use on my GTX 1080. Now that I have more VRAM, I see it run way over more often than not - 13GB even isn't an exception. The gap's getting pretty large pretty quickly.

This 4060ti might turn out to be 3GB 1060, versus the 6GB where the latter can simply run more games proper, even despite what settings you have to move down to.

Here's Cyberpunk 3440x1440 on max settings, no RT and quality FSR 2: 100+ FPS virtually everywhere, and just over 8GB allocated; it runs up if you go outside, 8.8GB happens. I think that clearly shows that VRAM will be the limiting factor here on x60's because you've definitely got the core oomph to run at these settings at least at 40~50 FPS and you can even add a sprinkle of RT on top if you're happy with running FSR Balanced (or its DLSS equivalent which produces equal or better FPS). And sure - even with 8.3GB you can run the game fine on 8. But this is a 2021 title.



Here's RT on / Psycho, FSR Quality - 9.4 GB; still 55 FPS and this is on AMD, we know Nvidia runs better RT frames especially in Cyberpunk with multiple effects.

Also... lol. Why would I even bother using this for that FPS hit :D I've just been playing this game and even Path Tracing (runs at a whoppin 14 FPS here :D) looks almost identical, I have to crawl into the screen to appreciate the differences, and in many cases I preferred the raster image for its overall presentation and lighting balance. Looking at the sun - and its low in the sky a LOT of the time - is ridiculous with RT on. And there is no sunglasses mode.

But yeah... sacrifice IQ because you lack VRAM... would seem like a total waste of GPU to me. You can run the game at playable frames and virtually max if you have sufficient VRAM.

Posted on Reply
#199
tfdsaf
bugBecause when you play the latest and greatest at high res, ultra quality, you totally buy a mid-range video card, right? :wtf:
Yes! Mid range has historically been capable of running the latest games at highest settings and relatively good resolution. I mean the GTX 970 I remember used to run almost all games at 60+fps at 1080p and most games at 1440p as well. The GTX 1070 was perfectly capable of running all of the latest games at 1440p and 60+fps, same with the GTX 2070, never mind that the 2070 was a step in the wrong direction and had bad value, it was still capable of running 1440p games at 60+fps.

All of these cards have been able to run most AAA games at least 3 years after release at either the highest or a tier bellow highest at 60+fps.

With AMD its been even better as they've always provided a lot more vram and room to grow, we saw it with RX 500, RX 5000, etc... where these series kept becoming better and better.

The most direct example of a bad value and DOA card is probably the 1060 3GB, just a year after releasing that card could not a third of the games at highest textures and, most 4GB got obliterated then as well, I remember I just bought the GTX 1060 6GB and ROTTR was out and that games used up to 6.5GB of vram. Assasint Creed something used 6+ GB, so even the 1060 6GB was on the edge just 2 years later!

These 8GB is basic level, its the lowest entry point, anything else and games are going to be stuttering, crashing, have high frame times, etc....

For GPU's that have still not come out yet, are yet to release in a month or two, having only 8GB as so-called mid tier cards is blatantly stupid and fraudulent.
Posted on Reply
#200
ixi
bug6650XT launched at $400 (and couldn't be had at that price) and it wasn't faster than a 3060Ti.
And I can purchase 6650xt for under 291e while unknown shops sell fot 270e, so, good luck nvidia.
Posted on Reply
Add your own comment
May 20th, 2024 23:49 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts