Monday, April 17th 2023

NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source: Red Gaming Tech (YouTube)
Add your own comment

237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

#101
kapone32
That means the 4060 should be $399 making it a non buy and I bet they will all have 8GB VRAM buffers too.
Posted on Reply
#102
Vayra86
bugAll frame are fake. They're generated from a series of numbers fed into the GPU. And if you think non-DLSS3 frames are somehow more accurate, AA and AF would like to have a word with you.

I haven't paid for a card that does DLSS3, but from the videos I watched, no, you can't.
Thats not the same degree of fake - AA and AF have universal support and DLSS3 is not just hardware limited but on top of that, game limited. Thats not a real implementation of a feature; its a per game affair, forever keeping you dependent (this is key, its reason for its existence) on Nvidia and its TLC.

The perception of reaping DLSS3 'benefits' is fake. Its like the mob coming by for 'protection money', pretty much.
Posted on Reply
#103
Daven
Space LynxWhy make 7800 series card when a 6800 XT alreadys beats/matches a 4070 ti in some games and costs only $540...
I also wonder if AMD needs to release anything below the 7900xt. They already have the 6950xt, 6900xt, 6800xt and 6800 all with 16 GB, lower TDP and lower price. There really are no new features between RDNA2 and RDNA3 because AMD applies FSR versions across generations.

I’m not sure what a 78xx brings to the table other than 5 nm and slightly better RT performance.
Posted on Reply
#104
john_
dlgh7that is when context is important. So Yeah if you look Intel already has a 9% GPU market share to match the 9% from AMD based on current numbers. Nvidia has an 81 percent market share not 90.

However this is why context is so important. If you look at computer sales as a whole and include Integrated Graphics, well then because Intel has been shipping all their CPU's with Integrated Intel actually controls 71% if the market. That kind of shows you the context of how many computers are actually sold and how many don't bother with a discrete option at all.
A house can have a number of PCs and only one of them be a gaming system. Or in a second system an older GPU could be used, so no new GPU is needed. Also integrated GPUs are used in the absolute majority of business and work systems, and those are probably a huge persentage of the total market. If we are also counting laptops, then all laptops come with integrated graphics and probably the majority doesn't include discrete GPU. And not to mention that a system with a discrete GPU probably also has an integrated one. So we count even integrated GPUs that in reality are not used. Based on those and a whatever else I forget, it is easy to understand how Intel enjoys 71% of the total GPU market. Intel is also a special case, thanks to their ties with OEMs. They didn't jumped to 6-9% over night by selling to retail. That percentage is mostly discrete GPUs sold directly to OEMs, probably as a package together with CPUs. Also from a user perspective, not all people out there play demanding games or are obsessed with 4K resolution, overdrive RT, 60 fps and all settings at ultra.

So what exactly is your point here? That Nvidia's market share is 81% and not 90%? Huge difference. That we can count Intel's integrated numbers and assume Nvidia is not a monopoly? Totally wrong. That only a percentage of gamers care about games and discrete GPUs? Yeah, no one is saying anything different. But Nvidia's market share and income from gaming GPUs does show that it is a very big market. What exactly is your point?
There is a reason Nvidia tried to grab up Arm. They wanted or need to find a CPU solution to go with their GPU solution to match AMD and Intel in the long term.
Thanks. I was saying it from the first day Nvidia expressed it's interest in buying ARM. I also have posted a couple of things in my latest replies that point at that direction. In short, Nvidia was about to invest heavily on the ARM platform, but before doing so they wanted to secure no interference with their plans, meaning total control of ARM and where the platform was going. In fact from the first Tegra I was expecting Nvidia to start playing with the idea of full Nvidia gaming PCs/laptops/consoles. But they focused on AI and didn't really gone that direction.
Posted on Reply
#105
dlgh7
bugAll frame are fake. They're generated from a series of numbers fed into the GPU. And if you think non-DLSS3 frames are somehow more accurate, AA and AF would like to have a word with you.

I haven't paid for a card that does DLSS3, but from the videos I watched, no, you can't.
If you watch videos about DLSS you need to consider the sources. All these sites and YouTubers have high end gear. High Frame rate monitors etc. If you watch technical breakdowns of DLSS especially 3 you find that DLLS 3 inherits any issues that you have with the lower fps. So say a 120hz fixes an issue but you are running 60 normal and 120 with DLSS 3 you get the quality of the 60 not a true 120. Not to mention DLSS adds enough latency that you would never use it for anything competitive at all.

Hardware Unboxed

Digital Foundry
Posted on Reply
#106
john_
DavenI also wonder if AMD needs to release anything below the 7900xt. They already have the 6950xt, 6900xt, 6800xt and 6800 all with 16 GB, lower TDP and lower price. There really are no new features between RDNA2 and RDNA3 because AMD applies FSR versions across generations.

I’m not sure what a 78xx brings to the table other than 5 nm and slightly better RT performance.
RX 6000 series is power hungry and also might not be able to support FSR 3.0. Then again FSR 3.0 is mostly a response to Nvidia's Frame Generation, so I doubt RX 7000 series has any kind of hardware that helps with it, so probably they will offer it to RX 6000 series also. If and when they manage to build it.
In any case AMD needs mid range RX 7000 series cards for better efficiency and maybe better RT performance. I don't know if an RX 7800XT will be as fast as an RX 6900XT in raster or RT, but if it is faster and at lower power consumption, then it is needed.
Posted on Reply
#107
Mahboi
So let me recap with taxes (about 20% in most places):
4090: $2000
4080: $1500
4070 Ti: $1000
4070: $750
4060 Ti: $550
4060: $400

And every single one of these cards below the 4080 has insufficient VRAM for the next 2 years, and will have a pathetic lifespan.
The 4090 has enough bulk VRAM and compute to properly handle almost any pro task and game, everything below is just an overpriced "normal GPU".

Right.

AMD, you CANNOT LOSE. The only requirement at this point is to work for a normal price.
Posted on Reply
#108
R0H1T
MahboiAMD, you CANNOT LOSE.
Did you forget fake frames gen :slap:

The new "must have" feature.
Posted on Reply
#109
RedelZaVedno
450 bucks for 3070 equivalent 3 years later. Oh happy Ngreedia days! :respect:
Posted on Reply
#110
Double-Click
MahboiSo let me recap with taxes (about 20% in most places):
4090: $2000
4080: $1500
4070 Ti: $1000
4070: $750
4060 Ti: $550
4060: $400

And every single one of these cards below the 4080 has insufficient VRAM for the next 2 years, and will have a pathetic lifespan.
The 4090 has enough bulk VRAM and compute to properly handle almost any pro task and game, everything below is just an overpriced "normal GPU".

Right.

AMD, you CANNOT LOSE. The only requirement at this point is to work for a normal price.
AMD already passed on that opportunity with the 7900XTX / XT.
Nvidia may be the one shifting goal posts with market pricing, but AMD (so far) is happily slotting their cards right into the uptick based on comparative performance.
Posted on Reply
#111
Chrispy_
So many reviewers and Youtubers are doubling down on the message, and it's very very unambiguous.
8GB isn't enough for a serious gaming GPU in 2023.

All rumours and leaks seem to confirm 8GB, but if they pull a double-density affair like the 3060 12GB, then a 4060Ti with 16GB for $450 might be okay. $440 does get you a brand new 16GB RX 6800 (well, after a $25 rebate - median prices seem to start at $465).
Posted on Reply
#112
64K
Might be a $450 MSRP but I bet retailers will be charging around $500.
Posted on Reply
#113
ThrashZone
Hi,
Little to close to April fools day
Sadly some will pay it :kookoo:
Posted on Reply
#114
bug
Vayra86Thats not the same degree of fake - AA and AF have universal support and DLSS3 is not just hardware limited but on top of that, game limited. Thats not a real implementation of a feature; its a per game affair, forever keeping you dependent (this is key, its reason for its existence) on Nvidia and its TLC.

The perception of reaping DLSS3 'benefits' is fake. Its like the mob coming by for 'protection money', pretty much.
So this isn't about the tech even? It's just about it being an Nvidia exclusive?
Posted on Reply
#115
Vayra86
bugSo this isn't about the tech even? It's just about it being an Nvidia exclusive?
Its about the extent to which you can actually use and extract the feature set/tech's benefits. When they require a per-game implementation on top of a hardware requirement it becomes pretty difficult to defend that it'll be working consistently. There is no way Nvidia is keeping up with everything, and historically they haven't, and even their very last generation of GPUs is further reinforcement of that fact. That last gen, and the generations before it, represent a much greater part of the market than whatever Ada is carving out now. SLI/Crossfire is the best example of how this will work. It has died because of the difficulty to maintain support and because the cost/benefit didn't work out favorably. Will DLSS3 get to a point where its 'add this dll and you're good'? Maybe. But even then, its restricted to a single gen of hardware thus far. That's not a good perspective for future and continued support. You can wait on DLSS4...

AA and AF are nowhere near that reality, which puts a different lens on the idea of 'fake frames'. Similarly other (AA) techniques Nvidia developed have or have not made it to widespread use and hardware agnostic adoption. The industry settles on things, such as T(X?)AA, and that's that. There is not a single motivator to 'settle on DLSS3'. It serves only to push Nvidia's marketing agenda. Nvidia doesn't own the console hardware either, so what real market is left to really go with Nvidia's flow and keep investing in it when Nvidia stops doing so?

They did the same thing when they showcased Pascal vs Turing performance for RT. 'Look at that mighty difference'. And today you're still looking at a vast majority of games where RT is nowhere to be found. Despite DXR and despite support on AMD. We're 5 whopping years in. Nvidia made a massive gamble and now they're stuck with a bunch of cores on their GPUs looking for problems to chew on and make themselves worthwhile. Without their bags of money to implement features in games, I wonder what'll be left. In the meantime AMD is turtling forward, adopting whatever is useful to them and setting the bar with console hardware while Intel is doing much the same within their own tiny garden. We'll get there, sure... but not because Nvidia said so and most definitely not through proprietary-only solutions.
dlgh7If you watch videos about DLSS you need to consider the sources. All these sites and YouTubers have high end gear. High Frame rate monitors etc. If you watch technical breakdowns of DLSS especially 3 you find that DLLS 3 inherits any issues that you have with the lower fps. So say a 120hz fixes an issue but you are running 60 normal and 120 with DLSS 3 you get the quality of the 60 not a true 120. Not to mention DLSS adds enough latency that you would never use it for anything competitive at all.

Hardware Unboxed

Digital Foundry
DLSS3 is a bit of a mixed bag. Its great when you already have a base FPS of 60... but you need it most when you actually have a base FPS of 30. But that'll still play and feel like 30. To me that feels and sounds like a pointless exercise. The games that work best on high refresh also want that FPS with a low latency.
Posted on Reply
#116
BSim500
bugSo this isn't about the tech even? It's just about it being an Nvidia exclusive?
Personally speaking my last 3-4 GPU's have been nVidia but I agree with those calling out BS on the current trend where "Fake Frames (tm)" marketing slides plastered over everything is very obviously attempting to normalize benchmarking games with vendor-locked proprietary features enabled as "standard" (an unhealthy thing for the industry regardless of who does it), and looks doubly stupid when not that long ago, the very same people now pushing DLSS2-3 as the Second Coming of Jesus for PC previously spent 2-3 years mocking "console peasants" for their 'fake 4k' upscaling / interpolation...

The real bottom line is low-mid range GPU's now have appallingly bad value. "Just enable DLSS" doesn't "solve" that problem, it just tries to sweep it under the carpet. And it's that "let's turn an enhancement into a crutch" + underlying marketing BS that's what many people are really calling out. It's no real different than if Intel started charging $499 for new quad-cores then said "Yo, we noticed we're not looking too good in the perf / $ charts, so from now on we want you to base your perf / $ CPU review charts on video encoding measuring new $499 quad-cores using Intel Quicksync then compare them to previous gen 6-8 core CPU's that were using software encoding..." to 'fake inflate' what you're actually gaining from upgrading from previous gen "like for like" to justify an "up-tiering" of pricing...
Posted on Reply
#117
Vayra86
BSim500Personally speaking my last 3-4 GPU's have been nVidia but I agree with those calling out BS on the current trend where "Fake Frames (tm)" marketing slides plastered over everything is very obviously attempting to normalize benchmarking games with vendor-locked proprietary features enabled as "standard" (an unhealthy thing for the industry regardless of who does it), and looks doubly stupid when not that long ago, the very same people now pushing DLSS2-3 as the Second Coming of Jesus for PC previously spent 2-3 years mocking "console peasants" for their 'fake 4k' upscaling / interpolation...

The real bottom line is low-mid range GPU's now have appallingly bad value. "Just enable DLSS" doesn't "solve" that problem, it just tries to sweep it under the carpet. And it's that "let's turn an enhancement into a crutch" + underlying marketing BS that's what many people are really calling out. It's no real different than if Intel started charging $499 for new quad-cores then said "Yo, we noticed we're not looking too good in the perf / $ charts, so from now on we want you to base your perf / $ CPU review charts on video encoding measuring new $499 quad-cores using Intel Quicksync then compare them to previous gen 6-8 core CPU's that were using software encoding..." to 'fake inflate' what you're actually gaining from upgrading from previous gen "like for like" to justify an "up-tiering" of pricing...
The Nvidia strategy here is obvious:
1. 'We kill perf with a shiny new graphics effect'
2. 'We implement a feature that'll sacrifice some IQ for major FPS jump, so you can use shiny new graphics effect'
3. 'We keep improving said feature and tie it to our newest hardware so you can enjoy an even bigger FPS jump, and you now depend on not one, but TWO of our technologies combined'

This is the new 'customer is king' approach for Gen Z I guess. Its not my cup of tea. I'm not this naive. Its the same thing as subscribing to a service to play games. Effectively Nvidia is implementing a hardware subscription and they can fuck right off. 'You will pay and you will own nothing'. Without Nvidia's special sauce, what's left of Ada is really a pretty poor stack of GPUs at a highly inflated price - the only thing I can really applaud of it, is the energy efficiency... which in part comes from a heavily neutered bus and low VRAM, handling more of that traffic on the shrunk and efficient die itself.
Chrispy_So many reviewers and Youtubers are doubling down on the message, and it's very very unambiguous.
8GB isn't enough for a serious gaming GPU in 2023.

All rumours and leaks seem to confirm 8GB, but if they pull a double-density affair like the 3060 12GB, then a 4060Ti with 16GB for $450 might be okay. $440 does get you a brand new 16GB RX 6800 (well, after a $25 rebate - median prices seem to start at $465).
Right and then you have a 4070ti with 12GB... plus a 4070 with the very same thing... And there's a 4060ti with 16GB! So what's that then... the new 'poor man's 4080'? :rockout::roll:

I'd honestly laugh my ass off.
Posted on Reply
#118
Chrispy_
90% of game developers are working on console-first or cross-platform titles that need to run well on the PS5 and Series X. Until Nvidia bid for the next console generations, win the bid(s), and put frame-gen and RT focus into those consoles, those features are nothing but a bit of gloss added to games that don't need it.

Of the 35 games currently supporting DLSS3, only a fraction of them are anywhere close to frame doubling, yet latency is always doubled. It's no surprise that Nvidia is pushing those few games that scale well super hard and it's a very very distorted representation of reality.
Posted on Reply
#119
Vayra86
Double-ClickAMD already passed on that opportunity with the 7900XTX / XT.
Nvidia may be the one shifting goal posts with market pricing, but AMD (so far) is happily slotting their cards right into the uptick based on comparative performance.
Well consider this for local pricing, tax included:

Cheapest cards from 600-900 EUR:
I think AMD covers the segment admirably, offering higher raster perf with much better VRAM capacities. They definitely do need something to replace the 69xx and 68xx, but you and I both know they won't exceed 4070 pricing.

Note the absence of 4070ti - it slots in above 900,- and then you've STILL just got 12GB to work with. Also consider the 7900XTX is 150 EUR higher - and available at that price as well, and 51% faster than 4070. 4080's however start at 1279,-

So we'll likely see the 4060ti at 475-500 EUR for entry/cheap AIB versions... for 8GB, the cap that was nice in 2016-2021.

Posted on Reply
#120
tfdsaf
droopyROYou mean the 4070, not the 4070Ti ? Here, there is about 100$ price difference between a 6800XT and a 6950. I am currently split between a RTX 4070 that draws 200W and has DLSS3 and a RX6950 that pulls double the wattage but has 16GB of vRAM. A 7800XT with 16GB and 200W, and priced close to RTX4070 would have been the ideal card for me.

No they won't. Neither AMD nor Nvidia will lower prices more than they have. And you don't have to be "rich" to afford a 500-1000USD/Euro card, it's not a bloody 70-100K sports car :)

There is the second hand market, if you are so tight on your budget. You can get good deals on hardware if look every day. But calling everyone "stupid" will get you nowhere.
You have to be rich to buy $1000 dollar GPU's lets be honest or alternatively a bum who still lives at his parents basement and doesn't contribute to the house at all and mommy still makes them food even though they are 30 years old! So if your only expense is PC gaming and internet bill and mommy and daddy let you stay at their basement rent free and even buy and make all your food, then maybe you can afford $1000 for a GPU.

If you have your own apartment and bills to pay and food to buy and whatnot, then $500 to $1000 is a big stretch!

Everyone isn't stupid, there are those bums as I've said who still live with their parents rent free and who's only expenditure is gaming and thus they can afford unreasonable prices!
bugSo you want more VRAM (presumably for future-proofing), but you don't want RT (again, presumably because it adds nothing to future-proofing). Is that right?

Hate it all you want, but once you flip on DLSS3, your Nvidia card will generate more frames per second. If more fps isn't something worth paying more*, I don't know what is. Those features add value. Not to you, apparently, but that doesn't means everyone else should ignore them because you do.

*I mean paying more in the general sense, not at these stupid prices in particular.
Those are fake frames which cause screen delay, if you played any sort of multiplayer game you'll know! You don't want fuzzy generated frames that add input latency and cause a screen delay. That fake frame doesn't add anything of value, it's just a tacked-on frame that often times will negatively impact your gaming experience, especially in multiplayer games where you want a competitive advantage!

Play counter strike, play Dota, play Overwatch, Call of Duty, etc... you don't want fake frames negatively impacting your gameplay! In fact, you want the effects and animation to be over quickly, so you have a clearer view, rather than having more fake interpolated frames cluttering up the image!

DLSS 3 fake interpolated frames don't actually improve your gaming experience and they are very limited when you can use them! If the game already runs at 60fps or bellow you can't use it, it will cause all sorts of issues to the image and if you are already running more than 120 frames per second you don't need extra frames as you will be limited by your monitor output or just pc latency output.

The simple way to think about this is you see a slightly smoother image, but the feeling is crooked! You feel the added input latency, you feel like your movements and actions don't correspond to the image you are seeing. That is the issue with fake frames and Nvidia can only provide you with fake frames, because they can't offer you improved performance over previous generations, they are not giving you more Vram, they are not giving you more value, so they have to rely on fake frames and gimmicks and to fool you into buying their overpriced turds!
Posted on Reply
#121
Blitzkuchen
450$ for a 4060TI and u will need DLSS for 1440p, sorry but im playing on PC to play in native resolution.
If i want Upscaling then ill buy me an Console.

I see the future for me on PC is only for Sandbox Games with the ARC 770 and 16GB,
for all other Games ill drop the PC and buy me instead of an 4060TI 450$ a PS5 with Disc for 500$.

Like Torvalds told in the past FCK NVIDIA, but AMD now too with the same prices. :laugh:
Posted on Reply
#122
Garrus
It doesn't matter what settings you use, you need 12GB VRAM. 8GB is not enough. I've had many problems with recent games. They either crash or don't load textures. MY RTX 3070 required me to play Last of Us at 720p to avoid crashes. Interestingly I had to use native 720p. 1440p DLSS upscaled from 720p also used too much VRAM.

It has to beat the 3070 Ti and have 12GB VRAM. Otherwise at $450 it is useless.

DLSS3 is only suitable for controller games, it causes a kind of severe motion blur with mouse movement. It's not just a latency issue. Also it is very very buggy in Cyberpunk, ruins cinematic timing and causes desync, also causes massive stutters as it enables and disables every time you open the menu for example. Not bad for controller games that are more slow paced. I'm not paying for DLSS3, it is worth nothing to me. I want actual native performance for my dollars.
Posted on Reply
#123
noel_fs
midrange on 450 what a fucking disgrace
Posted on Reply
#124
Blitzkuchen
Daven1060 6GB $300
2060 Super $400
3060 Ti $400
4060 Ti $450

Yup, prices are definitely going up.
560 199$
660 229$
760 249$
960 199$
1060 6GB 250$

560 was the 4th fastest GPU, after 560TI, 570 and 580.
Now we have in the 3k Series 3090 TI, 3090, 3080TI, 3080, 3070TI, 3070, 3060TI, 3060.

A 560 from 2012 is today a 3080 199$ vs 699$
SquaredIntel's original schedule called for Alchemist in 2022H1, Battlemage in 2023, and Celestial in 2024. But now the news, which I guess is a rumor, calls for Battlemage in 2024H2 and Celestial in 2026H2.
www.techpowerup.com/306474/intel-arc-battlemage-to-double-shader-count-pack-larger-caches-use-tsmc-4-nm
Arc 380 have 6GB and its performance after several patches is on pair with GTX 1650.


GTX 1650 4GB = 179€
RX 6400 4GB only with PCIE 4.0 usefull = 168€
GTX 1630 4GB = 159€
Arc 380 6GB 151€ (Features against all others AV1 and Quicksync) ;)
Posted on Reply
#125
sLowEnd
tfdsafYou have to be rich to buy $1000 dollar GPU's lets be honest or alternatively a bum who still lives at his parents basement and doesn't contribute to the house at all and mommy still makes them food even though they are 30 years old! So if your only expense is PC gaming and internet bill and mommy and daddy let you stay at their basement rent free and even buy and make all your food, then maybe you can afford $1000 for a GPU.

If you have your own apartment and bills to pay and food to buy and whatnot, then $500 to $1000 is a big stretch!

Everyone isn't stupid, there are those bums as I've said who still live with their parents rent free and who's only expenditure is gaming and thus they can afford unreasonable prices!
If you'd step outside of your bubble for moment, you'd realize that $1000 is expensive in a PC building context, but not in hobbies in general. There are many, many hobbies out there where $1000 won't get you much, and many many people who partake in said hobbies. Think for a moment.
Posted on Reply
Add your own comment
May 20th, 2024 18:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts