Monday, April 17th 2023
NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source:
Red Gaming Tech (YouTube)
237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
The perception of reaping DLSS3 'benefits' is fake. Its like the mob coming by for 'protection money', pretty much.
I’m not sure what a 78xx brings to the table other than 5 nm and slightly better RT performance.
So what exactly is your point here? That Nvidia's market share is 81% and not 90%? Huge difference. That we can count Intel's integrated numbers and assume Nvidia is not a monopoly? Totally wrong. That only a percentage of gamers care about games and discrete GPUs? Yeah, no one is saying anything different. But Nvidia's market share and income from gaming GPUs does show that it is a very big market. What exactly is your point? Thanks. I was saying it from the first day Nvidia expressed it's interest in buying ARM. I also have posted a couple of things in my latest replies that point at that direction. In short, Nvidia was about to invest heavily on the ARM platform, but before doing so they wanted to secure no interference with their plans, meaning total control of ARM and where the platform was going. In fact from the first Tegra I was expecting Nvidia to start playing with the idea of full Nvidia gaming PCs/laptops/consoles. But they focused on AI and didn't really gone that direction.
Hardware Unboxed
Digital Foundry
In any case AMD needs mid range RX 7000 series cards for better efficiency and maybe better RT performance. I don't know if an RX 7800XT will be as fast as an RX 6900XT in raster or RT, but if it is faster and at lower power consumption, then it is needed.
4090: $2000
4080: $1500
4070 Ti: $1000
4070: $750
4060 Ti: $550
4060: $400
And every single one of these cards below the 4080 has insufficient VRAM for the next 2 years, and will have a pathetic lifespan.
The 4090 has enough bulk VRAM and compute to properly handle almost any pro task and game, everything below is just an overpriced "normal GPU".
Right.
AMD, you CANNOT LOSE. The only requirement at this point is to work for a normal price.
fakeframes gen :slap:The new "must have" feature.
Nvidia may be the one shifting goal posts with market pricing, but AMD (so far) is happily slotting their cards right into the uptick based on comparative performance.
8GB isn't enough for a serious gaming GPU in 2023.
All rumours and leaks seem to confirm 8GB, but if they pull a double-density affair like the 3060 12GB, then a 4060Ti with 16GB for $450 might be okay. $440 does get you a brand new 16GB RX 6800 (well, after a $25 rebate - median prices seem to start at $465).
Little to close to April fools day
Sadly some will pay it :kookoo:
AA and AF are nowhere near that reality, which puts a different lens on the idea of 'fake frames'. Similarly other (AA) techniques Nvidia developed have or have not made it to widespread use and hardware agnostic adoption. The industry settles on things, such as T(X?)AA, and that's that. There is not a single motivator to 'settle on DLSS3'. It serves only to push Nvidia's marketing agenda. Nvidia doesn't own the console hardware either, so what real market is left to really go with Nvidia's flow and keep investing in it when Nvidia stops doing so?
They did the same thing when they showcased Pascal vs Turing performance for RT. 'Look at that mighty difference'. And today you're still looking at a vast majority of games where RT is nowhere to be found. Despite DXR and despite support on AMD. We're 5 whopping years in. Nvidia made a massive gamble and now they're stuck with a bunch of cores on their GPUs looking for problems to chew on and make themselves worthwhile. Without their bags of money to implement features in games, I wonder what'll be left. In the meantime AMD is turtling forward, adopting whatever is useful to them and setting the bar with console hardware while Intel is doing much the same within their own tiny garden. We'll get there, sure... but not because Nvidia said so and most definitely not through proprietary-only solutions. DLSS3 is a bit of a mixed bag. Its great when you already have a base FPS of 60... but you need it most when you actually have a base FPS of 30. But that'll still play and feel like 30. To me that feels and sounds like a pointless exercise. The games that work best on high refresh also want that FPS with a low latency.
The real bottom line is low-mid range GPU's now have appallingly bad value. "Just enable DLSS" doesn't "solve" that problem, it just tries to sweep it under the carpet. And it's that "let's turn an enhancement into a crutch" + underlying marketing BS that's what many people are really calling out. It's no real different than if Intel started charging $499 for new quad-cores then said "Yo, we noticed we're not looking too good in the perf / $ charts, so from now on we want you to base your perf / $ CPU review charts on video encoding measuring new $499 quad-cores using Intel Quicksync then compare them to previous gen 6-8 core CPU's that were using software encoding..." to 'fake inflate' what you're actually gaining from upgrading from previous gen "like for like" to justify an "up-tiering" of pricing...
1. 'We kill perf with a shiny new graphics effect'
2. 'We implement a feature that'll sacrifice some IQ for major FPS jump, so you can use shiny new graphics effect'
3. 'We keep improving said feature and tie it to our newest hardware so you can enjoy an even bigger FPS jump, and you now depend on not one, but TWO of our technologies combined'
This is the new 'customer is king' approach for Gen Z I guess. Its not my cup of tea. I'm not this naive. Its the same thing as subscribing to a service to play games. Effectively Nvidia is implementing a hardware subscription and they can fuck right off. 'You will pay and you will own nothing'. Without Nvidia's special sauce, what's left of Ada is really a pretty poor stack of GPUs at a highly inflated price - the only thing I can really applaud of it, is the energy efficiency... which in part comes from a heavily neutered bus and low VRAM, handling more of that traffic on the shrunk and efficient die itself. Right and then you have a 4070ti with 12GB... plus a 4070 with the very same thing... And there's a 4060ti with 16GB! So what's that then... the new 'poor man's 4080'? :rockout::roll:
I'd honestly laugh my ass off.
Of the 35 games currently supporting DLSS3, only a fraction of them are anywhere close to frame doubling, yet latency is always doubled. It's no surprise that Nvidia is pushing those few games that scale well super hard and it's a very very distorted representation of reality.
Cheapest cards from 600-900 EUR:
I think AMD covers the segment admirably, offering higher raster perf with much better VRAM capacities. They definitely do need something to replace the 69xx and 68xx, but you and I both know they won't exceed 4070 pricing.
Note the absence of 4070ti - it slots in above 900,- and then you've STILL just got 12GB to work with. Also consider the 7900XTX is 150 EUR higher - and available at that price as well, and 51% faster than 4070. 4080's however start at 1279,-
So we'll likely see the 4060ti at 475-500 EUR for entry/cheap AIB versions... for 8GB, the cap that was nice in 2016-2021.
If you have your own apartment and bills to pay and food to buy and whatnot, then $500 to $1000 is a big stretch!
Everyone isn't stupid, there are those bums as I've said who still live with their parents rent free and who's only expenditure is gaming and thus they can afford unreasonable prices! Those are fake frames which cause screen delay, if you played any sort of multiplayer game you'll know! You don't want fuzzy generated frames that add input latency and cause a screen delay. That fake frame doesn't add anything of value, it's just a tacked-on frame that often times will negatively impact your gaming experience, especially in multiplayer games where you want a competitive advantage!
Play counter strike, play Dota, play Overwatch, Call of Duty, etc... you don't want fake frames negatively impacting your gameplay! In fact, you want the effects and animation to be over quickly, so you have a clearer view, rather than having more fake interpolated frames cluttering up the image!
DLSS 3 fake interpolated frames don't actually improve your gaming experience and they are very limited when you can use them! If the game already runs at 60fps or bellow you can't use it, it will cause all sorts of issues to the image and if you are already running more than 120 frames per second you don't need extra frames as you will be limited by your monitor output or just pc latency output.
The simple way to think about this is you see a slightly smoother image, but the feeling is crooked! You feel the added input latency, you feel like your movements and actions don't correspond to the image you are seeing. That is the issue with fake frames and Nvidia can only provide you with fake frames, because they can't offer you improved performance over previous generations, they are not giving you more Vram, they are not giving you more value, so they have to rely on fake frames and gimmicks and to fool you into buying their overpriced turds!
If i want Upscaling then ill buy me an Console.
I see the future for me on PC is only for Sandbox Games with the ARC 770 and 16GB,
for all other Games ill drop the PC and buy me instead of an 4060TI 450$ a PS5 with Disc for 500$.
Like Torvalds told in the past FCK NVIDIA, but AMD now too with the same prices. :laugh:
It has to beat the 3070 Ti and have 12GB VRAM. Otherwise at $450 it is useless.
DLSS3 is only suitable for controller games, it causes a kind of severe motion blur with mouse movement. It's not just a latency issue. Also it is very very buggy in Cyberpunk, ruins cinematic timing and causes desync, also causes massive stutters as it enables and disables every time you open the menu for example. Not bad for controller games that are more slow paced. I'm not paying for DLSS3, it is worth nothing to me. I want actual native performance for my dollars.
660 229$
760 249$
960 199$
1060 6GB 250$
560 was the 4th fastest GPU, after 560TI, 570 and 580.
Now we have in the 3k Series 3090 TI, 3090, 3080TI, 3080, 3070TI, 3070, 3060TI, 3060.
A 560 from 2012 is today a 3080 199$ vs 699$ Arc 380 have 6GB and its performance after several patches is on pair with GTX 1650.
GTX 1650 4GB = 179€
RX 6400 4GB only with PCIE 4.0 usefull = 168€
GTX 1630 4GB = 159€
Arc 380 6GB 151€ (Features against all others AV1 and Quicksync) ;)