Monday, April 17th 2023
NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source:
Red Gaming Tech (YouTube)
237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
The biggest problem here is that, for example, I bought my computer 4 years ago (I'm not rich! I replace my entire system every 8-10 years if I can)
So, I currently have an almost 5-year-old system, which was a top model at the time, but since then I haven't been able to install a "normal"(for example 6800XT) video card. Because everything is more expensive for us in the EU than, for example, in America.
But I'll go on, in my country (in stupid Hungary) there is such a huge VAT on these products that you'll be hooked if you see it. In addition to this, scalpers, miners, lack of chips and covid closures have come so far. (Just a quick example 6800XT with us is ~ $900 and it's the cheapest at the moment.) And now inflation, war, and total free robbery in the video card market, which many justify with such stupidity....if everyone continues to do this, next year we will start at $5,500 for the 5090 card. I know that this is an exaggerated and extreme example, but I would not be surprised if this was also an acceptable price for many people...and don't get me wrong, things are really moving in this direction, even if you're laughing at me now.
Anyway, it's not interesting, because we can't do anything if so many people pay so much money for these things all the time, but there are definitely a lot of idiots. Much to Nvidia's delight!
You can hate what I just wrote, but that's how I see and feel it and there is a lot of truth in it.
I won't call people idiots for doing what they want with their money, though. Sure, I would have preferred if people, like me, stopped buying and made Nvidia and AMD think of lowering their prices sooner, but it is what it is.
Fwiw, I'm not rich, but I can afford to pay $1,000 for a video card. But since I rarely game, it just doesn't make sense for me to do that. There's a lot more useful things $1,000 will buy, including a week in Greece, for example.
Guys, hyperinflation is a thing. Costs are going up. Get used to it already. You will own nothing, and you will be happy.
When the PS6/Xbox series 2 comes out at $1000 people are going to twist themselves into pretzels.
My take on this is slightly different, though: gaming industry may be dying, but not gaming itself. I mean, if there is no longer a huge install base of powerful GPUs, we may not see FIFA 50. But smaller, smarter, more focused titles will always be with us, in some form. Think about titles like Lemmings, The Lost Vikings or Supaplex. They don't need Ubisoft behind them, yet they're tons of fun and actually make you think.
This also puts an interesting perspective on further development of RT. In that sense I truly don't understand Nvidia's long term agenda.
I don't think PC gaming is dying, just that it hit a wall, a plateau. And something has to happen to get us over it.
My previous psychological limit was set at 500-550 EUR, but that was with the idea of more frequent upgrading (every odd gen give or take) and lower resale value on cards. Today I can still sell a 1080 for close to 200 EUR. Its crazy if you consider how old the card already is, and it shows the performance increases gen-to-gen have stalled massively, in part due to a slower release cadence on GPUs ever since Pascal (and I believe Maxwell > Pascal was also already more than 1 year apart).
Second hand market has been problematic because there was barely any movement in graphics cards, and what did come available was mined on. Its going to take a year post-Ada I reckon to recover and get back a semblance of bang for buck... and then you're still risking to buy a mining card if its anything RDNA2 or Ampere or earlier.
Gaming won't die, PC gaming won't die, but people will move to newer GPUs more slowly, which will in turn slow down the progress of new technologies, because the mainstream market just isn't adopting soon. Again: RT is going to see this in a big way. We're already three generations (almost 5 years!) in and its still not moving anywhere big, starting to mimic VR. It exists, its nice to have, but you can do without just fine. The power efficiency was my main reason to just get a 7900XT, the price difference is pretty much a new PSU & power gap. For just 70 added bucks I could move the whole budget to GPU instead of grabbing a 6950XT. And that money is recouped over the course of the 7900XT usage. Newer cards are actually sold at MSRP over here in the Netherlands. Maybe its about time to vote Orban away, power corrupts and I reckon Hungary isn't a popular market lately.
I was like that for maybe 10 years after getting a job. Then suddenly, I didn't even have time for that. On the flip side, you can imagine how relaxed I am looking at the new prices of video cards and people fretting over them :D
Should we pay for TAA antialliasing method as a feature? Should we pay for GPU's being able to output more than 720p video resolution? Should we pay for driver updates?
I don't see how people justify insane prices and premium pricing for so called "features"! These are NOT features, they are addons which we expect and have been done since the DAWN on MAN!
We don't go into the store and search for rock knifes and if we see a metal knife, we pay a $200 premium on it because it "features" metal in it, no we expect it to be metal, we expect it to be sharp when we buy it new and we expect it to not rust the 3rd time you wash it!
Even if AMD had ZERO raytracing capability and had NO upscaling on their own and had the same amount of say 12GB of vram as an Nvidia counterpart I would still not pay a premium on an Nvidia card because it has raytracing. Or at best and I'm stretching here, I'd pay say $20 more to have access to those addons, I certainly wouldn't pay $100 more for a card to be slower by 5% in 99.9% of games which use rasterization ONLY and have LESS vram! That is being a fool and being scammed HARD!
But Nvidia is expecting people to have less vram than their competition, be slower in 90% of games which use rasterization and don't even feature RT and again go to steam library, 99.9% of games are NOT RT capable, but still shell out $100 more for a "feature" like RT which can't even run in many games because they don't give you the vram to be able to do so!
I've been saying this since Huang first yelled 10 Gigarays as if that was somehow a fantastic number we should all be amazed at. Nobody to this day has the slightest clue what it meant. Lots of people apparently were mighty impressed as they bought Turing at premium prices that was barely a hair better than Pascal. And then they took that as their new baseline because 'muh RT perf'.
To each their own... All I see is idiocy. Even today RT perf is barely palatable except on a 4090... unless you path trace a square box, in which case the FPS nosedives to sub 30. I totally get why there is a 450 dollar 4060ti now - fools & money exist and they apparently LOVE to 'trade up'.
However this is why context is so important. If you look at computer sales as a whole and include Integrated Graphics, well then because Intel has been shipping all their CPU's with Integrated Intel actually controls 71% if the market. That kind of shows you the context of how many computers are actually sold and how many don't bother with a discrete option at all.
There is a reason Nvidia tried to grab up Arm. They wanted or need to find a CPU solution to go with their GPU solution to match AMD and Intel in the long term.
Hoping the 7600 XT goes heads to heads with this in terms of performance at less price.
Hate it all you want, but once you flip on DLSS3, your Nvidia card will generate more frames per second. If more fps isn't something worth paying more*, I don't know what is. Those features add value. Not to you, apparently, but that doesn't means everyone else should ignore them because you do.
*I mean paying more in the general sense, not at these stupid prices in particular.
I too want a 7900XT, but only for 3 or let's make it 4 main reasons. The biggest is Total War Warhammer III, that has stutter with Ultra textures, if i drop them to High then vRAM usage is about 7.5GB and it's fine.
The other reasons are three games that should release this year, Starfield, Cyberpunk2077: Phantom Liberty and the one i want the most, STALKER 2 ! I don't care for anything else in 2023-2024. But the cheapest 7900XT is 4600RON(about 900USD/Euros) and i could sell my 3060Ti for about 1600RON(330$/Euros). So that means about 200% more money for 100% more performance :/
You can see the same in the GPU market. New cards aren't shifting the price performance curve, just extending it outward.
Anyone can make software. If we are going to rely on that why not just have a company become experts on that and you can buy the features from them that they enable on GPU's through special drivers. Like companies already do on Phones through apps for things like the Camera or to take advantage of certain phone features. Anyone can add features via software. And while FSR hasn't been as good as DLSS from Nvidia at least AMD open sourced it. People need to pay attention to the other things these guys are doing. Intel for instance is already becoming a legitimate contender in the Linux space with their linux support. Linux is becoming legit. Fedora is amazing now. With Proton support I can play anything. OBS yep. Davinci Resolve Studio. Yep. The only thing it really lacks is Adobe and if someone like Affinity would port their software to Linux I could switch over to Linux full time because it now does everything I need from Video work to streaming to gaming. Shrinking of the the process should always provide some cost savings provided the chip production is at a high success rate. Why? Because the smaller you get the more chips you can produce on the same amount of product. I think the lack of product stack from AMD is still a matter of them fixing a few issues found on the higher end cards and still part of Covid. I think it could still be 2024 or 2025 until we see chip and electronic production get ramped back up to pre-covid levels. Demand is still outweighing what most companies can get out. You can see this in the extended market the most. Just look at a company like Analogue. They are still trying to push out orders for the Analogue Pocket from like 18 months ago. I have a friend still waiting. They announced the Analogue Duo October 2020 and it still hasn't had any updates or been released yet.
Covid is still rearing its head in the production of electronics in Asia. It will normalize eventually.
And you are out of luck too if you own Ampere that somehow for reasons unknown cannot use DLSS3. Mighty cool features indeed! Imagine paying premium for DLSS2 on Ampere - fool & money parted! This has been Nvidias game and it never worked - PhysX should have made that abundantly clear, as much as Gsync.