Monday, April 17th 2023

NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source:
Red Gaming Tech (YouTube)
237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
oh! look what i found!
even better!
Another one!
An aib could go past the previous x80 series!
Seems to me that IT WAS ALWAYS THE RULE, or at least most of the time and not like you mention "sweet when it happened" as if it was just a rare ocurrence.
What happened post Maxwell was in part prompted by TSMC's failed 22nm node. GPU stagnated because of that (22nm designs had to be revised and built on 28nm) and when 16nm was available, GPUs took off once again. Good old days...
You can't compare apples to oranges.
Not only that, the 7800 XT may be 6950 XT performance at $550 maybe? that's $100 less than what the 6950 XT is selling for. And that's only being 16% faster than the 6800 xt.
I think you just suck at managing finances or your upgrade paths... No, I dont, the usual MO is that per gen at the same ish price point you gain a full tier (25-30%) of performance. We did see some larger jumps (Pascal!) but they also carried a price increase.
Like i previously said, if you play AAA games in 1080p or 1440p a 600$ GPU is all you need for 2-3 years at the minimum. If you save 20$ a month and have no GPU, you can buy a brand new one in less than 3 years. If you go for a second hand GTX 1080 that can run any AAA title in 1080p, you can afford that in about a year or so. Also there are monthly installments or paying plans with credit cards.
If you don't want to pay 600$ on a RTX4070/4060Ti and boycott it, yes, by all means, do that, it's your right. But complaining and telling people they are "rich" and idiots for wanting or buying one is not the way to go about it.
2060 had just 6GB and was also the lowest Turing RTX card. (Still, I agree best example of a 60~80 jump, but not quite it anyway)
980 was notoriously expensive price/perf compared to its very close in perf 970, but yeah, agreed, another close one. Except price didnt quite move in tune with the model number. Pascal made x60 more expensive. Thats a 4070ti AIB and it has 18% not 20. Its also a snapshot made in 2023 on release. Nobody is saying 12GB is insufficient today. We start to realize though it is where you need to be at the minimum for this level of core perf. On release day!
DLSS3 in particular though is a case of 'Thanks, I hate it' because its just a pointless principle. There was a reason not to hard Vsync your games or run interlaced over progressive... DLSS3 offers you nothing in frame latency. Its interpolation. Well, yay, thanks for that in 2023 on select games and hardware, Ill order five and some extra ketchup.
Seriously, all of this perceived special sauce is a solution looking for problems that really arent as hard as they are made out to be. I just played Q2 RTX on my AMD card... it was interesting to look at for just over 10 minutes. Dynamic lighting and all... 55 FPS. Game still looks and feels like Q2... It will stay with me as notoriously unimpressive, especially considering the performance for what you really get. Three quarters of those fancy effects were done a hundred times with raster at 2-3x the efficiency, and it looks identical, only reflection accuracy is more refined. If you stop to stare at it.
Nvidia is selling so-called unique software solutions on top of meh hardware. Its clear and its not something I like to support, because its proven to be a lie every time. Both DLSS and RTX are not unique they just have a performance edge, at a sacrifice or two, dependancy for you, and an inflated price.
Similarly even when I owned Nvidia cards I never felt any urge to pay for Gsync. Its the same thing and look where we are now.
Thats what happens when you fuse "boredom" + "bewilderment" + "Nvidia's simply taking the p-i-s-s"
More examples:
680 was ~20% faster than a 660Ti
3080 is 50% faster than 3060Ti.
Obviously, while it's reasonable to expect the next gen to be 20% faster, it's less reasonable to expect a 50% jump.
I just hope no sane person is attempting to justify these new pricing levels. Subjectively speaking, the 60-segment for me is entry-level gaming territory for above console elevated performance enthusiasts. Purely from an impartial broader consumer perspective, wanting something at the lower stack shouldn't push for almost half a thousand dollars. Thats probably half/+ the cash of what most buyers on a budget would consider sufficient for a gaming build.
Anyway, im not surprised anymore. The 4080 and 4090 MSRP was a clear indicator of what was yet to come. And NVIDIA has stayed true to its "need-mo-monayy" kingdom to come and upwards trajectory to stay until it reaches a boiling point. An interesting year/2 for me, i have never contemplated on the competition before (prior to covid)... now even i'm positively curious in anticipation for AMD and Intel to somehow kick Nvidia in the balls to bring them back to reality.
I remember in 2007 building a PC tower for a friend that was based on the 8600GTS and a lowend variant of the core 2 duo (both of which didn't age that great) and it was still around $750 which would with inflation be close to $1100 now.
I'd argue that a $1100 PC now (6700XT maybe even a price slashed 6800 with something like a 5700x or 12600) is going to age a lot better than the PC in 2007........
You don't know what inflation even is, so please stop using it as a way to figure out the value of something.
The TPU review for the 4070 also added new games and the distance between the 3070Ti 8GB and the 6800 16GB is the same as two years ago. It's just the choice if you accept the reality or not.
Added NFS Unbound. I had forgotten about him.
I sold my two RX570 4GB to some friends that wanted to get back in to gaming, 50$ for each. I am and they are amazed at how good those cards are if you don't push the settings too much.
I did this test before i sold them, TW Warhammer III is one of the hardest games to run at high settings and 60 fps. My 3060Ti can't do it at 1440p.
But the perspective matters - if you've seen the history of a card live in gaming, you see how performance deteriorates over time with new games, and you really do notice the differences. While 'Medium' is acceptable in many ways, its certainly not as crisp and lively as higher settings. New games tend to even elevate what happens on Medium, but even then, you're left with all the nice ancient hiccups in graphics, like pop-in of textures and LOD, limited view distance, up to and including stuff like smaller unit sizes in TW WH3. Some graphical settings directly touch on the gameplay experience in that way.
And its true, it was exactly TW WH3 and its overall performance (FPS nosedives compared to WH2, and that wasn't light either) that pushed me to spending even a bit too much for a hefty upgrade. Had I stepped into the game with a 1080 today, coming from something much weaker, I might have been able to accept what it offered in perf and IQ.
"Most relevant to potential buyers of the GTX 1660 Ti is the GeForce GTX 1660 Super, which delivers similar performance to the 1660 Ti, at a lower starting price of $229. At this writing, that's about $30 less than the lowest-price GTX 1660 Ti".
www.tomshardware.com/reviews/nvidia-geforce-gtx-1660-ti-turing,6002.html
The second problem is pretty drastic. Every gen for a decent amount of time you expected to make a leap in performance and typically in power consumption. So say you had been the owner of a GTX 970 back in the day you expected that when the 1070 came out it would be faster than the previous gen RTX 980 but also do it with better efficiency and less power draw. This is starting to not be the case. However, even if this was still the case gen to gen the drastic issue is that they aren't just trying to give you that but they are doing what little they are by making you pay for it.
What use would a 1070 of been if it matched the 980 but was a little more power efficient but the 1070 cost what the 980 did? That is the issue. Each gen they give us recently what gains we get they make you pay for them by raising the price each gen. So all you end up with is a little more efficient gpu and they force you into some Software via Ai etc and then don't backport the software portion to try and get you to upgrade.
They just are giving us half baked everything really and then saying oh by the way you have to pay the increased cost as well.