Monday, April 17th 2023
NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source:
Red Gaming Tech (YouTube)
237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
If not, keeping fooling yourself, because obviously you'll decide the core power was insufficient sooner rather than later and now you must have a 12GB 4070 ;) We all know where this ends. I'm very happy to see you're not coping at all though. On to the next shit purchase it is! Its always about timing, whether or not that is a good idea.
And the timing we're on, is console timing. We're halfway through current gen consoles which can push 12GB or more, and you know as well as I do most games don't get the TLC/optimization they could potentially have. If the consoles demand 10-12 GB, and frankly that is where it is moving now, and fast; and if newer engines can work a lot better with >8GB, you can rest assured this is the new normal.
Being below the mainstream norm with any graphics card is a definite push to upgrade. You might work your way through several more years of gaming, but you'll also feel forced to skip content left and right because its just not playable enough. I've done that lately because I couldn't bring myself to any kind of upgrade path/deal at its retarded pricing, not to mention the fact that past generations have been notoriously weak. We keep telling ourselves we're content playing the backlog, and this is also true, but there's also that thing you can't do and kinda do want.
Take note of the fact that an 8GB PS4 got released around the same time/year as Pascal which promptly pushed Maxwell's 4GB midrange to an 8GB norm from the x70 onwards (a DOUBLING... and that is after 2-3GB Kepler, so that's +50% and then +100% within the space of 3 generations), and even a pretty generous 6GB on the much weaker 1060. Its now really starting to show its limits, and in ever worse ways, and not because of lacking core - even Darktide was perfectly playable with FSR, not a mission was lost due to high variation of frametimes; even if framerates would be 40-50, much like in FSR Cyberpunk. But both games do love to eat 6-8GB. The disbalance you get now on Ada, and on a supposed 8GB RX x600 GPU is tremendous, and its absolutely a major difference with what we've seen in the past.
I've been beating this drum since Turing started cutting down VRAM / % core perf and here we are today - its happening, on release of new midrange cards that they already have to cut back on settings.
And if you keep cards for longer than 3 years, the worst case scenario is likely to happen because you'll be looking at a PS6 with the full 16GB addressable as VRAM, for sure. Perhaps even more on some epeen versions of it.
For example: F1 2020 versus F1 2022, Far Cry 5 versus FarCry 6, etc.
The Callisto Protocol, made in 2022, could not be tested in 2021 for the 3070Ti review. And it's just one of the added games.
Link, but maybe you have information that the reviewer used data from astrology, he did not test the video cards. Huh? HUH?!?!?!?!?! AMD's homeboy?
Summary of the video: they played with the settings until they drowned the video card with smaller memory. The one with a large memory doesn't feel good either, at their 2021 price you can expect at least 100 FPS.
Question for AMD fans, addicted to HU: why didn't they use DLSS/FSR? :kookoo:
I would be fine with 12 GB without the price hike on the RTX xx70 model, but with the hike, we should get more. RTX 4060 / Ti though are GPUs that are meant to last for a while, xx60 models being historically very capable midrange offerings, so limiting their capability straight on arrival is just silly to me. If they truly have 8 GB, or either one of them has, I think refreshed models are in order with more, so the 8 GB models can be priced down and take the place of entry level models, RTX 4050 being the absolute cheapest that you can comfortably game on.
Remember that everything is relative: Medium quality settings and textures today often look very good on a 1440p monitor, perhaps even at 4K depending on the game, so don't go nuts thinking that you need to have the absolute best settings for games to look great. See for yourself what settings look acceptable to you and compare the best textures against lesser quality settings - you will see that the difference is often not there or it is marginal.
My advice is to avoid buying 8 GB models if not for very cheap, but if you already have an 8 GB card, perhaps even the sorrowful RTX 3070 / Ti, you can still game just fine for a long time if you only tweak the settings right - everything will look good and run fine (at least regarding the RTX 3070 / Ti). I doubt too many bought RTX 3070 / Ti to play on a 4K monitor, where the VRAM limitation starts to show quite badly.
Enjoy your games being more conservative on VRAM usage! Even if the latest games offer immensely high resolution textures and various other demanding settings, it doesn't mean you really need them and when you read the benchmark charts, remember that reviewers usually do utilize these best, but unnecessary, settings.
GTX 970 in 2015 350$
GTX 1650 in 2023 180$
8 years difference, this never being happen in history
But go for ure RTX 4070 for about 600$ its a RTX 4060 for 300$
The text you quoted already kind of answered that question, 8GB in 2023 should be targeted more towards entry level GPUs like the 7600
A $450-500 4060 Ti having 8GB (along with a slow memory bus) is embarrassing no way around it.
I'm not sure why these charts are relevant? Are you suggesting increasing VRAM is supposed to drive increased FPS? It doesn't. You can slap 24GB of VRAM on any of these mid-tier cards and it won't make a difference.
Increased VRAM is more to do with visual quality output, essentially the image and graphics data shown on the display. Most buyers who have bought into or intend on buying higher resolution displays (1440p/4k) are in pursuit of sharper image quality alongside "the best quality preset" possible. In select or upcoming demanding titles 8GB simply ain't gonna cut it and not everyone is willing to fork out for these extortionately wallet-slurping higher VRAM provisioned cards. Games surpassing 8GB was always "inevitable". No one is suggesting 8GB isn't game'able and yes still highly relevant for some depending on the resolution, type of game or individual quality threshold. Unfortunately its not a one-shoe-fits-all sort of thing and already we can see cracks surfacing and no doubt with upcoming demanding games these cracks will broaden like the split seas of Moses. What makes matters worse, you're literally paying Nvidia or AMD RT/PT feature levies which are insanely taxing on VRAM and yet mid-tier cards (and very expensive ones at that) are being capped at 8GB. Going forward, purely from a consumer standpoint, i can't make sense of anyone sensible defending the 8GB position (esp. hi-res gaming).
What i find odd is the big elephant in the room - VRAM limitations have been a compromise for some time now or even better put, not size-scaling fast enough to leverage modern graphics capabilities. The good stuff is being passed on to the smaller volume bigger spenders. Even smart game engines use various VRAM-level real-time assets substitutions to tackle limitations or give the illusion of higher quality presets running seamlessly. This sort of erratic dynamic asset fiddling is very noticeable when VRAM limits encroach or max out (in some cases, i've noticed unappealing discrepancies even at 80-90% utilisation). We should make a clear distinction between games running smoothly (fps) vs dynamically remitted image quality or maxing out which leads to more perverse penalties (stuttering, artifacts, odd shuffling anamolies, crashes, etc).
Whats more frustrating is the bulk of GPU sales sits at the lower/mid-tier performance segment where less memory is provisioned (eg. previous consoles, budget pre-builts, entry level cards, etc). As a result, game developers with given hardware constrains are less inclined to offer more juicier eye-candy. Textures being the primary factor and we're not even touching on the more complex ML produce/realistic illumination compositions/weathering/etc. In short, the bigger the bottom barrel market + the calculatively enforced lower VRAM provison = ultimately as always, slower growth in graphics progress/realism. I'd fancy enhanced textures and graphics physics over AI strenous RT/PT anyday of the week. Just because 8GB for most is still workable it doesn't mean sticking with lower VRAM is a respectable advantage. The fact of the matter remains, anyone buying into todays over-priced graphics cards should not be limited to 8GB. Six years ago i bought into a 11GB 1080 TI after shifting to 1440p. The reasonble expectation back then being- future offerings would pump-up those GB's across all performance-tiers especially considering AAA games are always growing in weight and buyers maintain the perpetual hi-res display growth. 6 years on and we're still bickering on 8GB as a feasible standard for mid-ranged hi-performance cards - strangely bizzare if you ask me!!
At 13500 I had the answer because I have the processor. I cannot answer this demonstration. The closest game is CoD Modern Warfare. I don't have warfare 2, but I don't think it's that big of a difference because they use the same graphics engine and have always demonstrated that they launch superoptimized games, not crap that causes problems even for top video cards.
The discrepancy between HU and me is HUGE.
All settings at maximum, 1080p, RT ON, DLSS Quality
Minimum: 131 FPS
Max: 200+ FPS
Average: ~ 160 FPS
I have a request for AMD fans. Talk about what you have, not what you DON'T have.
3070/Ti and/or 6800 were not even in 2021 for 4K, maximum settings. The 4060/Ti won't be for 4K either. For 4K you need a very powerful graphics processor and they have enough vRAM.
The cheapest 4000 series Nvidia GPU at this time is the GTX 4070 for $600.
The cheapest 3000 series Nvidia GPU is the GTX 3050 for $270.
Who are the sheep you talk about?
4060Ti should be no more than $350, but Nvidia has you domesticated.
A750 counterpart for 250$ is the 3060 8GB for 300$. :toast:
I think in the near future AMD will get wrecked for its prices for theyr gpu for rasterize only,
in 2024 will Intel with Battlemage the counterpart for Nvidia in the low till mid class buyers, Amd is the 3rd whell on the bycycle.
And thats only AMDs fault, cause they tought they can play monopoly with NVIDIA.