Monday, April 17th 2023

NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source: Red Gaming Tech (YouTube)
Add your own comment

237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

#226
ixi
BoboOOZWell here's a review that says otherwise.

Edit: @tfdsaf you beat me to it.
Nvidia boys will be boys. Meanwhile I do want to get back good gpu's at okey prices... sadly last three generations are kick in the nuts...
Posted on Reply
#227
Vayra86
GicaI repeat once again the results of the last review. The gap between 6800 and 3070Ti a remained the same as two years ago. New games were added and nothing changed. What helped 16GB vRAM in front of 8GB? With nothing. In 2-3 years the two will be history and not because of memory.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!

Hey maybe if you repeat those same graphs 10 more times, you'll magically grow 4GB on that fantastic GPU of yours and you can make it last as long as its core.

If not, keeping fooling yourself, because obviously you'll decide the core power was insufficient sooner rather than later and now you must have a 12GB 4070 ;) We all know where this ends. I'm very happy to see you're not coping at all though. On to the next shit purchase it is!
bugI don't demand less, I won't upgrade if I'm not looking at at least +20% fps uplift. I just don't fret over particular specs.
Its always about timing, whether or not that is a good idea.

And the timing we're on, is console timing. We're halfway through current gen consoles which can push 12GB or more, and you know as well as I do most games don't get the TLC/optimization they could potentially have. If the consoles demand 10-12 GB, and frankly that is where it is moving now, and fast; and if newer engines can work a lot better with >8GB, you can rest assured this is the new normal.

Being below the mainstream norm with any graphics card is a definite push to upgrade. You might work your way through several more years of gaming, but you'll also feel forced to skip content left and right because its just not playable enough. I've done that lately because I couldn't bring myself to any kind of upgrade path/deal at its retarded pricing, not to mention the fact that past generations have been notoriously weak. We keep telling ourselves we're content playing the backlog, and this is also true, but there's also that thing you can't do and kinda do want.

Take note of the fact that an 8GB PS4 got released around the same time/year as Pascal which promptly pushed Maxwell's 4GB midrange to an 8GB norm from the x70 onwards (a DOUBLING... and that is after 2-3GB Kepler, so that's +50% and then +100% within the space of 3 generations), and even a pretty generous 6GB on the much weaker 1060. Its now really starting to show its limits, and in ever worse ways, and not because of lacking core - even Darktide was perfectly playable with FSR, not a mission was lost due to high variation of frametimes; even if framerates would be 40-50, much like in FSR Cyberpunk. But both games do love to eat 6-8GB. The disbalance you get now on Ada, and on a supposed 8GB RX x600 GPU is tremendous, and its absolutely a major difference with what we've seen in the past.

I've been beating this drum since Turing started cutting down VRAM / % core perf and here we are today - its happening, on release of new midrange cards that they already have to cut back on settings.

And if you keep cards for longer than 3 years, the worst case scenario is likely to happen because you'll be looking at a PS6 with the full 16GB addressable as VRAM, for sure. Perhaps even more on some epeen versions of it.
Posted on Reply
#228
Gica
ixiWhy do you think nothing has changed? If 4070 was added to the list. Did reviewer say that he did go through amd gpu's even if topic is about 4070? :D most likely graph was edited and thrown in only 4070.
See in the list the new games introduced in the 4070 review. All the cards in the new games are tested.
For example: F1 2020 versus F1 2022, Far Cry 5 versus FarCry 6, etc.
The Callisto Protocol, made in 2022, could not be tested in 2021 for the 3070Ti review. And it's just one of the added games.
Link, but maybe you have information that the reviewer used data from astrology, he did not test the video cards.
BoboOOZWell here's a review that says otherwise.

Edit: @tfdsaf you beat me to it.
ixiNvidia boys will be boys. Meanwhile I do want to get back good gpu's at okey prices... sadly last three generations are kick in the nuts...
Huh? HUH?!?!?!?!?! AMD's homeboy?
Summary of the video: they played with the settings until they drowned the video card with smaller memory. The one with a large memory doesn't feel good either, at their 2021 price you can expect at least 100 FPS.
Question for AMD fans, addicted to HU: why didn't they use DLSS/FSR? :kookoo:
Posted on Reply
#229
GamerNerves
I too think this VRAM discussion has got a little bit out of hand, because you can play just fine on 8 GB, though the right assessment of the question is, how much VRAM you should get for a certain price when buying a new GPU. I think it is unappealing to buy any 8 GB GPU over 200 €, because it is certain that you will have to dial down settings even at 1080p now and in the future, so I'm not criticizing the demand for more VRAM in lower price ranges, but I want to remind that when you have enough VRAM, you don't need any more. 12 GB is fine and even 10 GB will do the job at 1440p, but it simply feels stingy to only get 12 GB when you are already overpaying for GPUs like RTX 4070 / Ti, even if you will not likely face any meaningful issues related to insufficient VRAM. I don't know how many here watch Hardware Unboxed, but the host Steve there has gone a bit high with his demands, suggesting that even 20+ GB is something that should be demanded for high end GPUs. If RX 7800 / XT has 16 GB, would that be a problem? I would say totally not, though more is always welcome.
I would be fine with 12 GB without the price hike on the RTX xx70 model, but with the hike, we should get more. RTX 4060 / Ti though are GPUs that are meant to last for a while, xx60 models being historically very capable midrange offerings, so limiting their capability straight on arrival is just silly to me. If they truly have 8 GB, or either one of them has, I think refreshed models are in order with more, so the 8 GB models can be priced down and take the place of entry level models, RTX 4050 being the absolute cheapest that you can comfortably game on.

Remember that everything is relative: Medium quality settings and textures today often look very good on a 1440p monitor, perhaps even at 4K depending on the game, so don't go nuts thinking that you need to have the absolute best settings for games to look great. See for yourself what settings look acceptable to you and compare the best textures against lesser quality settings - you will see that the difference is often not there or it is marginal.
My advice is to avoid buying 8 GB models if not for very cheap, but if you already have an 8 GB card, perhaps even the sorrowful RTX 3070 / Ti, you can still game just fine for a long time if you only tweak the settings right - everything will look good and run fine (at least regarding the RTX 3070 / Ti). I doubt too many bought RTX 3070 / Ti to play on a 4K monitor, where the VRAM limitation starts to show quite badly.

Enjoy your games being more conservative on VRAM usage! Even if the latest games offer immensely high resolution textures and various other demanding settings, it doesn't mean you really need them and when you read the benchmark charts, remember that reviewers usually do utilize these best, but unnecessary, settings.
Posted on Reply
#230
Blitzkuchen
even if u not ran out of RAM with the optimised drivers for the 970 with 3,5GB +512MB its still faster than an GTX 1650.

GTX 970 in 2015 350$
GTX 1650 in 2023 180$

8 years difference, this never being happen in history

But go for ure RTX 4070 for about 600$ its a RTX 4060 for 300$
Posted on Reply
#231
tussinman
bugThen why is the 7600 also rumored to be equipped with 8GB?
Because it's replacing a card that currently cost as low as $199 USD (RX 6600)

The text you quoted already kind of answered that question, 8GB in 2023 should be targeted more towards entry level GPUs like the 7600

A $450-500 4060 Ti having 8GB (along with a slow memory bus) is embarrassing no way around it.
Posted on Reply
#232
wheresmycar
GicaI repeat once again the results of the last review. The gap between 6800 and 3070Ti a remained the same as two years ago. New games were added and nothing changed. What helped 16GB vRAM in front of 8GB? With nothing. In 2-3 years the two will be history and not because of memory.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!

Gica, i hope 8GB lasts you a life-time! It's certainly possible.... tetris on 16K will be a blast!

I'm not sure why these charts are relevant? Are you suggesting increasing VRAM is supposed to drive increased FPS? It doesn't. You can slap 24GB of VRAM on any of these mid-tier cards and it won't make a difference.

Increased VRAM is more to do with visual quality output, essentially the image and graphics data shown on the display. Most buyers who have bought into or intend on buying higher resolution displays (1440p/4k) are in pursuit of sharper image quality alongside "the best quality preset" possible. In select or upcoming demanding titles 8GB simply ain't gonna cut it and not everyone is willing to fork out for these extortionately wallet-slurping higher VRAM provisioned cards. Games surpassing 8GB was always "inevitable". No one is suggesting 8GB isn't game'able and yes still highly relevant for some depending on the resolution, type of game or individual quality threshold. Unfortunately its not a one-shoe-fits-all sort of thing and already we can see cracks surfacing and no doubt with upcoming demanding games these cracks will broaden like the split seas of Moses. What makes matters worse, you're literally paying Nvidia or AMD RT/PT feature levies which are insanely taxing on VRAM and yet mid-tier cards (and very expensive ones at that) are being capped at 8GB. Going forward, purely from a consumer standpoint, i can't make sense of anyone sensible defending the 8GB position (esp. hi-res gaming).

What i find odd is the big elephant in the room - VRAM limitations have been a compromise for some time now or even better put, not size-scaling fast enough to leverage modern graphics capabilities. The good stuff is being passed on to the smaller volume bigger spenders. Even smart game engines use various VRAM-level real-time assets substitutions to tackle limitations or give the illusion of higher quality presets running seamlessly. This sort of erratic dynamic asset fiddling is very noticeable when VRAM limits encroach or max out (in some cases, i've noticed unappealing discrepancies even at 80-90% utilisation). We should make a clear distinction between games running smoothly (fps) vs dynamically remitted image quality or maxing out which leads to more perverse penalties (stuttering, artifacts, odd shuffling anamolies, crashes, etc).

Whats more frustrating is the bulk of GPU sales sits at the lower/mid-tier performance segment where less memory is provisioned (eg. previous consoles, budget pre-builts, entry level cards, etc). As a result, game developers with given hardware constrains are less inclined to offer more juicier eye-candy. Textures being the primary factor and we're not even touching on the more complex ML produce/realistic illumination compositions/weathering/etc. In short, the bigger the bottom barrel market + the calculatively enforced lower VRAM provison = ultimately as always, slower growth in graphics progress/realism. I'd fancy enhanced textures and graphics physics over AI strenous RT/PT anyday of the week. Just because 8GB for most is still workable it doesn't mean sticking with lower VRAM is a respectable advantage. The fact of the matter remains, anyone buying into todays over-priced graphics cards should not be limited to 8GB. Six years ago i bought into a 11GB 1080 TI after shifting to 1440p. The reasonble expectation back then being- future offerings would pump-up those GB's across all performance-tiers especially considering AAA games are always growing in weight and buyers maintain the perpetual hi-res display growth. 6 years on and we're still bickering on 8GB as a feasible standard for mid-ranged hi-performance cards - strangely bizzare if you ask me!!
Posted on Reply
#233
kapone32
GicaSee in the list the new games introduced in the 4070 review. All the cards in the new games are tested.
For example: F1 2020 versus F1 2022, Far Cry 5 versus FarCry 6, etc.
The Callisto Protocol, made in 2022, could not be tested in 2021 for the 3070Ti review. And it's just one of the added games.
Link, but maybe you have information that the reviewer used data from astrology, he did not test the video cards.




Huh? HUH?!?!?!?!?! AMD's homeboy?
Summary of the video: they played with the settings until they drowned the video card with smaller memory. The one with a large memory doesn't feel good either, at their 2021 price you can expect at least 100 FPS.
Question for AMD fans, addicted to HU: why didn't they use DLSS/FSR? :kookoo:
Calling HU AMD fanboys shows how much context you really have in the space. HU are the main channel that pumped Raytracing and DLSS and created the narrative that rasterization is blah. It doesn't matter though because the only thing that would convince you is to have you sit in front of a High end Gaming PC and without specs know whether it is AMD, Nvidia or Intel just by using it. You would quickly find that your absolute opinion that you hold as fact would be on a slippery slope. The fact that consoles have 16GB of VRAM is why 8 GB cards are going to suffer, Especially when the consoles like to upscale 1080P to 4K. Of course that is only at 4K HIgh refresh rates though.
Posted on Reply
#234
Gica
GamerNervesI don't know how many here watch Hardware Unboxed, but the host Steve there has gone a bit high with his demands, suggesting that even 20+ GB is something that should be demanded for high end GPUs.
As usual, he followed the master's order. Immediately after AMD's announcement about the amount of vRAM, HU remembers (LOL, what a coincidence) the old 3070 and brings "proofs" to the sheep. Why? Well, 4070/4070Ti are on the market and AMD has nothing in the area. The order per unit: sheep, beware of vRAM. We offer you more, but ... wait.
At 13500 I had the answer because I have the processor. I cannot answer this demonstration. The closest game is CoD Modern Warfare. I don't have warfare 2, but I don't think it's that big of a difference because they use the same graphics engine and have always demonstrated that they launch superoptimized games, not crap that causes problems even for top video cards.
The discrepancy between HU and me is HUGE.
All settings at maximum, 1080p, RT ON, DLSS Quality
Minimum: 131 FPS
Max: 200+ FPS
Average: ~ 160 FPS

I have a request for AMD fans. Talk about what you have, not what you DON'T have.



GamerNervesI doubt too many bought RTX 3070 / Ti to play on a 4K monitor, where the VRAM limitation starts to show quite badly.
3070/Ti and/or 6800 were not even in 2021 for 4K, maximum settings. The 4060/Ti won't be for 4K either. For 4K you need a very powerful graphics processor and they have enough vRAM.
Posted on Reply
#235
Dahak390
Is that the 8 gig version or 16 gig version? If it's the 8 gig ver. you can bet the 16 gig ver. will be about $35 - $50 more. I hope the bus speed is bumped up a bit so the vram can be truly utilized to its best ability. A 64 bit bus throttles the potential performance that 16 gigs of vram gives the card. Unless they plan on using a 128 bit bus. Then you'll have a midrange card worth buying. Sort of.
Posted on Reply
#236
Redwoodz
GicaAs usual, he followed the master's order. Immediately after AMD's announcement about the amount of vRAM, HU remembers (LOL, what a coincidence) the old 3070 and brings "proofs" to the sheep. Why? Well, 4070/4070Ti are on the market and AMD has nothing in the area. The order per unit: sheep, beware of vRAM. We offer you more, but ... wait.
At 13500 I had the answer because I have the processor. I cannot answer this demonstration. The closest game is CoD Modern Warfare. I don't have warfare 2, but I don't think it's that big of a difference because they use the same graphics engine and have always demonstrated that they launch superoptimized games, not crap that causes problems even for top video cards.
The discrepancy between HU and me is HUGE.
All settings at maximum, 1080p, RT ON, DLSS Quality
Minimum: 131 FPS
Max: 200+ FPS
Average: ~ 160 FPS

I have a request for AMD fans. Talk about what you have, not what you DON'T have.






3070/Ti and/or 6800 were not even in 2021 for 4K, maximum settings. The 4060/Ti won't be for 4K either. For 4K you need a very powerful graphics processor and they have enough vRAM.
This is what I have, a $279 RX 6650XT
The cheapest 4000 series Nvidia GPU at this time is the GTX 4070 for $600.
The cheapest 3000 series Nvidia GPU is the GTX 3050 for $270.
Who are the sheep you talk about?
4060Ti should be no more than $350, but Nvidia has you domesticated.
Posted on Reply
#237
Blitzkuchen
I have in the secondary PC an A750 and it grows and grows with the new drivers, the enemy isnt a 6650XT cause it cant even run RT Effects on low with playable Framerates,
A750 counterpart for 250$ is the 3060 8GB for 300$. :toast:


I think in the near future AMD will get wrecked for its prices for theyr gpu for rasterize only,
in 2024 will Intel with Battlemage the counterpart for Nvidia in the low till mid class buyers, Amd is the 3rd whell on the bycycle.
And thats only AMDs fault, cause they tought they can play monopoly with NVIDIA.
Posted on Reply
Add your own comment
Dec 18th, 2024 04:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts