Monday, April 17th 2023

NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source: Red Gaming Tech (YouTube)
Add your own comment

237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti

#201
bug
ixiAnd I can purchase 6650xt for under 291e while unknown shops sell fot 270e, so, good luck nvidia.
I was talking about launch prices. Of course it can be had for less now.
tfdsafYes! Mid range has historically been capable of running the latest games at highest settings and relatively good resolution. I mean the GTX 970 I remember used to run almost all games at 60+fps at 1080p and most games at 1440p as well. The GTX 1070 was perfectly capable of running all of the latest games at 1440p and 60+fps, same with the GTX 2070, never mind that the 2070 was a step in the wrong direction and had bad value, it was still capable of running 1440p games at 60+fps.

All of these cards have been able to run most AAA games at least 3 years after release at either the highest or a tier bellow highest at 60+fps.

With AMD its been even better as they've always provided a lot more vram and room to grow, we saw it with RX 500, RX 5000, etc... where these series kept becoming better and better.

The most direct example of a bad value and DOA card is probably the 1060 3GB, just a year after releasing that card could not a third of the games at highest textures and, most 4GB got obliterated then as well, I remember I just bought the GTX 1060 6GB and ROTTR was out and that games used up to 6.5GB of vram. Assasint Creed something used 6+ GB, so even the 1060 6GB was on the edge just 2 years later!

These 8GB is basic level, its the lowest entry point, anything else and games are going to be stuttering, crashing, have high frame times, etc....

For GPU's that have still not come out yet, are yet to release in a month or two, having only 8GB as so-called mid tier cards is blatantly stupid and fraudulent.
970 and 1070 were not mid range cards, they were high end. People stated calling x70 mid range starting with 2070, but not because they were mid range, but because they weren't carved out of the x02 silicon anymore.
Posted on Reply
#202
Blitzkuchen
Vayra86Here's Cyberpunk 3440x1440 on max settings,
I dunno who play this garbage game, yeah many scream its from CD Project its a great game :laugh:
i do not remember which open World game have pop up Cops and Cars since last Grand Theft Auto 3 on the PS2.

Graphics is all, yeah its the burner off everything all, thats the reason for i play GTA Lieberty Cityie Storys on my PSP,
but in the other hand i played Cyberpunk for about 2 Hours and refund it. (Garbage Game never sawn before)
Posted on Reply
#203
wheresmycar
GicaIt's good that we understood each other. :slap:
I'm reposting this printscreen, don't forget about it. Added new games, video cards are two years old, 16GB didn't make a difference. And they won't do it either in 2024 or 2025, maybe in 2030. Then, this memory surplus will help the old GPU to render with a 10% boost, with a jump from 10 to 11 FPS. If you don't believe it, run the new games in extreme detail with Radeon VegaI 16GB. Even in 4K, because it has enough memory. :D



Do any of these titles hit or even come close to 8GB VRAM usage? (@1440p)

Personally, i wouldn't touch 8GB for higher resolution gaming. FPS performance is one thing but maxed out VRAM quality setting consistency is another.
Posted on Reply
#204
tfdsaf
bugI was talking about launch prices. Of course it can be had for less now.


970 and 1070 were not mid range cards, they were high end. People stated calling x70 mid range starting with 2070, but not because they were mid range, but because they weren't carved out of the x02 silicon anymore.
I mean $330 and $370 is pretty much mid range!

$250 to $350 is mid range and then you add or take $20.

Thing is x70 series have become much more expensive, so they are now "high end", but in name only! The x70 should be around $400 dollars realistically, some will argue possibly even lower than that!

I mean heck wasn't the GTX 980 a high end for $500?

So, entry level is $150 and bellow, the mainstream segment is essentially $150-250, mid range from $250 to 350, again plus or minus $20-30 to those price ranges depending on value and high end is essentially $350/400 and above! Usually that went to $500-600 historically for over 20+ years, although we've seen it rise to over $1000 these last 5 years!
Posted on Reply
#205
Gica
tfdsafLOL! Vram amount doesn't matter until you run out of it, then you literally can't play the game or you have insane stutters! We already have 5 games already that use more than 8GB of vram, in fact going towards 12GB realistically and we are early 2023, most upcoming games are going to use 10GB or more.

8GB is ENTRY level amount, 12GB is the bare minimum for mid range with 16GB being realistically the target for mid range and upper mid range.
It's called bad optimization. So bad that they cause problems for powerful video cards even in 1080p without DLSS or FSR.
Even so, I am guided by some landmarks that contradict you.
These are:
5700XT 12GB should not exceed 3070 8GB. And he doesn't. The distance is maintained in all tested resolutions, even in new games.

3070 Ti very close to 2080 Ti in all tested resolutions (+/- 2-3%). The memory surplus does not help the 2080 Ti to stand out, even at high resolutions.

Even if they are well optimized, a large amount of information stored in memory requires more computing power, a more powerful graphic processor. So, in 2-3 years, it's not the memory that will kill the 3070 Ti, but the power of the graphics processor. Other graphics chips from their generation will be present in the funeral procession, with all their surplus memory.
------
Except for nolifers, all people who allocate a limited amount of time to games think they have 1-2 games and the rest are at most curiosity. I don't think they will be emotionally affected if the video card has more modest results in games that don't interest them, if it behaves very well in the ones that interest them.
Posted on Reply
#206
ixi
bugI was talking about launch prices. Of course it can be had for less now.
Yeah, and 3060 and 3060 it is still over nvidia defined price. Hehe, just nvidia boys buying overpriced products.
Posted on Reply
#207
Vario
In my opinion, its only a bad deal due to the low VRAM. And that makes it quite a bad deal. If it had 12, it would be reasonable.

The 3060 and 3060 TI were in the $400-450 range. I paid $420 for the 3060 12GB from EVGA's waitlist simply because it was the only card I could buy (AMD or NVIDIA) in 2021.
Posted on Reply
#209
Vayra86
Gica------
Except for nolifers, all people who allocate a limited amount of time to games think they have 1-2 games and the rest are at most curiosity. I don't think they will be emotionally affected if the video card has more modest results in games that don't interest them, if it behaves very well in the ones that interest them.
The coping mechanism is strong in you, strong it is!

The bottom line is, you don't need to make any concessions if you pick a better GPU. Emotionally affected, not by a long shot ;) 8GB is entry level today, no way around it there is!

Posted on Reply
#210
ixi
Vayra86The coping mechanism is strong in you, strong it is!

The bottom line is, you don't need to make any concessions if you pick a better GPU. Emotionally affected, not by a long shot ;) 8GB is entry level today, no way around it there is!

Sir you, a cookie deserve. Now go, awaits kitchen.

Agree, that 8GB is too small for future. And if you use the gpu on 4K, then you will feel it as well.
Posted on Reply
#211
Gica
Vayra86The coping mechanism is strong in you, strong it is!
I apologize if I offended you. I have just now overlooked your number of posts, only on TPU.

Last review, 2023 edition
Posted on Reply
#212
Vayra86
bugI was talking about launch prices. Of course it can be had for less now.


970 and 1070 were not mid range cards, they were high end. People stated calling x70 mid range starting with 2070, but not because they were mid range, but because they weren't carved out of the x02 silicon anymore.
Upper midrange - bottom high end. The x70 was always presented as the 'can do everything' card for the most common resolution at its time. Poor man's high end if you will. Nvidia says that for Ada that is 1440p in their own marketing. And I think its even true - today.

x70 was never carved out of the 102?
Or am I misreading you? x70 used to be and even in Ada still is a cut down 104. Since Kepler at least. That is also why its bottom high end; x80 is the full 104, x70 the failed ones so they were also highly competitive on price.
Posted on Reply
#213
bug
Vayra86Upper midrange - bottom high end. The x70 was always presented as the 'can do everything' card for the most common resolution at its time. Poor man's high end if you will. Nvidia says that for Ada that is 1440p in their own marketing. And I think its even true - today.

x70 was never carved out of the 102?
Or am I misreading you? x70 used to be and even in Ada still is a cut down 104. Since Kepler at least. That is also why its bottom high end; x80 is the full 104, x70 the failed ones so they were also highly competitive on price.
Right you are. 1070 was still 104 silicon, 2070 was the first 106. The Super that replaced it was 104 again. I wasn't remembering correctly, I looked it up again.
Posted on Reply
#214
tfdsaf
GicaIt's called bad optimization. So bad that they cause problems for powerful video cards even in 1080p without DLSS or FSR.
Even so, I am guided by some landmarks that contradict you.
These are:
5700XT 12GB should not exceed 3070 8GB. And he doesn't. The distance is maintained in all tested resolutions, even in new games.

3070 Ti very close to 2080 Ti in all tested resolutions (+/- 2-3%). The memory surplus does not help the 2080 Ti to stand out, even at high resolutions.

Even if they are well optimized, a large amount of information stored in memory requires more computing power, a more powerful graphic processor. So, in 2-3 years, it's not the memory that will kill the 3070 Ti, but the power of the graphics processor. Other graphics chips from their generation will be present in the funeral procession, with all their surplus memory.
------
Except for nolifers, all people who allocate a limited amount of time to games think they have 1-2 games and the rest are at most curiosity. I don't think they will be emotionally affected if the video card has more modest results in games that don't interest them, if it behaves very well in the ones that interest them.
Average fps doesn't tell you the picture when the GPU runs out of memory, but you can clearly notice the giant stutters. frametimes are impacted a lot more, but even then, it doesn't tell the full picture!

You might be seeing barely 5% difference, maybe even less, but in reality, one is unplayable and the other one is buttery smooth!
bugI was talking about launch prices. Of course it can be had for less now.


970 and 1070 were not mid range cards, they were high end. People stated calling x70 mid range starting with 2070, but not because they were mid range, but because they weren't carved out of the x02 silicon anymore.
Mid range is the price point the GPU's are at. You can have the RTX 4080 at $350 it will be mid range, the name doesn't matter. Now again it also depends on the value, you have the say GTX 1050ti at $300 as a mid range, but the value won't be there.

You have entry level, mainstream, mid range and high end. These are all represented by price points, you can't have a GPU cost $250 and be entry level, it has to be $150 plus or minus 20 and bellow. mainstream is $150 to $300, again not set in stone pricing, but plus or minus $30.
Posted on Reply
#215
Gica
tfdsafAverage fps doesn't tell you the picture when the GPU runs out of memory, but you can clearly notice the giant stutters. frametimes are impacted a lot more, but even then, it doesn't tell the full picture!
I have had the video card for two years and I know how it behaves. I don't care what will happen in two years, the cycle of this series is finished then. Do you see any miracle of Vega Frontier 16 GB in front of 1080 Ti? Not! It was and will remain weaker. Don't buy 4060 (ti) for 4k. It will do great with 8GB vRam, the graphics processor will be the first limitation for a higher resolution or/and higher details./
Posted on Reply
#216
tfdsaf
GicaI have had the video card for two years and I know how it behaves. I don't care what will happen in two years, the cycle of this series is finished then. Do you see any miracle of Vega Frontier 16 GB in front of 1080 Ti? Not! It was and will remain weaker. Don't buy 4060 (ti) for 4k. It will do great with 8GB vRam, the graphics processor will be the first limitation for a higher resolution or/and higher details./
Well its dishonest to compare a GPU from 5 years ago having 16GB of vram when it was an overkill, to GPU's coming out TODAY or not even out yet as in the RTX 4060 series and claiming that 8GB is just enough, 8GB is the new 4GB when the 1000 series came out, a ton of games were already using over it and within a year it was like half of all new games used 4GB or more.

Right now any gpu that comes out with 8GB is essentially entry level, I mean look at all of the announced games in the past 2 months, they all have crazy high specs even for minimum requirements.

I can name you 5-6 games our right now that all use way more than 8GB of vram and yes technically you can play them, but you have to lower textures significantly even at 1080p and textures are the most important graphical component, it is what makes games look good!
Posted on Reply
#217
bug
tfdsafWell its dishonest to compare a GPU from 5 years ago having 16GB of vram when it was an overkill, to GPU's coming out TODAY or not even out yet as in the RTX 4060 series and claiming that 8GB is just enough, 8GB is the new 4GB when the 1000 series came out, a ton of games were already using over it and within a year it was like half of all new games used 4GB or more.

Right now any gpu that comes out with 8GB is essentially entry level, I mean look at all of the announced games in the past 2 months, they all have crazy high specs even for minimum requirements.

I can name you 5-6 games our right now that all use way more than 8GB of vram and yes technically you can play them, but you have to lower textures significantly even at 1080p and textures are the most important graphical component, it is what makes games look good!
Then why is the 7600 also rumored to be equipped with 8GB?
Posted on Reply
#218
tfdsaf
bugThen why is the 7600 also rumored to be equipped with 8GB?
It shouldn't be, it should have more! Just because companies force it down our throats doesn't mean we have to accept it.
Posted on Reply
#219
bug
tfdsafIt shouldn't be, it should have more! Just because companies force it down our throats doesn't mean we have to accept it.
Or maybe, just maybe, 8GB is perfectly fine for that segment?

I will say, I don't understand why people have developed a fetish for VRAM. I've been buying video cards for over 20 years now and not once did I have to replace one because of VRAM. Sure, there has been an occasional mismatch here and there (whether low VRAM or the dreaded 64-bit bus), but those were rare and easy to spot/avoid.
This "not enough VRAM" is just a fad, imho. Tech sites have detected these days picking the right selection of games will run fine on card X and less fine on card Y rakes in page view, so that's what they publish every occasion they have (how I miss HardOCP's "highest playable settings" and "apples-to-apples" graphs). Take a step back from that and you realize your latest generation card can play (tens of) thousands of games and struggle with maybe 50*. I really don't have a problem with that.

*Many of those 50 titles can be played at reduced settings, which again, is not really an issue, unless you bought into high-end. In that case yes, you are entitled to a card that can play everything you throw at it, but even then, there will always be a few developers that will take a "you can't max out our title on current-gen hardware, that's how cool we are".
Posted on Reply
#220
tfdsaf
bugOr maybe, just maybe, 8GB is perfectly fine for that segment?

I will say, I don't understand why people have developed a fetish for VRAM. I've been buying video cards for over 20 years now and not once did I have to replace one because of VRAM. Sure, there has been an occasional mismatch here and there (whether low VRAM or the dreaded 64-bit bus), but those were rare and easy to spot/avoid.
This "not enough VRAM" is just a fad, imho. Tech sites have detected these days picking the right selection of games will run fine on card X and less fine on card Y rakes in page view, so that's what they publish every occasion they have (how I miss HardOCP's "highest playable settings" and "apples-to-apples" graphs). Take a step back from that and you realize your latest generation card can play (tens of) thousands of games and struggle with maybe 50*. I really don't have a problem with that.

*Many of those 50 titles can be played at reduced settings, which again, is not really an issue, unless you bought into high-end. In that case yes, you are entitled to a card that can play everything you throw at it, but even then, there will always be a few developers that will take a "you can't max out our title on current-gen hardware, that's how cool we are".
You as a consumer demand less for more money, thus we get bad value. I demand more for less money, thus we get better value!

We've been through this at least 4 different times, when 1GB was not enough and Nvidia was equipping cards with 768mb of vram or 512mb, etc..., once we get a new generation of games and that usually comes within a year of new consoles launching, we get higher requirements and vram is a key requirement.

Again when the 1000 series dropped a 4GB card was dead in the water on arrival if you wanted to play games at the highest settings at 1080p and this has always been the case! The new generation x60 was as fast as last generation x80 flagship or very close to it, so for a few years you could run games at max settings at 1080p.

I remember within 1 year after the GTX 1060 released most new games consumed ovcer 4GB of vram, thankfully the 1060 at least one version came with 6GB of vram and you can still use it for play games at 1080p and medium settings, while the 3GB version you can't.

Even most 4GB gpu's can't run games anymore, they are stutter fest. And again, resolution doesn't really reduce texture space, its usually around 500mb up to 1GB in some rare cases, and you can always lower textures quality, but texture quality doesn't reduce or increase FPS, so if you have enough vram you can ALWAYS set textures to maximum settings and have the game always look the best! So why limit it with vram amount?

We'll have to wait and see what the actual performance of this card is and the price, but if this is anywhere in the region of $300 to 400 it needs to have more than 8GB of vram, they could have a version with 8GB that costs say $300 and one that costs $350 at 16GB, but they need to have 16GB version.

Resident Evil, hogwarts legacy, calisto protocol, the lat of us, etc and a dozen of upcoming games that all use over 8GB of vram.
Posted on Reply
#221
bug
tfdsafYou as a consumer demand less for more money, thus we get bad value. I demand more for less money, thus we get better value!
I don't demand less, I won't upgrade if I'm not looking at at least +20% fps uplift. I just don't fret over particular specs.
Posted on Reply
#222
Gica
tfdsafWell its dishonest to compare a GPU from 5 years ago having 16GB of vram when it was an overkill, to GPU's coming out TODAY or not even out yet as in the RTX 4060 series and claiming that 8GB is just enough, 8GB is the new 4GB when the 1000 series came out, a ton of games were already using over it and within a year it was like half of all new games used 4GB or more.

Right now any gpu that comes out with 8GB is essentially entry level, I mean look at all of the announced games in the past 2 months, they all have crazy high specs even for minimum requirements.

I can name you 5-6 games our right now that all use way more than 8GB of vram and yes technically you can play them, but you have to lower textures significantly even at 1080p and textures are the most important graphical component, it is what makes games look good!
I repeat once again the results of the last review. The gap between 6800 and 3070Ti a remained the same as two years ago. New games were added and nothing changed. What helped 16GB vRAM in front of 8GB? With nothing. In 2-3 years the two will be history and not because of memory.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!

Posted on Reply
#223
ixi
GicaI repeat once again the results of the last review. The gap between 6800 and 3070Ti a remained the same as two years ago. New games were added and nothing changed. What helped 16GB vRAM in front of 8GB? With nothing. In 2-3 years the two will be history and not because of memory.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!

Why do you think nothing has changed? If 4070 was added to the list. Did reviewer say that he did go through amd gpu's even if topic is about 4070? :D most likely graph was edited and thrown in only 4070.
Posted on Reply
#224
tfdsaf
GicaI repeat once again the results of the last review. The gap between 6800 and 3070Ti a remained the same as two years ago. New games were added and nothing changed. What helped 16GB vRAM in front of 8GB? With nothing. In 2-3 years the two will be history and not because of memory.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!

Dude are you crazy? They don't retest results every time they bench a new GPU, some of those results are years old. The only time these results get updated is when the whole system is changed, usually this means a new significantly faster CPU is introduced over the setup they are using at the time, this usually means at least 3 years without changing the test system for maximum compatibility of results.

Here is an updated look at the RTX 3070 vs the RX 6800 in new games. The RTX 4060 is NOT even out yet and is coming in with insufficient vram!

Go ahead and buy it, heck send Nvidia $5000 check in the mail as a thank you for pxysX while you are at it.

Me and SANE consumers demand better value!
Posted on Reply
#225
BoboOOZ
GicaI repeat once again the results of the last review. The gap between 6800 and 3070Ti a remained the same as two years ago. New games were added and nothing changed. What helped 16GB vRAM in front of 8GB? With nothing. In 2-3 years the two will be history and not because of memory.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!

Well here's a review that says otherwise.

Edit: @tfdsaf you beat me to it.
Posted on Reply
Add your own comment
Dec 18th, 2024 04:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts