Monday, April 17th 2023
NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
NVIDIA is preparing its fifth GeForce RTX 40-series "Ada" graphics card launch in May 2023, with the GeForce RTX 4060 Ti. Red Gaming Tech reports that the company could target the USD $450 price-point with this SKU, putting it $150 below the recently launched RTX 4070, and $350 below the RTX 4070 Ti. The RTX 4060 Ti is expect to nearly max-out the 5 nm "AD106" silicon, the same one that powers the RTX 4070 Laptop GPU. While the notebook chip maxes it out, featuring all 4,608 CUDA cores physically present across its 36 SM, the desktop RTX 4060 Ti will be slightly cut down, featuring 34 SM, which work out to 4,352 CUDA cores. The "AD106" silicon features a 128-bit wide memory interface, and NVIDIA is expected to use conventional 18 Gbps-rated GDDR6 memory chips. The design goal behind the RTX 4060 Ti could be to beat the previous-generation RTX 3070, and to sneak up on the RTX 3070 Ti, while offering greater energy efficiency, and new features such as DLSS 3.
Source:
Red Gaming Tech (YouTube)
237 Comments on NVIDIA to Target $450 Price-point with GeForce RTX 4060 Ti
i do not remember which open World game have pop up Cops and Cars since last Grand Theft Auto 3 on the PS2.
Graphics is all, yeah its the burner off everything all, thats the reason for i play GTA Lieberty Cityie Storys on my PSP,
but in the other hand i played Cyberpunk for about 2 Hours and refund it. (Garbage Game never sawn before)
Personally, i wouldn't touch 8GB for higher resolution gaming. FPS performance is one thing but maxed out VRAM quality setting consistency is another.
$250 to $350 is mid range and then you add or take $20.
Thing is x70 series have become much more expensive, so they are now "high end", but in name only! The x70 should be around $400 dollars realistically, some will argue possibly even lower than that!
I mean heck wasn't the GTX 980 a high end for $500?
So, entry level is $150 and bellow, the mainstream segment is essentially $150-250, mid range from $250 to 350, again plus or minus $20-30 to those price ranges depending on value and high end is essentially $350/400 and above! Usually that went to $500-600 historically for over 20+ years, although we've seen it rise to over $1000 these last 5 years!
Even so, I am guided by some landmarks that contradict you.
These are:
5700XT 12GB should not exceed 3070 8GB. And he doesn't. The distance is maintained in all tested resolutions, even in new games.
3070 Ti very close to 2080 Ti in all tested resolutions (+/- 2-3%). The memory surplus does not help the 2080 Ti to stand out, even at high resolutions.
Even if they are well optimized, a large amount of information stored in memory requires more computing power, a more powerful graphic processor. So, in 2-3 years, it's not the memory that will kill the 3070 Ti, but the power of the graphics processor. Other graphics chips from their generation will be present in the funeral procession, with all their surplus memory.
------
Except for nolifers, all people who allocate a limited amount of time to games think they have 1-2 games and the rest are at most curiosity. I don't think they will be emotionally affected if the video card has more modest results in games that don't interest them, if it behaves very well in the ones that interest them.
The 3060 and 3060 TI were in the $400-450 range. I paid $420 for the 3060 12GB from EVGA's waitlist simply because it was the only card I could buy (AMD or NVIDIA) in 2021.
The bottom line is, you don't need to make any concessions if you pick a better GPU. Emotionally affected, not by a long shot ;) 8GB is entry level today, no way around it there is!
Agree, that 8GB is too small for future. And if you use the gpu on 4K, then you will feel it as well.
Last review, 2023 edition
x70 was never carved out of the 102?
Or am I misreading you? x70 used to be and even in Ada still is a cut down 104. Since Kepler at least. That is also why its bottom high end; x80 is the full 104, x70 the failed ones so they were also highly competitive on price.
You might be seeing barely 5% difference, maybe even less, but in reality, one is unplayable and the other one is buttery smooth! Mid range is the price point the GPU's are at. You can have the RTX 4080 at $350 it will be mid range, the name doesn't matter. Now again it also depends on the value, you have the say GTX 1050ti at $300 as a mid range, but the value won't be there.
You have entry level, mainstream, mid range and high end. These are all represented by price points, you can't have a GPU cost $250 and be entry level, it has to be $150 plus or minus 20 and bellow. mainstream is $150 to $300, again not set in stone pricing, but plus or minus $30.
Right now any gpu that comes out with 8GB is essentially entry level, I mean look at all of the announced games in the past 2 months, they all have crazy high specs even for minimum requirements.
I can name you 5-6 games our right now that all use way more than 8GB of vram and yes technically you can play them, but you have to lower textures significantly even at 1080p and textures are the most important graphical component, it is what makes games look good!
I will say, I don't understand why people have developed a fetish for VRAM. I've been buying video cards for over 20 years now and not once did I have to replace one because of VRAM. Sure, there has been an occasional mismatch here and there (whether low VRAM or the dreaded 64-bit bus), but those were rare and easy to spot/avoid.
This "not enough VRAM" is just a fad, imho. Tech sites have detected these days picking the right selection of games will run fine on card X and less fine on card Y rakes in page view, so that's what they publish every occasion they have (how I miss HardOCP's "highest playable settings" and "apples-to-apples" graphs). Take a step back from that and you realize your latest generation card can play (tens of) thousands of games and struggle with maybe 50*. I really don't have a problem with that.
*Many of those 50 titles can be played at reduced settings, which again, is not really an issue, unless you bought into high-end. In that case yes, you are entitled to a card that can play everything you throw at it, but even then, there will always be a few developers that will take a "you can't max out our title on current-gen hardware, that's how cool we are".
We've been through this at least 4 different times, when 1GB was not enough and Nvidia was equipping cards with 768mb of vram or 512mb, etc..., once we get a new generation of games and that usually comes within a year of new consoles launching, we get higher requirements and vram is a key requirement.
Again when the 1000 series dropped a 4GB card was dead in the water on arrival if you wanted to play games at the highest settings at 1080p and this has always been the case! The new generation x60 was as fast as last generation x80 flagship or very close to it, so for a few years you could run games at max settings at 1080p.
I remember within 1 year after the GTX 1060 released most new games consumed ovcer 4GB of vram, thankfully the 1060 at least one version came with 6GB of vram and you can still use it for play games at 1080p and medium settings, while the 3GB version you can't.
Even most 4GB gpu's can't run games anymore, they are stutter fest. And again, resolution doesn't really reduce texture space, its usually around 500mb up to 1GB in some rare cases, and you can always lower textures quality, but texture quality doesn't reduce or increase FPS, so if you have enough vram you can ALWAYS set textures to maximum settings and have the game always look the best! So why limit it with vram amount?
We'll have to wait and see what the actual performance of this card is and the price, but if this is anywhere in the region of $300 to 400 it needs to have more than 8GB of vram, they could have a version with 8GB that costs say $300 and one that costs $350 at 16GB, but they need to have 16GB version.
Resident Evil, hogwarts legacy, calisto protocol, the lat of us, etc and a dozen of upcoming games that all use over 8GB of vram.
2080 Ti, the king of kings before Ampere, is fighting the 4060 now. 4070 humiliates him without the right of appeal. Insufficient vRAM? Let's be serious!
Here is an updated look at the RTX 3070 vs the RX 6800 in new games. The RTX 4060 is NOT even out yet and is coming in with insufficient vram!
Go ahead and buy it, heck send Nvidia $5000 check in the mail as a thank you for pxysX while you are at it.
Me and SANE consumers demand better value!
Edit: @tfdsaf you beat me to it.