Wednesday, June 14th 2023
NVIDIA GeForce RTX 4060 to Release on June 29
NVIDIA could be advancing the launch of its GeForce RTX 4060 (non-Ti) graphics card from its July 2023 launch the company originally announced. Leaked documents shared by MEGAsizeGPU say that NVIDIA could make the RTX 4060 available on June 29, which means reviews of the card could go live on June 28 for the MSRP cards, and June 29 for the premium ones priced above MSRP. It was earlier expected to launch alongside the 16 GB variant of the RTX 4060 Ti, in July.
The RTX 4060 is a significantly different product from the RTX 4060 Ti the company launched in May, it is based on the smaller AD107 silicon. The card is expected to feature 3,072 CUDA cores, 24 RT cores, 96 Tensor cores, 96 TMUs, and 32 ROPs, compared to the 4,352 CUDA cores, 34 RT cores, 136 Tensor cores, 136 TMUs, and 48 ROPs, of the RTX 4060 Ti. The memory configuration is similar, with 8 GB of GDDR6 memory across a 128-bit wide memory bus, however, the memory speed is slightly lower, at 17 Gbps vs. 18 Gbps of the Ti. The RTX 4060 has a TGP of just 115 W. The company hasn't finalized its price, yet.Update Jun 14th: NVIDIA confirmed the launch date on Twitter:
Sources:
MEGAsizeGPU (Twitter), NVIDIA Twitter
The RTX 4060 is a significantly different product from the RTX 4060 Ti the company launched in May, it is based on the smaller AD107 silicon. The card is expected to feature 3,072 CUDA cores, 24 RT cores, 96 Tensor cores, 96 TMUs, and 32 ROPs, compared to the 4,352 CUDA cores, 34 RT cores, 136 Tensor cores, 136 TMUs, and 48 ROPs, of the RTX 4060 Ti. The memory configuration is similar, with 8 GB of GDDR6 memory across a 128-bit wide memory bus, however, the memory speed is slightly lower, at 17 Gbps vs. 18 Gbps of the Ti. The RTX 4060 has a TGP of just 115 W. The company hasn't finalized its price, yet.Update Jun 14th: NVIDIA confirmed the launch date on Twitter:
NVIDIAThe GeForce RTX 4060 will now be available to order starting June 29, at 6AM Pacific.
76 Comments on NVIDIA GeForce RTX 4060 to Release on June 29
It'll make a good budget esports card for 1080p medium/competitive settings for years to come. As long as it is actually "budget" then there's no problem with this. If it's selling from $299 upwards though it's not going to really offer a significantly better esports experience than a vanilla RX6600 which you can get brand new for $179 now.
The 3050 / 3060 8GB / 6650XT / 7600 were all given negative review coverage not because they're bad cards in isolation, but because they're too expensive for the target audience they're serving (esports and casual gamers), whilst having insufficient VRAM for the more demanding AAA gamer looking for a new GPU. Even the 3050 4GB has a purpose and someone it will satisfy, but not at the ridiculous asking price of $249.
It's not like it's a unicorn either (6700 and 6700xt sell currently in the low 300s and do good at 1440p).
Meanwhile the RTX 4060 loses 33% of the VRAM that its predecessor had, in addition to halving the number of PCIE lanes. There are already benches up on YT where the 4060 TI gets its ass handed to it by the RTX 3060 in minimum frame rates at 4K purely due to the loss of VRAM (RE4 Remake at 4K maxed out, for example). The 3060 gets minimums in the 20s whilst the 4060 TI drops to low single digits, freezes and stutters on even a very up-to-date rig. It's dead on arrival.
The only place where the RTX 4060 should be released is a recycling centre -- it's literal e-waste. After seeing this crap show of a GPU market, I thank god everyday that the leather jacket-wearing clown at Ngreedia never managed to get his grubby hands on ARM, otherwise he would have gutted the entire technology sector and turned it into the same dumpster fire.
That said, while the 4060Ti sits just about what I'd call an acceptable price/perf zone, I'm quite sure reviews will show the plain 4060 is a good way off.
And I think I know what happened: Nvidia (and AMD for that matter) launched their initial high-end SKUs and were horrified to notice people aren't willing to pay those prices anymore. So they went into damage control mode to get their lower end SKUs into a more palatable range. Only in order to achieve that, they were forced to cut into the silicon way more than originally planned, resulting in very unbalanced hardware. If I'm right, their next gen will be looking much saner than what we're looking at this iteration.
3070 was fine back when it came out. The few titles back then that suffered could usually be alleviated just by not using ultra textures. People complain about the perf and vram usage in 4k a lot on cards that are sold as 1080p cards anyway, like the 3060. The goalposts move constantly with new games and new hardware too. Nothing is futureproof. I do think that cards like the 4060 should be launching with 12GB minimum though, just to keep them relevant and useful for longer. AMD are at least more generous with their VRAM. Just a shame they often have driver issues with various cards.
Also whether those games needed that much VRAM or not isn't really the point that I was getting at -- newer, poorly made console ports are only going to increase their requirements. My point was, this card just wasn't made with longevity in mind like the GTX 1070 and past similar 70 cards were and that seems to explain the complete stagnation that the PC GPU market has now. The 3070 has the performance to run for at least the next 5 years but is way too kneecapped to do so by that 8GB framebuffer. The reasons why it seems to have remained usable even over the past 3 years was due most console ports being last gen re-releases and "remasters" (Spiderman, God of War, TLOU Re-remastered, etc).
It's why I rarely play anything day1 any more.
I mean, forget DLSS and friends. Two decades ago, antialiasing was all about supersampling. Then, in order to boost performance, multisampling was added. Most reviews started showing how MSAA results are inferior to SSAA. Fast forwards two decades and we have TAA, FXAA and other approximations that can blur everything or shimmer. And we're happy when we can use MSAA instead. Nobody even mentions SSAA anymore, people have accepted MSAA is good enough.
Anisotropic filtering and its optimizations have through the same process as well. Now it's resolution's turn, that's all. I don't really game anymore, but before that I have almost never played anything on day 1. Didn't lose anything in the process either ;)
DLSS/FSR on the other hand, were invented to make things look slightly worse to compensate for the GPU power that you don't have.
Anti-aliasing adds to the experience at the cost of some performance, but upscaling takes away from it to gain back some performance.
My take on the matter is that if I can run native, I will. Upscaling can look good, very good even, but it will never be more than an aid to boost framerates when needed.
Then we established that he mainly games as he would on a gaming console - from comfortable distance to view the whole screen, playing fast paced shooters and role playing games mostly using controller, not keyboard and mouse. At his distance, even 1080p looked good, and by focusing on larger area of screen there's very little chance of noticing minor differences in image quality on small detail.
Then we compared it with my usage, 27" 4K screen at close distance, playing flight and racing sims where you are often focusing on small detail on screen, like target, runway, enemy, a curve in the far distance - you notice every small difference in Anti Aliasing, every drop in native resolution due to DLSS... It's similar with real time strategies and similar games with lots of small detail.
PC gaming is much more varied than console gaming. For my friend's usage, it mostly held true that even "performance DLSS" looked good.
It's not for everybody, of course. But that's why it's an option in the setting menu. It's not turned on automatically for anyone.