Wednesday, March 29th 2023
NVIDIA GeForce RTX 4060 Ti Name Gets Confirmed
Both the NVIDIA GeForce RTX 4060 Ti and the RTX 4060 have been rumored to launch in May, and now the RTX 4060 Ti name has been pretty much confirmed as NVIDIA board partners have received the marketing materials, including the box design template. When the board partners get the box and logo design templates and send out boxes to printing, the name is pretty much carved in stone.
The leak comes from MEGAsizeGPU, and we had a pretty similar leak for the RTX 4070 Ti earlier. Unfortunately, the leaked box design and key features do not reveal or confirm previous rumors about any of the important specifications, so the rumored 8 GB memory or PCIe Gen 4 x8 interface are still not confirmed. One thing is clear, NVIDIA is building up steam and gearing up to launch as many as three new RTX 40 series graphics cards in next two months, with the GeForce RTX 4070 rumored for April 13th launch date, while the GeForce RTX 4060 Ti and the GeForce RTX 4060 should launch in May.
Sources:
MEGAsizeGPU (Twitter), via Videocardz
The leak comes from MEGAsizeGPU, and we had a pretty similar leak for the RTX 4070 Ti earlier. Unfortunately, the leaked box design and key features do not reveal or confirm previous rumors about any of the important specifications, so the rumored 8 GB memory or PCIe Gen 4 x8 interface are still not confirmed. One thing is clear, NVIDIA is building up steam and gearing up to launch as many as three new RTX 40 series graphics cards in next two months, with the GeForce RTX 4070 rumored for April 13th launch date, while the GeForce RTX 4060 Ti and the GeForce RTX 4060 should launch in May.
58 Comments on NVIDIA GeForce RTX 4060 Ti Name Gets Confirmed
I feel like these developers are doing something wrong if their games can't be ran using high res textures because of VRAM amounts. Like the whole, you need 11GB of VRAM to run Far Cry 6 high res textures so the 3080 10GB wasn't able to do it.
It feels like these developers are artificially limiting GPUs, almost like it's some kind of conspiracy with Nvidia/AMD (maybe more so Nvidia since AMD cards tend to have more VRAM when compared to equivalent Nvidia GPUs) to help push the sale of newer or higher end GPUs.
Their games are made for console first, so they used all 16GB of RAM they could and then couldnt figure out how to make it run on 8GB video cards. The code is likely a total mess.
All this for a game that ran on the gamecube. LMAO.
That being said I would not buy a 4060/Ti with 8GB in 2023 for such prices, that just feels wrong.
Also agree that new games lately have some stupid Vram usage for some reason.
Yet, for example Plague Tale Requiem has one of the best non modded textures I've seen in a game and it barely uses 6+ gigs/4k res. 'using their own engine and not even a AAA studio..'
With my low-ish resolution I think I will be fine for a while or I will drop settings down a notch/use DLSS on quality which I don't mind in most games anyway.
12 should be the standard at this point and price range yea.
You cannot blame devs here when you are talking about historic levels of VRAM stagnation on Nvidia's part.
Unless they can properly utilize the magical L2, which a lot should - emphasis on should. Because you ARE left to the devices of engine implementation and developer quality, and not so much to the size of a certain texture, which is what made sufficient VRAM much easier.
I'm staying far away from low bandwidth cards in any case, as much as I stay away from anything <12GB for the next upgrade that has around 2x the core power. Sure, 8GB on my current card could have been 6GB and it'd have worked out (in most cases - I have a few edge games that can actually utilize upwards of 7~ish, but could prob also do with less), which is where the 2060's are at quite precisely in comparison... but at 4GB a 1080 would have been utterly crippled. That's kinda what Nvidia's latest is looking like. On the edge of crippled, and probably over it in two years time.
Dream on, I mean, you're not wrong in terms of what the marketing appears to aim at, but... what?
Mind you, kind of embarrassing that your $400 - $500 graphics card in 2023 can barely run 1080p when that's what cards like the 1060 and 480 were advertised for.
8GB on a $400+ that barring the VRAM limitation would be a good 1440p card. not so fine.
Same as 12GB on the $800 4070Ti. It won't be long before that starts to be a limit for a card that should be a top end 1440p card and a pretty good 4K card. And even then $800 for a top end 1440p card and a pretty good 4K card is just too much IMO.
Shared GPU memory via PCIe 4.0 16 lanes memory is relatively slow i.e. 32 GB/s per direction (or 64 GB/s full duplex) and system memory need to serve both CPU and GPU nodes.
For comparison, Xbox 360's memory bandwidth is 22.4 GB/s (before Xbox 360/DX10 era 3DC+ texture compression). All DX12U PC GPUs support delta color compression (DCC).
Full duplex 64 GB/s from PCIe 4.0 16 lanes will need system bandwidth that can deliver 64 GB/s, hence 128 bit DDR4-4000 or lowest DDR5-4800 would be needed.
AMD's mobile "Phoenix Point" APU supports up to 128-bit LPDDR5-7600 with 121 GB/s. NVIDIA doesn't support PCIe 5.0.
Killzone Shadow Fall's CPU vs GPU memory usage for PS4.
4,736 MB usage for Killzone Shadow Fall. PS4 has 5.5 GB available for games.
For Killzone Shadow Fall, the GPU: CPU memory usage ratio is about 2 : 1
Scale to 13.5 GB is available for PS5 games, the VRAM allocation is about 10 GB and PS5 is the baseline config. A PC version with a higher resolution texture pack of the PS5 game can blast past 8 GB and 10 GB VRAM.