Wednesday, March 29th 2023

NVIDIA GeForce RTX 4060 Ti Name Gets Confirmed

Both the NVIDIA GeForce RTX 4060 Ti and the RTX 4060 have been rumored to launch in May, and now the RTX 4060 Ti name has been pretty much confirmed as NVIDIA board partners have received the marketing materials, including the box design template. When the board partners get the box and logo design templates and send out boxes to printing, the name is pretty much carved in stone.

The leak comes from MEGAsizeGPU, and we had a pretty similar leak for the RTX 4070 Ti earlier. Unfortunately, the leaked box design and key features do not reveal or confirm previous rumors about any of the important specifications, so the rumored 8 GB memory or PCIe Gen 4 x8 interface are still not confirmed. One thing is clear, NVIDIA is building up steam and gearing up to launch as many as three new RTX 40 series graphics cards in next two months, with the GeForce RTX 4070 rumored for April 13th launch date, while the GeForce RTX 4060 Ti and the GeForce RTX 4060 should launch in May.
Sources: MEGAsizeGPU (Twitter), via Videocardz
Add your own comment

58 Comments on NVIDIA GeForce RTX 4060 Ti Name Gets Confirmed

#1
Firedrops
Wow, a 4060 and 4060 ti??? Never saw this coming!
Posted on Reply
#2
Slane
Are these cards dead on arrival with the latest games that need a lot and a lot of VRAM ? Because, I can't understand to have only 8 Gb in 2023 as my cheap GTX 1060 6Gb was only 220€ at the end of 2016.
Posted on Reply
#3
pavle
Don't get all too excited, just yesterday I saw Daniel Owen's video about RTX 3070 Ti 8GB and it can't even play Resident evil 4 Remake (with high textures) because of "only" 8GB of VRAM, core is plenty powerful enough. Maybe nvidia will store their users' textures in a cloud. :wtf:
Posted on Reply
#4
N/A
What's wrong with using normal textures at 1440/144.
Posted on Reply
#5
TheinsanegamerN
N/AWhat's wrong with using normal textures at 1440/144.
The core is more then capable of playing with high quality textures and it cost $800+ just a year ago? Or are high end cards getting gimped by cheap memory busses acceptable to you?
Posted on Reply
#6
piloponth
It is laughable how these downsized chips with 128b width are getting the Ti postfix
Posted on Reply
#7
neatfeatguy
pavleDon't get all too excited, just yesterday I saw Daniel Owen's video about RTX 3070 Ti 8GB and it can't even play Resident evil 4 Remake (with high textures) because of "only" 8GB of VRAM, core is plenty powerful enough. Maybe nvidia will store their users' textures in a cloud. :wtf:
I've played RE4, why do we need remakes and then have them created to a point where most recent GPUs can't even handle them with the high end textures?

I feel like these developers are doing something wrong if their games can't be ran using high res textures because of VRAM amounts. Like the whole, you need 11GB of VRAM to run Far Cry 6 high res textures so the 3080 10GB wasn't able to do it.

It feels like these developers are artificially limiting GPUs, almost like it's some kind of conspiracy with Nvidia/AMD (maybe more so Nvidia since AMD cards tend to have more VRAM when compared to equivalent Nvidia GPUs) to help push the sale of newer or higher end GPUs.
Posted on Reply
#8
TheinsanegamerN
neatfeatguyI've played RE4, why do we need remakes and then have them created to a point where most recent GPUs can't even handle them with the high end textures?

I feel like these developers are doing something wrong if their games can't be ran using high res textures because of VRAM amounts. Like the whole, you need 11GB of VRAM to run Far Cry 6 high res textures so the 3080 10GB wasn't able to do it.

It feels like these developers are artificially limiting GPUs, almost like it's some kind of conspiracy with Nvidia/AMD (maybe more so Nvidia since AMD cards tend to have more VRAM when compared to equivalent Nvidia GPUs) to help push the sale of newer or higher end GPUs.
Far easier answer: they're japanese. Japanese devs have NEVER been able to figure out the whole "PC" thing. It's not like this was even the first japanese game this year that came out and ran like absolute junk on powerful PCs.

Their games are made for console first, so they used all 16GB of RAM they could and then couldnt figure out how to make it run on 8GB video cards. The code is likely a total mess.

All this for a game that ran on the gamecube. LMAO.
Posted on Reply
#9
Sithaer
Personally I'm doing alright with my 3060 Ti and 8GB Vram but thats because I'm only using a FHDUltrawide res '2560x1080' and I have no plans to upgrade from this resolution anytime soon.
That being said I would not buy a 4060/Ti with 8GB in 2023 for such prices, that just feels wrong.

Also agree that new games lately have some stupid Vram usage for some reason.
Yet, for example Plague Tale Requiem has one of the best non modded textures I've seen in a game and it barely uses 6+ gigs/4k res. 'using their own engine and not even a AAA studio..'
Posted on Reply
#10
Frick
Fishfaced Nincompoop
SithaerPersonally I'm doing alright with my 3060 Ti and 8GB Vram but thats because I'm only using a FHDUltrawide res '2560x1080' and I have no plans to upgrade from this resolution anytime soon.
That being said I would not buy a 4060/Ti with 8GB in 2023 for such prices, that just feels wrong.

Also agree that new games lately have some stupid Vram usage for some reason.
Yet, for example Plague Tale Requiem has one of the best non modded textures I've seen in a game and it barely uses 6+ gigs/4k res. 'using their own engine and not even a AAA studio..'
I have the 3060ti as well, and it's not fast enough to push 4K at high settings in all games anyway. But yes, these cards definitely should have 12GB.
Posted on Reply
#11
Sithaer
FrickI have the 3060ti as well, and it's not fast enough to push 4K at high settings in all games anyway. But yes, these cards definitely should have 12GB.
I don't have a 4k monitor, that was just an example that some games can have good looking textures/generally good graphics and still don't use much unlike some of the recent games that don't look any better yet use up everything you have and have various issues.
With my low-ish resolution I think I will be fine for a while or I will drop settings down a notch/use DLSS on quality which I don't mind in most games anyway.

12 should be the standard at this point and price range yea.
Posted on Reply
#12
evernessince
neatfeatguyI've played RE4, why do we need remakes and then have them created to a point where most recent GPUs can't even handle them with the high end textures?

I feel like these developers are doing something wrong if their games can't be ran using high res textures because of VRAM amounts. Like the whole, you need 11GB of VRAM to run Far Cry 6 high res textures so the 3080 10GB wasn't able to do it.

It feels like these developers are artificially limiting GPUs, almost like it's some kind of conspiracy with Nvidia/AMD (maybe more so Nvidia since AMD cards tend to have more VRAM when compared to equivalent Nvidia GPUs) to help push the sale of newer or higher end GPUs.
8GB has been mid-range since the RX 480 and that was 7 years ago. Mind you that's generous to Nvidia given the MSRP of the RX 480 8GB was $240 whereas this 4060 Ti will likely cost $400+ given the $400 msrp of the 3060 ti.

You cannot blame devs here when you are talking about historic levels of VRAM stagnation on Nvidia's part.
Posted on Reply
#13
Vayra86
SlaneAre these cards dead on arrival with the latest games that need a lot and a lot of VRAM ? Because, I can't understand to have only 8 Gb in 2023 as my cheap GTX 1060 6Gb was only 220€ at the end of 2016.
Yes.
Unless they can properly utilize the magical L2, which a lot should - emphasis on should. Because you ARE left to the devices of engine implementation and developer quality, and not so much to the size of a certain texture, which is what made sufficient VRAM much easier.

I'm staying far away from low bandwidth cards in any case, as much as I stay away from anything <12GB for the next upgrade that has around 2x the core power. Sure, 8GB on my current card could have been 6GB and it'd have worked out (in most cases - I have a few edge games that can actually utilize upwards of 7~ish, but could prob also do with less), which is where the 2060's are at quite precisely in comparison... but at 4GB a 1080 would have been utterly crippled. That's kinda what Nvidia's latest is looking like. On the edge of crippled, and probably over it in two years time.
Posted on Reply
#14
Why_Me
For all the people on here talking 1440 and 4K .. these cards are geared for 1080P.
Posted on Reply
#15
Vayra86
Why_MeFor all the people on here talking 1440 and 4K .. these cards are geared for 1080P.
Yeah, ofc, I have a 2016 card that plays 1440p... At 500 MSRP. Even today.
Dream on, I mean, you're not wrong in terms of what the marketing appears to aim at, but... what?
Posted on Reply
#16
evernessince
Why_MeFor all the people on here talking 1440 and 4K .. these cards are geared for 1080P.
The difference in VRAM usage in most games between 1080p and 4K in most cases is 1GB. That's not going to make up for the lack of VRAM.

Mind you, kind of embarrassing that your $400 - $500 graphics card in 2023 can barely run 1080p when that's what cards like the 1060 and 480 were advertised for.
Posted on Reply
#17
Dimitriman
Not excited for these at all. Apart from the 4090, this has to be one of Nvidia's worst launched generation of cards.
Posted on Reply
#18
btk2k2
Why_MeFor all the people on here talking 1440 and 4K .. these cards are geared for 1080P.
8GB for a 1080p card in a sub $300 price category would be fine.

8GB on a $400+ that barring the VRAM limitation would be a good 1440p card. not so fine.

Same as 12GB on the $800 4070Ti. It won't be long before that starts to be a limit for a card that should be a top end 1440p card and a pretty good 4K card. And even then $800 for a top end 1440p card and a pretty good 4K card is just too much IMO.
Posted on Reply
#19
Vader
1080p is being slowly outpaced by the GPUs, 1440p/medium settings at reasonable FPS on a "1080p" card is not unreasonable at all these days. For the premium price they'll ask for, at least that capability must be had.
Posted on Reply
#20
mathieu
evernessinceThe difference in VRAM usage in most games between 1080p and 4K in most cases is 1GB. That's not going to make up for the lack of VRAM.

Mind you, kind of embarrassing that your $400 - $500 graphics card in 2023 can barely run 1080p when that's what cards like the 1060 and 480 were advertised for.
nope more the difference is 2-3 gb however the result remains the same that card should have at least 10 gb vram
Posted on Reply
#21
Denver
neatfeatguyI've played RE4, why do we need remakes and then have them created to a point where most recent GPUs can't even handle them with the high end textures?

I feel like these developers are doing something wrong if their games can't be ran using high res textures because of VRAM amounts. Like the whole, you need 11GB of VRAM to run Far Cry 6 high res textures so the 3080 10GB wasn't able to do it.

It feels like these developers are artificially limiting GPUs, almost like it's some kind of conspiracy with Nvidia/AMD (maybe more so Nvidia since AMD cards tend to have more VRAM when compared to equivalent Nvidia GPUs) to help push the sale of newer or higher end GPUs.
"Faster and poorly optimized" is cheaper than investing in optimizing your game properly, the incredibly powerful hardware we have now and tricks like FSR/DLSS etc are just more reasons not to invest in proper optimization.
Posted on Reply
#22
ValenOne
N/AWhat's wrong with using normal textures at 1440/144.
8 GB is the old-generation video memory configuration.
Denver"Faster and poorly optimized" is cheaper than investing in optimizing your game properly, the incredibly powerful hardware we have now and tricks like FSR/DLSS etc are just more reasons not to invest in proper optimization.
Larger video memory enables higher diversified high-resolution textures. 8 GB VRAM is last gen video memory config.
TheinsanegamerNFar easier answer: they're japanese. Japanese devs have NEVER been able to figure out the whole "PC" thing. It's not like this was even the first japanese game this year that came out and ran like absolute junk on powerful PCs.

Their games are made for console first, so they used all 16GB of RAM they could and then couldnt figure out how to make it run on 8GB video cards. The code is likely a total mess.

All this for a game that ran on the gamecube. LMAO.
Buy RX 6800 16 GB instead of 8 GB VRAM gimped RTX 3070 / 3070 Ti. You're not a true PC master race when your video card is below PS5 and XSX game consoles.
Posted on Reply
#23
bearClaw5
rvalencia8 GB is the old-generation video memory configuration.


Larger video memory enables higher diversified high-resolution textures. 8 GB VRAM is last gen video memory config.


Buy RX 6800 16 GB instead of 8 GB VRAM gimped RTX 3070 / 3070 Ti. You're not a true PC master race when your video card is below PS5 and XSX game consoles.
I think you're forgetting that the consoles only have 16gb total. Hypothetically that 3070 has at least 24gb.
Posted on Reply
#24
ValenOne
bearClaw5I think you're forgetting that the consoles only have 16gb total. Hypothetically that 3070 has at least 24gb.
You're forgetting consoles don't have PC double memory data copies issues. Refer to www.techpowerup.com/306713/directx-12-api-new-feature-set-introduces-gpu-upload-heaps-enables-simultaneous-access-to-vram-for-cpu-and-gpu

Shared GPU memory via PCIe 4.0 16 lanes memory is relatively slow i.e. 32 GB/s per direction (or 64 GB/s full duplex) and system memory need to serve both CPU and GPU nodes.

For comparison, Xbox 360's memory bandwidth is 22.4 GB/s (before Xbox 360/DX10 era 3DC+ texture compression). All DX12U PC GPUs support delta color compression (DCC).

Full duplex 64 GB/s from PCIe 4.0 16 lanes will need system bandwidth that can deliver 64 GB/s, hence 128 bit DDR4-4000 or lowest DDR5-4800 would be needed.

AMD's mobile "Phoenix Point" APU supports up to 128-bit LPDDR5-7600 with 121 GB/s. NVIDIA doesn't support PCIe 5.0.

Killzone Shadow Fall's CPU vs GPU memory usage for PS4.

4,736 MB usage for Killzone Shadow Fall. PS4 has 5.5 GB available for games.

For Killzone Shadow Fall, the GPU: CPU memory usage ratio is about 2 : 1

Scale to 13.5 GB is available for PS5 games, the VRAM allocation is about 10 GB and PS5 is the baseline config. A PC version with a higher resolution texture pack of the PS5 game can blast past 8 GB and 10 GB VRAM.
Posted on Reply
#25
Denver
bearClaw5I think you're forgetting that the consoles only have 16gb total. Hypothetically that 3070 has at least 24gb.
You're forgetting that consoles are unique systems without the huge range of combinations of components that PCs have, consoles achieve a much higher level of optimization, especially in terms of memory consumption.
Posted on Reply
Add your own comment
Dec 23rd, 2024 02:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts