• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti 16 GB

Joined
Apr 13, 2023
Messages
38 (0.07/day)
I kinda think that the reason behind the madness is that Jensen knows that there are a lot of people who will buy a card in a green box no matter how good or bad the value is. I'm pretty sure that the reason behind all of nVidia's madness is profit, pure and simple.

It makes no sense to have the 4060 Ti have 16GB of VRAM but the 4070 and 4070 Ti, two cards that would be a lot more useful for AI, only have 12.


You're right, he does, but at the same time, he had no qualms about showing everyone the impossible position that the RTX 4060 Ti 16GB was forced into by nVidia. He showed that he has much bigger cojones than HUB because they somehow managed to "neglect" to do this comparison. There's no excuse for that because the comparison was immediately obvious to anyone with more than two brain cells to rub together and HUB isn't that stupid.

I'm sure that there's another reason why they didn't do that comparison. Based on Daniel Owen's results, we can be sure that the only ones who wouldn't be pleased with such a comparison would be nVidia. Now, I don't know for certain why HUB didn't do that ridiculously obvious comparison, but when one does the math...


What are you talking about? You weren't pushing me anywhere and I didn't justify a damn thing for AMD. I predicted a bloodbath and since nVidia is green, I ironically called it a "Romulan Bloodbath".

You're really starting to make me wonder about you. :kookoo:


That's not exactly what I meant. What I meant was that there's only a 20% performance difference between it and the RX 6800 XT. Where does that leave the RX 7800 XT? People are going to expect more than a 10% performance uplift between the 7800 and 7900 cards. Even if the RX 7800 XT is 10% faster than the RX 6800 XT, they're going to have to price it pretty low (like $450-$500) if it's going to sell because the market is currently saturated around the $500-$600 price point. There just isn't enough of a performance delta for the RX 7800 XT to have a proper place in the market because the RX 6950 XT still exists. AMD knew this all too well which is why they released the RX 7600 first. They were hoping that the RX 6800 XT and RX 6950 XT would sell out before they were forced to release the RX 7800 XT but that hasn't happened. If it is 10% faster than the RX 6800 XT, then it would be compelling at $500 but not so much at $550 because it would be 10% more expensive for 10% more performance. That's not exactly something to get excited about, especially if the RX 6800 XT's price gets cut even further. At $450, it would be the de facto market leader and would be the most praised card of this generation regardless of what colour box it comes in.

Of course, that would require foresight and AMD has demonstrated the exact opposite of that for this entire generation so far. :kookoo:

AMD will shoot themselves in the foot yet again. I'd like to say that it's stupidity but they're not stupid people. This whole situation was caused by an anti-consumer play by AMD that they failed to get away with and I'm really glad that they didn't. AMD is going to suffer pretty badly this generation and it's 100% their fault. I'm just going to sit back and laugh at them as they receive exactly what they deserve.

If the RX 7800 XT releases at $450, then I'll be the first to happily take back everything that I said. I would love to be wrong here but I wouldn't even bet a dime on it.
Well here we have it. Of course, for the clicks(they're about to reach 1M subs) whether HWUB and all of us agrees or disagrees on anything. My observation is that HWUB will not call out nVidia directly but doing comparisons like this gives them a free pass. At the end of the day, follow the money.

But they'll make jabs like this towards AMD about nothing while we know how shitty nVidia is with their closed sources.

If the 7900XT is a joke, then what is the 12GB 4070 and 4070ti? It should be dog poop then right?

Looking at TechSpot, how they rate AMD vs nVidia vs Intel, and the comments, they have deleted my comments for calling out how shiity both the 4060ti is.
 
Last edited:
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
The 1060 6GB has about 200GB/s while the 4060ti is 162% faster on a mere 288GB/s.

Its definitely not going to be able to use all that core advantage with even 2x the memory capacity of a 1060 filled up. Cache isn't large so it will simply choke on complex scenes when more data is needed at once. You say it right, 1080p only, yes. But even then, there is more data. The VRAM capacities required for games between 1080p and 4K is often no more than a 2GB gap. The 16GB will help for games that exceed 8GB, but it won't resolve the issue of lacking bandwidth everywhere.

The reason the 1060 6GB isn't penalized for its 6GB is because every game pegs the core at 100% anyway, and that card relatively has solid bandwidth to feed it. Comparatively it has more real bandwidth than whatever the 4060ti has in bandwidth + cache advantage. The same thing applies throughout Pascal's stack. The 1080 already sits at 500GB/s while being just 60% ish faster than a 1060 6GB - and here's the kicker, if you OC a 1080's memory to 11Gbps (a +500 on the rev. 1, every single one can do it) you gain real frames, consistently. So even thát card still has core to spare for its given bandwidth. It will choke only when games exceed its framebuffer ánd require large amounts of new data on top, a situation that applies, for example, to TW Warhammer 3 at 1440p ultra/high and up, which will readily use north of 10GB VRAM.

As for bright futures, I think they're both pretty weak offers; a 4060ti and a 4070 both are priced too high. With the 4060ti you spend $500,- for what is virtually entry level hardware.
Cache isn't large?
Here are sizes of common texture DXT5 format and sizes:
8192 - 85.3MB
4096 - 21.3MB
2048 - 5.3MB
1024 - 1.3MB

Cache sizes on gen 40 cards and also on AMD cards are greatly helping to decrease pressure on mem bandwidth(for that reason it was lowered). That was not possible on older gen cards where the max. size was 6MB.
The most critical area in view frustum are objects near the camera(<~10m). On 1080p resolution it almost does not make sense use 4k and 8k textures(also depends how textures are mapped on objects), because lower texture sizes(higher MIP levels) are used on objects. Rest of the rendered scene can easily use cache, because texture higher MIP levels are stupidly small.
Every modern game if it is optimized needs 2 textures for fully PBR rendering....so e.g. 2k texture(used on one object) consume 10.6MB in VRAM/cache.

In the future games, there will be definitely more objects in the view frustum, but the principles what will be needed for rendering that objects are the same for 1080p and the same FOV. The 1080p resolution simply does not allow you to use bigger textures(lower MIP levels)...so the pressure on mem bandwidth will be just slightly higher(proportional to num of objects). It will be more important to have the assets data stored in VRAM than on system RAM/SSD. Mem bandwidth will be still enought for 1080p. Otherwise, we would already see a big fps drop today and that not happened.

Do you still think that frame gen + 16GB VRAM + 32MB L2 cache + 128bit bus on 4060ti is somehow unbalanced for 1080p gaming? I think exactly the opposite, especially in 5+ years from now.
 
Last edited:
Joined
Dec 10, 2022
Messages
484 (0.76/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Well here we have it. Of course, for the clicks(they're about to reach 1M subs) whether HWUB and all of us agrees or disagrees on anything. My observation is that HWUB will not call out nVidia directly but doing comparisons like this gives them a free pass. At the end of the day, follow the money.

But they'll make jabs like this towards AMD about nothing while we know how shitty nVidia is with their closed sources.

If the 7900XT is a joke, then what is the 12GB 4070 and 4070ti? It should be dog poop then right?

Looking at TechSpot, how they rate AMD vs nVidia vs Intel, and the comments, they have deleted my comments for calling out how shiity both the 4060ti is.
It sure took them long enough, eh?;)
 
Top