• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Early Leak Claims AMD Radeon RX 9060 XT Might Reach NVIDIA GeForce RTX 4070 Territory

Having more VRAM doesn't give you more performance, it only makes you capable of displaying higher details provided you have the bandwidth and computational performance to go along with it. A theoretical "9050 XT" would be way too slow to make any real use for 12 GB during gaming, and would probably approach ~10 FPS in loads that actually requires it. Meanwhile, having faster and slightly more expensive VRAM would probably yield an extra ~3-5% performance across the board, which is way more valuable to the customer than extra VRAM which would only come into play when running unrealistic uses cases no buyer would ever do.

It does not need to use 12GB to be of benefit. Once you are using 8.5GB you will see and feel the benefit of more VRAM and frequently titles at 1080p are using over 8GB and there are more on the horizon.
 
If RX 9060XT is Navi 44 based (according to the leaks half of Navi48, meaning 64ROPs/ 128 TMUs/ 2048SPs/ 128 memory bus) it will be around RTX 4060Ti QHD performance at 3GHz maybe 1-2% faster, how much higher can they push the silicon for reference clocks?
If it's Navi48 based there is zero chance for 128bit bus and 2048SPs, a potentially meaningful config will be 96ROPs/192TMUs/3072SPs/192bit bus (12GB) but this will have higher performance than a 7800XT for example.
He's saying between 4060Ti and 7700XT and if they push the clocks it could reach near 4070 performance, lol, he doesn't know anything, even with all the contacts he has.
 
It does not need to use 12GB to be of benefit. Once you are using 8.5GB you will see and feel the benefit of more VRAM and frequently titles at 1080p are using over 8GB and there are more on the horizon.
Only as long as you are not confusing allocated memory with used memory.
GPUs support heavy compression, and lots of temporary buffers are filled and then emptied compressed during rendering of a single frame, so the actual used memory will constantly fluctuate, but is generally far less than allocated memory. The true measure of VRAM usage is to find the point where the performance plummets (which will obviously be game dependent).

But the point remains; the detail level that a "9050 XT" would be able to render close to ~60 FPS will not be constrained by 8 GB VRAM.
 
Only as long as you are not confusing allocated memory with used memory.
GPUs support heavy compression, and lots of temporary buffers are filled and then emptied compressed during rendering of a single frame, so the actual used memory will constantly fluctuate, but is generally far less than allocated memory. The true measure of VRAM usage is to find the point where the performance plummets (which will obviously be game dependent).

But the point remains; the detail level that a "9050 XT" would be able to render close to ~60 FPS will not be constrained by 8 GB VRAM.

In a game like Indiana Jones the difference between a cut down 8GB part and a cut down 12GB part is going to be the difference between a 60+ fps result and a result in the 30 fps range.

Indiana Jones 8GB fails

In Diablo 4 8GB cards have lower texture quality to maintain the framerate

D4 8GB Lower texture quality

There are other titles where 8GB shows lower performance or lower image quality depending on how the engine handles the shortfall. On top of that these cards have an 8x PCIe bus which makes the situation worse.
 
Back
Top