• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

12GB Confirmed to be GeForce RTX 4070 Standard Memory Size in MSI and GIGABYTE Regulatory Filings

Nvidia will make the price at whatever as unfortunately they know people will play for it.
Well people aren't rushing to buy it seems. Plenty of stock of the 4080 and 4070Ti in the retailers I keep an eye on. Even the 4090 king is easily available for not far off MSRP. I guess we won't know the true state of things until Nvidia makes a company report on recent quarters. If sales are down then maybe they'll rethink their strategy of meagre vram and high pricing.
 
Says who.

average-fps_2560_1440.png
And this from Wizard on a GPU analysis for The Last of Us Part 1: "Not only rendering performance requirements are high, but VRAM is also challenging. Even at 1600x900 we measured allocation of 10.5 GB, which of course doesn't mean that every single frame touches all the memory. Still, cards with 8 GB do encounter a small FPS penalty even at 1080p. At 4K, the VRAM usage is 14 GB, which makes things challenging for RTX 3080 10 GB and RTX 4070 Ti 12 GB—both these cards drop well below their AMD counterparts at 4K."
 
And this from Wizard on a GPU analysis for The Last of Us Part 1: "Not only rendering performance requirements are high, but VRAM is also challenging. Even at 1600x900 we measured allocation of 10.5 GB, which of course doesn't mean that every single frame touches all the memory. Still, cards with 8 GB do encounter a small FPS penalty even at 1080p. At 4K, the VRAM usage is 14 GB, which makes things challenging for RTX 3080 10 GB and RTX 4070 Ti 12 GB—both these cards drop well below their AMD counterparts at 4K."
Tbh, neither is marketed as a 4k card. They can push 4k, but, of course, you're going to have to lower something to get there. Both those cards and their AMD counterparts are below the 60fps threshold. Nvidia cards can usually use the DLSS3 escape hatch (power VRAM requirements as a side effect), but with this title they don't get that option.

And then there's the remake of RE4, where these card push 4k pretty comfortably.
 
And this from Wizard on a GPU analysis for The Last of Us Part 1: "Not only rendering performance requirements are high, but VRAM is also challenging. Even at 1600x900 we measured allocation of 10.5 GB, which of course doesn't mean that every single frame touches all the memory. Still, cards with 8 GB do encounter a small FPS penalty even at 1080p. At 4K, the VRAM usage is 14 GB, which makes things challenging for RTX 3080 10 GB and RTX 4070 Ti 12 GB—both these cards drop well below their AMD counterparts at 4K."
Because 'The Last of Us' is such a terrific optimized game /sarasm. And btw who's going to purchase this card for gaming at 4K?
 
Tbh, neither is marketed as a 4k card.
RTX 3080 was of course marketed as a 4K card, it was also marketed as a flagship - and RTX 3090 was marketed as a "TITAN class performance" and not meant as a normal gaming card product.
 
Interesting that the nvidia fans only complain a game is "badly optimized" when it runs worse on a nvidia gpu due to nvidia putting just enough vram on their cards so you're forced to replace it sooner.
And 12GB on a 4070 isn't really enough, especially for a $600 gpu, while you can buy a RX 6950XT for around $650.
 
Interesting that the nvidia fans only complain a game is "badly optimized" when it runs worse on a nvidia gpu due to nvidia putting just enough vram on their cards so you're forced to replace it sooner.
And 12GB on a 4070 isn't really enough, especially for a $600 gpu, while you can buy a RX 6950XT for around $650.
The game has issues and anyone familiar with PC gaming would know that.
 
I don't think Nvidia listens to customer feedback in random forums.

View attachment 289440


And as they have said, AI will need tens of thousands of GPUs, so all you gamers can go play with your rain sticks.
Good. Then that will 1. mean death to new AAA, PC games or 2. game devs will have to really scratch their heads and optimize the shit out of ever game they make or port to PC so that users can game with old hardware like 8GB vRAM cards.
 
Nah, Jensen said that this era is over, that Moore's Law is dead. And months later, Moore actually died. Coincidence?
Actually if you use HBM like AMD(circa 2015) did with the Fury X you can hit TB/sec BW. Should be standard now instead of DDR6. NVIDIA mafia squashed it. Remember the AMD 4K killer claims, BW isn't everything for high FPS.
 
Back
Top