Monday, April 3rd 2023
NVIDIA GeForce RTX 4070 has an Average Gaming Power Draw of 186 W
The latest leaked slide for GeForce RTX 4070 confirms most of the specifications, as well as reveals some previously unknown details, including the 186 W average power draw. While the specification list does not mention the number of CUDA cores, it does confirm it will be based on AD104 GPU with 36 MB of L2 cache, and come with 12 GB of GDDR6X memory with 504 GB/s of maximum memory bandwidth, which points to 192-bit memory interface and 21 Gbps clocked memory.
The slide posted by Videocardz also compares the upcoming GeForce RTX 4070 with the previous generation RTX 3070 Ti and the RTX 3070, showing a significant increase in shader number, RT cores, and Tensor cores, not to mention we are talking about 3rd gen RT cores and 4th Gen Tensor cores on the RTX 4070. It will also support DLSS 3, and have AV1 and H.264 NV encoders.The most interesting part of the slide is the power draw comparison, showing a TGP of 200 W, which is is lower than on the RTX 3070 Ti and the RTX 3070. It also draws less power under average gaming, video playback, and in idle. According to NVIDIA's own slide, the GeForce RTX 4070 has an average gaming draw of 186 W, with video playback draw of 16 W, and idle power draw of 10 W. All of these are lower than on the RTX 3070 Ti and the RTX 3070.
The slide also pretty much confirms the previously reported $599 price tag, at least for some of the RTX 4070 graphics cards, as some custom models will definitely be priced significantly higher. So far, it appears that NVIDIA might not change the launch price, and, as reported earlier, the $599 will leave plenty of room for custom RTX 4070 graphics cards without going close to the less expensive RTX 4070 Ti graphics cards, which sell close to $800.
Source:
Videocardz
The slide posted by Videocardz also compares the upcoming GeForce RTX 4070 with the previous generation RTX 3070 Ti and the RTX 3070, showing a significant increase in shader number, RT cores, and Tensor cores, not to mention we are talking about 3rd gen RT cores and 4th Gen Tensor cores on the RTX 4070. It will also support DLSS 3, and have AV1 and H.264 NV encoders.The most interesting part of the slide is the power draw comparison, showing a TGP of 200 W, which is is lower than on the RTX 3070 Ti and the RTX 3070. It also draws less power under average gaming, video playback, and in idle. According to NVIDIA's own slide, the GeForce RTX 4070 has an average gaming draw of 186 W, with video playback draw of 16 W, and idle power draw of 10 W. All of these are lower than on the RTX 3070 Ti and the RTX 3070.
The slide also pretty much confirms the previously reported $599 price tag, at least for some of the RTX 4070 graphics cards, as some custom models will definitely be priced significantly higher. So far, it appears that NVIDIA might not change the launch price, and, as reported earlier, the $599 will leave plenty of room for custom RTX 4070 graphics cards without going close to the less expensive RTX 4070 Ti graphics cards, which sell close to $800.
56 Comments on NVIDIA GeForce RTX 4070 has an Average Gaming Power Draw of 186 W
Also the Series S is like 8.5 GB for games and targets 1080p so all the sub 400 gpus should be fine for 1080p.... Although 400 for an 8GB gpu is kinda sad.
www.newegg.com/asrock-radeon-rx-6800-rx6800-pgd-16go/p/N82E16814930048
I mean my 4090 isn't CPU bottlenecked in Port Royal and it'll still draw less than 400W in some scenes. I don't think it ever hits 450W in that test. Why? Because it's hitting top of the voltage-frequency curve before it hits the power limit. That's a voltage limit. That's also how most games work. I can show you some screenshots at DSR 4K resolutions if you want?
The Igorlab's results are not because of a CPU bottleneck. A 4070Ti isn't going to be CPU bottlenecked at 4K.
You don't seem to understand that a power limit is just a number chosen by Nvidia or AMD. The 4090 could've easily been a 200W TDP card if Nvidia wanted to but then it would've indeed always been power limited. With 450W that's not the case and the same might be true for 200W on the 4070.
If the GPU is limited by voltage then it's always going to be that way, it makes no sense to design a card that hits a voltage limit before the power limit is reached. You have this completely backwards, the reason the voltage-frequency curve exists in the first place is in order to regulate power and temperature. GPUs are specifically designed to hit their designated power targets, it's the power and temperature limit which dictate how high the frequencies go, not the other way around.
I can assure you there's no CPU bottleneck happening here:
I can show you Spider-Man with RT at 8K at sub 40 fps not hitting 450W if you want a personal example? I don't get why you keep denying reality when there's so many examples you can check. It's fine to think it makes no sense but I think a CPU that hits TJmax before hitting the power limit or voltage limit makes even less sense and yet that also exists. It's really weird how you act as if you know it better than people who actually own the card and can test it.
Also I'm pretty sure a voltage-frequency curve is mostly dependant on the process node in function with the architecture.
This is my stock voltage curve and once I hit 2790mhz it'll not go up no matter how little power I'm using as the max voltage Nvidia allows at stock is 1050mv.
I can guarantee you the 4070 will run at it's designated power target all the time in the same way the 4070ti does :
No because it wouldn't prove anything, RT cores would be the bottleneck in that case leaving shaders underutilized, it's well known that RT workloads result in lesser power consumption vs pure rasterized workloads on Nvidia cards. Also Spider-Man is known to be pretty heavy on the CPU under any circumstances so it would be a bad choice anyway.