• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce RTX 4070 SUPER Dual OC Snapped—Goodbye 8-pin

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Here are some of the first pictures of the ASUS GeForce RTX 4070 SUPER Dual OC, the company's close-to-MSRP custom-design implementation of the upcoming RTX 4070 SUPER, which is expected to be announced on January 8, with reviews and retail availability a week later. The card very closely resembles the design of the RTX 4070 Dual OC, but with one major difference—the single 8-pin PCIe power connector makes way for a 16-pin 12VHPWR. Considering that the ASUS Dual OC series tends to come with a nominal factory OC at power limits matching NVIDIA reference, this is the first sign that the RTX 4070 SUPER in general might have typical graphics power (TGP) above what a single 8-pin could fulfill, and so we've given a 12VHPWR, just like every RTX 4070 Ti. The cards will include an NVIDIA-designed adapter that converts two 8-pin PCIe to a 12VHPWR, with its signal pins set to tell the graphics card that it can deliver 300 W of continuous power.

The GeForce RTX 4070 SUPER is based on the same AD104 silicon as the RTX 4070 and RTX 4070 Ti, with its ASIC code rumored to be "AD104-350." The SKU allegedly enables 56 out of 60 streaming multiprocessors (SM) present on the silicon, giving it 7,168 out of 7,680 CUDA cores. This is a big increase from the 5,888 CUDA cores (46 SM) that the vanilla RTX 4070 is configured with. The memory subsystem is expected to be unchanged from the RTX 4070 and RTX 4070 Ti—12 GB of 21 Gbps GDDR6X across a 192-bit memory interface; leaving NVIDIA with one possible lever, the ROP count. While the RTX 4070 Ti has 80 ROPs, the RTX 4070 has 64. It remains to be seen how many the RTX 4070 SUPER gets. Its rumored TGP of 225 W is behind the switch to 12VHPWR connectors.



View at TechPowerUp Main Site | Source
 
And I assume Asus will price it exactly 2x as the original 4070, no wait this is an Nvidia card :nutkick:
 
They just gave me one more reason to buy a 3090 Ti. xD
 
Napkin math says that it should land around the 3080Ti mark performance wise. Not great, not terrible. A lot will depend on the pricing. If it just replaces the normal 4070 at 600 (or, let’s be bold and say 550) then it will be a decent buy. If it goes any higher, like 700, then this will be another pointless release.
It is wild how a 4060, a card much maligned by the enthusiast community, is still arguably the best value out of all 40-series lineup.
 
It is wild how a 4060, a card much maligned by the enthusiast community, is still arguably the best value out of all 40-series lineup.
4070 makes more sense to me.

+ 12 GB (not a very great number overall, yet compared to 4060 series...)
+ $ per FPS ratio at 4K is near identical to 4060, yet a bit worse at 1440p/1080p (4070 even becomes better $ per FPS in the most demanding games, however)
+ Full 16 lane PCI-e interface
+ Is actually capable of some ray tracing even with DLSS and FG disabled in some games

The only reasons to prefer 4060 over 4070 are budget and size/wattage restrictions. The former, however, feels more like an excuse.

If it goes any higher, like 700, then this will be another pointless release.
Unlikely. 550 is as unlikely. I'd expect $600...650. Reasonably overpriced that is.
 
What's the point of turning to 12vhp?
Does 12vhp have any significant advantage? Like it catches fire more quickly? I don't think I want fireworks in my PC.
Make a change when a change is needed or necessary, otherwise it's useless.
Well, 8-pin + 6-pin would do. I don't blame them if one 12vhp costs a lot less.
 
Last edited:
What's the point of turning to 12vhp?
Does 12vhp have any significant advantage? Like it catches fire more quickly?
Make a change when a change is needed or necessary, otherwise it's useless. I don't think I want fireworks in my PC.
Well, 8-pin + 6-pin would do. I don't blame them if one 12vhp costs a lot less.
A single connector is easier to manage in builds and won’t blow up your computer if demand is to high as there is communication between the devices.
 
A single connector is easier to manage in builds and won’t blow up your computer if demand is to high as there is communication between the devices.
Says who? People have ran dual 8 / 6 pin for ages no problem, and psus readily support it at the wattages you need for said gpu. Blowing up computer with high demand...?! Wha?

As for the 4070 S.. its just as DOA as a 4070ti, limited through VRAM both bandwidth and capacity... pointless release.
 
Says who? People have ran dual 8 / 6 pin for ages no problem, and psus readily support it at the wattages you need for said gpu. Blowing up computer with high demand...?! Wha?

As for the 4070 S.. its just as DOA as a 4070ti, limited through VRAM both bandwidth and capacity... pointless release.
Only interesting 4070 is the ti super as it has a 256 bus like a 4070 should have.

and yes the nice thing about the new connector is components being able to communicate load. If you have a billion Watt intel cpu having a gpu that could request a pittance of the power budget is nice.
 
Pricing will be key on the super cards, if they move the bar forward at all, mild success, if priced in like with current price to perf for40 series, it's a bust. No way they price higher than launch msrps of cards they replace.
 
and yes the nice thing about the new connector is components being able to communicate load.
It's not nice, it's called "unnecessarily complex."

"Old school" PSUs ran multi-GPU + multi-CPU or other devilish power hog configs for ages without a hinch, and now, nVidia are introducing an "intelligent" cable that tells the GPU how much it can draw. How many things can go horribly wrong? All of them. The controller might bug out and misinterpret the readings, ultimately leading your GPU to (a lesser evil) underperform due to insufficient power or (a bigger evil) blow up due to excessive power. It might also die, ultimately bricking the PSU and, if everything went south, the rest of PC. Of course these are ridiculously rare occasions but this happens to "smart" TVs, fridges, automobiles etc, so why should PSUs be any different?

All reasonable PC builders take power spikes and going over limits on all their components into account and purchase respective "overkill" PSUs for ages. Additional layer of "safety" only removes said safety in long term. Classic 8-pin connectors are time tested, approved, are not misdesigned, they are widely available and can withstand stupid high amounts of wattage. With AWG16, you are likely to feed any existing mass market GPU only off of one 8-pin. And their main advantage is that they don't decide what's best for your components, they just do their job.

K.I.S.S.
 
Last edited:
New GPU launches mean Nvidia still cares about gaming :cool:
 
What's the point of turning to 12vhp?
Consolidation of resources towards a single connector. That was the plan from all along with power levels from nothing to 600W. New PSUs have native 12VHPWR cables and bundled adapters have hopefully been figured out by now, plus at this power level 12VHPWR should really not be a problem.
 
at this power level 12VHPWR should really not be a problem.
But it is a major problem. At this price and wattage range, most users already own an "obsolete" PSU or are aiming at past-gen PSUs, some of them being aftermarket buyers. That's why purchasing this GPU will inflict "collateral damage" as a buyer will unnecessarily buy adaptors/cables or even a more expensive PSU. Of course the alternative, id est buying a different GPU, still exists but that's far from ideal.

It's okay to only sell enthusiast class wares with not-so-widely spread interfaces. There is absolutely no reason other than monopolist ambitions to force this connector in other segments.
 
But it is a major problem. At this price and wattage range, most users already own an "obsolete" PSU or are aiming at past-gen PSUs, some of them being aftermarket buyers. That's why purchasing this GPU will inflict "collateral damage" as a buyer will unnecessarily buy adaptors/cables or even a more expensive PSU. Of course the alternative, id est buying a different GPU, still exists but that's far from ideal.

It's okay to only sell enthusiast class wares with not-so-widely spread interfaces. There is absolutely no reason other than monopolist ambitions to force this connector in other segments.

That's a lot of false information.

There is an adaptor included with every sold GPU with 12VHPWR connector
 
There is an adaptor included with every sold GPU with 12VHPWR connector
That's a daisy chain element. Can't be considered a valid option: both fugly and potentially dangerous.
 
As potentially dangerous as your false information
Care to prove it's false? "My PC has still not blown up yet" is not an argument, take something real.
 
You have to prove your false information first pal
Ah, okay, you are one of those who accuse and believe in the guilt presumption.

Better luck trolling next time.
 
Ah, okay, you are one of those who accuse and believe in the guilt presumption.

Better luck trolling next time.

1. You saying people need to upgrade PSU or buying Adapter, which is false
2. You saying the included adapter is unsafe, which is also false
3. You saying I was trolling when you are giving out false information like it's normal LOL
 
It's not nice, it's called "unnecessarily complex."

"Old school" PSUs ran multi-GPU + multi-CPU or other devilish power hog configs for ages without a hinch, and now, nVidia are introducing an "intelligent" cable that tells the GPU how much it can draw. How many things can go horribly wrong? All of them. The controller might bug out and misinterpret the readings, ultimately leading your GPU to (a lesser evil) underperform due to insufficient power or (a bigger evil) blow up due to excessive power. It might also die, ultimately bricking the PSU and, if everything went south, the rest of PC. Of course these are ridiculously rare occasions but this happens to "smart" TVs, fridges, automobiles etc, so why should PSUs be any different?

All reasonable PC builders take power spikes and going over limits on all their components into account and purchase respective "overkill" PSUs for ages. Additional layer of "safety" only removes said safety in long term. Classic 8-pin connectors are time tested, approved, are not misdesigned, they are widely available and can withstand stupid high amounts of wattage. With AWG16, you are likely to feed any existing mass market GPU only off of one 8-pin. And their main advantage is that they don't decide what's best for your components, they just do their job.

K.I.S.S.
How many pc Builders are reasonable?
nvidia was the first mainstream component producer to bring it to market, but it’s a standard he rest of the industry will soon follow.
 
How many pc Builders are reasonable?
Not enough unfortunately.
but it’s a standard he rest of the industry will soon follow
We'll see. We're completely doomed if that's true because this is one of the elements that HAVE NO RIGHT TO HAVE A BRAIN.
 
Back
Top