Tuesday, March 21st 2023
INNO3D RTX 4070 Box Picture Confirms 8-pin PCIe Power Connector
A picture of the INNO3D GeForce RTX 4070 has leaked online, confirming earlier rumor that at least some RTX 4070 graphics cards will not come with new 16-pin 12VHPWR connector, but rather the standard 8-pin PCIe one. This is in line with earlier rumors that NVIDIA will probably have two variants, premium and the one that will be sold closer to the MSRP.
Unfortunately, the box shot of the INNO3D RTX 4070 does not reveal a lot more information, other than it will have a dual-slot, dual-fan cooler with detachable fan blades, and a heatsink with copper base for the GPU, aluminium base for the memory, and nickel plated heatpipes. The RTX 4070, based on the AD104 GPU and packing 12 GB of GDDR6X memory on a 192-bit memory interface, is scheduled to launch on April 13th, with first reviews showing up on April 12th.
Sources:
@9550pro Twitter, via Videocardz.com
Unfortunately, the box shot of the INNO3D RTX 4070 does not reveal a lot more information, other than it will have a dual-slot, dual-fan cooler with detachable fan blades, and a heatsink with copper base for the GPU, aluminium base for the memory, and nickel plated heatpipes. The RTX 4070, based on the AD104 GPU and packing 12 GB of GDDR6X memory on a 192-bit memory interface, is scheduled to launch on April 13th, with first reviews showing up on April 12th.
24 Comments on INNO3D RTX 4070 Box Picture Confirms 8-pin PCIe Power Connector
2. What about the perfromance loss with a benchmark, like 3DMark Time Spy, Superposition? I am sceptic a bit (sry, but no offens) about that -45% power limit cause only 5% 3D performance loss. Maybe your 3D applications only partial load the GPU? I am just curious about the clarification.
I did heavy undervolt with my all recent GPU (RX470, RX580, GTX1060, GTX1070) and with 50-65% of PWR limit the performance loss usually more than 10%, more like ~12-13%.
But all-in-all, no doubt the 3D perf./Watt Effiviency is greatly encreasing, avoiding the heavy OC factory made boosting techs that would like to fry the GPU chip, with heavy voltage and frequency boost.
"Limited" to 200W the 4070 will retain the vast majority of its performance vs. a 240 or 280W power limit.
en.m.wikipedia.org/wiki/Planned_obsolescence Nice token of euhhh... shitty product out of the box.
The 4090 sees a 5% hit at 80% power. At 70% and below the hit starts increasing exponentially and at 50% you are talking 23.5%.
The 4080's performance profile is identical at the same % based power limits and I'd assume that would apply to the 4070 Ti as well.
55% is where you will see maximum efficiency before performance drops off a cliff but it's going to come with a decent hit to performance. Essentially your card will perform between a 3070 Ti and a 3080.
If someone looks at a 4070ti and goes geez 800 usd for that is awesome good for them. I honestly mostly just feel bad for anyone who needs a gpu and at under 1000 usd this is the best they can get at least from Nvidia anyways. A lot of people who waited out the last couple years stuck on Pascal/Turing just to basically get screwed by Nvidia.
Even though I prefer Nvidia gpus I would definitely buy a 7900XTX over it even with the slightly lesser feature set and AMD drivers because at least it offers something new from a rasterized performance perspective if my budget was less than 1k anyways. It is pretty salty. They must have packaged it with the tears of all the gamers who now have to spend 800+ usd on a 3060ti/3070 successor in disguise. :laugh:
I'm guessing you did your own research on the product and decided it was what was best for you the same as I did with the 4090. Neither card is overly appealing giving their price and this is likely the new norm which is just kinda sad honestly. Hopefully I'm wrong.
Seeing all the 4070ti/4080 just sitting on store shelves collecting dust does give me some hope. Same with the equally bad 7900XT.
Considering that you have a 1080p 63 Hz monitor, it makes sense how you got your 5% drop figure based on that game and monitor setup. You are essentially using only a small portion of that GPU's power.
It draws 220 Watt at stock settings.