• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,233 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It turns out that NVIDIA has not one, but two new GeForce RTX 40-series "Ada" SKUs on the anvil this January. One of these is the RTX 4070 Ti, which we know well to be a rebranding of the RTX 4080 12 GB in the face of backlash that forced NVIDIA to "unlaunch" it. The other as it turns out, is the RTX 4070, with an interesting set of specifications. Based on the same 4 nm AD104 silicon as the RTX 4070 Ti, the new RTX 4070 is significantly cut down. NVIDIA enabled 46 out of 60 streaming multiprocessors (SM) physically present on the silicon, which yield 5,888 CUDA cores—the same count as the previous-gen RTX 3070, when compared to the 7,680 that the maxed-out RTX 4070 Ti enjoys.

The GeForce RTX 4070, besides 5,888 CUDA cores, gets 46 RT cores, 184 Tensor cores, 184 TMUs, and a reduced ROP count of 64, compared to 80 of the RTX 4070 Ti. The memory configuration remains the same as the RTX 4070 Ti, with 12 GB of 21 Gbps GDDR6X memory across the chip's 192-bit memory interface, working out to 504 GB/s of memory bandwidth. An interesting aspect of this SKU is its board power, rated at 250 W, compared to the 285 W of the RTX 4070 Ti, and the 220 W of its 8 nm predecessor, the RTX 3070.



View at TechPowerUp Main Site | Source
 
Joined
Apr 2, 2008
Messages
434 (0.07/day)
System Name -
Processor Ryzen 9 5900X
Motherboard MSI MEG X570
Cooling Arctic Liquid Freezer II 280 (4x140 push-pull)
Memory 32GB Patriot Steel DDR4 3733 (8GBx4)
Video Card(s) MSI RTX 4080 X-trio.
Storage Sabrent Rocket-Plus-G 2TB, Crucial P1 1TB, WD 1TB sata.
Display(s) LG Ultragear 34G750 nano-IPS 34" utrawide
Case Define R6
Audio Device(s) Xfi PCIe
Power Supply Fractal Design ION Gold 750W
Mouse Razer DeathAdder V2 Mini.
Keyboard Logitech K120
VR HMD Er no, pointless.
Software Windows 10 22H2
Benchmark Scores Timespy - 24522 | Crystalmark - 7100/6900 Seq. & 84/266 QD1 |
Im not overly impressed. Apart from the extra L2 cache, and some increased numbers on the fake 4080 the new GPU's seems to have lower specs. I guess the 3rd party reviews will confirm or refute my viewpoint.
 
Joined
Jun 5, 2018
Messages
237 (0.10/day)
Wow, so we are relying on clock speed and arch improvements only for the vanilla 4070 huh? How insulting this generation is to the Nvidia buyer! Unfortunately Nvidia can still count on enablement from its die hard crowd who will totally buy into the "Layered on Top" strategy from Jensen.

It's very impressive the kind of price premium Ray Tracing alone can command nowadays, AMD probably recognizes this which is why it didn't even try to compete with the 4090 on the $1500+ market. What I don't understand is why they will still not catch up on RT even though Intel managed to be on par with Ampere on its discrete GPU debut!
 
Joined
Jul 27, 2019
Messages
11 (0.01/day)
Im not overly impressed. Apart from the extra L2 cache, and some increased numbers on the fake 4080 the new GPU's seems to have lower specs. I guess the 3rd party reviews will confirm or refute my viewpoint.

What inscreased numbers ? The specs of the 4070 Ti are strictly identical to the 4080 12GB. Same core count, same frequency, same bus, same TDP etc...
 

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
This is potentially good news. 1/4 smaller than previous gen, about the same specs and a narrower memory bus. All the premises for something that a little faster than Ampere at lower prices. I am now curious about how this plays out.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
The only hope is that AMD could wish to save us :(

RTX 3070 - 5888 shaders, 256-bit, 392 sq. mm, $499
RTX 4070 - 5888 shaders, 192-bit, 295 sq. mm, $699-799

1670581244313.png


1670581309259.png

NVIDIA GeForce RTX 4070 rumored to feature 5888 CUDA cores, 12GB memory and 250W TDP - VideoCardz.com

nvidia is digging new deeper holes.

The shitshow by nvidia continues... now it's up to us to vote with our wallets...
 
Joined
May 2, 2016
Messages
171 (0.05/day)
Joined
May 11, 2018
Messages
1,254 (0.53/day)
Remember, you can't compare Ada MSRP with Ampere MSRP, because that one was fake and prices were much higher during majority of that generation.

So don't remember to compare to scalped prices, and only imagination can limit you on how expensive you want them to appear so Ada cards will look inviting!

Imagination!
 
Joined
Oct 6, 2021
Messages
1,605 (1.40/day)
What inscreased numbers ? The specs of the 4070 Ti are strictly identical to the 4080 12GB. Same core count, same frequency, same bus, same TDP etc...
It's not based on the same chip, apparently it's the little brother in the family.
 
Joined
May 11, 2018
Messages
1,254 (0.53/day)
RTX 4080 12GB was always meant to be AD 104 silicon, a third one from the top. That's why it was so strange for Nvidia to even consider the "X080" label.

Ampere generation had everything from 3080 10 GB to 3090 Ti on the same top silicon!
 
Joined
Oct 10, 2018
Messages
147 (0.07/day)
14070ti.png


Don't forget these results, 4070 Ti will be on a par with 3090. In best case, 4070 would give same performance of 3080. 4070's cuda count is equal to 70% of 4070 Ti. Thus, don't expect 4070 will be same with 3080. 4060 Ti and 4060 would be DOA. Probably, 4060 Ti will have 10 GB memory along with 160 bit bus and 20-30% performance uplift over 3060 Ti. In best case, 4060 Ti will have 5120 cores. 4060 will have 8GB or 10GB it depends on Nvidia, I expect 40-50% performance uplift over 3060 12GB.


14070ti2.png

14070ti3.png


THIS GENERATION ONLY RELY ON DLSS 3.0. But, if FSR 3.0 supports all GPUs, Nvidia could be beaten easily by AMD.
 

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
The only hope is that AMD could wish to save us :(

RTX 3070 - 5888 shaders, 256-bit, 392 sq. mm, $499
RTX 4070 - 5888 shaders, 192-bit, 295 sq. mm, $699-799

View attachment 273690

View attachment 273691
NVIDIA GeForce RTX 4070 rumored to feature 5888 CUDA cores, 12GB memory and 250W TDP - VideoCardz.com

nvidia is digging new deeper holes.

The shitshow by nvidia continues... now it's up to us to vote with our wallets...
That's clearly marked as rumor and talks nothing about price. Price is only discussed in comments by people who "guess" it.
Seems to be fact for you tho.
 
Joined
Oct 6, 2021
Messages
1,605 (1.40/day)
RTX 4080 12GB was always meant to be AD 104 silicon, a third one from the top. That's why it was so strange for Nvidia to even consider the "X080" label.

Ampere generation had everything from 3080 10 GB to 3090 Ti on the same top silicon!
Wow... I just saw that Nvidia planned to launch the 4080 12gb with a mid-end chip, what a joke. lol
 
Joined
Jun 5, 2018
Messages
237 (0.10/day)
Remember, you can't compare Ada MSRP with Ampere MSRP, because that one was fake and prices were much higher during majority of that generation.

So don't remember to compare to scalped prices, and only imagination can limit you on how expensive you want them to appear so Ada cards will look inviting!

Imagination!
That is exactly what Nvidia wants you to believe my friend! Do not fall for it. The last gen was a fluke because of a sum of all fears coming together. The market has always been traded at/around MSRP with only launch volumes being problematic.
 
Joined
May 11, 2018
Messages
1,254 (0.53/day)
Forgot to add the /sarcasm at the end.

Poe's Law states:

“”Without a clear indication of the author's intent, it is difficult or impossible to tell the difference between an expression of sincere extremism and a parody of extremism.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
That's clearly marked as rumor and talks nothing about price. Price is only discussed in comments by people who "guess" it.
Seems to be fact for you tho.

It doesn't say anything about the performance either. But if those 64 ROPs are indeed correct, don't expect it to be too fast. This screams DOA as is now.

nvidia rebadged the whole lineup.

RTX 4090 is RTX 4080.
RTX 4080 is RTX 4070.
RTX 4070 is RTX 4060.

RTX 4090 Ti is RTX 4080 Ti.

RTX 3090 Ti -> nothing
RTX 3090 -> nothing
RTX 3080 -> RTX 4090
RTX 3070 -> RTX 4080
RTX 3060 -> RTX 4070
 
Last edited:
Joined
May 11, 2018
Messages
1,254 (0.53/day)
Wow... I just saw that Nvidia planned to launch the 4080 12gb with a mid-end chip, what a joke. lol

You could say they were planing to launch an RTX 4080 card with a LOW END chip, compared to Ampere architecture, which used two different chips for almost the whole desktop range:

Top end:
  • GeForce RTX 3080 (GA102)
  • GeForce RTX 3080 12GB (GA102)
  • GeForce RTX 3080 Ti (GA102)
  • GeForce RTX 3090 (GA102)
  • GeForce RTX 3090 Ti (GA102)
Midrange:

GeForce RTX 3060 (GA106 or GA104)
GeForce RTX 3060 Ti (GA104 or GA103)
GeForce RTX 3070 (GA104)
GeForce RTX 3070 Ti (GA104)

Low end:

GeForce RTX 3050 (GA106 or GA107)
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Midrange:

GeForce RTX 3060 (GA106 or GA104)
GeForce RTX 3060 Ti (GA104 or GA103)
GeForce RTX 3070 (GA104)
GeForce RTX 3070 Ti (GA104)

Low end:

GeForce RTX 3050 (GA106 or GA107)

The midrange is such a mess. It's almost as if you are playing lottery and only the luck decides how much performance you would get.

RTX 3090 -> RTX 4090 up 56% higher performance
RTX 3080 -> RTX 4080 up 42% higher performance
RTX 3070 -> RTX 4070 up 15% higher performance

It looks like nvidia now works only for the rich, while the average joe will have no choice but to avoid this generation.
 
Joined
Apr 6, 2021
Messages
1,131 (0.85/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
View attachment 273697

Don't forget these results, 4070 Ti will be on a par with 3090. In best case, 4070 would give same performance of 3080. 4070's cuda count is equal to 70% of 4070 Ti. Thus, don't expect 4070 will be same with 3080. 4060 Ti and 4060 would be DOA. Probably, 4060 Ti will have 10 GB memory along with 160 bit bus and 20-30% performance uplift over 3060 Ti. In best case, 4060 Ti will have 5120 cores. 4060 will have 8GB or 10GB it depends on Nvidia, I expect 40-50% performance uplift over 3060 12GB.


View attachment 273698
View attachment 273699

THIS GENERATION ONLY RELY ON DLSS 3.0. But, if FSR 3.0 supports all GPUs, Nvidia could be beaten easily by AMD.

Well, it just shows that everything from Nvidia lower than a 4090 will suck at 4k gaming. :cool: For "peasant" resolutions like 1080p & 1440p the 4080(16GB) etc. is still fine, but on 4k it will choke.
(fake DLSS3 FPS not considered)
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.65/day)
Location
Ex-usa | slava the trolls
Well, it just shows that everything from Nvidia lower than a 4090 will suck at 4k gaming.

"Max settings" are the worst settings - you lose 50-80% performance compared to normal/adequate settings.
 

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Looks like selling Fake Frames (TM) is the new norm. :shadedshu:
It's not any more fake than AA. Nor is it any more mandatory ;)

It's basically the same story since the advent of 3D acceleration: silicon can never keep up with monitor resolutions (and, more recently, refresh rates) game engines, so they will have to fake this and that. Anyone still remembers the "brilinear" filtering debacle? That one produced way worse results than DLSS (luckily for us it was superseded by AF).
 
Last edited:
Joined
May 20, 2020
Messages
1,372 (0.83/day)
It's not any more fake than AA. Nor is it any more mandatory ;)

It's basically the same story since the advent of 3D acceleration: silicon can never keep up with monitor resolutions (and, more recently, refresh rates) game engines, so they will have to fake this and that. Anyone still remembers the "brillinear" filtering debacle? That one produced way worse results than DLSS (luckily for us it was superseded by AF).
You're quite right and yeah vividly I remember brilinear, because with nvidia we still use it - each and every nvidia card and they use it even when you turn on 16x (maximum) anisotropic filtering (between lod areas) with drivers set to "Quality" which is factory default. Here's an example of today's nvidia brilinear (at least the tester shows exactly what goes on in-game, not like AMD where perfect trilinear is shown in AF tester and Moire mess in games).
 

Attachments

  • nvidia_brilinear_w_16af.PNG
    nvidia_brilinear_w_16af.PNG
    72.8 KB · Views: 100
  • Like
Reactions: bug

bug

Joined
May 22, 2015
Messages
13,759 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
You're quite right and yeah vividly I remember brilinear, because with nvidia we still use it - each and every nvidia card and they use it even when you turn on 16x (maximum) anisotropic filtering (between lod areas) with drivers set to "Quality" which is factory default. Here's an example of today's nvidia brilinear (at least the tester shows exactly what goes on in-game, not like AMD where perfect trilinear is shown in AF tester and Moire mess in games).
At least in that screenshot the difference is minimal. Early iterations were horrible, instead of (near) circular, we were seeing all sorts of starry patterns.
 
Top