Friday, December 9th 2022

NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

It turns out that NVIDIA has not one, but two new GeForce RTX 40-series "Ada" SKUs on the anvil this January. One of these is the RTX 4070 Ti, which we know well to be a rebranding of the RTX 4080 12 GB in the face of backlash that forced NVIDIA to "unlaunch" it. The other as it turns out, is the RTX 4070, with an interesting set of specifications. Based on the same 4 nm AD104 silicon as the RTX 4070 Ti, the new RTX 4070 is significantly cut down. NVIDIA enabled 46 out of 60 streaming multiprocessors (SM) physically present on the silicon, which yield 5,888 CUDA cores—the same count as the previous-gen RTX 3070, when compared to the 7,680 that the maxed-out RTX 4070 Ti enjoys.

The GeForce RTX 4070, besides 5,888 CUDA cores, gets 46 RT cores, 184 Tensor cores, 184 TMUs, and a reduced ROP count of 64, compared to 80 of the RTX 4070 Ti. The memory configuration remains the same as the RTX 4070 Ti, with 12 GB of 21 Gbps GDDR6X memory across the chip's 192-bit memory interface, working out to 504 GB/s of memory bandwidth. An interesting aspect of this SKU is its board power, rated at 250 W, compared to the 285 W of the RTX 4070 Ti, and the 220 W of its 8 nm predecessor, the RTX 3070.
Source: harukaze5719 (Twitter)
Add your own comment

93 Comments on NVIDIA GeForce RTX 4070 and RTX 4070 Ti Detailed Specs Sheet Leaks

#1
b1k3rdude
Im not overly impressed. Apart from the extra L2 cache, and some increased numbers on the fake 4080 the new GPU's seems to have lower specs. I guess the 3rd party reviews will confirm or refute my viewpoint.
Posted on Reply
#2
Dimitriman
Wow, so we are relying on clock speed and arch improvements only for the vanilla 4070 huh? How insulting this generation is to the Nvidia buyer! Unfortunately Nvidia can still count on enablement from its die hard crowd who will totally buy into the "Layered on Top" strategy from Jensen.

It's very impressive the kind of price premium Ray Tracing alone can command nowadays, AMD probably recognizes this which is why it didn't even try to compete with the 4090 on the $1500+ market. What I don't understand is why they will still not catch up on RT even though Intel managed to be on par with Ampere on its discrete GPU debut!
Posted on Reply
#3
Eykxas
b1k3rdudeIm not overly impressed. Apart from the extra L2 cache, and some increased numbers on the fake 4080 the new GPU's seems to have lower specs. I guess the 3rd party reviews will confirm or refute my viewpoint.
What inscreased numbers ? The specs of the 4070 Ti are strictly identical to the 4080 12GB. Same core count, same frequency, same bus, same TDP etc...
Posted on Reply
#4
bug
This is potentially good news. 1/4 smaller than previous gen, about the same specs and a narrower memory bus. All the premises for something that a little faster than Ampere at lower prices. I am now curious about how this plays out.
Posted on Reply
#7
Bwaze
Remember, you can't compare Ada MSRP with Ampere MSRP, because that one was fake and prices were much higher during majority of that generation.

So don't remember to compare to scalped prices, and only imagination can limit you on how expensive you want them to appear so Ada cards will look inviting!

Imagination!
Posted on Reply
#8
Denver
EykxasWhat inscreased numbers ? The specs of the 4070 Ti are strictly identical to the 4080 12GB. Same core count, same frequency, same bus, same TDP etc...
It's not based on the same chip, apparently it's the little brother in the family.
Posted on Reply
#9
Bwaze
RTX 4080 12GB was always meant to be AD 104 silicon, a third one from the top. That's why it was so strange for Nvidia to even consider the "X080" label.

Ampere generation had everything from 3080 10 GB to 3090 Ti on the same top silicon!
Posted on Reply
#10
Ibotibo01


Don't forget these results, 4070 Ti will be on a par with 3090. In best case, 4070 would give same performance of 3080. 4070's cuda count is equal to 70% of 4070 Ti. Thus, don't expect 4070 will be same with 3080. 4060 Ti and 4060 would be DOA. Probably, 4060 Ti will have 10 GB memory along with 160 bit bus and 20-30% performance uplift over 3060 Ti. In best case, 4060 Ti will have 5120 cores. 4060 will have 8GB or 10GB it depends on Nvidia, I expect 40-50% performance uplift over 3060 12GB.





THIS GENERATION ONLY RELY ON DLSS 3.0. But, if FSR 3.0 supports all GPUs, Nvidia could be beaten easily by AMD.
Posted on Reply
#11
bug
ARFThe only hope is that AMD could wish to save us :(

RTX 3070 - 5888 shaders, 256-bit, 392 sq. mm, $499
RTX 4070 - 5888 shaders, 192-bit, 295 sq. mm, $699-799




NVIDIA GeForce RTX 4070 rumored to feature 5888 CUDA cores, 12GB memory and 250W TDP - VideoCardz.com

nvidia is digging new deeper holes.

The shitshow by nvidia continues... now it's up to us to vote with our wallets...
That's clearly marked as rumor and talks nothing about price. Price is only discussed in comments by people who "guess" it.
Seems to be fact for you tho.
Posted on Reply
#12
Denver
BwazeRTX 4080 12GB was always meant to be AD 104 silicon, a third one from the top. That's why it was so strange for Nvidia to even consider the "X080" label.

Ampere generation had everything from 3080 10 GB to 3090 Ti on the same top silicon!
Wow... I just saw that Nvidia planned to launch the 4080 12gb with a mid-end chip, what a joke. lol
Posted on Reply
#13
Dimitriman
BwazeRemember, you can't compare Ada MSRP with Ampere MSRP, because that one was fake and prices were much higher during majority of that generation.

So don't remember to compare to scalped prices, and only imagination can limit you on how expensive you want them to appear so Ada cards will look inviting!

Imagination!
That is exactly what Nvidia wants you to believe my friend! Do not fall for it. The last gen was a fluke because of a sum of all fears coming together. The market has always been traded at/around MSRP with only launch volumes being problematic.
Posted on Reply
#14
Bwaze
Forgot to add the /sarcasm at the end.

Poe's Law states:

“”Without a clear indication of the author's intent, it is difficult or impossible to tell the difference between an expression of sincere extremism and a parody of extremism.
Posted on Reply
#15
ARF
bugThat's clearly marked as rumor and talks nothing about price. Price is only discussed in comments by people who "guess" it.
Seems to be fact for you tho.
It doesn't say anything about the performance either. But if those 64 ROPs are indeed correct, don't expect it to be too fast. This screams DOA as is now.

nvidia rebadged the whole lineup.

RTX 4090 is RTX 4080.
RTX 4080 is RTX 4070.
RTX 4070 is RTX 4060.

RTX 4090 Ti is RTX 4080 Ti.

RTX 3090 Ti -> nothing
RTX 3090 -> nothing
RTX 3080 -> RTX 4090
RTX 3070 -> RTX 4080
RTX 3060 -> RTX 4070
Posted on Reply
#16
Bwaze
DenverWow... I just saw that Nvidia planned to launch the 4080 12gb with a mid-end chip, what a joke. lol
You could say they were planing to launch an RTX 4080 card with a LOW END chip, compared to Ampere architecture, which used two different chips for almost the whole desktop range:

Top end:
  • GeForce RTX 3080 (GA102)
  • GeForce RTX 3080 12GB (GA102)
  • GeForce RTX 3080 Ti (GA102)
  • GeForce RTX 3090 (GA102)
  • GeForce RTX 3090 Ti (GA102)
Midrange:

GeForce RTX 3060 (GA106 or GA104)
GeForce RTX 3060 Ti (GA104 or GA103)
GeForce RTX 3070 (GA104)
GeForce RTX 3070 Ti (GA104)

Low end:

GeForce RTX 3050 (GA106 or GA107)
Posted on Reply
#17
ARF
BwazeMidrange:

GeForce RTX 3060 (GA106 or GA104)
GeForce RTX 3060 Ti (GA104 or GA103)
GeForce RTX 3070 (GA104)
GeForce RTX 3070 Ti (GA104)

Low end:

GeForce RTX 3050 (GA106 or GA107)
The midrange is such a mess. It's almost as if you are playing lottery and only the luck decides how much performance you would get.

RTX 3090 -> RTX 4090 up 56% higher performance
RTX 3080 -> RTX 4080 up 42% higher performance
RTX 3070 -> RTX 4070 up 15% higher performance

It looks like nvidia now works only for the rich, while the average joe will have no choice but to avoid this generation.
Posted on Reply
#18
MarsM4N
Ibotibo01

Don't forget these results, 4070 Ti will be on a par with 3090. In best case, 4070 would give same performance of 3080. 4070's cuda count is equal to 70% of 4070 Ti. Thus, don't expect 4070 will be same with 3080. 4060 Ti and 4060 would be DOA. Probably, 4060 Ti will have 10 GB memory along with 160 bit bus and 20-30% performance uplift over 3060 Ti. In best case, 4060 Ti will have 5120 cores. 4060 will have 8GB or 10GB it depends on Nvidia, I expect 40-50% performance uplift over 3060 12GB.





THIS GENERATION ONLY RELY ON DLSS 3.0. But, if FSR 3.0 supports all GPUs, Nvidia could be beaten easily by AMD.
Well, it just shows that everything from Nvidia lower than a 4090 will suck at 4k gaming. :cool: For "peasant" resolutions like 1080p & 1440p the 4080(16GB) etc. is still fine, but on 4k it will choke.
(fake DLSS3 FPS not considered)
Posted on Reply
#19
ARF
MarsM4NWell, it just shows that everything from Nvidia lower than a 4090 will suck at 4k gaming.
"Max settings" are the worst settings - you lose 50-80% performance compared to normal/adequate settings.
Posted on Reply
#20
pavle
Looks like selling Fake Frames (TM) is the new norm. :shadedshu:
Posted on Reply
#21
bug
pavleLooks like selling Fake Frames (TM) is the new norm. :shadedshu:
It's not any more fake than AA. Nor is it any more mandatory ;)

It's basically the same story since the advent of 3D acceleration: silicon can never keep up with monitor resolutions (and, more recently, refresh rates) game engines, so they will have to fake this and that. Anyone still remembers the "brilinear" filtering debacle? That one produced way worse results than DLSS (luckily for us it was superseded by AF).
Posted on Reply
#22
pavle
bugIt's not any more fake than AA. Nor is it any more mandatory ;)

It's basically the same story since the advent of 3D acceleration: silicon can never keep up with monitor resolutions (and, more recently, refresh rates) game engines, so they will have to fake this and that. Anyone still remembers the "brillinear" filtering debacle? That one produced way worse results than DLSS (luckily for us it was superseded by AF).
You're quite right and yeah vividly I remember brilinear, because with nvidia we still use it - each and every nvidia card and they use it even when you turn on 16x (maximum) anisotropic filtering (between lod areas) with drivers set to "Quality" which is factory default. Here's an example of today's nvidia brilinear (at least the tester shows exactly what goes on in-game, not like AMD where perfect trilinear is shown in AF tester and Moire mess in games).
Posted on Reply
#24
bug
pavleYou're quite right and yeah vividly I remember brilinear, because with nvidia we still use it - each and every nvidia card and they use it even when you turn on 16x (maximum) anisotropic filtering (between lod areas) with drivers set to "Quality" which is factory default. Here's an example of today's nvidia brilinear (at least the tester shows exactly what goes on in-game, not like AMD where perfect trilinear is shown in AF tester and Moire mess in games).
At least in that screenshot the difference is minimal. Early iterations were horrible, instead of (near) circular, we were seeing all sorts of starry patterns.
Posted on Reply
#25
pavle
Yeah, that's correct, bug. Flower power generations of graphics cards. :)
You can get full trilinear either by turning trilinear optimizations to off in nvidia Control panel or choosing "High Quality" rendering option. With nv Quadro cards you get true trilinear under OpenGL even with rendering set to "Quality".
Posted on Reply
Add your own comment
Nov 22nd, 2024 12:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts