Monday, April 10th 2023

Official NVIDIA RTX 4070 Performance Claims Leak Online

Just a few days before the official launch scheduled for April 13th, the first official performance figures for the RTX 4070 have found their way online. As expected, NVIDIA is showing the performance with DLSS 3 and Frame Generation, comparing it to three graphics cards in the RTX 30 series, the RTX 3080, the RTX 3070 Ti, and the RTX 3070.

According to NVIDIA, the GeForce RTX 4070 is targeting 1440p performance at above 100 FPS, with ray tracing and DLSS 3 enabled, of course. NVIDIA officially did not reveal any performance numbers without DLSS 3, but earlier rumors puts the RTX 4070 at around the same performance as the RTX 3080. With DLSS 3 and according to NVIDIA, the RTX 4070 is 1.4x, 1.7x, and 1.8x times faster, compared to the aforementioned RTX 30 series graphics cards.
NVIDIA has a pretty decent list of games, with and without Frame Generation, showing the RTX 4070 pulling way ahead of the RTX 3070 Ti and RTX 2070 Super, as expected. NVIDIA was keen to note that the RTX 3080 launched at $699, and the RTX 3070 Ti launched at $599, which should be the MSRP of the RTX 4070.

As detailed earlier, the GeForce RTX 4070 should be around 15 percent slower than the RTX 4070 Ti in gaming, with the gap closing on higher resolutions. Of course, these are all numbers with DLSS 2/DLSS 3, so you should wait for reviews to show up to get a better idea on the actual performance, performance per Watt, and performance per dollar. The first slide also confirms that the GeForce RTX 4070 will be available on April 13th at $599.
Source: Videocardz
Add your own comment

113 Comments on Official NVIDIA RTX 4070 Performance Claims Leak Online

#51
Max(IT)
Marketing bs including DLSS3 makes little sense, but considering the specs, the 4070 will be very close to the 3080, maybe faster at 1080P, and slightly slower at 4K.
rv8000Not directed at you, but that line is a blanket excuse for being an ill informed buyer. On hand I have a 2070 super, 3080 12gb, and 7900 XTX between the 3 rigs I have at home, the 3080 has had more driver related crashes and issues in the span of a week than my 7900 XTX since release.

While for some the initial change going between nvidia and amd drivers may be jarring, modern UI aside, AMD offers more and better control through their driver than nvidia currently does from a feature standpoint.

I can’t say for sure, but I like to blame tech tubers for a lot of this. They burn through so much free hardware for pumping out reviews and click bait videos, that more often than not discrepancies across reviews and other reviewers point out they more than likely have software/setup configuration issues that sometimes aren’t even related to the hardware they’re testing. With so little time to move to the next review it gets passed of as yea it’s definitely their problem (amd/nvidia/whoever) and not something they did. So everyone draws conclusions from half baked data, and click-bait snide bullet points, then choose to argue with whatever “influencers” data suits their argument

Write me off as a fanboy, but both nvidia and amd have had major issues in the past, which is entirely different from now. With the exception of some buggy game releases, both my nvidia and amd rigs have been exceptionally stable. I laugh everytime someone uses the driver excuse though. It may be my opinion, but AMD drivers are objectively better.

TLDR

Good hardware, bad BAD prices, escepcially Nvidia.
AMD drivers better … Is this some kind of a joke?
Posted on Reply
#52
oxrufiioxo
Max(IT)Marketing bs including DLSS3 makes little sense, but considering the specs, the 4070 will be very close to the 3080, maybe faster at 1080P, and slightly slower at 4K.


AMD drivers better … Is this some kind of a joke?
Yeah DLSS3 comparisons are only semi useful when comparing 40 series vs 40 series but even then it's not a feature everyone will use less so than even DLSS as a selling point

idk the last time I had issues with AMD drivers was the 7970 but only in Crossfire. I even purchased a 5700XT to try and replicate the issue people where having and couldn't but that was a real issue and something AMD should have been faster to fix. Them ignoring the 6000 series to try and fix 7000 series and going by Hardware Unboxed not really improving anything overall was stupid though and just gave Nvidia fanboys more ammunition against them.
Posted on Reply
#53
Max(IT)
P4-630Nice, I do 1440 gaming, but I'm not in a hurry, I'll wait for Blackwell 2x~2.6x the performance..

Sure. And the next one in 2027 will be even better…
Posted on Reply
#54
oxrufiioxo
Max(IT)Sure. And the next one in 2027 will be even better…
You see this every generation. Pascal sucks I'm going to buy the next generation cards.... Turing sucks I'm going to buy the next generation cards..... Ampere sucks but I cant buy it anyway so I'm going to buy the next generation cards.... Ada sucks prices are too high and it never ends.....
Posted on Reply
#55
Why_Me
rv8000You’re conveniently ignoring most 3000 series cards has/had been offered at massive discounted prices while available alongside the 4080/4070ti for months; at this point stock looks to be dried up. People had plenty of time to purchase and make a value comparison at those points.

Almost every AIB model except a handful are also around the $900 mark for the 4070ti, and will be the same case for the 4070. So you get 3080 and 3080ti performance for almost the same exact prices and or performance per dollar.


I also specifically said price to performance, and spoke of nothing about msrp, as that is pretty irrelevant in most cases. Nvidia did nothing to move that needle. They shifted cards down yet another tier. Inflation, cost of materials yada yada, a midrange card (60/70 series), could be had anywhere from $250-400 just a few years ago. Now we get a 4060/4070 at 1.5-2 the cost, no value increase whatsoever, and people come here and defend nvidia on the matter?
pcpartpicker.com/search/?q=RTX+4070+Ti
RTX 4070 Ti 12GB: $799.99, $814.99, $829.99, $839.99
Posted on Reply
#56
Max(IT)
oxrufiioxoYou see this every generation. Pascal sucks I'm going to buy the next generation cards.... Turing sucks I'm going to buy the next generation cards..... Ampere sucks but I cant buy it anyway so I'm going to buy the next generation cards.... Ada sucks prices are too high and it never ends.....
The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.
Posted on Reply
#57
P4-630
Max(IT)The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.
I used to buy ATi only in the past but times have changed.
Also I have a G-Sync only monitor.

BTW there are many people owning an AMD GPU/CPU here....
Posted on Reply
#58
oxrufiioxo
Max(IT)The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.
Eh a decent amount of people on this forum own them and if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
Posted on Reply
#59
rv8000
Why_Mepcpartpicker.com/search/?q=RTX+4070+Ti
RTX 4070 Ti 12GB: $799.99, $814.99, $829.99, $839.99
Yes that is exactly what I said. Not sure what you’re trying to prove? There are a “handful” around msrp, the majority are not. Thanks for proving my point.
Posted on Reply
#60
Why_Me
rv8000Yes that is exactly what I said. Not sure what you’re trying to prove? There are a “handful” around msrp, the majority are not. Thanks for proving my point.
There's plenty at and around MSRP. Two cards by two different manufactures are at MSRP, three more are slightly above MSRP. If someone is ignorant enough to blow $900 on a 4070 Ti then that's on them.
Posted on Reply
#61
rv8000
Why_MeThere's plenty at and around MSRP. Two cards by two different manufactures are at MSRP, three more are slightly above MSRP. If someone is ignorant enough to blow $900 on a 4070 Ti then that's on them.
Unless you have some burning desire to play with ray tracing enabled (butcher fps, degrade visual quality with DLSS and or FSR, and get a good experience in maybe 10 games where it’s worthwhile), I’d say the ignorant person is the one buying a 4070ti at all, so keep defending it all you want. The 4070ti and 7900 XT are terrible choices for value and performance.
Posted on Reply
#62
Why_Me
rv8000Unless you have some burning desire to play with ray tracing enabled (butcher fps, degrade visual quality with DLSS and or FSR, and get a good experience in maybe 10 games where it’s worthwhile), I’d say the ignorant person is the one buying a 4070ti at all, so keep defending it all you want. The 4070ti and 7900 XT are terrible choices for value and performance.
Did W1zzard use ray tracing in this benchmark?

Posted on Reply
#63
tvshacker
So let me get this straight, WITHOUT DLSS3 @1440P:

3070>3070TI>3080/4070>3080TI>3090>3090TI/4070TI>4080
That is a massive GAP... And for NVIDIA to admit that the 4070=3080 it'll be in a best case/cherry picked scenarios.
oxrufiioxoif I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
This one made me think back... I believed the 4070ti was a great deal when it came out: 3090/TI performance for 900€. Now after seeing what happened with the 8Gb cards ind RE 4 Remake and Hogwarts Legacy, my mind has changed...
Posted on Reply
#64
Bwaze
Official Benchmarks For NVIDIA RTX 4070 Leak Online – Matches RTX 3080 Performance Without Frame Generation!

"Considering these are first party benchmarks, a grain of salt never hurt anyone, but they are incredibly exciting as NVIDIA is stating that the upcoming RTX 4070 will be able to match the NVIDIA RTX 3080 GPU in DLSS performance without Frame Generation. This is a huge deal because Frame Generation is quite the controversial technology and so-called 'fake frames' have divided gamers. With the RTX 4070 however, even without framer generation, you are looking at RTX 3080 performance levels with standard DLSS. "

" RTX 4070 appears to have a far greater value proposition once you tie in DLSS 3.0 and/or Frame Generation. Without Frame Generation and just good old DLSS3, it performs more or less identical to an RTX 3080. If you include Frame Generation however, it suddenly performs up to 40% faster than an RTX 3080 or 80% faster than an RTX 3070 - which is what generational upgrades should always be like. "

But then:

" If you are someone that believes at raster performance only (although in today's era, I would add that it is a highly obsolete metric), than the RTX 4070 performs 15% faster than an RTX 3070."

:-P

So, how much does Nvidia pay the reviewers and news sites to write such an exciting articles about sub par performance?
Posted on Reply
#65
Max(IT)
P4-630I used to buy ATi only in the past but times have changed.
Also I have a G-Sync only monitor.

BTW there are many people owning an AMD GPU/CPU here....
Many people HERE doesn’t mean much.
AMD graphic card market share dropped below 6%, worldwide.
The fact of having a very vocal fan base doesn’t change things.
oxrufiioxoEh a decent amount of people on this forum own them and if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
Again, as I said above, having a very vocal fan base doesn’t change facts about market share.
Radeon are cheaper for a single reason: no one want one.
Lisa Su already proved to the market than the moment they gain market share, they immediately drop the “nice price” policy.

BTW I don’t care which card you (or me) prefer. I don’t have any brand loyalty.
I installed dozens of Radeon card in the last 4 years, and on a good chunk of them I had to fix some issue sooner or later (I would say on the 40% of them). Nvidia cards/drivers weren’t perfect, but the percentage of issues drops down to 10%.
This just means one thing for me: angry customers and more workload for me.

yes I know: when you enter a forum like this, AMD supporters are very vocal in saying “never had a problem with my Radeon”. Happy for them. My experience is vastly different.
Posted on Reply
#66
Tek-Check
4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.
Posted on Reply
#67
Max(IT)
Tek-Check4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.
That video is flawed in many ways… by the way, from 8 GB to 12 GB there is a huge difference. 8 GB was a poor choice since the beginning, but 12 GB for cards intended for 1440P are ok in my opinion. The problem would be an 8 GB 4060 Ti, if Nvidia dare to…
Posted on Reply
#68
TheoneandonlyMrK
£599 notes my arse, 1440p my arse.

I can see the EWaste potential from here.

16GB should be minimum on this class of card and for 1440p.

HUB showed what you can expect, two years viable use then viable to scrap, and it wasn't flawed, only someone who hasn't watched it and so, can't back it up with a actual reason would say so, or a fanboi as demonstrated, no watching just opinionated.

And 599 my arse fake MSRP that will be valid day's only then aibs will have to up their prices to Stop loosing money.



And after this I will be surprised if Nvidia don't loose more AIB partners, Nvidia basically shat on them again.
Posted on Reply
#69
john_
Tek-CheckSteve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti
I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
Posted on Reply
#70
Bwaze
Tek-Check4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.
Well, suits them right, real PC Master race card is of course at least $1200!
Posted on Reply
#71
TheoneandonlyMrK
john_I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
This is Exactly what Nvidia's driver optimisation has been for years.

It isn't by accident that Nvidia has things like

LOD bias in driver where Intel and AMD don't.

It's also not for no reason that Nvidia lead the fake frames tech drive.


They Always have cheated at benches and always will, they just have the market on their side now.
Posted on Reply
#72
Tek-Check
Max(IT)That video is flawed in many ways… by the way, from 8 GB to 12 GB there is a huge difference. 8 GB was a poor choice since the beginning, but 12 GB for cards intended for 1440P are ok in my opinion. The problem would be an 8 GB 4060 Ti, if Nvidia dare to…
It all depends on which games people play. My 7900XTX can use up to 21GB of VRAM in some dense urban 3D renderings of buildings in Flight Simulator. If someone plays VRAM intense games (the once Steve measured, plus MFS), 12GB on 4070 and 4070Ti will quickly become troublesome in terms of stuttering, lower RT performance and unloaded textures, just like 3070 and 3070Ti have. 12GB vs 8GB is NOT "huge difference", I am afraid. Just try those games tested and you will find out that now you might play well, on the edge of 10-11GB, but next year and later on, there will be mounting problems. The history of 3070 and 3070Ti will repear itself. This is also what Unreal Engine 5 game developers say in interviews. have a watch online.
BwazeWell, suits them right, real PC Master race card is of course at least $1200!
7900XTX is already available for $960 in some markets, from Asrock. It's faster than 4080 and has 24GB of VRAM, not to be underestimated at all.
john_I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
We can trust it, as several reviewers test video quality on images, both in pure raster and with upscalers. I would not go into conspiracy of games silently changing texture settings. Even if they do, it can still be checked and uncovered. That would be embarrassing for GPU vendors to report to the public.
Posted on Reply
#73
kanecvr
john_So, Nvidia is marketing Frame Generation as a performance improvement.
I remember when nvidia's FX series (5xxx) could not compete with ATi's 9000 series, and were caught cheating in benchmarks - particularly 3dmark 2001 if memory serves. Everyone was up in arms about it. Now they come up with DLSS and is trying to sell it as a "performance increase" - and fans rejoice... What a clown world we live in.
Posted on Reply
#74
Bwaze
Tek-Check7900XTX is already available for $960 in some markets, from Asrock. It's faster than 4080 and has 24GB of VRAM, not to be underestimated at all.
But it's made by a company that's loosing that small market share it still has to broken Intel cards. :-D
kanecvrI remember when nvidia's FX series (5xxx) could not compete with ATi's 9000 series, and were caught cheating in benchmarks - particularly 3dmark 2001 if memory serves. Everyone was up in arms about it. Now they come up with DLSS and is trying to sell it as a "performance increase" - and fans rejoice...
They don't even hide it, there's no conspiracy - people just aren't reading reviews in the finer details. I'm quite sure the default benchmarks with no DLSS, no frame generation will sooner or later be delegated to closing pages in reviews, as a footnote - or the reviewers won't be getting their shiny new cards for free, and then they can just close down.

As WCCFTech wrote:

"If you are someone that believes at raster performance only (although in today's era, I would add that it is a highly obsolete metric), than the RTX 4070 performs 15% faster than an RTX 3070."
Posted on Reply
#75
Tek-Check
BwazeBut it's made by a company that's loosing that small market share it still has to broken Intel cards. :-D
If you see majority of people crossing a bridge and jumping into water, are you going to follow them mindlessly and jump too? That's what your comment is about - conformity to market share rather than looking into what's actually on offer.
Posted on Reply
Add your own comment
Dec 18th, 2024 21:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts