• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Official NVIDIA RTX 4070 Performance Claims Leak Online

I might get downvoted for this but they should probably include some lube in the boxes for sure.......

The problem with the 7900XT/7900XTX is AMD is not at a point where for most gamers they are very appealing even though they are both slightly better overall than the 4070ti given the pricing. I've never personally had any major issues with amd drivers but I know many who have to the point that AMD gpu's don't even exist anymore to them.

I own plenty of AMD and Nvidia products I still will bash the crap out of stuff they release that isn't very good due to pricing. At the same time I won't blindly do it either even if it's not a good one you can still make a case for all the current gen gpus if you try hard enough...

lets be real though all these products are pretty good from the 4070ti to the 4090 and both the AMD offerings the issue is their price this isn't a RX 6500 scenario that is just a bad product regardless of price but even then people who own them defend it and enjoy it so my opinion really doesn't matter all that much to begin with at least for the people actually buying these products and not just blindly drinking the koolaid from either of these companies.

Not directed at you, but that line is a blanket excuse for being an ill informed buyer. On hand I have a 2070 super, 3080 12gb, and 7900 XTX between the 3 rigs I have at home, the 3080 has had more driver related crashes and issues in the span of a week than my 7900 XTX since release.

While for some the initial change going between nvidia and amd drivers may be jarring, modern UI aside, AMD offers more and better control through their driver than nvidia currently does from a feature standpoint.

I can’t say for sure, but I like to blame tech tubers for a lot of this. They burn through so much free hardware for pumping out reviews and click bait videos, that more often than not discrepancies across reviews and other reviewers point out they more than likely have software/setup configuration issues that sometimes aren’t even related to the hardware they’re testing. With so little time to move to the next review it gets passed of as yea it’s definitely their problem (amd/nvidia/whoever) and not something they did. So everyone draws conclusions from half baked data, and click-bait snide bullet points, then choose to argue with whatever “influencers” data suits their argument

Write me off as a fanboy, but both nvidia and amd have had major issues in the past, which is entirely different from now. With the exception of some buggy game releases, both my nvidia and amd rigs have been exceptionally stable. I laugh everytime someone uses the driver excuse though. It may be my opinion, but AMD drivers are objectively better.

TLDR

Good hardware, bad BAD prices, escepcially Nvidia.
 
Marketing bs including DLSS3 makes little sense, but considering the specs, the 4070 will be very close to the 3080, maybe faster at 1080P, and slightly slower at 4K.

Not directed at you, but that line is a blanket excuse for being an ill informed buyer. On hand I have a 2070 super, 3080 12gb, and 7900 XTX between the 3 rigs I have at home, the 3080 has had more driver related crashes and issues in the span of a week than my 7900 XTX since release.

While for some the initial change going between nvidia and amd drivers may be jarring, modern UI aside, AMD offers more and better control through their driver than nvidia currently does from a feature standpoint.

I can’t say for sure, but I like to blame tech tubers for a lot of this. They burn through so much free hardware for pumping out reviews and click bait videos, that more often than not discrepancies across reviews and other reviewers point out they more than likely have software/setup configuration issues that sometimes aren’t even related to the hardware they’re testing. With so little time to move to the next review it gets passed of as yea it’s definitely their problem (amd/nvidia/whoever) and not something they did. So everyone draws conclusions from half baked data, and click-bait snide bullet points, then choose to argue with whatever “influencers” data suits their argument

Write me off as a fanboy, but both nvidia and amd have had major issues in the past, which is entirely different from now. With the exception of some buggy game releases, both my nvidia and amd rigs have been exceptionally stable. I laugh everytime someone uses the driver excuse though. It may be my opinion, but AMD drivers are objectively better.

TLDR

Good hardware, bad BAD prices, escepcially Nvidia.
AMD drivers better … Is this some kind of a joke?
 
Marketing bs including DLSS3 makes little sense, but considering the specs, the 4070 will be very close to the 3080, maybe faster at 1080P, and slightly slower at 4K.


AMD drivers better … Is this some kind of a joke?
Yeah DLSS3 comparisons are only semi useful when comparing 40 series vs 40 series but even then it's not a feature everyone will use less so than even DLSS as a selling point

idk the last time I had issues with AMD drivers was the 7970 but only in Crossfire. I even purchased a 5700XT to try and replicate the issue people where having and couldn't but that was a real issue and something AMD should have been faster to fix. Them ignoring the 6000 series to try and fix 7000 series and going by Hardware Unboxed not really improving anything overall was stupid though and just gave Nvidia fanboys more ammunition against them.
 
Sure. And the next one in 2027 will be even better…

You see this every generation. Pascal sucks I'm going to buy the next generation cards.... Turing sucks I'm going to buy the next generation cards..... Ampere sucks but I cant buy it anyway so I'm going to buy the next generation cards.... Ada sucks prices are too high and it never ends.....
 
You’re conveniently ignoring most 3000 series cards has/had been offered at massive discounted prices while available alongside the 4080/4070ti for months; at this point stock looks to be dried up. People had plenty of time to purchase and make a value comparison at those points.

Almost every AIB model except a handful are also around the $900 mark for the 4070ti, and will be the same case for the 4070. So you get 3080 and 3080ti performance for almost the same exact prices and or performance per dollar.


I also specifically said price to performance, and spoke of nothing about msrp, as that is pretty irrelevant in most cases. Nvidia did nothing to move that needle. They shifted cards down yet another tier. Inflation, cost of materials yada yada, a midrange card (60/70 series), could be had anywhere from $250-400 just a few years ago. Now we get a 4060/4070 at 1.5-2 the cost, no value increase whatsoever, and people come here and defend nvidia on the matter?
RTX 4070 Ti 12GB: $799.99, $814.99, $829.99, $839.99
 
You see this every generation. Pascal sucks I'm going to buy the next generation cards.... Turing sucks I'm going to buy the next generation cards..... Ampere sucks but I cant buy it anyway so I'm going to buy the next generation cards.... Ada sucks prices are too high and it never ends.....
The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.
 
The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.
I used to buy ATi only in the past but times have changed.
Also I have a G-Sync only monitor.

BTW there are many people owning an AMD GPU/CPU here....
 
The only constant is: no one want an AMD card. It doesn’t matter how badly priced and low specced Nvidia cards are. Radeon still are irrelevant.

Eh a decent amount of people on this forum own them and if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
 
Yes that is exactly what I said. Not sure what you’re trying to prove? There are a “handful” around msrp, the majority are not. Thanks for proving my point.
There's plenty at and around MSRP. Two cards by two different manufactures are at MSRP, three more are slightly above MSRP. If someone is ignorant enough to blow $900 on a 4070 Ti then that's on them.
 
There's plenty at and around MSRP. Two cards by two different manufactures are at MSRP, three more are slightly above MSRP. If someone is ignorant enough to blow $900 on a 4070 Ti then that's on them.

Unless you have some burning desire to play with ray tracing enabled (butcher fps, degrade visual quality with DLSS and or FSR, and get a good experience in maybe 10 games where it’s worthwhile), I’d say the ignorant person is the one buying a 4070ti at all, so keep defending it all you want. The 4070ti and 7900 XT are terrible choices for value and performance.
 
Unless you have some burning desire to play with ray tracing enabled (butcher fps, degrade visual quality with DLSS and or FSR, and get a good experience in maybe 10 games where it’s worthwhile), I’d say the ignorant person is the one buying a 4070ti at all, so keep defending it all you want. The 4070ti and 7900 XT are terrible choices for value and performance.
Did W1zzard use ray tracing in this benchmark?

average-fps_2560_1440.png
 
So let me get this straight, WITHOUT DLSS3 @1440P:

3070>3070TI>3080/4070>3080TI>3090>3090TI/4070TI>4080
That is a massive GAP... And for NVIDIA to admit that the 4070=3080 it'll be in a best case/cherry picked scenarios.

if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
This one made me think back... I believed the 4070ti was a great deal when it came out: 3090/TI performance for 900€. Now after seeing what happened with the 8Gb cards ind RE 4 Remake and Hogwarts Legacy, my mind has changed...
 
Official Benchmarks For NVIDIA RTX 4070 Leak Online – Matches RTX 3080 Performance Without Frame Generation!

"Considering these are first party benchmarks, a grain of salt never hurt anyone, but they are incredibly exciting as NVIDIA is stating that the upcoming RTX 4070 will be able to match the NVIDIA RTX 3080 GPU in DLSS performance without Frame Generation. This is a huge deal because Frame Generation is quite the controversial technology and so-called 'fake frames' have divided gamers. With the RTX 4070 however, even without framer generation, you are looking at RTX 3080 performance levels with standard DLSS. "

" RTX 4070 appears to have a far greater value proposition once you tie in DLSS 3.0 and/or Frame Generation. Without Frame Generation and just good old DLSS3, it performs more or less identical to an RTX 3080. If you include Frame Generation however, it suddenly performs up to 40% faster than an RTX 3080 or 80% faster than an RTX 3070 - which is what generational upgrades should always be like. "

But then:

" If you are someone that believes at raster performance only (although in today's era, I would add that it is a highly obsolete metric), than the RTX 4070 performs 15% faster than an RTX 3070."

:-P

So, how much does Nvidia pay the reviewers and news sites to write such an exciting articles about sub par performance?
 
I used to buy ATi only in the past but times have changed.
Also I have a G-Sync only monitor.

BTW there are many people owning an AMD GPU/CPU here....
Many people HERE doesn’t mean much.
AMD graphic card market share dropped below 6%, worldwide.
The fact of having a very vocal fan base doesn’t change things.

Eh a decent amount of people on this forum own them and if I only had 900 bucks in my pocket at microcenter and needed a new gpu it would not be a 4070ti
Again, as I said above, having a very vocal fan base doesn’t change facts about market share.
Radeon are cheaper for a single reason: no one want one.
Lisa Su already proved to the market than the moment they gain market share, they immediately drop the “nice price” policy.

BTW I don’t care which card you (or me) prefer. I don’t have any brand loyalty.
I installed dozens of Radeon card in the last 4 years, and on a good chunk of them I had to fix some issue sooner or later (I would say on the 40% of them). Nvidia cards/drivers weren’t perfect, but the percentage of issues drops down to 10%.
This just means one thing for me: angry customers and more workload for me.

yes I know: when you enter a forum like this, AMD supporters are very vocal in saying “never had a problem with my Radeon”. Happy for them. My experience is vastly different.
 
4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.
 
4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.
That video is flawed in many ways… by the way, from 8 GB to 12 GB there is a huge difference. 8 GB was a poor choice since the beginning, but 12 GB for cards intended for 1440P are ok in my opinion. The problem would be an 8 GB 4060 Ti, if Nvidia dare to…
 
£599 notes my arse, 1440p my arse.

I can see the EWaste potential from here.

16GB should be minimum on this class of card and for 1440p.

HUB showed what you can expect, two years viable use then viable to scrap, and it wasn't flawed, only someone who hasn't watched it and so, can't back it up with a actual reason would say so, or a fanboi as demonstrated, no watching just opinionated.

And 599 my arse fake MSRP that will be valid day's only then aibs will have to up their prices to Stop loosing money.



And after this I will be surprised if Nvidia don't loose more AIB partners, Nvidia basically shat on them again.
 
Last edited:
Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti
I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
 
4070 and 4070Ti have VRAM crippled to 12GB only, instead of 16GB, and therefore this will become the same problem in two years as it is now with 3070 and 3070Ti with 8GB. Even Ray Tracing performance will be crippled in increasing number of games.

In most recent video by Hardware Unboxed, Steve clearly shows how planned obsolescennce VRAM strategy by Nvidia works with 3070 and 3070Ti. More and more games are a stuttering mess with 8GB VRAM both in 1080p and 1440p and RT performance is dismal, to the point that RT with 6800 is better because it has 16GB of VRAM. The same will happen with 4070 and 4070Ti in 2 years. Nvidia is literally forcing people to upgrade gen-to-gen due to small VRAM offering. Well done to them for being able to convince people that those cards are fine today.

Well, suits them right, real PC Master race card is of course at least $1200!
 
I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
This is Exactly what Nvidia's driver optimisation has been for years.

It isn't by accident that Nvidia has things like

LOD bias in driver where Intel and AMD don't.

It's also not for no reason that Nvidia lead the fake frames tech drive.


They Always have cheated at benches and always will, they just have the market on their side now.
 
That video is flawed in many ways… by the way, from 8 GB to 12 GB there is a huge difference. 8 GB was a poor choice since the beginning, but 12 GB for cards intended for 1440P are ok in my opinion. The problem would be an 8 GB 4060 Ti, if Nvidia dare to…
It all depends on which games people play. My 7900XTX can use up to 21GB of VRAM in some dense urban 3D renderings of buildings in Flight Simulator. If someone plays VRAM intense games (the once Steve measured, plus MFS), 12GB on 4070 and 4070Ti will quickly become troublesome in terms of stuttering, lower RT performance and unloaded textures, just like 3070 and 3070Ti have. 12GB vs 8GB is NOT "huge difference", I am afraid. Just try those games tested and you will find out that now you might play well, on the edge of 10-11GB, but next year and later on, there will be mounting problems. The history of 3070 and 3070Ti will repear itself. This is also what Unreal Engine 5 game developers say in interviews. have a watch online.

Well, suits them right, real PC Master race card is of course at least $1200!
7900XTX is already available for $960 in some markets, from Asrock. It's faster than 4080 and has 24GB of VRAM, not to be underestimated at all.

I think what we can see in that video and really worry is that we wouldn't be able to trust benchmarks in the future. I mean, if the driver or the game can keep the framerate smooth by butchering the visual quality, by a little or by a lot, can we really trust reviews and framerates? And when it is by a lot, we can see the results on Hogwarts easily. But what happens in the case we put for example ultra settings and the game changes silently those to high without informing us? What if the driver can do that as an "optimization"? Visually it will be difficult to see it and benchmarks will be showing a false image.
We can trust it, as several reviewers test video quality on images, both in pure raster and with upscalers. I would not go into conspiracy of games silently changing texture settings. Even if they do, it can still be checked and uncovered. That would be embarrassing for GPU vendors to report to the public.
 
So, Nvidia is marketing Frame Generation as a performance improvement.
I remember when nvidia's FX series (5xxx) could not compete with ATi's 9000 series, and were caught cheating in benchmarks - particularly 3dmark 2001 if memory serves. Everyone was up in arms about it. Now they come up with DLSS and is trying to sell it as a "performance increase" - and fans rejoice... What a clown world we live in.
 
Last edited:
7900XTX is already available for $960 in some markets, from Asrock. It's faster than 4080 and has 24GB of VRAM, not to be underestimated at all.

But it's made by a company that's loosing that small market share it still has to broken Intel cards. :-D

I remember when nvidia's FX series (5xxx) could not compete with ATi's 9000 series, and were caught cheating in benchmarks - particularly 3dmark 2001 if memory serves. Everyone was up in arms about it. Now they come up with DLSS and is trying to sell it as a "performance increase" - and fans rejoice...
They don't even hide it, there's no conspiracy - people just aren't reading reviews in the finer details. I'm quite sure the default benchmarks with no DLSS, no frame generation will sooner or later be delegated to closing pages in reviews, as a footnote - or the reviewers won't be getting their shiny new cards for free, and then they can just close down.

As WCCFTech wrote:

"If you are someone that believes at raster performance only (although in today's era, I would add that it is a highly obsolete metric), than the RTX 4070 performs 15% faster than an RTX 3070."
 
Back
Top