• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Founders Edition

That's why the best time to buy a GPU is two years after a console launch, then keep said GPU for 6+ years.

This has been true for awhile. 2 years after the xbox came out, we got the 9800 pro/XT that worked well for years afterwards. two years after the PS3 we got the 9800 series/ radeon 3000 series, again working for most of those consoles lifecycles. Two years after the PS4 we got the GTX 900 series. And two years after the PS5 we are getting ada/rDNA3.

Agreed with your 1st statement.

But with regards to the NVIDIA 9800/ATI 3000 series, those DX10 cards had really horrible longevity, DX10 (launched with Vista in 2006) was a transitional phase that most game devs ignored (they just kept making DX9 games instead with some providing DX10 executables, no one made DX10 exclusive titles that didn't also have DX9 fallback renderers) before the industry moved rapidly to DX11 only 3 years later in 2009 when Windows 7 released, once that happened, those DX10 only cards became obsolete rapidly.

DX11 would become the longest running API and is STILL used in games to this day, over 14 years after introduction, so those 1st gen Radeon 5800/Nvidia 4xx series cards had arguably the longest lifespan of them all in terms of feature set compatibility.
 
Still 50-100$ too expensive, should have been a 499$ card. But prices might come down, especially after AMD shows their cards....
Other then that, exactly what I expected: Close Performance to the 3080 with a smaller pricetag (locally it is) and 40% less powerusage. Althorugh I thought RT might have been a bit stronger.

Many people still complain, they wouldn't if it were 499$ OR 16GB VRAM...
 
No plans for that. The goal is actually to make short ~5 minute summary videos, I can't stand those 20++ minute videos that dance around the topic just to get preferred by the YT algo, because they are longer than 20 minutes
:D:D
1681320178507.png
 
Wizzard who refused to put in the rx6950xt on the charts ? Or for that matter rtx 2070super? Dosent even bother retesting older cards with newer drivers to see the improvements from both camps.
FYI the entire GPU test bench was updated in January, including updated drivers.

1681320235576.png
 
Wizzard who refused to put in the rx6950xt on the charts ?
Send me a 6950 XT ref, I'll add it to the reviews. Just not seeing much of a reason to buy one for the charts

Or for that matter rtx 2070super?
Not sure how 2070S is relevant to this review?

retesting older cards with newer drivers
Huh? I've retested every single card in this review in January on the newest drivers
 
You want RT performance: nVidia
You want DLSS2/3: nVidia
You want nvenc: nVidia
You want CUDA for ANYTHING OUT THERE apart from gaming: nVidia
etc. etc.

And you have AMD to price their gpus competitively to nVidia? On what ground?
Anyway, the 6000 cards do not exist for me. The whole package is problematic.
The 7000 are good but very badly priced. I would purchase one if they were cheaper.

If I could afford a 16GB nVIDIA GPU, I would have gotten nVIDIA, but my options were 3080 10GB or 6800XT and it was a no brainer for me, no way in hell Im getting a 10GB GPU in 2022/23 after having 8GB for 6 years.
 
If I could afford a 16GB nVIDIA GPU, I would have gotten nVIDIA, but my options were 3080 10GB or 6800XT and it was a no brainer for me, no way in hell Im getting a 10GB GPU in 2022/23 after having 8GB for 6 years.
Would do the same.
Right now the best deal in germany might be the 6800XT 569€ or 6950XT for 649€ (same model). The 3080 10gb starts at 740€...the 3080 12GB at 867€...
The next tier with 7900XT (829€) or 4070Ti (889€) is too high. I'd say 1% higher price per 1% performance is alright. Coming from the 6800XT that is ~700€ for the 7900XT...
The 4070 should also be 500-600€, but it will start at 659€ with tax.
 
Last edited:
I am just not comfortable with the statement that RTX 4070 Ti maxes all the available hardware. Obviously not, as both boards have two vacant places for memory IC's for higher memory bus = capacity.
 
"classic raster"? Who are you trying to fool, 99.9% of the market is still driven by raster.

The 6800 XT is AMD's last gen card, I'd expect Nvidia's newer cards to take the lead. That it's even debatable just goes to show you how disappointing it is. Fake frames? Is that something people should care about? According to most reviews, no, no you should not.
Bleeding edge tech and AI seems to be Nvidia core business strategy, since Turing, and even with all the complains about it...Nvidia is still winning business wise, and the whole competition is following their leads with AMD own fake frame coming soon. It's not like Phys x where Nvidia was alone doing it, and very few games made use of it... everybody is on board this time...so we might just be looking at the future of PC gaming with what ADA is doing...we'll have to wait for the sales figure of the 4070 to see where the people's wallet really is
 
Everyone complaining about the price is still stuck in 2011 when we were in the throes of the recession and top tier silicon was $500. Forgetting, of course, that in 2007 the top dog was a $830 card, and that was a SHATLOAD back in 2007.

The fact that we have seen rampant inflation in the last 3 years just doesnt seem to connect mentally. You point out to people that nvidia's margins have gone from 54% to 59% since 2011 and they get real quiet, because the harsh reality is that it isnt just nvidia raising prices. It's everything behind them. Is nvidia scalping? A bit, yeah, especially on the xx8x series. Is it realistic for them to offer GPUs like this for 1xxx series pricing? Oh hell no.

Yes, it is an alright price.

You're not in 2018 anymore, let alone 2010. Costs have gone up. $600 is the new $300.
Yes you're right and no you're wrong.
Let's get the wrong out of the way first: It's not inflation. A $299 GPU from 2010 would only cost $414 today if inflation was the reason. Inflation doesn't even begin to cover the price doubling - it's a minor factor at best.

What you're right about is that people do need to accept GPUs are more expensive now; There's more to the cost of a card than die size, because the GTX 660Ti from over 10 years ago cost $300 and I picked it because it has the same ~300mm die size as this RTX 4070. The other differences, which are what drives the price up, are as follows:
  • More VRM phases using more capable (and more expensive) MOSFETS that require more cooling
  • Premium GDDR(6)X instead of regular GDDR
  • VRAM cooling, if you even got any in 2012, was usually just a stamped plate. GDDR6 and GDDR6X need proper cooling.
  • Multi-heatpipe, CNC-machined all-metal multi-layer heatsink, baseplate, backplate with 2+ fans rather than 2012's plastic shroud over a simple skived copper block or maybe on higher-end cards a small heatsink by today's standards with around one-quarter the surface area.
  • PCIe 4.0 in the same physical slot as PCIe 2.0 requires higher-quality PCBs, more layers, more copper, and more SMC components. We saw that issue even in something as simple as a PCIe 4.0 ribbon cable was quadruple the cost and half of them were borderline unstable at PCIe 4.0 even then!
  • Physical rigidity and size has to increase with the weight/size of today's cooling requirements. Backplates, support arms, additional bolt-throughs for baseplates all add to the cost of manufacture, raw materials, and also the shipping/handling costs for every single part of the chain from Taiwan to your door. A tier-equivalent GPU box now weighs a good 50% more than it did a decade ago and you can probably only fit half as many of them on a shipping container, so shipping costs have doubled right there.
If we didn't want to pay any more than inflation-adjusted costs, we'd be getting power delivery, cooling, PCB quality, and build-quality of cards from the bad old days - and it's likely that a modern GPU would simply refuse to work and be impossible to build with the limitations of yesteryear's cheaper, simpler components and cooling.
 
Last edited:
Send me a 6950 XT ref, I'll add it to the reviews. Just not seeing much of a reason to buy one for the charts


Not sure how 2070S is relevant to this review?


Huh? I've retested every single card in this review in January on the newest drivers
The fact that both the 6000 series and 7000 series are on diffrent drivers. 7000 series are still on Beta drivers :) you added Rtx 3090ti but not the 3080 12gb?
 
If I could afford a 16GB nVIDIA GPU, I would have gotten nVIDIA, but my options were 3080 10GB or 6800XT and it was a no brainer for me, no way in hell Im getting a 10GB GPU in 2022/23 after having 8GB for 6 years.

I prefer 10GB vram and 3080s level of RT performance rather than 16GB and barely Turing one.
Apart from some AMD sponsored console ports, it's highly unlikely to find a game to require more than 10gb of vram.
Even if you find some, like Doom eternal, the reason of the requirement is a joke (pointless size of textures).

For me the lighting, the physics, the animation make a beautiful game. Not the 8K textures.
The console ports require vram that is not justifiable by the look of them. It's totally unreasonable.
 
I prefer 10GB vram and 3080s level of RT performance rather than 16GB and barely Turing one.
Apart from some AMD sponsored console ports, it's highly unlikely to find a game to require more than 10gb of vram.
Even if you find some, like Doom eternal, the reason of the requirement is a joke (pointless size of textures).

For me the lighting, the physics, the animation make a beautiful game. Not the 8K textures.
The console ports require vram that is not justifiable by the look of them. It's totally unreasonable.

While I don't disagree textures is one of the easiest ways to improve a games visuals without any performance loss assuming the card has adequate vram. I feel like games as a whole would benefit from higher quality textures a lot of games still have really terrible low resolution textures in places even if the overall quality level is fine likely due to a lot of game engines still having to factor in last generation consoles and their 6.5-7GB ish vram pools.
 
I prefer 10GB vram and 3080s level of RT performance rather than 16GB and barely Turing one.
Apart from some AMD sponsored console ports, it's highly unlikely to find a game to require more than 10gb of vram.
Even if you find some, like Doom eternal, the reason of the requirement is a joke (pointless size of textures).

For me the lighting, the physics, the animation make a beautiful game. Not the 8K textures.
The console ports require vram that is not justifiable by the look of them. It's totally unreasonable.

So you want everything to look like path traced Quake? Yikes.

Your bias aside, and regardless of performance, name games using more than just shadows, reflections and AO RT implementations that are actually worth a damn and visually improve a game?

Textures are equally important to a game looking “beautiful”. If I’m not mistaken, the higher resolution and quality RT features are (especially shadows) the larger VRAM requires will become. Not only this but as consoles dictate what features are often implemented in games, RT will remain a gimmick or true path tracing will remain unreachable by existing hardware and hardware to come for the next few years.

People have some weird hang up on Ray Tracing with it having almost no impact on popular games and the majority of the market going on 3 generations after Nvidia introduced the RTX feature set.
 
The fact that both the 6000 series and 7000 series are on diffrent drivers. 7000 series are still on Beta drivers :)
These were the latest drivers for these cards in Jan. AMD didn't provide RX 6000 driver updates for months, while they kept pushing out beta for 7900 Series. I'll retest everything soon again anyway
 
Still 50-100$ too expensive, should have been a 499$ card. But prices might come down, especially after AMD shows their cards....
Other then that, exactly what I expected: Close Performance to the 3080 with a smaller pricetag (locally it is) and 40% less powerusage. Althorugh I thought RT might have been a bit stronger.

Many people still complain, they wouldn't if it were 499$ OR 16GB VRAM...
I agree with you but i think the TI version should have 16GB vRAM and 100$ less and this one should be ~499$ / 12GB vRAM, i was considering to get an RTX 3070 before all of this talking about 8GB is not enough anymore (i tend to keep cards for a good period of time, i rarely game but i like to set graphics highest possible when i do) and i was looking at the 4070, at 500$ i would have just grab it but at 600$ im thinking twice, maybe something nice from AMD will show up. (edit: problem is you dont find them at MSRP , mostly 100$ more i think)
 
Last edited:
I agree with you but i think the TI version should have 16GB vRAM and 100$ less and this one should be ~499$ / 12GB vRAM, i was considering to get an RTX 3070 before all of this talking about 8GB is not enough anymore (i tend to keep cards for a good period of time, i rarely game but i like to set graphics highest possible when i do) and i was looking at the 4070, at 500$ i would have just grab it but at 600$ im thinking twice, maybe something nice from AMD will show up.


Yeah if the RDNA3 option cost the same and has 16GB and has similar or better raster performance it's probably the safer option if you keep your gpu longer than 2 years.

If not you could defiantly do worse just don't pay a hair over MSRP for this card it's already a stretch to begin with whether its worth 600 usd.
 
just did a search and learned about the zluda project as an alternative to cuda. Apparently nobody has still forked the project, which is amazing to me as trillions of dollars are being pushed into AI projects.. Everyone complaining about the prices, the path to better competition is an alternative to CUDA yes?
 
just did a search and learned about the zluda project as an alternative to cuda. Apparently nobody has still forked the project, which is amazing to me as trillions of dollars are being pushed into AI projects.. Everyone complaining about the prices, the path to better competition is an alternative to CUDA yes?
Also HIP by AMD, yet nobody is buying
 
Go buy a 6800XT then, and enjoy AMD driver’s heaven :rolleyes:

really I cannot see the point people here is constantly speaking about the 6800XT, a card that in last Steam survey was positioned in THE VERY LAST position, just above the “others” category…
 
just did a search and learned about the zluda project as an alternative to cuda. Apparently nobody has still forked the project, which is amazing to me as trillions of dollars are being pushed into AI projects.. Everyone complaining about the prices, the path to better competition is an alternative to CUDA yes?

Everyone will generally take the path of least resistance. Kudos to nvidia for introducing hardware and software advances over the years, but proprietary solutions will always creat stagnation in multiple forms. Thankfully the majority of consumer based instances of this (physics, upscalers, VRR) have since gone the way of the Dodo and prices based on these features now have parity.

It’s unfortunate AMD acquired ATi when they did to some degree, splitting their focus didn’t help out either department. Nvidia is deeply entrenched from a commercial standpoint, and aside from the strong arming they more than likely do, commercial applications of hardware rarely get changed unless there’s a massive shift in performance.
 
Last edited:
Congratulations nVidia you have made an efficient RTX 3080 for $100 less. Rejoice! This is pathetic. Remember the gains the 3070 had over the 2080? Pepperidge Farm remembers. And they remember it being atleast noticeable, unlike this one.

Also, @W1zzard this is a 1440p card and the 3080 in your own graph shows it being IDENTICAL any small differences at other resolutions don't matter because an extra 2% @1080p (a resolution I seriously doubt people would be buying this card for) is impossible to notice without a framerate counter on.
 
Agreed with your 1st statement.

But with regards to the NVIDIA 9800/ATI 3000 series, those DX10 cards had really horrible longevity, DX10 (launched with Vista in 2006) was a transitional phase that most game devs ignored (they just kept making DX9 games instead with some providing DX10 executables, no one made DX10 exclusive titles that didn't also have DX9 fallback renderers) before the industry moved rapidly to DX11 only 3 years later in 2009 when Windows 7 released, once that happened, those DX10 only cards became obsolete rapidly.

DX11 would become the longest running API and is STILL used in games to this day, over 14 years after introduction, so those 1st gen Radeon 5800/Nvidia 4xx series cards had arguably the longest lifespan of them all in terms of feature set compatibility.
That's pretty revisionist. DX11 didnt get major mainstream traction until 2010/2011, at which point said cards were 4-5 years old, and those games STILL had DX9 fallback.

The GTX 9800 was useful well into 2012, at which point the PS4 was coming out. The GTX 200 series were goldilocks cards.
"You're being ripped off, you should be happy about it!"
"I know the cost of everything went up by 50% and our wages went up from 2018 but everything should cost the same it did 10 years ago!"
 
Back
Top