• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Founders Edition

The 3070 was a similar card at launch; Performance was good enough at launch but questions were raised about whether 8GB was too little VRAM for a card of that calibre.

Here we are on launch day and 10GB cards have been having a hard time for a few months already. The question isn't if 12GB is enough, the question is how long will 12GB be enough for? 6 months? 2 years? What's a reasonable expectation for a $600 card?

6800XT being $100 cheaper and having 16GB makes this a hard sell for anyone willing to give up DLSS3... I was looking at a 3080 last year instead of the 6800XT and now that Nvidia have snubbed even their own 30-series owners with their latest DLSS3 features, I'm glad I didn't bother!
 
Because these are the games you expect to play when you pay 600+.
If I pay over $600 for a GPU it better play all of my Games well. The issue I have with this card is Nvidia are the ones that are touting Ray Tracing and now they tell you Frame Generation is the new must have. Meanwhile I enjoyed my 6800XT and used most of the money in selling that to buy a 7900XT. This card even given market conditions should be around $400 US as this should be competing against the 6700XT.
 
Imagine being dumb enough to buy a 700$ GPU that uses 30% more power than a 480$ RX6800. I have a 12gb 3080 that beats this card in every single game in 4k and cost me only 600$. Give me a break. It is no wonder they are losing so much money.
 
Waiting to see if it costs 850-900 EUR. Because it will.
P.S.
I love that you guys have 3080 as 550 USD card. Would be nice...
 
I don't see how any of that is relevant, but yes, him.
The relevance is that Wizzard only does a one off run with almost everycard and then people dosent really get the whole picture and spout that the RX 6000 series is power hogs for no reason.
 
Hi, so i'm not knowledgeable about GPU's, but what i'd like to know is how this 4070 compares to a Arc A770 in linux gaming, Especially with the new improved drivers.

The A770 is € 379,00 in Europe ATM, and i assume the 4070 will be something like € 700. So roughly half the price. Question is, is a A770 2x worse? I can't find any decent comparisons, but i would guess that an A770 for linux gaming is not that bad of a deal considering how much people complain about the pricing of these.. Disappointed the A770 was not included in these benchmarks..
 
Hi, so i'm not knowledgeable about GPU's, but what i'd like to know is how this 4070 compares to a Arc A770 in linux gaming, Especially with the new improved drivers.

The A770 is € 379,00 in Europe ATM, and i assume the 4070 will be something like € 700. So roughly half the price. Question is, is a A770 2x worse? I can't find any decent comparisons, but i would guess that an A770 for linux gaming is not that bad of a deal considering how much people complain about the pricing of these.. Disappointed the A770 was not included in these benchmarks..
Look at the A770 review and see how it compares to the 3080. 4070 will be close.
 
This would have been much sweeter @$500. But because of 4070Ti@$800, we already knew that wasn't happening :(
On the bright side, at least the prices have stopped sky-rocketing.
Funny thing, the 3070 was on par with the 2080Ti, the best nvidia had to offer in touring gen, for less than half the price. Now the 4070 only matches the 4th best(3080) for slightly less money.

1070 ---- faster than Titan X, for a fraction of the price
2070 ----- very close(90%) to 1080 Ti, for half the price

I guess the 5070 will be as fast as the 4070Ti. :D

Do you see the trend? (not only the prices are getting ridiculous, also the performance increases).

EDIT: Correction.
 
Last edited:
Funny thing, the 3070 was on par with the 2080Ti, the best nvidia had to offer in touring gen, for less than half the price. Now the 4070 only matches the 4th best(3080) for slightly less money.

1070 Ti ---- faster than Titan X, for a fraction of the price
2070 super ----- faster than 1080 Ti, for almost half the price

I guess the 5070 will be as fast as the 4070Ti. :D

Do you see the trend? (not only the prices are getting ridiculous, also the performance increases).
charging more for generationally-smaller gains has been Nvidia's MO since RTX was introduced.
 
Look at the A770 review and see how it compares to the 3080. 4070 will be close.
Thx. I checked the relative performance chart, and the 3080 scores 162%. So for linux gaming, the A770 is a 38% better deal than the 4070 :) And it has 16GB of vram which might come in handy. (but ofcourse those benchmarks are with older drivers i assume, might be an even better deal now)
 
I don't feel so bad paying 690 for my 6950XT now.
same here. The 4070 turned out ok performance-wise but it should be priced at 500e tax included. As it stands, it will end up costing 720 euro (tax included) when it reaches store shelves in my country.
 
all this whinging about price tends to suggest to me that a growing number of PC gamers cant really afford their chosen hobby.. unfortunately i dont see this situation getting any better..

trog
Everyone complaining about the price is still stuck in 2011 when we were in the throes of the recession and top tier silicon was $500. Forgetting, of course, that in 2007 the top dog was a $830 card, and that was a SHATLOAD back in 2007.

The fact that we have seen rampant inflation in the last 3 years just doesnt seem to connect mentally. You point out to people that nvidia's margins have gone from 54% to 59% since 2011 and they get real quiet, because the harsh reality is that it isnt just nvidia raising prices. It's everything behind them. Is nvidia scalping? A bit, yeah, especially on the xx8x series. Is it realistic for them to offer GPUs like this for 1xxx series pricing? Oh hell no.
"NVIDIA has set a base price of $600 for the GeForce RTX 4070 Founders Edition, which is an alright price given the current GPU pricing landscape, but $100 more expensive than the launch-price of RTX 3070 and RTX 2070."

Alright price?

Compared to RTX 3070 its 27.6% faster in 1080p, 27.4% faster in 1440p and 25.4% faster in 4K. That used to be performance increases of midlife upgrades, like RTX 3070 Ti launching a year later for the same price.

Performance per dollar chart says it all:

performance-per-dollar-3840-2160.png
Yes, it is an alright price.

You're not in 2018 anymore, let alone 2010. Costs have gone up. $600 is the new $300.
 
The 2070 and 3070 matched, not exceeded, the previous flagships when the x80s were significantly slower. ------------ AT A FRACTION OF THE PRICE
1080 vs 1080Ti and 2080 vs 2080Ti

The 4070 matches the 3080 while all the 3080s and 90s, except from 3090Ti, perform within 5%. ---------- FOR ALMOST THE SAME PRICE AS THE 3080

But I get what you say and it makes sense.
Added more data.
 
The 3070 was a similar card at launch; Performance was good enough at launch but questions were raised about whether 8GB was too little VRAM for a card of that calibre.

Here we are on launch day and 10GB cards have been having a hard time for a few months already. The question isn't if 12GB is enough, the question is how long will 12GB be enough for? 6 months? 2 years? What's a reasonable expectation for a $600 card?

6800XT being $100 cheaper and having 16GB makes this a hard sell for anyone willing to give up DLSS3... I was looking at a 3080 last year instead of the 6800XT and now that Nvidia have snubbed even their own 30-series owners with their latest DLSS3 features, I'm glad I didn't bother!
12GB works well, and will continue to work well, for several years. 12GB is how much RAM is used in consoles typically today. So if you stick to roughly console level settings, 12GB will be sufficient. That will continue until we get a PS6/xbox series ultra titanium Z with 32GB of ram, then a couple years after they launch we will see games take advantage of that RAM.

That's why the best time to buy a GPU is two years after a console launch, then keep said GPU for 6+ years.

This has been true for awhile. 2 years after the xbox came out, we got the 9800 pro/XT that worked well for years afterwards. two years after the PS3 we got the 9800 series/ radeon 3000 series, again working for most of those consoles lifecycles. Two years after the PS4 we got the GTX 900 series. And two years after the PS5 we are getting ada/rDNA3.
 
3080 release price without mining 699$, 4070 599$ :laugh:
 
12GB works well, and will continue to work well, for several years. 12GB is how much RAM is used in consoles typically today. So if you stick to roughly console level settings, 12GB will be sufficient. That will continue until we get a PS6/xbox series ultra titanium Z with 32GB of ram, then a couple years after they launch we will see games take advantage of that RAM.

That's why the best time to buy a GPU is two years after a console launch, then keep said GPU for 6+ years.
exactly. that's why i bought the rx6700xt for 4K. and a 6800u handheld for 720p. couldn't be happier with both.
 
Solid card for peeps who want to game at 1440P yet can't afford a 4070 Ti. ^^
 
The 2070 and 3070 matched, not exceeded, the previous flagships when the x80s were significantly slower. ------------ AT A FRACTION OF THE PRICE
1080 vs 1080Ti and 2080 vs 2080Ti

The 4070 matches the 3080 while all the 3080s and 90s, except from 3090Ti, perform within 5%. ---------- FOR ALMOST THE SAME PRICE AS THE 3080

But I get what you say and it makes sense.
Added more data.

-nVidia didn't want to release an another gen with a price increase. They did it with Turing, so it would be disastrous to increase the prices again.
now that we had the 3080 as a value monster, they fXXXed us with the 4080. The 5080 will be a value king again....

-the 3080 is painfully close to 3080Ti and 3090, so we can say that the 4070 is at a fraction of the price of the 3090.

-the 1080Ti and 2080Ti were gaming cards. With 11GB of vram you couldn't use it for heavy professional use.
the 3090 was expensive because of the vram. Many bought them for professional use and didn't buy a way more expensive quadro, just for the vram. This is also the reason why nVidia does not put much vram in general in their gpus while AMD put a truck load of it but it's useless in the professional world by 95%

-the 4070 price should have been 549$. I also expected to see a price drop of the 4080 now....Probably because there's absolute silence from the AMD HQs.
 
@W1zzard maybe I'm just getting old but the power consumption charts confuse me..... Maybe someone already mention this as well.


How is gaming power consumption higher than maximum power consumption?

power-gaming.pngpower-maximum.png
 
100.628437%... 0.628% faster in classic raster, while being 22% slower in ray tracing, the standard that is being used in all modern game engines moving forward, while consuming 100 W more when gaming. Seems like a wise choice to go for the "winning" RX 6800 XT!

That's also not taking into account DLSS and frame generation.

"classic raster"? Who are you trying to fool, 99.9% of the market is still driven by raster.

The 6800 XT is AMD's last gen card, I'd expect Nvidia's newer cards to take the lead. That it's even debatable just goes to show you how disappointing it is. Fake frames? Is that something people should care about? According to most reviews, no, no you should not.
 
I remember 70-series or 700-series cards costing around $400 probably 10 years ago. After adjusting for inflation, that's like $520 today, and it's been said that chips are getting more expensive so, fine, it costs $600 to buy into the 70-series today. I think of 60-series as midrange and 70-series as higher end. Game developers are saying now that 12 GB is the lower limit they're targeting for VRAM, and it's what this card has. If it only has the bare minimum will it have a long useful life? And what does this mean for the 4060, which is what I would usually consider buying? That memory amount doesn't seem fitting for a high-end card.
 
-nVidia didn't want to release an another gen with a price increase. They did it with Turing, so it would be disastrous to increase the prices again.
now that we had the 3080 as a value monster, they fXXXed us with the 4080. The 5080 will be a value king again....

-the 3080 is painfully close to 3080Ti and 3090, so we can say that the 4070 is at a fraction of the price of the 3090.

-the 1080Ti and 2080Ti were gaming cards. With 11GB of vram you couldn't use it for heavy professional use.
the 3090 was expensive because of the vram. Many bought them for professional use and didn't buy a way more expensive quadro, just for the vram. This is also the reason why nVidia does not put much vram in general in their gpus while AMD put a truck load of it but it's useless in the professional world by 95%

-the 4070 price should have been 549$. I also expected to see a price drop of the 4080 now....Probably because there's absolute silence from the AMD HQs.

What does that look like at this point though? 20% more performance than the 4090 and maybe 20GB of vram for 999 lol.......

I doubt we will ever see another 80 class product under 1k to be honest.

Also the 4070 is like 20% behind the 3090 at 4k so even if we throw out the 3090ti as the flagship the 4070 is more behind than the 2070 was vs the 1080ti which in my book was the worst 70 tier product till the 4070 came out. Sure it's a great 1080p product and decent 1440p product but at 600 usd gamers should expect more.
 
Last edited:
Back
Top