• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Founders Edition

Do you remember the planet ? That place where we live on ? Power consumption is not only a bill problem…

People are way overcomplicating this for seemingly no other reason than to fuel their make belive fanboy war

If card A uses 200w and card B uses 300w while giving me a similar amount of perfomance I'm going to choose card A almost every time unless B offers me somthing drastically better than A like better Upscaling/RT/Other Features.

This is for in 2023 only in relation to overall performance.

I'm ok with the 400 ish watts my 4090 uses due to the performance it gives me at 4k I would not be ok with the perfomance a 7900XTX gives me at it's power usage at 4k in RT heavy games regardless of how much cheaper it is. This pertains to me if you think the alternative is better good for you.

There are a ton of other factors why somone might choose the lower power consumption card or vice versa for me the only thing that really matters is what perfomance does it give me for the amount of power it uses.
 
People are way overcomplicating this for seemingly no other reason than to fuel their make belive fanboy war

If card A uses 200w and card B uses 300w while giving me a similar amount of perfomance I'm going to choose card A almost every time unless B offers me somthing drastically better than A like better Upscaling/RT/Other Features.

This is for in 2023 only in relation to overall performance.

I'm ok with the 400 ish watts my 4090 uses due to the performance it gives me at 4k I would not be ok with the perfomance a 7900XTX gives me at it's power usage at 4k in RT heavy games regardless of how much cheaper it is. This pertains to me if you think the alternative is better good for you.

There are a ton of other factors why somone might choose the lower power consumption card or vice versa for me the only thing that really matters is what perfomance does it give me for the amount of power it uses.

Yes you’ve stated this like 8 times in this thread and every other thread you replied too on a nvidia topic of how pleased you are with your 4090 in a vacuum.

Let’s be honest here, if nvidia offered 10% more performance than amd at the top end, while the nvidia card used 30% more power, you’d still choose nvidia.

If AMD offered the same performance, and used 15% LESS power, you’d choose nvidia.

If AMD paid you to use a card at the same performance and power usage, you’d choose nvidia.
 
Yes you’ve stated this like 8 times in this thread and every other thread you replied too on a nvidia topic of how pleased you are with your 4090 in a vacuum.

Let’s be honest here, if nvidia offered 10% more performance than amd at the top end, while the nvidia card used 30% more power, you’d still choose nvidia.

If AMD offered the same performance, and used 15% LESS power, you’d choose nvidia.

If AMD paid you to use a card at the same performance and power usage, you’d choose nvidia.

I've owned plenty of amd cards and their processors I've even commented about after 2 years how underwhelming my 3080ti has been (although it was the only card I could get at msrp) due to missing the boat with the 3090.. The 6900XT was 200 usd more than what I paid at the time and with the RT performance just wasn't worth it.

Yes if they offered 15% more perfomance than a 4090 while having similar RT performance even if they were more expensive within reason I'd own an amd gpu currently. I was actually extremely disappointed they decided they didn't want to compete in the high end. They did so much better last generation with the 6800XT/6900XT vs the 3080/3090 I got my hope's up for rdna3 and even after the press event I had some hope with their BS 50-70% improvements over the 6950XT when in reality it was closer 35-40%. I hate FSR but i could lived with it if they were giving me more perfomance than what I'm currently getting.

Trust me I want amd to provide a viable alternative as much as anyone that competes in raster/rt/features I'd love to have an all amd pc it took 3 generations for me to get on board with Zen maybe RDNA 4 will do the trick on the gpu side here's hoping.

Yes I am pleased with the 4090s perfomance but I'm not overly happy about pricing and the way it's going but that goes for both nvidia and amd I want amd to do well/better the same ad I want nvidia to do better I wish the 4070/4070ti were better and the 4080 was cheaper but what I want is irrelevant to these two companies that don't give a shit about us.
 
Last edited:
90% of gpu buyers buy nvidia even if they are more expensive, even if they perform a little worst, even if they are consume a little more power, so how is this news? if you throw a rock in the air, the changes of hitting a nvidia gpu buyer over amd is very high? there are reasons for it and most of it amd's own fault. how is this even related to the vram issue?

damn, these arguments over and over again.
 
That is presuming:

1. Your power supply can handle a 300-400W GPU (at sub-200W, the 4070 will easily run on more modest PSUs, very important for SFFs);
2. Provided your power supply has the capacity, ensure that it can handle the transients and spikes that a high-wattage GPU will produce (which is another problem in itself);
3. Your PC's cooling can manage the 6800 XT's high wattage (the 4070 will exhaust less hot air inside the case due to its lower power consumption);
4. Energy costs remain constant (they are ballooning up in Europe, and in a general trend upwards worldwide)
5. You absolutely do not care about having features such as DLSS and the NVIDIA ecosystem and;
6. You absolutely do not care about resale value in the future;
7. You buried your head in the sand, set your foot down and absolutely refuse to accept that DX12 Ultimate and raytraced games are here to stay - with ideological fervor

Then yeah buying leftover stock of 6800 XT makes sense
I think there are also a lot of people like me, willing to buy used videocards. Here in Argentina, the 135.000 pesos (around $360) that I paid for my barely used 6800XT Nitro+ a week ago would get me a new GTX1650 at best. My budget was 150.000 ($400). For my other used options, I wasnt willing to buy a 3080 with only 10GB in 2023 for $400, and 3080 12GB was a lot more expensive ($450+) and very hard to find people selling them since they're still great (its power consumption was also even higher than a 3090 and I wasnt comfortable using it with my RM750x).

If I had the option to buy a used 12-16GB nVIDIA GPU for a similar or very slightly higher price than what I paid, I would have stayed with nVidia. But that just wasn't an option. If you have $400 for a GPU in Argentina and arent scared to buy used, between 3080 10GB and 6800XT 16GB, the 6800XT is a no-brainer. Remember that $400 is around 2 months of minimum wage here, that's quite an investment for a GPU and I cant just upgrade next year when 10GB gets destroyed by more new PS5 games.
 
Do you remember the planet ? That place where we live on ? Power consumption is not only a bill problem…
Yeah like creating less e-waste? A card that is only as fast as a 2.5 year old 3080 10GB with only 2GB more, which is just enough in some new games and in some cases not enough if you want to set textures to high with RT on. Power consumption isn't that much of a problem, cost isn't an issue unless you'd have the PC running 24/7.
Yes you’ve stated this like 8 times in this thread and every other thread you replied too on a nvidia topic of how pleased you are with your 4090 in a vacuum.

Let’s be honest here, if nvidia offered 10% more performance than amd at the top end, while the nvidia card used 30% more power, you’d still choose nvidia.

If AMD offered the same performance, and used 15% LESS power, you’d choose nvidia.

If AMD paid you to use a card at the same performance and power usage, you’d choose nvidia.
Exactly, the people that insist that Nvida's features are needed to run games aren't even going to consider anything else, and yet when spending $1000+ on a gpu, power use is somehow an issue.
 
I've owned plenty of amd cards and their processors I've even commented about after 2 years how underwhelming my 3080ti has been (although it was the only card I could get at msrp) due to missing the boat with the 3090.. The 6900XT was 200 usd more than what I paid at the time and with the RT performance just wasn't worth it.

Yes if they offered 15% more perfomance than a 4090 while having similar RT performance even if they were more expensive within reason I'd own an amd gpu currently. I was actually extremely disappointed they decided they didn't want to compete in the high end. They did so much better last generation with the 6800XT/6900XT vs the 3080/3090 I got my hope's up for rdna3 and even after the press event I had some hope with their BS 50-70% improvements over the 6950XT when in reality it was closer 35-40%. I hate FSR but i could lived with it if they were giving me more perfomance than what I'm currently getting.

Trust me I want amd to provide a viable alternative as much as anyone that competes in raster/rt/features I'd love to have an all amd pc it took 3 generations for me to get on board with Zen maybe RDNA 4 will do the trick on the gpu side here's hoping.

There it is. So are you admitting you’re going to buy the fastest bar none? Why spend time making multiple posts on power usage between node/generation differences to justify a card that’s gone up in price, down in SKU tier, and provide historically worse value for a 70 series card?

Also we don’t need to hear how you’d choose your 4090 again, we get it.

Since the 5700xt AMD has offered multiple products that are objectively better value and more performant than Nvidia in the market where most consumers make a purchase. And I know someone mentioned this is a previous post, and it shall be beat to death, the majority of the time AMD offers a competing product where it matters, yet consumers can’t be assed to understand just because ‘hurr durr xx90 good all Nvidia card best’ isn’t true. It’s sad to see any enthusiast with a decent understanding of hardware justify the increased reaming Nvidia has been providing consumers.
 
There it is. So are you admitting you’re going to buy the fastest bar none? Why spend time making multiple posts on power usage between node/generation differences to justify a card that’s gone up in price, down in SKU tier, and provide historically worse value for a 70 series card?

Also we don’t need to hear how you’d choose your 4090 again, we get it.

Since the 5700xt AMD has offered multiple products that are objectively better value and more performant than Nvidia in the market where most consumers make a purchase. And I know someone mentioned this is a previous post, and it shall be beat to death, the majority of the time AMD offers a competing product where it matters, yet consumers can’t be assed to understand just because hurr durr xx90 good all Nvidia card best. Its sad to see any enthusiast with a decent understanding of hardware justify the increased reaming Nvidia has been providing consumers.

We also don't need to hear people crying about others wanting better efficiency but here we are.

And no I don't always buy the best card/fastest card. Best is pretty subjective as it is. This generation I really just didn't have much choice if I wanted a new gpu the 4080 is too close in price to the 4090 (If it was 800 ish I'd have got it instead) and the rdna3 options have underwhelming RT perfomance for my use case and FSR kinda sucks my opinion only.

Again for the 50th time this is what I chose for me, not you. What you decide is your choice if you love the 7900XT/7900XTX good for you go for it they have excellent raster perfomance and decent amount of vram

If your somone who wants the 4080 over both of them go for it.

I don't like this 4070 this thread is about but for those that do buy it I sincerely hope it serves them well for a long time. The same with the 4070ti.

Now If my budget was capped in this $600-800 range I would probably get a 7900XT and deal with the meh FSR only that card wasn't an option to begin with for me becuase it was more of a sidegrade as it is.
 
Last edited:
Solid price with 599$, its cheaper to get a PS5 or Series X with 12 Months Membership :laugh:
599$ GTX 4070 in 2023 with the die Size of an GTX 570 in 2012:

Just regret of GTX 570 2,5GB for about 350$ (GTX 570 1,25GB 299$),
vs now.

Come on LMAO:toast:
 
Solid price with 599$, its cheaper to get a PS5 or Series X with 12 Months Membership :laugh:
599$ GTX 4070 in 2023 with the die Size of an GTX 570 in 2012:

Just regret of GTX 570 2,5GB for about 350$ (GTX 570 1,25GB 299$),
vs now.

Come on LMAO:toast:

While I think most console owners would never consider a $600 graphics card you have to have a decent enough pc to slot this into to begin with. This offers vastly better perfomance than the 2 consoles this will run most console games at double the framerate assuming similar settings so to me they are just not the same market.

I have a ps5 I love it but it's nowhere near even my 2 year old 3080ti in my secondary pc and that's already a midrange ish gpu by todays standards.
 
We also don't need to hear people crying about others wanting better efficiency but here we are.

And no I don't always buy the best card/fastest card. Best is pretty subjective as it is. This generation I really just didn't have much choice if I wanted a new gpu the 4080 is too close in price to the 4090 (If it was 800 ish I'd have got it instead) and the rdna3 options have underwhelming RT perfomance for my use case and FSR kinda sucks my opinion only.

Again for the 50th time it seems for me not you what you decide is your choice if you love the 7900XT/7900XTX good for you go for it they have excellent raster perfomance and decent amount of vram

If your somone who wants the 4080 over both of them go for it.

I don't like this 4070 this thread is about but for those that do buy it I sincerely hope it serves them well for a long time. The same with the 4070ti.

Now If my budget was capped in this $600-800 range I would probably get a 7900XT and deal with the meh FSR only that card wasn't an option to begin with for me becuase it was more of a sidegrade as it is.

A) You brought up efficiency first. I repeat YOU. Efficiency is good and advancements are expected generation over generation, yet you repeatedly wanted to point out how a previous generation card on a worse process node offering the same performance for over a year (technically much longer but we know what happened with prices) now is somehow a revelation to praise any next generation product over. You should call out any new product that goes the opposite way absolutely, but efficiency improvements gen over gen are not news and are always the norm.

B) All I did was compare two relevant competing cards in the current generation and how for the average user (across the US) you’re talking about single dollar differences per year. Then you felt the further need to clarify personal scenarios as some universal justification on the matter.

C) You leave off with some altruistic scenario of a capped budget mentioning the competing card would have been your choice, then Immediately make a case/justifications against it.

You’ve made it abundantly clear why you chose, would chose your 4090, the argument has always been how the efficiency of the 4070 doesn’t make it some blessing from Nvidia and a mutually accepted bad price, providing little to no value benefit from cards available 2+ years ago.
 
My expectation was a repeat of Navi 21 vs. GA102, so the very least on par with AD102 with the usual pitfalls that renders it the runner-up. It doesn't come even close to that. As it stands, the 7900 XTX consumes more power than the RTX 4090 (provided you don't buy an artificially limited, 2x 8-pin model, by the way - I can make the argument that my 3090 "only" uses 375W too, but you are leaving performance on the table), and performs substantially worse than it does in its already hilariously cut down configuration.

Even their presentations, which are hilariously optimistic, turned out botched beyond repair, they missed these extremely conservative, marketable estimates:

View attachment 291376

Reality is closer to a pyrrhic victory with a 30% lead over the regular 6900 XT and getting sent to pound sand by the 4090:

relative-performance_3840-2160.png

That's a 47% performance increase over the 6900XT, not 30%.

And in that same review, they reached 63% more performance in Cyberpunk, 65% more performance in Watch Dogs Legion, 54% more performance in Resident Evil Village and around 50% more performance in Metro Exodus, which are some of the estimates there. Its not that bad even considering the 5% performance difference with the 6950XT.
 
You’ve made it abundantly clear why you chose, would chose your 4090, the argument has always been how the efficiency of the 4070 doesn’t make it some blessing from Nvidia and a mutually accepted bad price, providing little to no value benefit from cards available 2+ years ago.

I've also repeatedly said how much I dislike the 4070 I just wouldn't buy any of the alternatives in it's similar price range. IF I was in the market for a 600 usd card but I guess that's hard to understand.

The arguments people are trying to make (Mostly AMD fanboys anyways) is that why buy this when the 6800XT is 100 usd cheaper although even more so if buying used I stated I would buy this over it due to the better efficiency/better RT/DLSS yet the only thing AMD fans want to point out is efficiency for some reason.

Obviously this is not targeting 6800XT or even 6700XT owners so any one who has those cards should be even more underwhelmed by it at the same time a lot of people spent over `1000 usd on a 6800XT and over 800 on a 6700XT so it is what it is.

Again in 2023 all things being equal I would go with a the card that offers a similar amount of performance with the least amount of power draw the thing is this isn't apples to apples even vs the eventual RDNA3 competitor this is likely to have better RT, FSR is still kinda junk, and RDNA3 isn't nearly as efficient as ADA but amd at least gives us more vram and that is a really nice thing to have that someone might choose over this and good for them. Now if you want to debate the merits of any and all of those it's fine people have to look at the different technologies and decide for them what vendor is better.

It's not as easy as when I owned a 7970 or 290X there is a lot more than just raster performance that's important to me although that is still high up on the list.

I could be wrong but I never meant to imply what other should or shouldn't buy even though there are plenty in this thread both Nvidia owners and AMD owners that seem to have that sentiment. IF anyone looking a this card goes I think this is what's best for me good for them. If someone asked me if they should buy this product I would say that depends which is kinda sad because the 70 tier product use to be a no brainer well except the 2070 although there are plenty happy owners of them as well.

At the end of the day everyone has to vote with their wallet and for this generation I voted Nvidia. At all the lower tiers that is a bit harder and that is a good thing although pricing in general is really bad.
I have no issues giving AMD my money I've purchased a ton of their products over the last 10 years recommended their lower end gpu's 100s of times.

Like I said in one of my previous post the ball is in AMD court hopefully they can do better than this it's not like Nvidia set the bar very high hopefully they don't choose just to meet the same bar with their 4070 competitor.
 
The sad thing is, is this their best value Ada card. In isolation it's decent, but basically only 3080 speed at $100 less on RRP when that was new. But it's 33% dearer than 3070 at around 25% more performance. That's not progress at all. Only 4090 has delivered progress in the real sense and still the only card I would buy if I had more money than sense.

Yeah, but in the case of the 4090 there is an additional price increase - that of physical size. I can't take more than 3 slots wide.
 
Yeah, but in the case of the 4090 there is an additional price increase - that of physical size. I can't take more than 3 slots wide.
i'm thanking myself that I stayed with a mid tower case with plenty of room so I can fit these behemoth highend cards if I choose to go with one of those AIB models.
 
That's a 47% performance increase over the 6900XT, not 30%.

And in that same review, they reached 63% more performance in Cyberpunk, 65% more performance in Watch Dogs Legion, 54% more performance in Resident Evil Village and around 50% more performance in Metro Exodus, which are some of the estimates there. Its not that bad even considering the 5% performance difference with the 6950XT.

Not hard to find comparison with a modern CPU with a 6950X in it to compare. 35% ish going by this. The 6950XT is around 12% faster than the 6900XT btw again depending on games used that will obviously vary. As far as RT performance differences I don't think anyone who really likes RT is buying an AMD gpu. I could be wrong but at the very least they don't like CP2077 and Witcher 3 NG level's of RT anyways.

2160p.png

source

 
Last edited:
I really appreciate the easy to read Architecture page: very informative about RT, RT optimisations, and DLSS 3. I was considering a RTX 3xxx series prior to reading that page, but now, absolutely not. It's RTX 4xxx or nothing. Since RTX 4xxx is out of my budget, it means: nothing. Drink tea, wait.
 
Not hard to find comparison with a more modern CPU with a 6950X in it to compare. 35% ish going by this. The 6950XT is around 12% faster than the 6900XT btw again depending on games used that will obviously vary. As far as RT performance differences I don't think anyone who really likes RT is buying an AMD gpu. I could be wrong but at the very least they don't like CP2077 and Witcher 3 NG level's of RT anyways.

View attachment 291443

source

Havent checked other reviews but I find that a lot of people misread TPU charts.

If card 2 is 50% slower than card 1, card 1 is 100% faster than card 2.
 
igor's lab is a bit drama queen/click bait master, so i would wait for someone else's confirmation

i'm going to be stoned again like with the hardware unboxed clickbait video but so be it, i speak the truth :D
 
igor's lab is a bit drama queen/click bait master, so i would wait for someone else's confirmation

i'm going to be stoned again like with the hardware unboxed clickbait video but so be it, i speak the truth :D

They did skimp on it pretty hard just looking at the PCB. like I said even PNY 599 card seems better equipped regardless.


You still crying about 8GB of vram lol? Heaven forbid people wanting progress.
 
igor's lab is a bit drama queen/click bait master, so i would wait for someone else's confirmation

i'm going to be stoned again like with the hardware unboxed clickbait video but so be it, i speak the truth :D
It's pretty simple, some AIB models don't have enough screws to have proper pressure for VRM cooling plates. Hence the temps. Not sure what is there to clickbait.
 
Back
Top