• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Founders Edition

Even with all that this card is extremely meh compared to pretty much any previous 70 class product. I think even the meh AF 2070 was more exciting.
I agree. The 600$ is too much to swallow.
A 549$ would be more realistic.

I disagree for the 2070....that was a joke of a card as well as the 2080(I had one).
The 3070 was a dream because it matched the 2080Ti(I had one too) at 499$.
 
I agree. The 600$ is too much to swallow.
A 549$ would be more realistic.

I disagree for the 2070....that was a joke of a card as well as the 2080(I had one).
The 3070 was a dream because it matched the 2080Ti(I had one too) at 499$.

idk even the meh 2070 offered similar to 1080ti performance this doesn't even come close to previous gen flagships and really looks like a 3060ti replacement and even more so now the 4070ti looks like th real 4070

I'm ok with the 4080 being called an 80 class product but Nvidia really did shift everything below it down a tier while charging more money....... I guess we will see how bad the 4060ti is next.
 
idk even the meh 2070 offered similar to 1080ti performance this doesn't even come close to previous gen flagships and really looks like a 3060ti replacement and even more so now the 4070ti looks like the real 4070
The 2070 and 3070 matched, not exceeded, the previous flagships when the x80s were significantly slower.
1080 vs 1080Ti and 2080 vs 2080Ti

The 4070 matches the 3080 while all the 3080s and 90s, except from 3090Ti, perform within 5%.

But I get what you say and it makes sense.
 
Well, it's not "the same". If outputs about the same performance levels, but it only draws 2/3rd the power to do so.
Are Nvidia's cheerleaders back to pretending to care about power consumption now then, after spending the last year or so defending 500W+ cards as normal and necessary? It's hard to keep track since the narrative on that changes so often, depending on which card needs to be defended that month.

Speaking personally, I don't consider shaving a third off the power for the same performance (often less above 1080p) to be at all impressive after two and a half years, for $100 less. The 1070 shaved a third off the 980 Ti's power consumption, whilst having 2GB more VRAM, being faster across the board and costing $270 less. That was impressive. This is complete stagnation, and the fact that people are actually defending it is depressing really. Jensen won. People are mindbroken and will accept anything at this point.
 
Even with all that this card is extremely meh compared to pretty much any previous 70 class product. I think even the meh AF 2070 was more exciting.
Yeah. 2070 was +27% over 1070 and you got to play around with RT (and no matter what people say around these places with DLSS you could easily play any 2019 RT game at 1440p unless you were intent on setting everything else on ultra), here it's just +20% and you get nothing. Absolute BS but it's still the best one can get new at $599. Though I suppose for people who already have fat PSUs and only play raster games 6950XT for extra 30-50 bucks might be another way.
 
what happened to the Borderlands 3 bench?
Too old and CPU limited, replaced by other games in January, lots of UE4 titles in our bench anyway

any plans to add CS2?
Not sure yet

Are shorts bad economically wise?
Yes, the YT algorithm prefers videos that are longer than 20 minutes, which is why everyone is making those long videos

It seems the AIB's agree, because every AIB 4070 Ti card reviewed today is the same price as the FE, at $600.
As mentioned in the text, it's a requirement by NVIDIA. Reviews today are for MSRP only. Reviews for more expensive cards can only go live tomorrow
 
I don't feel so bad paying 690 for my 6950XT now.
Felt amazing snagging the ASRock OC formula variant for 700 on black friday.
 
Yeah. 2070 was +27% over 1070 and you got to play around with RT (and no matter what people say around these places with DLSS you could easily play any 2019 RT game at 1440p unless you were intent on setting everything else on ultra), here it's just +20% and you get nothing. Absolute BS but it's still the best one can get new at $599. Though I suppose for people who already have fat PSUs and only play raster games 6950XT for extra 30-50 bucks might be another way.

40 series pricing as a whole is the only reason this looks appealing at all with the 4080 literally costing twice as much. It's not a bad card but it really doesn't move the needle forward much from products that came out over 2 years ago,

In a vacuum it's still the only nvidia card near it's price worth buying and honestly that makes me kinda sad for what I consider really the new midrange.
As mentioned in the text, it's a requirement by NVIDIA. Reviews today are for MSRP only. Reviews for more expensive cards can only go live tomorrow

That's hands down the best thing Nvidia has done for this launch.
 
"NVIDIA has set a base price of $600 for the GeForce RTX 4070 Founders Edition, which is an alright price given the current GPU pricing landscape, but $100 more expensive than the launch-price of RTX 3070 and RTX 2070."

Alright price?

Compared to RTX 3070 its 27.6% faster in 1080p, 27.4% faster in 1440p and 25.4% faster in 4K. That used to be performance increases of midlife upgrades, like RTX 3070 Ti launching a year later for the same price.

Performance per dollar chart says it all:

performance-per-dollar-3840-2160.png
 
100.628437%... 0.628% faster in classic raster, while being 22% slower in ray tracing, the standard that is being used in all modern game engines moving forward, while consuming 100 W more when gaming. Seems like a wise choice to go for the "winning" RX 6800 XT!

That's also not taking into account DLSS and frame generation.

It will vary on which source, but out of the top 20 most popular games being played, the only title with a relevant difference in RT performance is Hogwarts legacy. Even then performance is horrible without upscaling thrown into the mix as usual.

RT isn’t the standard to measure anything by, and won’t be for another few years. True ray/path tracing is awhile out and we definitely don’t have any mainstream hardware capable of doing so.

OT

Another poor value card to bend consumers over. If this price increase/tier drop trend continues to next gen most people aren’t going to be able to afford PC hardware if a 5060 ends up costing more than consoles alone.
 
Why are you quoting an RT graph ??? 99.9% of games out there is NON RT. Please stop this non sense multi billion dollar company defenses, its shameless
Because the 200 W 4070 is effectively the same performance as the 300 W 6800XT (no, the 6800XT being 0.628% faster in classic raster isn't meaningful) while having more than 20% higher RT performance and access to DLSS/FG. So it's being quoted as it is what is different, obviously.
 
Why are you quoting an RT graph ??? 99.9% of games out there is NON RT. Please stop this non sense multi billion dollar company defenses, its shameless
Because these are the games you expect to play when you pay 600+.
 
all this whinging about price tends to suggest to me that a growing number of PC gamers cant really afford their chosen hobby.. unfortunately i dont see this situation getting any better..

trog
 
  • Like
Reactions: N/A
Because these are the games you expect to play when you pay 600+.
??????? RTX 4090 cant even path trace in CP2077. Again , stop what you are doing. It makes 0% sense
 
Are Nvidia's cheerleaders back to pretending to care about power consumption now then, after spending the last year or so defending 500W+ cards as normal and necessary? It's hard to keep track since the narrative on that changes so often, depending on which card needs to be defended that month.

Speaking personally, I don't consider shaving a third off the power for the same performance (often less above 1080p) to be at all impressive after two and a half years, for $100 less. The 1070 shaved a third off the 980 Ti's power consumption, whilst having 2GB more VRAM, being faster across the board and costing $270 less. That was impressive. This is complete stagnation, and the fact that people are actually defending it is depressing really. Jensen won. People are mindbroken and will accept anything at this point.
I can't speak for others, but I've always cared about power draw. I don't care about high-end, those cards are more like "look ma', I can do this!", they may come with their dedicated reactor for all I care. But I've always bought mid-rangers largely because their power draw is seldom an issue.
Of course, I'm not going to pick AMD over Nvidia or viceversa over a 10W difference. But in this case, we're looking at almost 100W while gaming.
 
Because the 200 W 4070 is effectively the same performance as the 300 W 6800XT (no, the 6800XT being 0.628% faster in classic raster isn't meaningful) while having more than 20% higher RT performance and access to DLSS/FG. So it's being quoted as it is what is different, obviously.
200w or 300w dosent matter. It will make less then 50bucks at the end og the year. My rx6900xtx does not go over 300w with clocks at 2800mhz. The rx6800xt does not consume 300w , spikes perhaps. Why are you people quotibg fake frames and upscaling ? So you are willing ti buy a brand new gpu so it can run your games at a lower res just to upscale it ? Make it make sense please. Stop being fanboys. Dosent help anybody

I can't speak for others, but I've always cared about power draw. I don't care about high-end, those cards are more like "look ma', I can do this!", they may come with their dedicated reactor for all I care. But I've always bought mid-rangers largely because their power draw is seldom an issue.
Of course, I'm not going to pick AMD over Nvidia or viceversa over a 10W difference. But in this case, we're looking at almost 100W while gaming.
Rx 6800xt will never run a constant 300w. My rx6900xtx under water while being oc’ed dosent run 300w
 
??????? RTX 4090 cant even path trace in CP2077. Again , stop what you are doing. It makes 0% sense
That's normal. Not even the next 3 gens will be able to do so, in complex geometry environments.
That's why RT is invented.

But you have to admit that it's admirable a gpu that needed 5-6 hours to process a single frame, now it processes 10-15 frames in a sec.
That's something.
 
Rx 6800xt will never run a constant 300w. My rx6900xtx under water while being oc’ed dosent run 300w
Take it up with @W1zzard then. His benchmarks say it does.
 
all this whinging about price tends to suggest to me that a growing number of PC gamers cant really afford their chosen hobby.. unfortunately i dont see this situation getting any better..

trog

Idk I can easily afford my 4090 and still think this is meh AF.
 
Because the 200 W 4070 is effectively the same performance as the 300 W 6800XT (no, the 6800XT being 0.628% faster in classic raster isn't meaningful) while having more than 20% higher RT performance and access to DLSS/FG. So it's being quoted as it is what is different, obviously.
Really my 6800XT does not go over 255 Watts so try again.
 
Take it up with @W1zzard then. His benchmarks say it do
Take it up with @W1zzard then. His benchmarks say it does.
Wizzard who refused to put in the rx6950xt on the charts ? Or for that matter rtx 2070super? Dosent even bother retesting older cards with newer drivers to see the improvements from both camps.
 
Wizzard who refused to put in the rx6950xt on the charts ? Or for that matter rtx 2070super? Dosent even bother retesting older cards with newer drivers to see the improvements from both camps.
I don't see how any of that is relevant, but yes, him.
 
Back
Top