• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Founders Edition

In other news....

1681661368802.png
 
4070 being recommended with “awards” from reviewers is like an out of season April fools joke


3060ti ($400) providing performance of a 2080 super ($700)
3070 ($500) providing performance of a 2080TI($1200)
4070 ($600) providing performance of a 3080 ($700)
4060 ($500) providing performance of a 3070 ($500)
4050 ($400) providing performance of a 3060 ($330)

Nice “progression”

remember the outrage about the 4080 12gb? They’re doing it again except it’s called a “4070” and they will do it with the 4060 and 4050 too
 
Last edited:
4050 8GB 128bit vs 3060 12GB 192bit :shadedshu:

4050, 60/ti will be dead even on 1080p High within the next 2 years, last gen consoles are not relevant for new Game IPs.
4060ti will cost the same like an Series X or PS5.
 
I have to question how important realism is in a game with baby textures. Minecraft is indeed the best example. If you gave me the choice between a highly textured Minecraft with complex shapes (not everything a block) and the standard Minecraft with RTX, I would laugh at the idea of playing the current Minecraft.
It's the difference between a perfectly lit Lego castle and a proper castle model. Sure, each has its charm. But claiming that "realism" is best brought with lighting...nah. Just nah. I'm not denying your point entirely btw, lighting is the most important component for atmosphere, but atmosphere with 0 detail isn't going to suffice. Good lighting AND good textures. If I have to choose between both, I'll simply pick the one that seems to be the most "sturdy", the one that'll work in most cases. That's textures, not lighting.

I agree with this from i guess a short-lived RT experience. I think we've got a long way to go in over-all graphics development to give RT a real shot at ultra realism.... although i admit in some scenes/viewed objects or reflective surfaces RT looks great. Personally i don't hang about in games to observe the lighting effects in still or slow paced visuals, i'm too busy kicking A-S-S in FPS MP titles. But should game developers sharpen up their act with realistic textures and other aspects of the game with RT enabled... it would still suck considering the best of GPUs which would be capable of running this sort of stuff seamlessly will probably cost around $2000-$3000 ... courtesy of the green monster.
 
I agree with this from i guess a short-lived RT experience. I think we've got a long way to go in over-all graphics development to give RT a real shot at ultra realism.... although i admit in some scenes/viewed objects or reflective surfaces RT looks great. Personally i don't hang about in games to observe the lighting effects in still or slow paced visuals, i'm too busy kicking A-S-S in FPS MP titles. But should game developers sharpen up their act with realistic textures and other aspects of the game with RT enabled... it would still suck considering the best of GPUs which would be capable of running this sort of stuff seamlessly will probably cost around $2000-$3000 ... courtesy of the green monster.
And historically speaking, in every technical field, whenever it was needed to choose between to technologies, the older one was always the safe bet. I'm talking telephone vs Discord/Skype, email vs websites, land lines vs wifi...this is not due to poor design of the new tech but with the fact that the older tech had more experience, more backbone to itself, and that the older tech always was built on lesser means, and needed to stand on a simpler toolchain.

I trust 100% that if we have to choose between better textures and raster or better raytracing in the coming...10 years honestly, raster and higher quality textures will always be a safer bet. It will appear in more games, it will look better in more games, it will put less of a damper on performance in more games, etc.

I believe I've said this elsewhere (probably Reddit actually), but with Cyberpunk's Path Tracing, we've had an indisputable technological statement from Nvidia:
"If you want Path Tracing, you need a team of Nvidia engineers dedicated to helping your team achieve it, months and years of slow progress with partial raytracing first, and for your users to own a $1600 or above GPU, so they can get 17 FPS, and then use supersampling and frame generation to get to above 60 FPS."

For PT to exist properly in the gaming world, you'll need to:
A) Knock down "a team of dedicated Nvidia engineers" to "an API and documentation that is easy enough to use for most render specialists"
B) Months and years need to become weeks to months (so get the PT running as easily as any other features, not spend a year optimising it enough to be functional)
C) Get that GPU cost to below $700
D) Get those frames to at the bare minimum 30 on that GPU, and use supersampling (and in time, get it to 60, then to 144, and all that)

To me, getting A should take already a solid year, and getting those render specialists competent with full PT will easily take years. It takes so long for common knowledge and docs to grow to a truly usable degree. Let's say 5 years or so for A to fully materialise.
Getting B is easy if A is completely done, let's say that it'll slowly get done as A is done.
Getting C will also easily take at least 2, if not 3 new generations of GPUs, so 4 to 6 years, not before.
Getting D is the same as getting C, but you can probably add another gen or two of GPUs for it.

I'm betting that Path Tracing will actually be present in quite a few AAA games in 5 years time, and still be only an option because it'll be an elitist thing. Then you'll get it to slide down from being elitist to commonplace in the next 5 years. Until these whole 10 years are done, I highly doubt that you'll be better off seeking Raytracing than seeking good raster/textures/classic game tech, which by the way, will still keep growing, albeit at an ever slower pace on the fields where raytracing can replace it.

Also nice icon/name lol
 
Many thinks Raytracing in those garbage games is so beautiful but i disagree it isnt realistic, nothing in the reallife have such hard mirroring like RT ON in many games,
i have atm only 1 game i like for RT its Minecraft. :laugh:

In the other way i like sandbox games and those love gpu ram as hell.
Arc 770 16GB constant 60 FPS (GPU RAM usage about 14GB), 3060 12GB 60 FPS and heavy drops.

RTX 3080 = not playable in the same Setting:D
Vega 7 = constant 20 FPS its even more playable than on RTX 3080 but not really.
 
Last edited:
We are yet to see the implementation of the new RT features that perform more work per clock. since 3080 and 4070 deliver the exact same performance in this overdrive mode this isn't the appearance of what i thought would be that just yet. it could take forever and by then the card would become paperweight just like my GTX760.
 

46 fps with upscaling. Well, it's a nice tech demo at least. Maybe in a few years after maybe 2? more hardware upgrade cycles it'll be a reasonable option for the high-end buyers, but we're clearly not there yet.

But if you look at some of the CP2077 RTOD videos, you'll see a problem that needs to be solved for full RT to work, and it will take time and money. The lighting job that RT is taking over from baked-in raster lighting doesn't work well. It's been carefully optimized for raster because the vast majority of gamers play using rasterized lighting as their cards can do raster at good framerates and don't have the power or often the option for good RT framerates.

So do game studios need to make 2 differently optimized lighting versions of each game so that RT actually looks good like raster does? Do they charge more for the RT-designed lighting, as for the time being that's not only the minority but also likely the subset more able/willing to pay?

I ask because I just spent a couple of hours playing Control first with RT off, next with it on, and then off again. And when you make the change it's nice to have the RT, it's a visual improvement as you're tooling around. But once you are back into the story/gameplay, the obvious difference with RT on is in it's reduction in framerate. I have a 6800XT so my RT framerate hit is greater than on a raster-equivalent 3080 or 4070, and it was getting 55-65 fps at 1440p. I locked Control to 80fps max to make the raster jump in fps less noticeable but 55 is very noticeably lower than 80 and that framerate dip has a far larger effect on gameplay and immersion than the RT effects.
 
In other news....

Some DigitalFoundry (aka Nvidia's Personal Reviewers) bs
An XTX that would have less performance than a 3070...

These DF guys sure are working for Jensen's paycheck...

Actually scratch that, an XTX with about the perf of a 2080 Ti is even more hilarious.
Digital Foundry sure is a balanced reviewer that isn't trying to make Nvidia look good as much as possible, nor would they be pushing their efforts to make AMD look bad to the point of ridiculous sometimes.
 
An XTX that would have less performance than a 3070...

These DF guys sure are working for Jensen's paycheck...

Actually scratch that, an XTX with about the perf of a 2080 Ti is even more hilarious.
Digital Foundry sure is a balanced reviewer that isn't trying to make Nvidia look good as much as possible, nor would they be pushing their efforts to make AMD look bad to the point of ridiculous sometimes.

This isn't news, AMD's raytracing engine is horrible. 3DMark Speed Way runs slower on a 6950 XT than it runs on a vanilla 3070, so I'm very much willing to give them the benefit of doubt.

RDNA 3 did not introduce major changes to the ray accelerator design, it just has more units, cache and memory bandwidth to work with.
 
The CP overdrive just shows the difference in performance between AMD and nvidia gpus in path tracing.
No game out there is or will be like that.

The 7900XTX is close to 3080 in games with RT. Games like CP with many RT effects not jokes like far cry and resident evil.

The 7000 is a safe choice for gamers. Just priced badly.
 
Source please?

Source: Benchmarks and performance of every RT-enabled game and/or benchmark ever, most recently exacerbated by Cyberpunk's new path traced mode which just murders the whole lot

In RT, Radeon GPUs consistently perform at the same level of GeForce cards 2-3 tiers below them, and the 7900 XTX itself barely matches the RTX 3080 Ti/3090's RT performance, once overclocked - it's closer to 3080 12GB if you use a 375W stock reference model (so at this point it's a full gen behind).
 
Source: Benchmarks and performance of every RT-enabled game and/or benchmark ever, most recently exacerbated by Cyberpunk's new path traced mode which just murders the whole lot

In RT, Radeon GPUs consistently perform at the same level of GeForce cards 2-3 tiers below them, and the 7900 XTX itself barely matches the RTX 3080 Ti/3090's RT performance, once overclocked - it's closer to 3080 12GB if you use a 375W stock reference model (so at this point it's a full gen behind).
I didn't see a single benchmark that ever put the XTX below a 3080. Not even close. Putting it well below a 3070 is just ridiculous. Following other benchmarks the XTX should be around the 3080's 44FPS.
Then again it is an Nvidia tech made by Nvidia for Nvidia with CDPR's blessing, so...perhaps I shouldn't be so surprised.
 
I didn't see a single benchmark that ever put the XTX below a 3080. Not even close. Putting it well below a 3070 is just ridiculous. Following other benchmarks the XTX should be around the 3080's 44FPS.
Then again it is an Nvidia tech made by Nvidia for Nvidia with CDPR's blessing, so...perhaps I shouldn't be so surprised.

Re-read, I said the 6950 XT (Navi 21) performs like the 3070 in Speed Way, a quick look at their global result browser will be really obvious... and the XTX performs = 3080/3080 Ti in RT in general, unless it's pathtraced (Cyberpunk overdrive) where it quickly loses gas and then performs substantially worse
 
Re-read, I said the 6950 XT (Navi 21) performs like the 3070 in Speed Way, a quick look at their global result browser will be really obvious... and the XTX performs = 3080/3080 Ti in RT in general, unless it's pathtraced (Cyberpunk overdrive) where it quickly loses gas and then performs substantially worse
I think Pathtraced Cyberpunk is RTX optimized like Portal RTX. Try running Portal RTX on a 7000 series card and it is a slide show just like Pathtraced Cyberpunk.
 
RT is the new Physx/Tessellation. Its so ineffecient to do real time RT in games. Just wait in few years it will be baked into the games. I bought into the hype in 2019 and bought a RTX2080Ti. The card was great ! Tried RT in control and never turned it back on again.
 
I think Pathtraced Cyberpunk is RTX optimized like Portal RTX. Try running Portal RTX on a 7000 series card and it is a slide show just like Pathtraced Cyberpunk.

Not a matter of optimization IMO (not that Cyberpunk is great at it), but fully pathtraced graphics are still beyond current-generation GPUs. The 3090 being the hardware to target 1080p @ 30 fps proves that, even if Cyberpunk were flawlessly optimized, let's say that the 3090 wouldn't do much more than 40 fps at 1080p.

Ada is better, but we're going to begin seeing hardware powerful enough to muscle through that with Blackwell (50 series), if not the architecture that comes after that. Portal RTX lags for a similar reason, although, I do strongly believe that NVIDIA could have done better by Ampere users on that specific instance.
 
but we're going to begin seeing hardware powerful enough to muscle through that with Blackwell (50 series), if not the architecture that comes after that.

Big ticket 50-series/60 - most buyers will be more concerned about the muscle required to push their hard earned cash to Nvidias pockets... unfortunately i've hit my excitement threshold with forward Gen possibilities. I spend more time nowadays pathtracing possibilities for current Gen cards to drop in price (at a considerable scale). An efficient 4080-level card for $800 is all the light (wallet weight) realism i need. I get it people in closed off basements or attics might differ in opinion but I've got 2 windows in my room and with the curtains pushed aside i get plenty of natural RT/PT rolling in. Actually to think of it, thank God theres no Green tax on peripheral vision global illumination lol (Nvidias taking notes: we must eclipse the sun)

I'm still unconsciously waiting for a RTX-4080-NO-BS variant for less. Makes sense considering RT/PT is hardly going to be mainstream anytime soon.
 
Last edited:
Source: Benchmarks and performance of every RT-enabled game and/or benchmark ever, most recently exacerbated by Cyberpunk's new path traced mode which just murders the whole lot

In RT, Radeon GPUs consistently perform at the same level of GeForce cards 2-3 tiers below them, and the 7900 XTX itself barely matches the RTX 3080 Ti/3090's RT performance, once overclocked - it's closer to 3080 12GB if you use a 375W stock reference model (so at this point it's a full gen behind).
You can check hardware RT on unreal Engine, a 4090 is 3 times as fast as the 7900xtx, and no one can claim unreal is nvidia optimized.

This is from PUGET's review.. But somehow people in this forum will insist the difference in RT is 16%. What can you do, amd defenders just hate reality.

Unreal_40Series_raytrace.png
 
You can check hardware RT on unreal Engine, a 4090 is 3 times as fast as the 7900xtx, and no one can claim unreal is nvidia optimized.

This is from PUGET's review.. But somehow people in this forum will insist the difference in RT is 16%. What can you do, amd defenders just hate reality.

Unreal_40Series_raytrace.png

UE4 tends to favor Nvidia in actual released games. I'd be more interested in a similar benchmark with UE5 which seems more hardware agnostic than 4. I guess only time will tell when we actually get some released UE5 games that support a decent amount of RT.

Right now all we have is Fortnite and while the hardware RT is impressive vs the software implementation it's not exactly super heavy and imo not on the level of CP2077.


Most still don't care about RT but that likely comes down to the majority of people not having hardware fast enough to support it while getting decent performance.

The 4070 while capable of RT I wouldn't classify as a card good enough that I'd want to use it with.
 
You can check hardware RT on unreal Engine, a 4090 is 3 times as fast as the 7900xtx, and no one can claim unreal is nvidia optimized.

This is from PUGET's review.. But somehow people in this forum will insist the difference in RT is 16%. What can you do, amd defenders just hate reality.

Unreal_40Series_raytrace.png

Mate you can't be serious?

UE4 and Nvidia's partnership or multiple partnering ups is old story... in an official capacity (one eg. UE uses nV phsx). AMD has been seen to eventually polish up on code/driver support to somewhat close in on the gap to an already nV-UE4 refined package. Check some of pugets previous articles on RT, i'm certain hes mentioned these partnerships/leanings or AMD's initial driver discrepancies. I would rather stick to independent reviewers with the likes of TPU/jarrod/HU/etc where the benchmarks look a little more realistic spread across more than just one gaming engine. You can't blame Puget though... it is what it is, a UE4 take in real-time but at a given time.
 
Last edited:
Back
Top