• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 Founders Edition

Joined
Jan 20, 2019
Messages
1,377 (0.68/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I was in your boat with real world performance matters, efficiency matters.

this is probably the most valuable thing you've said since dorothy hit fred on the head with a loaf of bread.

I'm all for efficiency and lower power consumption for top tier graphics products and processors. nV has absolutely won me over with their current line of products or to be precise the 4080 and upwards (not interested in mid-tier cards). Impressive stuff for "window shopping" but not so impressive for "shopping". For me 40-series up-tier is like a $150k+ sports car exhibition show. Take it all in baby but no touchy smuchy you pitiful deprived poor bastuud. Yep, nV is rude lol. So as far as i'm concerned, nV can take their impressive products and that fat treacherous price tag they've got flaunting its riches and stick it where the sunshine don't shine.

For your info, this Generation, its the first time i've been strongly considering shifting over to AMD's 7000 series. Yep, rather not pay the green tax in exchange for some higher power consumption offerings (and yep undervolting considered). Only I want 7900XTX/4080-LEVEL performance and i'm not willing to pay a dime over $800 (well a little north of 800 won't bite). Can't see the 4080 dropping anywhere near to 800 but the XTX... dunno? maybe?

As for all the other features - nah not interested!
 

MerrHeLL

New Member
Joined
Apr 6, 2022
Messages
6 (0.01/day)
"Loses"

:D

New cards will always cost more than used cards bro.

View attachment 291210View attachment 291211
Ray Tracing is like 3 games... if those are yours and you care, this card screams get a 4070Ti or 4080. AMD picture quality and visuals on anything but Ray Tracing are superior to NVidia's but get set aside by the FPS fanatics. if your GPU pushes your game faster than your monitor refresh, it simply doesn't matter. I've had most RTX cards, 3060 Ti, 3070, 3070 Ti, 3080 and of these cards the 3060 Ti and 3070 are the best. I'm running an XFX SPEEDSTER MERC 310 AMD Radeon™ RX 7900 XTX. Its silent, quiet and just crunches through everything with ease. One thing AMD has over Nvidia is fast alt tab switching. You are in a game, die and want to check on stuff... Nvidia has that pause flicker weirdness that annoys/ AMD has instant alt tab access to other apps, 2nd and 3rd screens etc. I swapped my 3080 out for an RX 6700XT for that reason and was happier. When RTX 4080's couldn't be found for less than $1500, which is up until a month or so ago, I picked up the 7900 XTX for a steal ($1050). It has been the perfect card for me. AS someone who games and also does video editing and photoshop photography color mapping, AMD simply has more accurate color. I do wish reviewers would look at more than just FPS. Full disclosure. TechPowerUp is my favorite hardware review site. I think they give the most complete reviews and most comparable information with their large datasets. This review is also a good review. My take, $600 for a an RTX 3070 upgrade simply isn't worth it. Stick with the 3070 and wait. Either next gen, or wait for a 4070 Ti to fall to $600 or less.

this is probably the most valuable thing you've said since dorothy hit fred on the head with a loaf of bread.

I'm all for efficiency and lower power consumption for top tier graphics products and processors. nV has absolutely won me over with their current line of products or to be precise the 4080 and upwards (not interested in mid-tier cards). Impressive stuff for "window shopping" but not so impressive for "shopping". For me 40-series up-tier is like a $150k+ sports car exhibition show. Take it all in baby but no touchy smuchy you pitiful deprived poor bastuud. Yep, nV is rude lol. So as far as i'm concerned, nV can take their impressive products and that fat treacherous price tag they've got flaunting its riches and stick it where the sunshine don't shine.

For your info, this Generation, its the first time i've been strongly considering shifting over to AMD's 7000 series. Yep, rather not pay the green tax in exchange for some higher power consumption offerings (and yep undervolting considered). Only I want 7900XTX/4080-LEVEL performance and i'm not willing to pay a dime over $800 (well a little north of 800 won't bite). Can't see the 4080 dropping anywhere near to 800 but the XTX... dunno? maybe?

As for all the other features - nah not interested!
That's exactly how I felt. Got lucky on an XFX SPEEDSTER MERC 310 AMD Radeon™ RX 7900 XTX for $1050 and I am loving it. RTX 4080's were not found for less than $1500 when I bought. I've seen a decent Gigabtye 4080 for $1100 at Newegg lately. You can find a 7900 XT around $800 at times but the ones I like haven't gone for less than $899.
 
Joined
Mar 10, 2023
Messages
30 (0.06/day)
Here's a game using UE5 RT, to provide more info about actual in game performance
View attachment 292484
I guess you forgot the fact that Lumen is not real raytracing, so this graph doesnt really matters in terms of ray tracing performance. Not to mention you can use frame generation here to get roughly 50% more fps with nvidia….so this Amd performance is quite weak.
 

3x0

Joined
Oct 6, 2022
Messages
935 (1.42/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550I Gaming Edge Wi-Fi ITX
Cooling Scythe Fuma 2 rev. B Noctua NF-A12x25 Edition
Memory 2x16GiB G.Skill TridentZ DDR4 3200Mb/s CL14 F4-3200C14D-32GTZKW
Video Card(s) PowerColor Radeon RX7800 XT Hellhound 16GiB
Storage Western Digital Black SN850 WDS100T1X0E-00AFY0 1TiB, Western Digital Blue 3D WDS200T2B0A 2TiB
Display(s) Dell G2724D 27" IPS 1440P 165Hz, ASUS VG259QM 25” IPS 1080P 240Hz
Case Cooler Master NR200P ITX
Audio Device(s) Altec Lansing 220, HyperX Cloud II
Power Supply Corsair SF750 Platinum 750W SFX
Mouse Lamzu Atlantis Mini Wireless
Keyboard HyperX Alloy Origins Aqua
I guess you forgot the fact that Lumen is not real raytracing, so this graph doesnt really matters in terms of ray tracing performance. Not to mention you can use frame generation here to get roughly 50% more fps with nvidia….so this Amd performance is quite weak.
If Lumen isn't "real" raytracing then I guess Frame Generation isn't real frames
 
Joined
Aug 25, 2015
Messages
188 (0.06/day)
Location
Denmark
System Name Red Bandit
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS PRIME X670E-PRO WIFI
Cooling Mo-Ra3 420 W/4x Noctua NF-A20S - 2xD5's/1xDDC 4.2
Memory G.SKILL Trident Z5 NEO EXPO 6000CL30/3000/2000
Video Card(s) GIGABYTE RX 6900 XT Ultimate Xtreme WaterForce WB 16GB
Storage Adata SX8200 PRO 2TB x 2
Display(s) Samsung Odyssey G7 32" 240HZ
Case Jonsbo D41 Mesh/Screen
Audio Device(s) Logitech Pro X Wireless
Power Supply Corsair RM1000e v2 ATX 3.0
Mouse Logitech G502 Hero
Keyboard Corsair K70 MX RED Low profile
Software W11 Pro
I guess you forgot the fact that Lumen is not real raytracing, so this graph doesnt really matters in terms of ray tracing performance. Not to mention you can use frame generation here to get roughly 50% more fps with nvidia….so this Amd performance is quite weak.
Lol and frame generation is “real” frames ? Gad damn copium is high with nvidia bois this generation !
 
Joined
Apr 14, 2022
Messages
688 (0.82/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
Software RT, like Lumens or northlight engine's one (Quantum Break) or in CryEngine (Crysis) is real RT.
The fact that the game doesn't take the most out of the nVidia's RT cores doesn't mean that the implementation give different results.
It's still RT.
In these cases, the 7900XT/XTX are competitive to nVidia high end gpus.

The problem with the software RT is that it is more expensive and need a BFGPU to run it properly while the mid range cards are doomed to fail.
 
Joined
Mar 10, 2023
Messages
30 (0.06/day)
Lumen is a hybrid solution that uses raytracing only for certain parts of the picture, thats why it can run well on gpu-s that does not even support raytracing properly.

But this is probably only works well if the used raytracing is low level, so the performance penalty is not too big. Because if the gpu uses everything to calculate a heavy raytracing implementation, then no resource left for the other stuffs…also probably really high power consumption and memory usage.
 

3x0

Joined
Oct 6, 2022
Messages
935 (1.42/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550I Gaming Edge Wi-Fi ITX
Cooling Scythe Fuma 2 rev. B Noctua NF-A12x25 Edition
Memory 2x16GiB G.Skill TridentZ DDR4 3200Mb/s CL14 F4-3200C14D-32GTZKW
Video Card(s) PowerColor Radeon RX7800 XT Hellhound 16GiB
Storage Western Digital Black SN850 WDS100T1X0E-00AFY0 1TiB, Western Digital Blue 3D WDS200T2B0A 2TiB
Display(s) Dell G2724D 27" IPS 1440P 165Hz, ASUS VG259QM 25” IPS 1080P 240Hz
Case Cooler Master NR200P ITX
Audio Device(s) Altec Lansing 220, HyperX Cloud II
Power Supply Corsair SF750 Platinum 750W SFX
Mouse Lamzu Atlantis Mini Wireless
Keyboard HyperX Alloy Origins Aqua
Lumen is a hybrid solution that uses raytracing only for certain parts of the picture, thats why it can run well on gpu-s that does not even support raytracing properly.

But this is probably only works well if the used raytracing is low level, so the performance penalty is not too big. Because if the gpu uses everything to calculate a heavy raytracing implementation, then no resource left for the other stuffs…also probably really high power consumption and memory usage.
What is real raytracing then? Most of the games are hybrids of raster and raytracing, even Cyberpunk which got updated recently to support more raytracing options. Quake 2 RTX and similar fully raytraced/pathtraced games barely worked on RTX GPUs because of how demanding they were to run, and had very low model detail and ray sampling to offset the performance issues.

I don't care what "real" raytracing is, I care only what the game developers are going to implement as raytracing in the future. Unreal Engine 5 is most likely going to be the dominant/most used engine in the future, and Lumen is probably going to be the go to option for developers for raytracing implementation (considering it has to run on consoles as well).
 

bug

Joined
May 22, 2015
Messages
13,487 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
What is real raytracing then? Most of the games are hybrids of raster and raytracing, even Cyberpunk which got updated recently to support more raytracing options. Quake 2 RTX and similar fully raytraced/pathtraced games barely worked on RTX GPUs because of how demanding they were to run, and had very low model detail and ray sampling to offset the performance issues.

I don't care what "real" raytracing is, I care only what the game developers are going to implement as raytracing in the future. Unreal Engine 5 is most likely going to be the dominant/most used engine in the future, and Lumen is probably going to be the go to option for developers for raytracing implementation (considering it has to run on consoles as well).
There's a trend currently where games limit their RT usage because where Nvidia has a hard time handling it, AMD just tanks. When doing it all in software, you can imagine the number of things that get the RT treatment is even more limited. And the number of rays used must be really, really low.
In a way, it's like gaming at 720p or lower. You would still call it 3D gaming/rendering, but really...
 
Joined
Mar 10, 2023
Messages
30 (0.06/day)
In my opinion this is one step back, when RT on, one clearly stronger card gives almost the same fps as the weaker one, it is basically simplified version of raytracing that runs well on the outdated architecture with probabaly very limited number of rays and not utilizing the more modern tech. Of course it looks fine in techdemos, when they fully optimize & utilize the engine, but to be honest, even the UE4 techdemo is still a pipe dream, noone ever reached that level. I dont know too much about the popularity of the UE engine, but most AAA games not using it, so probably UE5 will be only popular with titles. UE engine is usually a cost saving method for making games, so noone really bother with utilizing its capabilities fully.
 
Last edited:

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,716 (1.96/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White, Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
Well, programming caters to the weakest link, and unfortunately AMD powered consoles are the biggest link (and the weakest), so despite NVIDIA owning PC marketshare, their far superior RT is rarely used to it's strengths, since all modern game publishers care only about profit margins, not releasing the best games they can.

I wouldn't expect this to change much until AMD releases a generation of cards with actual competent RT hardware in consoles, and not shader cores doing the work for pseudo RT in software.

Until then be happy that games like CP:2077 can allow you to stretch the legs of your Ada generation card and an RDNA3 card will run it in single digit frame rates (if at all). There will be others, it just won't be the norm.

There's certainly plenty of games where RT has massive influence on the immersion and QoL, such as Minecraft, compare ray traced on vs off, it's practically a different game, or Minecraft 2.

Deep Rock Galactic would be a prime candidate for Path Tracing too, but unfortunately the developers are more than happy to cash in on their success and just do minor content updates without much real technical innovation or effort.
 
Joined
Mar 10, 2023
Messages
30 (0.06/day)
Well, programming caters to the weakest link, and unfortunately AMD powered consoles are the biggest link (and the weakest), so despite NVIDIA owning PC marketshare, their far superior RT is rarely used to it's strengths, since all modern game publishers care only about profit margins, not releasing the best games they can.

I wouldn't expect this to change much until AMD releases a generation of cards with actual competent RT hardware in consoles, and not shader cores doing the work for pseudo RT in software.

Until then be happy that games like CP:2077 can allow you to stretch the legs of your Ada generation card. There will be others, it just won't be the norm.
I just hope that at least one of the two next gen console wont using AMD chips, because their GPUs are like a stoneaxe, literally almost no innovation in the last 4-5 year. They really limiting the game industry, this Lumen stuff probabaly also born because the incompetence of AMD.
 
Last edited:

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,716 (1.96/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White, Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
I just hope that at least one of the two next gen console wont using AMD chips, because their GPUs are like a stoneaxe, literally almost no innovation in the last 4-5 year.
They're cheap, and good enough, remember most people have never seen a high end PC rendering real path tracing with their own eyes, so they don't really know what they're missing.

Same with the AMD fanboys, they'll give you some airy answer like "not that impressed", but their experience was on something like a 20 series RTX with first gen games, or some laptop with a 3060, probably in 1080p, and probably in a game with tick box "ray tracing", rather than anything with actual effort.

Unfortunately PC gaming doesn't drive innovation anywhere near as much as it could, despite the hardware existing.

Mobile gaming and consoles is where the $$$ are in R&D.
 

bug

Joined
May 22, 2015
Messages
13,487 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
They're cheap, and good enough, remember most people have never seen a high end PC rendering real path tracing with their own eyes, so they don't really know what they're missing.

Same with the AMD fanboys, they'll give you some airy answer like "not that impressed", but their experience was on something like a 20 series RTX with first gen games, or some laptop with a 3060.
Let's be honest here, not even Nvidia's hardware is able to push RT properly yet. It's getting there, but clearly having just two players and one of them lagging, isn't going to do us any favor.
 
Joined
Mar 10, 2023
Messages
30 (0.06/day)
You can use frame generation, that gives roughly 50% more fps, so even with RT most game are playable even at 4K.
And in the sweet spot WQHD basically every game can be playable with high fps.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,716 (1.96/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White, Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
Let's be honest here, not even Nvidia's hardware is able to push RT properly yet. It's getting there, but clearly having just two players and one of them lagging, isn't going to do us any favor.
I disagree, 4080/4090 can push 4K with path tracing in CP:2077, sure it's using DLSS to do that at reasonable frame rates, but the IQ is still vastly better than 1080p native path tracing, or literally any other game in existence.

Obviously next gen cards will be better, but there's more than enough ooomph in even last gen NVIDIA cards to offer a good RT experience. E.g I can happily turn every RT setting on and to max in most games with my 3080 Ti (not talking about full RT/path tracing), without using DLSS. In games without RT I turn DLAA on if it exists.

The issue is most people are on xx60 tier hardware, not that current gen cards "can't do RT".
 
Joined
Mar 10, 2023
Messages
30 (0.06/day)
And even now, i often see that people recommend stg like an 6800XT as a good alternativa...and later cry about ray tracing...cant do it/dont need it. :D
 

bug

Joined
May 22, 2015
Messages
13,487 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
You can use frame generation, that gives roughly 50% more fps, so even with RT most game are playable even at 4K.
And in the sweet spot WQHD basically every game can be playable with high fps.

I disagree, 4080/4090 can push 4K with path tracing in CP:2077, sure it's using DLSS to do that, but the IQ is still vastly better than 1080p native path tracing, or literally any other game in existence.

Obviously next gen cards will be better, but there's more than enough ooomph in even last gen NVIDIA cards to offer a good RT experience. E.g I can happily turn every RT setting on and to max in most games with my 3080 Ti (not talking about full RT/path tracing), without using DLSS. In games without RT I turn DLAA on if it exists.

The issue is most people are on xx60 tier hardware, not that current gen cards "can't do RT".
What I meant is we still have this global illumination, per object, include/don't include reflections segmentation. Current cards still can't push all of them at the same time (using a meaningful number of rays), so this segmentation helps developers toggle RT bits on/off.
4080/4090 may have it easier, but since very few people can/will pay that mcuh for a video card, these cards won't push RT forward in a meaningful way either. Good RT performance will really need to trickle down to x60 cards before that happens. But that's ok. We've waited decades for RT, what's 3-5 more years?
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,716 (1.96/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White, Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
What I meant is we still have this global illumination, per object, include/don't include reflections segmentation. Current cards still can't push all of them at the same time (using a meaningful number of rays), so this segmentation helps developers toggle RT bits on/off.
4080/4090 may have it easier, but since very few people can/will pay that mcuh for a video card, these cards won't push RT forward in a meaningful way either. Good RT performance will really need to trickle down to x60 cards before that happens. But that's ok. We've waited decades for RT, what's 3-5 more years?
Blame lazy devs and AMD refusing to commit to real RT and actually compete.

Those two are what are holding back RT adoption (and therefore optimization), if more people wanted to play games with the visuals turned up (High/Ultra in traditional raster technique is barely any difference, Pathtracing on vs RT off is night and day), they would then invest in capable cards. But people see AMD cards with "equivalent" gaming performance (the reality is they are only competitive in raster) and go with those since they're cheaper, or with xx60 class cards which have never been about IQ, but playable framerates.

UE5/5.1 has existed for a while now, and plenty of earlier games/engines have very good RT implementations, the tech is there and exists. In many ways RT tech is advancing much faster than any other type of game engine feature/technical progression, with the exception of maybe AI accelerated stuff like upscaling or LLM integration.
 

bug

Joined
May 22, 2015
Messages
13,487 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Blame lazy devs and AMD refusing to commit to real RT and actually compete.

Those two are what are holding back RT adoption (and therefore optimization), if more people wanted to play games with the visuals turned up (High/Ultra in traditional raster technique is barely any difference, Pathtracing on vs RT off is night and day), they would then invest in capable cards. But people see AMD cards with "equivalent" gaming performance (the reality is they are only competitive in raster) and go with those since they're cheaper, or with xx60 class cards which have never been about IQ, but playable framerates.

UE5/5.1 has existed for a while now, and plenty of earlier games/engines have very good RT implementations, the tech is there and exists. In many ways RT tech is advancing much faster than any other type of game engine feature/technical progression, with the exception of maybe AI accelerated stuff like upscaling or LLM integration.
I'm not blaming anyone. I understand why AMD does and says what they do. I'm just saying, with only two players, one of which is dragging their feet, we're not where we could/want to be.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,716 (1.96/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White, Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
I'm not blaming anyone. I understand why AMD does and says what they do. I'm just saying, with only two players, one of which is dragging their feet, we're not where we could/want to be.
I agree with you.

It would also be helpful if people (not referring to yourself) would learn how to read graphs when making purchasing decisions and formulating arguments.

Lots of talk at the moment about how RDNA2 is the king of value. Even when compared to current gen NVIDIA, it seems the value is scaling identically, when you consider RT as a necessary task for GPUs to do (something I do, as previously mentioned, all new game engines are using it as default lighting).

1688389999359.png
 
  • Like
Reactions: bug

bug

Joined
May 22, 2015
Messages
13,487 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I agree with you.

It would also be helpful if people (not referring to yourself) would learn how to read graphs when making purchasing decisions and formulating arguments.

Lots of talk at the moment about how RDNA2 is the king of value. Even when compared to current gen NVIDIA, it seems the value is scaling identically, when you consider RT as a necessary task for GPUs to do (something I do, as previously mentioned, all new game engines are using it as default lighting).

View attachment 303390
Yes, but confirmation bias dictates RT (and DLSS) are worthless. You know, sour grapes ;)
 
Joined
Jan 20, 2019
Messages
1,377 (0.68/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I agree with you.

It would also be helpful if people (not referring to yourself) would learn how to read graphs when making purchasing decisions and formulating arguments.

Lots of talk at the moment about how RDNA2 is the king of value. Even when compared to current gen NVIDIA, it seems the value is scaling identically, when you consider RT as a necessary task for GPUs to do (something I do, as previously mentioned, all new game engines are using it as default lighting).

View attachment 303390

But would you buy a 4070 12GB card for 4k gaming with RT enabled? IMO, the performance disparity here is meaningless when some of the more demanding games barely hit over 30fps. This can only be a "user-specific" scenario where personal tolerances and the types of games targeted come into play.

Soon I would love to invest in RT, or use RT as a reference point since photo realism in games is something i aspire to, providing its widely available or in the least in the games I play. In return, if I'm having to spend a boat load of cash on a RT road warrior, in return I would expect equally abundant performance rewards. But what we have is: HUGE asking price vs not so equally coinciding perf hence nothing to get excited about.... which pretty much sums up the lack of interest. AMD not pushing RT as aggressively sounds about right for now although 7000-series has come a long way from 6000 (again still doesn't interest me). Maybe another Gen or 2 before RT/PT makes sense.

12GB is not enough for 4K+RT. FPS performance isn't the only performance metric we should be concerned with. Where hard limitations are set at the memory level, by default, you can expect poorer image quality even at the best of quality settings. At this sort of level, smart game engines with their dynamic assets swapping is not exactly "SMART" but a "compromise" (or can we call it a cheat?). RT technically suggests we're looking for the finest image quality with a touch of realism - buying a skimped card and settling with unsettling FPS rewards with further quality compromises at 4K somewhat defeats the objective. It's here i don't believe RT should be a referenced point of entry (for most) hence raster performance easily takes precedence. Again its largely down to user specific games but i'm sure for the vast majority peddling 4K its a no-go unless a tiny-weeny handful of us (not me) are willing to splash-out for something like a XTX, *80 or *90.

For the vast majority of gamers "good value" essentially implies affordability. "Value scaling" on the other hand works when the pockets/budgets can scale accordingly (not for most unfortunately). At the moment getting your feet wet in gaming with reasonably good hardware doesn't require a ton of cash, thanks to the abundant options available outside of Nvidias 40-series day light robbery pricing excursions and skimped class-action at the lower performance tiers. I've been with Nvidia since Adam and Eve and even i've been put-off on a massive scale... its not just AMD doing it better on the value-front but Nvidia regrettably pushing current Gen pricing through the roof (in the name of a couple of uninspiring features and MANAYYYYYYY).
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,716 (1.96/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus Block, HWLABS Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 2x A4x10, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White, Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19044.4046
Benchmark Scores Legendary
But would you buy a 4070 12GB card for 4k gaming with RT enabled? IMO, the performance disparity here is meaningless when some of the more demanding games barely hit over 30fps. This can only be a "user-specific" scenario where personal tolerances and the types of games targeted come into play.

Soon I would love to invest in RT, or use RT as a reference point since photo realism in games is something i aspire to, providing its widely available or in the least in the games I play. In return, if I'm having to spend a boat load of cash on a RT road warrior, in return I would expect equally abundant performance rewards. But what we have is: HUGE asking price vs not so equally coinciding perf hence nothing to get excited about.... which pretty much sums up the lack of interest. AMD not pushing RT as aggressively sounds about right for now although 7000-series has come a long way from 6000 (again still doesn't interest me). Maybe another Gen or 2 before RT/PT makes sense.

12GB is not enough for 4K+RT. FPS performance isn't the only performance metric we should be concerned with. Where hard limitations are set at the memory level, by default, you can expect poorer image quality even at the best of quality settings. At this sort of level, smart game engines with their dynamic assets swapping is not exactly "SMART" but a "compromise" (or can we call it a cheat?). RT technically suggests we're looking for the finest image quality with a touch of realism - buying a skimped card and settling with unsettling FPS rewards with further quality compromises at 4K somewhat defeats the objective. It's here i don't believe RT should be a referenced point of entry (for most) hence raster performance easily takes precedence. Again its largely down to user specific games but i'm sure for the vast majority peddling 4K its a no-go unless a tiny-weeny handful of us (not me) are willing to splash-out for something like a XTX, *80 or *90.

For the vast majority of gamers "good value" essentially implies affordability. "Value scaling" on the other hand works when the pockets/budgets can scale accordingly (not for most unfortunately). At the moment getting your feet wet in gaming with reasonably good hardware doesn't require a ton of cash, thanks to the abundant options available outside of Nvidias 40-series day light robbery pricing excursions and skimped class-action at the lower performance tiers. I've been with Nvidia since Adam and Eve and even i've been put-off on a massive scale... its not just AMD doing it better on the value-front but Nvidia regrettably pushing current Gen pricing through the roof (in the name of a couple of uninspiring features and MANAYYYYYYY).
In our whole test suite not a single game saw a meaningful performance hit with 12 GB, not even at 4K—and RTX 4070 is fundamentally a 1440p card.
 
  • Like
Reactions: bug
Top