• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Portal with RTX

Joined
Sep 2, 2022
Messages
89 (0.12/day)
Location
Italy
Processor Intel Core i7-5960X
Motherboard MSI X99S SLI PLUS
Memory Corsair Vengeance LPX DDR4 4x8GB
Video Card(s) Gigabyte GTX 1070 TI 8GB
Storage Sandisk Ultra II 240GB + HDD
Display(s) Benq GL2480 24" 1080p 75 Hz
Power Supply Seasonic M12II 520W
Mouse Logitech G400
Software Windows 10 LTSC
This thing that NVIDIA should make its industrial patents free and open source is funny, probably someone writes from fantasy land.
When a company invest millions or billions to develope new technologies, then it needs to get that money back to stay alive and grow.
FIAT and Magneti Marelli many years ago invented the common rail technology for car engines, investing billions in research. Then they offered the others car manufacturers to use that tech under license. "Want to use it, then you have to pay a fee!" That's how things works with companies of this kind. They don't work pro bono and they have to stay ahead of their competitors to survive and make money.
NVIDIA introduced Cuda when? in 2006? and in most recent years the RT and Tensor cores, and finally Optix. All this hardware is used both by games and professional softwares.
Since then AMD has done nothing to catch them. If you decide to go with Open CL, then you have to invest in that and support (with money!) the software houses in order to see their programs making use of Open CL and of your hardware.
AMD introduced HIP only when the Blender devs announced that they would have abandoned Open CL support with build 3.0: later in 2021. This just to show how lazy they can be. Well, HIP is still slower than Cuda and doesn't support all the AMD gpus. If you use your pc also for content creation, the AMD gpus don't even come in consideration nowadays. One can't blaime NVIDIA for this, but AMD. Don't look at this market just with the eyes of the gamer: the green cards offer more, you can use them for more stuff, not just for gaming.
I had both brands and I recently switched to a GTX because I needed Cuda. Now I want Optix and I'll buy a RTX as soon as I can. I can't be mad with NVIDIA for this, I'm glad that they made this technology and hardware instead. It costs a lot? Yes. But they offer me more.

This Portal mod is a show-off made by the NVIDIA engineers to show what this modding tool is capable off. Has just been released. Hasn't been done by a hundred modders, but by a group of engineers that halfway in the making decided that was good enough for its purpose. When the modding community will start making stuff with it, then they will give feedbacks, ask for improvements and optimizations. In a world where triple A games are released full of bugs at full price, you expect that a free tech demo is free from problems?

And, by the way, why NVIDIA should optimize this mod also for AMD and Intel gpus? Why they should care? They are competitors. It's on them to optimize their drivers and catch up. Lisa Su and the AMD shareholders could reduce their cut and spend those billions in research. People here speak like NVIDIA is the bad wolf and AMD the Little Red Riding Hood, when instead they are as greedy as the others.
 
Joined
Jul 15, 2020
Messages
997 (0.65/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
I thought I explained why not if you dont get it then it's on you. You could give an argument instead you imply because 'NV is doing it'? so it is ok?
I also explained: This is a copy paste after some sentence order edit:
"Portal RTX is an edge example, not the common one.
Many games do well with some sort of RT, also on AMD cards.
We have may RT games that works fine on both NV and AMD.
"
I see no wrong with going DLSS/FSR/XeSS with or without RT.

Not run the way I like? There is a standard for running games which is consider sufficient. Do you know what the standard says? Are you ok with running a game with 30FPS at 4k for instance because that is what you get with the 4090. Even below that. On the other side same person that cheers for RT and low frames will tell how used he is a 144hz and refuses to play games if getting a 100FPS min or 144FPS min is not happening.
I`m sure there are some of those, not me anyway. I do just fine with 4K 60hz and 970GTX.
Sure DLSS but it is not perfect especially DLLS3 with all the shimmering, ghosting and artifacts.
NV brings RT but it does not work so DLSS to make it work on expensive top dogs bumping price since new invention and people are ok with it.
It is not perfect, correct. It actually have some issues to figure out. But that`s not a reason imo mark them as broken and not to use them.
RT is not a game changer at this point and consider all aspects of a product it brings way higher prices and DLSS versions that only work with specific cards. DLSS 2 is probably going to be abandoned since NV has to focus on DLSS 3 at this point. Those who purchased the 3000 series cards are left behind unless they use FSR. Yet they have paid for those features.
You may be ok with it or not. As of me, I'm not OK and I think that is clear so praising RT for what it is and what it currently offers rubs on shill
We agree, but I see more to RT than to only play games (and that with debatable benefits, I wont use it for most part) and I don`t take it to NV vs AMD vs the world thing.

So some more copy past to settle your mind:
"RT is a nice bonus in some games to play with, nothing more but also nothing less.
I see no reason to boycott it as not few find pleasure with it.
Some actually need and utilize the RT and Tensor capabilities for non-gaming cases."
 
Joined
Jan 14, 2019
Messages
11,560 (5.55/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I don't get the stubborn refusal to embrace them in any capacity, outside of I suppose 1080p 60hz or less sort of thing where they're not all that useful, but even 1080p144 dlss and FSR have their place for a lot of people.

They're absolutely one of the best things to happen in recent years for gamers and I personally am fascinated by them, and find using FSR 2.0 or DLSS quality modes at 4k preferable to native in virtually every single instance. I found them very compelling at 1440p too.
You loss little to very little visuals in most games but gain a lot of FPS which contribute much more to playability.
If you already above 100fps without DLSS/FSR/XeSS than OK, but the small compromise of them easily pay off if you want 144+HZ/4K ultra settings on top AAA games or both and don`t have the 1000+$ for top GPU`s.

For the most part, DLSS/FSR/XeSS is one of the best things that happen to gaming recently when you factor the global market behavior and not only by what use to be in the past.
Personally, I find the whole idea counter-intuitive. People buy higher and higher resolution monitors, then they dumb down the graphics with upscaling because they can't run those high resolutions properly. Why not just stay with a resolution that your GPU can natively power through? That's why I'm still on 1080p 60 Hz with a 6750 XT.

With that said, I had a go with DLSS with a 2070 in Cyberpunk. The "ultra quality" setting was... how they said it in the HBO series Chernobyl: "not great, not terrible", and I got maybe 30-40% more FPS which is nice, but any other setting looked blurry as heck. Even with "balanced", my eyes felt like they were melting after a minute of gameplay. Verdict: "ultra quality" is usable if you run out of options, but all the rest are rubbish.
 
Joined
Jul 15, 2020
Messages
997 (0.65/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
Personally, I find the whole idea counter-intuitive. People buy higher and higher resolution monitors, then they dumb down the graphics with upscaling because they can't run those high resolutions properly. Why not just stay with a resolution that your GPU can natively power through? That's why I'm still on 1080p 60 Hz with a 6750 XT.

With that said, I had a go with DLSS with a 2070 in Cyberpunk. The "ultra quality" setting was... how they said it in the HBO series Chernobyl: "not great, not terrible", and I got maybe 30-40% more FPS which is nice, but any other setting looked blurry as heck. Even with "balanced", my eyes felt like they were melting out after a minute of gameplay. Verdict: "ultra quality" is usable if you run out of options, but all the rest are rubbish.
The real counter-intuitive imo is DLSS3 that make more FPS but simultaneously increase latency. This is an double-edged sword to hold and do good if you have very high fps (+100) in the fist place, before Appling the DLSS3 thing.
It`s good only for the very top CPU`s and GPU`s systems and that trend is bad.
 
Joined
Jan 14, 2019
Messages
11,560 (5.55/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
The real counter-intuitive imo is DLSS3 that make more FPS but simultaneously increase latency. This is an double-edged sword to hold and do good if you have very high fps (+100) in the fist place, before Appling the DLSS3 thing.
It`s good only for the very top CPU`s and GPU`s systems and that trend is bad.
That too. But DLSS in general. Or any upscaling method. It's like "I threw away my shitty 1440p monitor for this brilliant 4K panel, but my games started to stutter, so I'm gonna upscale a shitty 1440p image with Nvidia's/AMD's new XYZABCDLSS to make it better". Doesn't it sound like a huge waste of money? :laugh:
 
Joined
Jul 15, 2020
Messages
997 (0.65/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
That too. But DLSS in general. Or any upscaling method. It's like "I threw away my shitty 1440p monitor for this brilliant 4K panel, but my games started to stutter, so I'm gonna upscale a shitty 1440p image with Nvidia's/AMD's new XYZABCDLSS to make it better". Doesn't it sound like a huge waste of money? :laugh:
If it pass a blind test for most people, than it`s a very good investment of money ;)
 
Last edited:
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
We agree, but I see more to RT than to only play games (and that with debatable benefits, I wont use it for most part) and I don`t take it to NV vs AMD vs the world thing.
RT is being used broadly in variety of industries but here we are talking about gaming so I guess we better focus on that.

"RT is a nice bonus in some games to play with, nothing more but also nothing less.
I see no reason to boycott it as not few find pleasure with it.
Some actually need and utilize the RT and Tensor capabilities for non-gaming cases."
People do not boycott existence of RT since that has been here for long while and it is being used in many things than gaming. That's a fact but not the point of the conversation so I'm not sure how is that one of your arguments or it is not an argument but rather a statement.
Well I dont boycott RT that's for sure. Just because NV is trying with RT it does not mean we have to like the implementation and what it comes with and it comes with huge price tag, huge performance impact, upscaling which is ok but it is not perfect and has draw backs in image quality besides you are forced to use it if you consider RT and you need to have the most top notch card there is. Lower tier cards you can forget about it but you still pay for RT nonetheless. Because it is advertised as such isn't it?

To be fair, what people are using RT and Tensor cores for is not the discussion here. We are talking here about RT in games and what it offers you as a consumer, considering price, gaming experience etc.

The real counter-intuitive imo is DLSS3 that make more FPS but simultaneously increase latency.
Shimmering and ghosting as well. DLSS3 has a problem with this in basically every game with DLSS3 implementation. Hopefully this will be eliminated somehow.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,129 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Personally, I find the whole idea counter-intuitive. People buy higher and higher resolution monitors, then they dumb down the graphics with upscaling because they can't run those high resolutions properly.
Well I'd disagree right there that I'm dumbing them down, at 4k for sure now that I'm on a 4k120 OLED, and often still at 1440p, I find the DLSS image downright preferable, the AA and detail it's capable of is astounding to me, then the performance increase is just the cherry on top. FSR 2.1+ has similarly impressed me vs Native+TAA at 4k. Also bear in mind it's been demonstrated that higher framerates at a given resolution can also enhance results. I can see why at 1080p60 you're not gaining a lot of perf OR retaining enough image quality to make it worthwhile I guess, I see where your use case makes it a let down. I disagree that upscaling is counter-intuitive as a whole idea, but there undeniably are certain situations where it would be counter intuitive to use upscaling.
But DLSS in general. Or any upscaling method. It's like "I threw away my shitty 1440p monitor for this brilliant 4K panel, but my games started to stutter, so I'm gonna upscale a shitty 1440p image with Nvidia's/AMD's new XYZABCDLSS to make it better". Doesn't it sound like a huge waste of money? :laugh:
Well when you purposely phrase it in a belittling and disparaging way, yeah it sounds stupid, you achieved what you set out to do.

A phrase I might use to describe that situation instead "I upgraded from a decent VA 1440p monitor to a top tier 4k high refresh rate OLED, and upscaling gives me another great tool to tweak and enhance my gaming experience on my existing hardware, and extract the panels potential for a phenomenal and immersive gaming experience" Doesn't that sound pretty appealing? :) Now of course, that question right there isn't for you, because we know you don't find it appealing, but I can tell you I'm far from alone.
 
Last edited:
Joined
Nov 20, 2015
Messages
22 (0.01/day)
Location
Italy
It would be interesting to se a 2022 version of portal with raster used in all its might and then compare to the ray tracing version, i think you will probably be unable to see any difference in static image and really low difference in movement, but with a 100% fps increase on the raster side... ray tracing is, for now, just a style exercise.
 
Joined
Sep 8, 2009
Messages
1,067 (0.19/day)
Location
Porto
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Pro
Cooling AiO 240mm
Memory 2x 32GB Kingston Fury Beast 3600MHz CL18
Video Card(s) Radeon RX 6900XT Reference (amd.com)
Storage O.S.: 256GB SATA | 2x 1TB SanDisk SSD SATA Data | Games: 1TB Samsung 970 Evo
Display(s) LG 34" UWQHD
Audio Device(s) X-Fi XtremeMusic + Gigaworks SB750 7.1 THX
Power Supply XFX 850W
Mouse Logitech G502 Wireless
VR HMD Lenovo Explorer
Software Windows 10 64bit
Compared to a first-person game launched in 2020 and alter, IMHO this is really just a regular looking game with the worst performance-to-IQ ratio I've ever seen. Even after watching a bunch of videos of this on youtube, I honestly don't think this is worth any significant praise. In general its IQ is miles behind games the perform >4x better like any CoD or Battlefield released in the last 5 years.

Sure, the IQ upgrade is enormous, but this is a 15 year-old game made for DX9 graphics cards (with a DX8 fallback mode for Geforce FX cards, IIRC) with 512MB or less VRAM, so even a high-resolution texture and reshade mods would make it look completely different. It's a game with very little geometry to speak of, at least until the later levels.


Also, looking at the 6900XT getting less than 30% of the performance of a RTX 3060 at 1080p makes me think this was actually made with RDNA2 cards in mind. Just in the opposite direction of performance optimization. It seems so obvious that it sounds a bit desperate to be honest. Is RDNA3 that much of a threat?
If anything, trying to present their ability to push technology forward exclusively at the cost of transparently screwing their competition is just making Nvidia lose credibility.



You are right, but NV didn`t make the rules- they just play along. They actually must, if they want to keep in the long term. So do AMD.
This is not true. Nvidia is constantly using their market dominance and superior developer relations program to make the rules and change the goalposts according to their specific architectural advantages.

For their Fermi architecture Nvidia made everything about compute throughput and Nvidia-sponsored games would push for lots of shader effects. At the time sites like Anandtech would put synthetic compute benchmarks for GPU reviews to try to get a glimpse of performance on "future games".




By the time AMD released GCN that was superior to Fermi in compute, Nvidia launched Kepler with worse compute performance than Fermi but with a huge boost in geometry performance. Nvidia proceeded to use their dev relations advantage to push for tons of tessellation and sub-pixel triangles in games (i.e. hairworks, godrays, super detailed concrete slabs, etc.) and propping up tessellation-heavy benchmarks like Unigine Heaven so they could keep the upper hand. At this point we'd see Anandtech using synthetic tessellation benchmarks to show again why PC games would show a performance advantage on Nvidia GPUs.






Come ~2018 and AMD with RDNA caught up with rasterization in pretty much all aspects: shader processor utilization is excellent, they finally got rid of Globalfoundries' inferior FinFet processes and started to use TSMC as well (better perf-per-watt, higher clocks), and the geometry performance bottleneck actually transitioned to compute culling so it became a non-issue to most GPUs.
As soon as AMD caught up with geometry performance, Nvidia once again changed the goalposts to raytracing performance.
AMD with RDNA2 is clearly winning on rasterization performance-per-watt and performance-per-cost, but Nvidia is pushing all their engineering and marketing efforts to prop up raytracing performance as the only thing that matters. And for that we get emperors with no clothes like this Portal RTX, and gems like this:


dxr geforce rtx 4090 performance



AMD/ATI actually tried to do something like this with tessellation twice, with Truform in 2001 and DX10.1 Tesselation in 2007 but they failed miserably. In the second attempt it's not even a case of no one adopting it because it was regularly used in X360 games. They just didn't have any weight over PC developers.





My guess is this is exactly why RDNA3 isn't bringing any substantial improvements to raytracing performance, nor should it.
If AMD keeps chasing after Nvidia's latest architectural shift they're never going to gain ground. They'd always be 2 generations behind.
It's not like these shifts are bringing substantial boosts to IQ, even. Hairworks on Geral's hair didn't make the game look substantially better but axed performance by what, 30%? Did anyone think the Godrays in Fallout 4 were worth the 20% performance hit? Is raytracing on Cyberpunk worth the 50% performance hit?

And with Unreal Engine 5 we're also seeing how GPU compute triangles for "infinite" geometry detail paired with mostly screen space effects provides a much larger boost to IQ and immersion, on all GPU architectures and playable on the GPUs of today. With design wins on almost all gaming consoles from the past 15 years, AMD understood they must develop and optimize their architectures with the input of independent game developers, instead of Nvidia's latest fad. And if this means they get pushed out of the PC discrete GPU market, then so be it.
 
Joined
Jul 15, 2020
Messages
997 (0.65/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
People do not boycott existence of RT since that has been here for long while and it is being used in many things than gaming. That's a fact but not the point of the conversation so I'm not sure how is that one of your arguments or it is not an argument but rather a statement.
Well I dont boycott RT that's for sure. Just because NV is trying with RT it does not mean we have to like the implementation and what it comes with and it comes with huge price tag, huge performance impact, upscaling which is ok but it is not perfect and has draw backs in image quality besides you are forced to use it if you consider RT and you need to have the most top notch card there is. Lower tier cards you can forget about it but you still pay for RT nonetheless. Because it is advertised as such isn't it?
So we very much agree- RT in gaming is a mare eye candy gimmick that you can do very easy without.
I also think that we could have been better with a parallel original GTX along RTX (like the 16xx along with the 2xxx series) but I can understand NV decision to focus on RTX alone. And if so, they must push it as hard as they can- no other choice for a company in their stand.
If it good enough to pay the extra over AMD offering is to every one choice to do in every tier. If NV are over advertising it and PR the hell out of it to the point of absurd with super duper over sugar coat it- that is your responsibility as consumer to understand, compere and decide. You can`t expect them to do it for you and you can only blame yourself for buying a cat in a sack (if that`s what RT for you). There is no falls advertising anyway near in this or any other RT game release or RTX related stuff that I know of.

Also, I don`t think it is OK (of fair) to have RT only on top end GPU by syntactically locking it up for lower tier just because performance aren't good. I don't say it`s what you suggested, it`s just one way to exclude it from lower tiers. Other way is to make new arc for low end GPU which will cost also. Another is to lock RTX on every card and sell a key to unlock it, like codecs\color palette with professional cameras (you will pay big to have ProRes on your Inspire 2 drone or have D-log in some cameras). I preferer keep it as it is than any of the above.
Anyway, every sane person know that paying less for low cost GPU will give you low performance and the need to compromise on settings, not related to RT on/off. You can go RT on on low end, apply DLSS performance mode, lower settings and it will look like shit and will not worth it (for most eyes anyway)- but it is good that no one deny you from trying that.

Shimmering and ghosting as well. DLSS3 has a problem with this in basically every game with DLSS3 implementation. Hopefully this will be eliminated somehow.
Yep, let`s see what DLSS3.1 will bring. Maybe also 'make up' every third frame and not every second one as of now.
Be sure 5xxx will bring more such 'goodies' of controversial taste.

This is not true. Nvidia is constantly using their market dominance and superior developer relations program to make the rules and change the goalposts according to their specific architectural advantages.
All is true, but I don`t see any fundamental problem with this practice.
You invest, direct the market in your favor direction to favor your product in order to claim more market share and make more profit and exploit according to your current stand point, financial strength and market dominance. This is the way to go in the big boys land.
Nothing new- It`s been done by many many companies worldwide for years now, before NV came to life.
Moreover, you have legit lobby at governments to change the state rule to favor your own company. All according to the book play. All is 'Kosher'.

NV just play aggressive (but elegantly) with the cards they have and they are the chip leader (poker metaphor wise) where only two players sit at the table (lets keep ARK aside for now please).
Do you think AMD (or you) would have play differently if it was the other way around?

And with the power and resource they have they fortifying their stand by pioneering on many tech stuff that benefit many (non-gaming) sectors. A very strong positive feedback cycle. From a tech stand point in capitalistic wallstreet market, it is the best way to go.
 
Last edited:
Joined
Jun 6, 2021
Messages
678 (0.56/day)
System Name Red Devil
Processor AMD 5950x - Vermeer - B0
Motherboard Gigabyte X570 AORUS MASTER
Cooling NZXT Kraken Z73 360mm; 14 x Corsair QL 120mm RGB Case Fans
Memory G.SKill Trident Z Neo 32GB Kit DDR4-3600 CL14 (F4-3600C14Q-32GTZNB)
Video Card(s) PowerColor's Red Devil Radeon RX 6900 XT (Navi 21 XTX)
Storage 2 x Western Digital SN850 1GB; 1 x Samsung SSD 870EVO 2TB
Display(s) 3 x Asus VG27AQL1A; 1 x Sony A1E OLED 4K
Case Corsair Obsidian 1000D
Audio Device(s) Corsair SP2500; Steel Series Arctis Nova Pro Wireless (XBox Version)
Power Supply AX1500i Digital ATX - 1500w - 80 Plus Titanium
Mouse Razer Basilisk V3
Keyboard Razer Huntsman V2 - Optical Gaming Keyboard
Software Windows 11
NV is not to blame, it just reviles the true advantage of NV RTX vs. AMD RX when fully unleashed.
Dedicated and specialized HW (RT cores in this case) will always do better. See also ASIC for mining.
If and when AMD will decide to spend some silicone space for that feture thay will be back on the (RT) game.

Smells like a good, real case (not just tech demo), new RT benchmark title.
I was in that camp hoping that AMD could do more with less...its pretty obvious that isn't the case.

And with Unreal Engine 5 we're also seeing how GPU compute triangles for "infinite" geometry detail paired with mostly screen space effects provides a much larger boost to IQ and immersion, on all GPU architectures and playable on the GPUs of today.
I'm glad you brought that up... Fortnite's UE 5.1 crippled my 6900XT @ 1440p w/ max settings 30-40fps. The 4090 @ 2160p w/ max settings 85-90 fps. If I'm not mistaking, AMD worked with UE to create this engine, I hope the 7000 series can handle it much better than their current hardware.
 
Joined
Oct 26, 2018
Messages
222 (0.10/day)
Processor Intel i5-13600KF
Motherboard ASRock Z790 PG Lightning
Cooling NZXT Kraken 240
Memory Corsair Vengeance DDR5 6400
Video Card(s) XFX RX 7800 XT
Storage Samsung 990 Pro 2 TB + Samsung 860 EVO 1TB
Display(s) Dell S2721DGF 165Hz
Case Fractal Meshify C
Power Supply Seasonic Focus 750
Mouse Logitech G502 HERO
Keyboard Logitech G512
Wow, the 3070 has a whole 1 FPS at 4K and is beaten big time by the 3060.
Wow the 6900xt and rtx3080 get 1 FPS too. Wouldn't it be a bigger wow that the 3060 beat the 3080 - "big time"???
I do think it's interesting that the 3060 is able to take a lead in any situation, but this is not a real world scenario.
Nobody with a 3060 or 3070 plays at 4k anyway, so who really cares?
At any playable resolution, the 3070 beats the 3060 by a substantial margin as expected.
Same old story... By the time the greater amount of slower VRAM on the cheap card pays off, it's competely useless anyway(3060@4k).
10GB VRAM usage for 1440p and 16GB for 4K and that's without considering the age of this game.
Those numbers are for the 4090.
If it was that simple, I'd expect the 3070 to take a hit at 2K, which it did not.
In fact comparing 2K to 1K, the 3070 increased it's relative lead over both the 2080Ti and the 3060, which suggests more than 8GB VRAM isn't helping at all at 2K.
The 3070 was always going to be quick to obsolescence due to it's limited VRAM size but this really does demonstrate that in spades.
At 4k no card in existence can produce acceptable FPS in this title.
The 3080 with 10GB and the 6900XT with 16GB both perform the same as the 3070 at 4k, so by itself this really demonstrates nothing.
It's published by Nvidia so almost certainly this is a case of Nvidia pulling another gamesworks and doing everything it can to nuke AMD performance. The 6900 XT is much faster than the 3060 in RT yet here is much lower. This is more or less what I expected of these Nvidia RTX mods. Nvidai seems to only be accelerating it's anti-competitive anti-consumer practices of late.
Yes, it's a big conspiracy to make your games less fun!
Portal RTX is definitely not an RTX tech demo, and is obviously just a Portal sequel not affiliated with any company.
It's not a company trying to promote the products they sell.
It's definitley not that AMD's implementation is partially running on stream processors because AMD does not have complete dedicated hardware RT.
"Traversal of the BVH and shading of ray results is handled by shader code running on the Compute Units"
"AMD’s RDNA 2 offloads BVH traversal to the stream processors which likely stalls the standard graphics/compute pipeline"

At this point I'm pretty sure your wife's boyfriend has a 3070:D
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
In games with actual raytracing Nvidia cards are also slow. To be forced to use upscaling on a $1000+ card even at 1080p means the game is broken, simple as that.

It sounds so funny unless you realize that 3D rendering with proper ray/path tracing is computationally insanely expensive and the fact that NVIDIA has managed to render 1080p in real time at 60fps is an extreme achievement of itself:


Of course, if it's AMDSlow then it shouldn't exist in the first place, right?

Why do we have the same conversation again and again? Fast rewind to 2000 and we'll have all the conversations about how "programmable shaders are slow and shouldn't exist". Show me a single modern 3D game without hundreds if not thousands of shader programs. None, nothing? Should we just continue to rely on prebaked lighting which falls apart or doesn't work in far too many cases?

And before you say "They shouldn't have released something as slow as it is and instead they should have waited a few years" - how are you going to implement and advance something if it's not actively used? By this logic we'd never have programmable shaders, tesselation or any new DirectX/OpenGL version because "TOO SLOW!!!!!!!"

Most hateful comments about DXR/Vulkan are an insult to intelligence, progress and ingenuity.
 
Joined
Dec 5, 2013
Messages
632 (0.16/day)
Location
UK
Most hateful comments about DXR/Vulkan are an insult to intelligence, progress and ingenuity.
Genuine progress is in actually making Portal 3 whilst Ellen McLain is still active, not messing around arguing over how the need for a $2,000 GPU to hit 26-60fps in a 15 year old game is "progress" of the pure cringe epeen kind... :rolleyes:
 
Last edited:
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
By the time AMD released GCN that was superior to Fermi in compute, Nvidia launched Kepler with worse compute performance than Fermi but with a huge boost in geometry performance. Nvidia proceeded to use their dev relations advantage to push for tons of tessellation and sub-pixel triangles in games (i.e. hairworks, godrays, super detailed concrete slabs, etc.) and propping up tessellation-heavy benchmarks like Unigine Heaven so they could keep the upper hand. At this point we'd see Anandtech using synthetic tessellation benchmarks to show again why PC games would show a performance advantage on Nvidia GPUs.
So many things wrong with just this single paragraph which sounds almost unbiased and true, it's quite sad really:
  • The vast majority of people don't pay attention to benchmarks using specialized benchmarking applications such as 3DMark, Unigine Heaven/Vally/Superposition, etc.
  • Most reviewers don't pay much attention to them as well.
  • AMD did manage to reach the same level of tesselation performance later on, as many games started to rely on it heavily - which you rightfully noted in the follow up paragraph but tried to downplay it as if it was a switch from GloFo to TSMC which allowed that, which is patently untrue.
  • Reviewers did avoid using Crysis 2 for the same reason: it rendered hidden objects with heavy tesselation - something which was later described as done inadvertently/by mistake and was not made to make NVIDIA cards look better.
  • Absolutely most reviewers disabled NVIDIA gameworks (based on tesselation) to benchmark games which was actually unfair to NVIDIA. Fast forward 10 years and most games with heavy RTRT effects are eschewed as well or put in separate review sections like here on TPU. Only games with light RTRT are benchmarked which looks like an unfair advantage to AMD (which spent most of its transistor budget on the last gen rasterization tech), but who cares, right?
Overall NVIDIA pushed these features but tesselation was first released by ... Microsoft as ... an open standard for any 3D company to implement. AMD with its financial struggles didn't have enough bright people to get it right in the beginning which they rectified later.

Come ~2018 and AMD with RDNA caught up with rasterization in pretty much all aspects: shader processor utilization is excellent, they finally got rid of Globalfoundries' inferior FinFet processes and started to use TSMC as well (better perf-per-watt, higher clocks), and the geometry performance bottleneck actually transitioned to compute culling so it became a non-issue to most GPUs.
As soon as AMD caught up with geometry performance, Nvidia once again changed the goalposts to raytracing performance.
Microsoft came up with DXR (Vulkan RT by an open committee as well) which NVIDIA implemented beautifully and offered game developers to take advantage of since it allowed to fix certain game features which traditional rasterized pipeline couldn't take care of and NVIDIA is again "bad"? Should we not have any progress towards better looking games/more realistic rendering? Is this what you're blaming NVIDIA for? Or game developers should use new rendering features only when AMD catches up? But what if AMD is not in a position to catch up as they were first with tesselation which took them AFAIK almost a decade to fix? What if it takes them a decade to reach NVIDIA's level of RTRT performance? Should we not have any games using it? That would be a horrible day, no, horrible future. Let's continue using old shader based fake lighting, fake reflections, fake shadows until we all die, right?

My guess is this is exactly why RDNA3 isn't bringing any substantial improvements to raytracing performance, nor should it.
Your guess is wrong. RDNA3 is ~80% faster in RTRT which makes it almost as fast as Ampere (GeForce/RTX 30 series).

If AMD keeps chasing after Nvidia's latest architectural shift they're never going to gain ground. They'd always be 2 generations behind.
It's not NVIDIA's shift. It's a game industry shift. It's not NVIDIA's fault AMD spreads itself thin over CPUs/GPUs/APUs for consoles and then have enough insolence to by Xilinx, the money ($35 billion for God's sake) they could have invested in making RDNA3 a true competitor to Ada Lovelace, not a second fiddle again and again, which will be touted as an excellent arch for older games. Sigh.

It's not like these shifts are bringing substantial boosts to IQ, even. Hairworks on Geral's hair didn't make the game look substantially better but axed performance by what, 30%? Did anyone think the Godrays in Fallout 4 were worth the 20% performance hit? Is raytracing on Cyberpunk worth the 50% performance hit?
NVIDIA's hairworks did make game characters' hair look a ton better and nowadays it's how it's universally done. If it had been a gimmick tanking performance for no reasons or benefits it would have never taken off.

What a way to turn the entire history of game graphics inside out. You've achieved something most PR/propaganda departments could only dream of: you made it look totally plausible and real except it's almost universally untrue.

You want to actually hate NVIDIA? How about their pricing?
111.png


Or the entire ADA stack?

123.jpg
 
Last edited:
Joined
Sep 8, 2009
Messages
1,067 (0.19/day)
Location
Porto
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Pro
Cooling AiO 240mm
Memory 2x 32GB Kingston Fury Beast 3600MHz CL18
Video Card(s) Radeon RX 6900XT Reference (amd.com)
Storage O.S.: 256GB SATA | 2x 1TB SanDisk SSD SATA Data | Games: 1TB Samsung 970 Evo
Display(s) LG 34" UWQHD
Audio Device(s) X-Fi XtremeMusic + Gigaworks SB750 7.1 THX
Power Supply XFX 850W
Mouse Logitech G502 Wireless
VR HMD Lenovo Explorer
Software Windows 10 64bit
What a way to turn the entire history of game graphics inside out. You've achieved something most PR/propaganda departments could only dream of: you made it look totally plausible and real except it's almost universally untrue.
Quite the accusation coming from the guy who uses the term "implemented beautifully by Nvidia", thinks that AMD "doesn't have enough bright people" and them buying Xilinx was an "act of insolence".
All this while presenting zero sources to his claims.

I mean there's one source that corroborates my claim that RDNA3 brings little to no improvements to raytracing performance. Navi 31 is an all around 80% faster card than Navi 21 and its raytracing performance is 80% higher, meaning the improvements to raytracing performance are marginal at best.
Thanks for helping my case then.
 
Last edited:
Joined
Sep 4, 2022
Messages
267 (0.35/day)
In order for rt to be where rasterization is today on a 4090 consoles would have to have 4090 like rt performance. If the ps5 pro is based on rdna 3 technology then we would have to wait for a ps6 console upgrade imo to see rt performance where rasterization is today at a minimum.
 
Joined
Jan 14, 2019
Messages
11,560 (5.55/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Well I'd disagree right there that I'm dumbing them down, at 4k for sure now that I'm on a 4k120 OLED, and often still at 1440p, I find the DLSS image downright preferable, the AA and detail it's capable of is astounding to me, then the performance increase is just the cherry on top. FSR 2.1+ has similarly impressed me vs Native+TAA at 4k. Also bear in mind it's been demonstrated that higher framerates at a given resolution can also enhance results. I can see why at 1080p60 you're not gaining a lot of perf OR retaining enough image quality to make it worthwhile I guess, I see where your use case makes it a let down. I disagree that upscaling is counter-intuitive as a whole idea, but there undeniably are certain situations where it would be counter intuitive to use upscaling.

Well when you purposely phrase it in a belittling and disparaging way, yeah it sounds stupid, you achieved what you set out to do.

A phrase I might use to describe that situation instead "I upgraded from a decent VA 1440p monitor to a top tier 4k high refresh rate OLED, and upscaling gives me another great tool to tweak and enhance my gaming experience on my existing hardware, and extract the panels potential for a phenomenal and immersive gaming experience" Doesn't that sound pretty appealing? :) Now of course, that question right there isn't for you, because we know you don't find it appealing, but I can tell you I'm far from alone.
To be fair, it sounds all the same to me. Higher resolution monitor with upscaling vs lower resolution native. My vote is on the latter. :)
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,129 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
To be fair, it sounds all the same to me. Higher resolution monitor with upscaling vs lower resolution native. My vote is on the latter. :)
I'd take the former without hesitation, higher resolution with upscaling, I want it in all the games. They don't really sound the same at all to me, one stands head and shoulders over the other for visual fidelity imo, not a doubt in my mind.
 
Last edited:
Joined
Jun 6, 2021
Messages
678 (0.56/day)
System Name Red Devil
Processor AMD 5950x - Vermeer - B0
Motherboard Gigabyte X570 AORUS MASTER
Cooling NZXT Kraken Z73 360mm; 14 x Corsair QL 120mm RGB Case Fans
Memory G.SKill Trident Z Neo 32GB Kit DDR4-3600 CL14 (F4-3600C14Q-32GTZNB)
Video Card(s) PowerColor's Red Devil Radeon RX 6900 XT (Navi 21 XTX)
Storage 2 x Western Digital SN850 1GB; 1 x Samsung SSD 870EVO 2TB
Display(s) 3 x Asus VG27AQL1A; 1 x Sony A1E OLED 4K
Case Corsair Obsidian 1000D
Audio Device(s) Corsair SP2500; Steel Series Arctis Nova Pro Wireless (XBox Version)
Power Supply AX1500i Digital ATX - 1500w - 80 Plus Titanium
Mouse Razer Basilisk V3
Keyboard Razer Huntsman V2 - Optical Gaming Keyboard
Software Windows 11
So many so called "tech heads" here, yet hate graphical progression. Perplexing to say the least. I use to frown on lighting, but now I see the truth. Baked in lighting is a thing of the past! :rockout: :rockout: :rockout:
 
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Looking at my Steam achievements my first playthrough was right back in September 2011, didn't realise it was that long ago.

Looking forward to another playthrough tomorrow. :cool:
 
Joined
Jun 6, 2021
Messages
678 (0.56/day)
System Name Red Devil
Processor AMD 5950x - Vermeer - B0
Motherboard Gigabyte X570 AORUS MASTER
Cooling NZXT Kraken Z73 360mm; 14 x Corsair QL 120mm RGB Case Fans
Memory G.SKill Trident Z Neo 32GB Kit DDR4-3600 CL14 (F4-3600C14Q-32GTZNB)
Video Card(s) PowerColor's Red Devil Radeon RX 6900 XT (Navi 21 XTX)
Storage 2 x Western Digital SN850 1GB; 1 x Samsung SSD 870EVO 2TB
Display(s) 3 x Asus VG27AQL1A; 1 x Sony A1E OLED 4K
Case Corsair Obsidian 1000D
Audio Device(s) Corsair SP2500; Steel Series Arctis Nova Pro Wireless (XBox Version)
Power Supply AX1500i Digital ATX - 1500w - 80 Plus Titanium
Mouse Razer Basilisk V3
Keyboard Razer Huntsman V2 - Optical Gaming Keyboard
Software Windows 11
Looking at my Steam achievements my first playthrough was right back in September 2011, didn't realise it was that long ago.

Looking forward to another playthrough tomorrow. :cool:
I wish I could, but my GPU won't allow it.
 
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I wish I could, but my GPU won't allow it.

Hopefully AMD will release a driver that gives you a suitable performance boost to make it playable, the most recent Nvidia driver 527.37 for example added support if I recall correctly.
 
Top