• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Ratchet & Clank Rift Apart Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,719 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Feb 1, 2019
Messages
3,523 (1.67/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,719 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
AMD themselves are working on a driver update to fix the crashing issue, so it's obvious nvidia is behind it and nvidia fanboys are having double standards. The amd crusade at it again
Not sure if the crashing issue is even related to RT not being enabled.

AMD: "Application crash or driver timeout may be observed while playing Ratchet & Clank: Rift Apart with Ray-Tracing and Dynamic Resolution Scaling enabled on some AMD Graphics Products, such as the Radeon RX 7900 XTX."

Actually, you cannot enable RT in the game with AMD, resolution scaling or not. If Nixxes was fixing the issue above, then they would turn off resolution scaling when RT is enabled on AMD, yet they disabled RT completely

however DS logically surely will consume VRAM
Logically? Please explain.

IMO if you have a fast path from storage to VRAM, then you don't need to preload stuff into VRAM, because you can just load it shortly before it's needed -> lower VRAM usage
 
Joined
Feb 1, 2019
Messages
3,523 (1.67/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Not sure if the crashing issue is even related to RT not being enabled.

AMD: "Application crash or driver timeout may be observed while playing Ratchet & Clank: Rift Apart with Ray-Tracing and Dynamic Resolution Scaling enabled on some AMD Graphics Products, such as the Radeon RX 7900 XTX."

Actually, you cannot enable RT in the game with AMD, resolution scaling or not. If Nixxes was fixing the issue above, then they would turn off resolution scaling when RT is enabled on AMD, yet they disabled RT completely


Logically? Please explain.

IMO if you have a fast path from storage to VRAM, then you don't need to preload stuff into VRAM, because you can just load it shortly before it's needed -> lower VRAM usage
Well I am assumed a seperate DS cache, but you are right, it can be just loaded straight in as a texture. Maybe just a small i/o buffer, so yeah I think I got that wrong.
 
Joined
May 19, 2021
Messages
13 (0.01/day)
It's 1.1.. 1.2 was just released in April, no way for anyone to integrate that so soon
So you're saying that pretty much every article written on this port is wrong? Almost all of them mentions it using DS 1.2 and even Nixxes themselves claim that they are using 1.2 on Steam (post from July 18 if you want to check it), so in this case I think you are wrong.
 
Joined
Jun 6, 2021
Messages
683 (0.55/day)
System Name Red Devil
Processor AMD 5950x - Vermeer - B0
Motherboard Gigabyte X570 AORUS MASTER
Cooling NZXT Kraken Z73 360mm; 14 x Corsair QL 120mm RGB Case Fans
Memory G.SKill Trident Z Neo 32GB Kit DDR4-3600 CL14 (F4-3600C14Q-32GTZNB)
Video Card(s) PowerColor's Red Devil Radeon RX 6900 XT (Navi 21 XTX)
Storage 2 x Western Digital SN850 1GB; 1 x Samsung SSD 870EVO 2TB
Display(s) 3 x Asus VG27AQL1A; 1 x Sony A1E OLED 4K
Case Corsair Obsidian 1000D
Audio Device(s) Corsair SP2500; Steel Series Arctis Nova Pro Wireless (XBox Version)
Power Supply AX1500i Digital ATX - 1500w - 80 Plus Titanium
Mouse Razer Basilisk V3
Keyboard Razer Huntsman V2 - Optical Gaming Keyboard
Software Windows 11
Game does have RT, only AMD hardware can't enable RT due to driver bugs
I'm certain you knew what I was talking about... Driver bug - convenient isn't it?

Didn't realize AMD GPU owners cared about RT....... Hopefully they get their drivers in order.

Also just looking at the last couple years Nvidia has done a much better Job in sponsored titles supporting competing technologies unlike AMD and why there was a collective groan when it was announced Starfield was an AMD sponsored game.
I am one of those AMD owners who cares a lot about RT and I'm also one of those who stated they will be going Nvidia for my next card UNLESS AMD comes exceptionally close in RT or matches Nvidia's offerings. RT is the future and I'm all in for it. Second part of your comment is rather bogus as one of the main remarks made by the media and fanboys alike, Nvidia doesn't do this. Folks and selective memory loss.

Could be that AMD released a driver update the broke raytracing in R&C. They borked RL as well recently (very annoying).
As AMD hasn't released a fix yet there's probably a reason for it (reverting breaks something more important for example) so they had two options.
- Delay launch of R&C for AMD to release an updated driver
- Just launch R&C but disable raytracing on AMD

As most don't really care about raytracing on AMD and considering it's an Nvidia sponsored title I can't really fault Nixxes for just releasing it.

In short I don't think Nvidia has anything to do with it aside that if AMD sponsored it they would've maybe delayed the launch.
Too much speculation, I don't have any of the answers for those. I already own this on PS5 so I won't be testing it on PC. Lastly, I'm the AMD unicorn owner - I love RT and I can't wait for it to be implemented in a far more efficient manner.
 
Joined
Sep 10, 2018
Messages
6,850 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I'm certain you knew what I was talking about... Driver bug - convenient isn't it?


I am one of those AMD owners who cares a lot about RT and I'm also one of those who stated they will be going Nvidia for my next card UNLESS AMD comes exceptionally close in RT or matches Nvidia's offerings. RT is the future and I'm all in for it. Second part of your comment is rather bogus as one of the main remarks made by the media and fanboys alike, Nvidia doesn't do this. Folks and selective memory loss.

Ummm.... Thankfully modders have been more competent implementing DLSS than the developers implementing FSR....





 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,719 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
So you're saying that pretty much every article written on this port is wrong? Almost all of them mentions it using DS 1.2 and even Nixxes themselves claim that they are using 1.2 on Steam (post from July 18 if you want to check it), so in this case I think you are wrong.
Guess I am wrong and it's 1.2
 
Joined
Oct 12, 2005
Messages
703 (0.10/day)
Nvidia's limitation of 8 lanes for the 4060 and the 4060 Ti is just as lamentable given that the 1050 Ti had 16 PCIe lanes with a smaller die size than any of these GPUs. This is why I don't buy the argument about limited space at the edge of the die. The die sizes are:
It's rather telling that the oldest and smallest die is the only one with 16 PCIe lanes.

The smallest die also only have PCI-3 3.0, not 4.0. 4.0 use way more space and those thing no longer scale much with process. Also the 1050 Ti use GDDR5 witch also take less space than GDDR6
 
Joined
Nov 26, 2021
Messages
1,623 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
The smallest die also only have PCI-3 3.0, not 4.0. 4.0 use way more space and those thing no longer scale much with process. Also the 1050 Ti use GDDR5 witch also take less space than GDDR6
PCIe is a serial link; 4.0 vs 3.0 won't add much for die space. 16 lanes of PCIe 3 are estimated to be about only 3 mm^2 in TSMC's 28 nm process.
 
Joined
Jan 9, 2023
Messages
293 (0.44/day)
Too much speculation, I don't have any of the answers for those. I already own this on PS5 so I won't be testing it on PC. Lastly, I'm the AMD unicorn owner - I love RT and I can't wait for it to be implemented in a far more efficient manner.
So we're going to speculate that Nvidia is intentionally doing this to AMD instead? Because that's what my comment is a reply to.
I'd like to err to innocent until proven guilty.

RTX 4060 Ti 16 GB has been added
Another game to add to the list of games that are "problematic" for 8GB cards. Though I expected some more pain.
I wonder if we can see the difference between 8GB and 16GB visually in realistic conditions.
 

bug

Joined
May 22, 2015
Messages
13,718 (3.97/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Shouldn't you be blaming the developer of the game?
Apparently not: https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-7-2
Application crash or driver timeout may be observed while playing Ratchet & Clank™: Rift Apart with Ray-Tracing and Dynamic Resolution Scaling enabled on some AMD Graphics Products, such as the Radeon™ RX 7900 XTX.

Also, I'm not sure how this qualifies as a Nvidia-sponsored title, since it's a console port. It literally started on AMD hardware.
 
Joined
Apr 18, 2019
Messages
100 (0.05/day)
System Name Annihilator
Processor 5800X
Motherboard MSI x570 Gaming Plus
Cooling ID-COOLING FrostFlow X 240
Memory 32GB TForce Xtreem White
Video Card(s) ASRock RX 7800XT Steel Legend
Storage NVME Gen-3 1 TB, 2 TB SSD
Display(s) 32" 144Hz 2k + 34" UW 2k
Case Kediers C-580
Audio Device(s) Sony 5.1 Home Theater
Power Supply TT 850W True Power
Mouse EVGA x20
Keyboard EVGA Z15 RGB Mechanical Gaming Keyboard
Software Winbdows 11
Benchmark Scores 13,797 with AMD Radeon RX 7800 XT(1x) and AMD Ryzen 7 5800X on FireStrike Ultra
Well, as usual, wont even bother to try it until it gets patched. I am sick and tired of people complaining about beta releases ... early adopters that are obviously incomplete and poorly optimized. If you guys vote with your wallets, they will listen.

Repeat with me: "no buy until game runs smooth ..." fell free to add any expletives at the end. :roll:
 
Joined
Sep 21, 2020
Messages
1,610 (1.07/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
AMD GPUs are doing surprisingly well in the 1% lows here, much better than their Nvidia counterparts. These results correlate directly with their half-precision/FP16 processing power:

GPU1% low FPS @ 4KFP16 TFLOPS
7900 XTX75122.9
7900 XT63103.2
40906182.6
40805448.8
6900 XT4846.1
3090 Ti4640.0
4070 Ti4540.1
6800 XT4541.5
30904035.7
68003832.3

Here's my theory:

Ratchet & Clank is the first PC title to take advantage of GPU Decompression. This feature of DirectStorage 1.1+ allows the decompression of game assets to take place on the GPU. This approach is way faster than the traditional decompression on the CPU because it avoids a number of bottlenecks, frees up the CPU for other game-related tasks, and takes advantage of massive parallel processing capabilities of modern GPUs. It also leverages much higher bandwidth of the card's VRAM for decompressing and copying game data. Since GPGPU workloads execute faster with partial floating point precision, I would presume that the GDeflate compression stream format used by GPU Decompression performs better on cards with higher FP16 rating.

Since the whole idea of GPU Decompression (besides reducing load times) is to improve asset streaming -- notably in open world games -- GPUs that can do it faster should also show better 1% and 0.1% low figures, allowing for smoother gameplay.

Well I am assumed a seperate DS cache, but you are right, it can be just loaded straight in as a texture. Maybe just a small i/o buffer, so yeah I think I got that wrong.
You're right in saying that DirectStorage increases VRAM usage, but it does so by a negligible amount. It places two additional staging buffers in VRAM whose size can be defined. It is assumed that 128-256 MB per buffer is optimal:

ds.png
ds1.png

Images taken from here. And here's a good article on the subject.
 
Last edited:
Joined
Sep 10, 2018
Messages
6,850 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
AMD GPUs are doing surprisingly well in the 1% lows here, much better than their Nvidia counterparts. These results correlate directly with their half-precision/FP16 processing power:

GPU1% low FPS @ 4KFP16 TFLOPS
7900 XTX75122.9
7900 XT63103.2
40906182.6
40805448.8
6900 XT4846.1
3090 Ti4640.0
4070 Ti4540.1
6800 XT4541.5
30904035.7
68003832.3

Here's my theory:

Ratchet & Clank is the first PC title to take advantage of GPU Decompression. This feature of DirectStorage 1.1+ allows the decompression of game assets to take place on the GPU. This approach is way faster than the traditional decompression on the CPU because it avoids a number of bottlenecks, frees up the CPU for other game-related tasks, and takes advantage of massive parallel processing capabilities of modern GPUs. It also leverages much higher bandwidth of the card's VRAM for decompressing and copying game data. Since GPGPU workloads execute faster with partial floating point precision, I would presume that the GDeflate compression stream format used by GPU Decompression performs better on cards with higher FP16 rating.

Since the whole idea of GPU Decompression (besides reducing load times) is to improve asset streaming -- notably in open world games -- GPUs that can do it faster should also show better 1% and 0.1% low figures, allowing for smoother gameplay.


You're right in saying that Direct Storage increases VRAM usage, but it does so by a negligible amount. It places two additional staging buffers in VRAM whose size can be defined. It is assumed that a 128-256 MB per buffer is optimal:

View attachment 306681View attachment 306682
Images taken from here. And here's a good article on the subject.

I always thought it was odd Nvidia has their own version of this..... Guessing if true this is why.

edit Apparently RTX I/O is being used in this...
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
I'm certain you knew what I was talking about... Driver bug - convenient isn't it?


I am one of those AMD owners who cares a lot about RT and I'm also one of those who stated they will be going Nvidia for my next card UNLESS AMD comes exceptionally close in RT or matches Nvidia's offerings. RT is the future and I'm all in for it. Second part of your comment is rather bogus as one of the main remarks made by the media and fanboys alike, Nvidia doesn't do this. Folks and selective memory loss.


Too much speculation, I don't have any of the answers for those. I already own this on PS5 so I won't be testing it on PC. Lastly, I'm the AMD unicorn owner - I love RT and I can't wait for it to be implemented in a far more efficient manner.

Yeah it's Nvidia fault that AMD has poor track record of producing proper Radeon drivers.

Either way Radeon owners can enable RT in R&C later with a driver/game update, but DLSS will never come to AMD sponsored games like Jedi Survivor,RE 4 Remake,Callisto Protocol, or even Starfield
 
Joined
Apr 6, 2021
Messages
1,131 (0.86/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
Man, AMD is totally pissing all over Nvidia in the 1% lows. :eek: In a "Nvidia sponsored" game, lol. Pretty embarrassing.

Apparently not: https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-7-2


Also, I'm not sure how this qualifies as a Nvidia-sponsored title, since it's a console port. It literally started on AMD hardware.

"AMD is working with the game developers of Ratchet & Clank™: Rift Apart to resolve some stability issues when Ray-Tracing is enabled."

By the wording they pretty much blame the game developers for the problems. :cool: Guess they are not very experienced yet with the implementation of RT for PC games.

AMD themselves are working on a driver update to fix the crashing issue, so it's obvious nvidia is behind it and nvidia fanboys are having double standards. The amd crusade at it again

Well, if you check the (negative) STEAM reviews you'd notice that the Nvidia users also got tons of problems with the game. :oops: From constant crashing to bad framerates, memory leaks, visual glitches, stuttering, freezing, black screens, etc. Game has quite some early adopter issues, even if you're in the green boat. Some people even say the game looks better on the PlayStation.


I'd like to see impact of Direct Storage tested depending on a few drives from different performance segments.

ComputerBase recently did a (member supported) benchmark test with the DirectStorage 1.2 BulkLoadDemo Benchmark. :) Quite a long list of tested drives.


What I find interesting is that the Crucial T700 4TB is wiping the floor with all other drives, comming out as the fastest single drive, while the 2TB version is just on par with the competitors. Not sure how this affects game performance. But I guess reviewers also don't know yet either. Just new waters.
 
Joined
Feb 24, 2020
Messages
97 (0.06/day)
Location
3rd world sh1thole, AKA italy
System Name AMDream v2.6.2
Processor AMD Ryzen 7 5800X3D+TechN AM4, CO-30 all cores
Motherboard Asus Prime B550-Plus w. PCIe USB 3.0/3.1 case connectors card
Cooling 2x240 slim rads, 1x420mm slim rad, 4xEK Vardar Evo RGB, stock case fans
Memory 2x8GB RGB+2x8GB non-RGB Crucial Ballistix DDR4-3600 CL16
Video Card(s) AMD Radeon RX 7900XT Reference+Alphacool waterblock+Kryosheet
Storage 2TB SK Hynix Platinum P41+Eluteng NVMe heatsink
Display(s) Xiaomi Mi 2k Gaming Monitor 27” (1440p165 IPS)+VESA arm
Case Lian Li Lancool III White ARGB
Audio Device(s) Bose Acoustimass 5 Series II speakers, Ayima A04 amp
Power Supply Seasonic Focus GX-850
Mouse Glorious Model D- Wired
Keyboard Keychron K5 (white backlight, blue Gateron Low Profile switches)
Software Windows 11 Pro
Can’t see any meaningful “holy sh*t” rt on vs off differences on the comparisons (max+rt vs max), just some slight enhancements on shadows. Is that because I’m watching the comparisons on an iPad?

But, gawd dayumn, it looks GREAT and runs even better on RDNA3 (those minimum FPS are *chef’s kiss*). I’m waiting some patches, but my 7900XT is ready for it.
 

Pace

New Member
Joined
Jul 27, 2023
Messages
3 (0.01/day)
NVIDIA sponsored title that has the latest DLSS and XeSS versions, but not the latest version of FSR... and RT does not work on AMD.
 
Joined
Nov 23, 2013
Messages
359 (0.09/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI B350 Tomahawk Arctic
Memory 4x8GB Corsair Vengeance LPX DDR4 3200Mhz
Video Card(s) Gigabyte 6700XT Gaming OC (2.80Ghz core / 2.15Ghz mem)
Storage Corsair MP510 NVMe 960GB; Samsung 850 Evo 250GB; Samsung 860 Evo 500GB;
Display(s) Dell S2721DGFA; Iiyama ProLite B2783QSU;
Case Cooler Master Elite 361
Power Supply Cooler Master G750M
Can’t see any meaningful “holy sh*t” rt on vs off differences on the comparisons (max+rt vs max), just some slight enhancements on shadows. Is that because I’m watching the comparisons on an iPad?

But, gawd dayumn, it looks GREAT and runs even better on RDNA3 (those minimum FPS are *chef’s kiss*). I’m waiting some patches, but my 7900XT is ready for it.
You are missing some obvious reflections I guess, for example in the 2nd shot up, right...
But I gotta say while playing none of this matters much. That vibe the game gives off at the highest settings - that you're playing some pixar cgi film - remains untouched ^^
 
Joined
Mar 1, 2021
Messages
115 (0.09/day)
Processor R7 7800X3D
Motherboard MSI B650 Tomahawk
Memory 2x32GB 6000CL30
Video Card(s) RTX 3070 FE
Case Lian Li O11 Air Mini
Power Supply Corsair RM1000x
Game does have RT, only AMD hardware can't enable RT due to driver bugs
>Game has horrible stutters on Nvidia cards resulting in 4090 having worse 1% lows than 6800XT
>Developer's fault

>Game doesnt support RT for radeon cards, even tho the original game was built for PS5 and its rdna2 architecture
>Fault of the drivers, totally not developers
 
Joined
Nov 11, 2016
Messages
3,393 (1.16/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
>Game has horrible stutters on Nvidia cards resulting in 4090 having worse 1% lows than 6800XT
>Developer's fault

>Game doesnt support RT for radeon cards, even tho the original game was built for PS5 and its rdna2 architecture
>Fault of the drivers, totally not developers

Lol who here ever said anything about low 1% low FPS being devs fault? Obviously Nvidia has to improve their driver for this game just like AMD need to fix RT.

But hey Radeon owners can enable FSR just fine
 
Joined
Jan 24, 2011
Messages
287 (0.06/day)
Processor AMD Ryzen 5900X
Motherboard MSI MAG X570 Tomahawk
Cooling Dual custom loops
Memory 4x8GB G.SKILL Trident Z Neo 3200C14 B-Die
Video Card(s) AMD Radeon RX 6800XT Reference
Storage ADATA SX8200 480GB, Inland Premium 2TB, various HDDs
Display(s) MSI MAG341CQ
Case Meshify 2 XL
Audio Device(s) Schiit Fulla 3
Power Supply Super Flower Leadex Titanium SE 1000W
Mouse Glorious Model D
Keyboard Drop CTRL, lubed and filmed Halo Trues
I always thought it was odd Nvidia has their own version of this..... Guessing if true this is why.

edit Apparently RTX I/O is being used in this...
They don't have their "own version," it's just branding of their implementation of the DirectStorage standard.
 
Last edited:
Top