• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Immortals of Aveum Benchmark Test and Performance Analysis

Joined
Dec 19, 2008
Messages
290 (0.05/day)
Location
WA, USA
System Name Desktop
Processor AMD Ryzen 5950X
Motherboard ASUS Strix B450-I
Cooling be quiet! Dark Rock TF 2
Memory 32GB DDR4 3600
Video Card(s) AMD RX 6800
Storage 480GB MyDigitalSSD NVME
Display(s) AOC CU34G2X
Power Supply 850w
Mouse Razer Basilisk V3
Keyboard Steelseries Apex 5
ok but what about 1080p 60fps for RTX 3080? Does the performance of the hardware drop that much in 3 years? If there is such a disgraceful situation, it is the fault of the developer.
lol 59.1 FPS not close enough for you? Dropping one setting, pretty much any setting, would do it for you. As would overclocking.
 
Joined
Sep 10, 2018
Messages
6,049 (2.82/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I've been playing with the integrated Performance Budget Tool, which is found in the graphics menu. The game assesses your graphics card's and processor's performance on first launch, presents the benchmark results as your total GPU and CPU "budgets", and sets the detail options accordingly. Every graphics setting is assigned two values reflecting its impact on GPU and CPU performance. Here's a detailed breakdown of the costs from their minimum to maximum value:

GPU costCPU costGPU costCPU cost
Texture qualitylow3010ultra5010
Visual effects qualitylow1010ultra3010
Shadow qualitylow3010ultra5010
Post processing qualitylow2010ultra4030
Volumetric fog resolutionlow2010ultra6030
Global illumination qualitylow3020ultra9020
Reflection qualitylow400high600
Anisotropic filteringoff0016x100100
Ambient occlusion qualitylow1010ultra5030
Atmosphere qualitylow1010ultra5010
Cinematics depth of field qualitylow2010ultra6030
Foliage qualitylow1010ultra1010
Mesh qualitylow3010ultra9010
Cinematics motion blur qualitylow3010ultra5010
Particle qualitylow4010ultra6010
Shadow mesh qualitylow1020ultra7020
Shadow resolution qualitylow2020ultra2040
Subsurface scattering qualitylow2010ultra8010
total cost3801901020390

There are also two toggles -- Lights shafts and Local exposure -- without allotted numerical costs.

While the whole idea sounds very practical in theory, the values chosen by the developers to represent each cost make one wonder. While global illumination and mesh quality certainly burden the GPU, why is anisotropic filtering supposed to be the most demanding setting? Also, the tool seems to miscalculate the total CPU budget. In the developers' own words:

View attachment 310398

My 5800X3D was evaluated at 230, whereas the 13900K gets 290. The Performance Budget Tool could become really helpful eventually, but it needs further polishing. Also, the game would benefit greatly from textual descriptions of each setting, with the corresponding visual benefit/tradeoff. That could help less experienced gamers make more informed choices in quality vs. performance.

And just for reference, those are the total budgets assigned by the tool to various GPUs:

7900XTX 1750
4090 1450
4080 1100
3080Ti 800
4070 700
3080 700
3060Ti 500
2060 300
1650 100
Steam Deck60

I was watching a video about that earlier it seems whatever system they came up with to generate numbers is full of shite.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,276 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
They should have stuck with Frostbite engine. At least some devs there will know how to use it
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE
 
Joined
Jun 11, 2019
Messages
536 (0.29/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
lol 59.1 FPS not close enough for you? Dropping one setting, pretty much any setting, would do it for you. As would overclocking.
IMHO this is kinda missing the point - the game just doesn't look good, lol. There are games that both look better on a technical level and run better too so surely this is a fail. Who is to blame is another topic altogether but the fact is performance here is abysmal.
 
Joined
Sep 10, 2018
Messages
6,049 (2.82/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE

I wouldn't be surprised if in the next couple years at least 30-40% of major releases use this engine a lot of major studios that used engines I liked have made the switch which I think is unfortunate. Hopefully I'm just being pessimistic and it all works out but the engine hasn't gotten off to a good start for consumers regardless of how much easier it makes it for the developer.

Remnant 2 did seem to sell well regardless of it's performance though.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,276 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I wouldn't be surprised if in the next couple years at least 30-40% of major releases use this engine a lot of major studios that used engines I liked have made the switch which I think is unfortunate. Hopefully I'm just being pessimistic and it all works out but the engine hasn't gotten off to a good start for consumers regardless of how much easier it makes it for the developer.
Agree 100%
 
Joined
Feb 11, 2009
Messages
5,453 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE

yeah well there is a reason everyone here drives a Volkswagen as well, and that reason is that...everyone is driving a volkswagen, so there are a lot of parts available for cheap and expertise is also everywhere to find, good luck fixing a Lexus at your local dealership.
That does not mean a volkswagen is a better car then a lexus though....
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,276 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
yeah well there is a reason everyone here drivers a Volkswagen as well, and that reason is that...everyone is driving a volkswagen, so there are a lot of parts available for cheap and expertese is also everywhere to find, good luck fixing a Lexus at your local dealership.
You are making a good point, too, there's A LOT of talent out there that knows how to work with UE.

The underlying reason is that it's actually nice to work with UE. Like 2 years ago I wanted to write my own benchmark for in-house testing at customizable workloads .. with very little experience .. UE was by far the best learning experience, and it's free, and they give you all the source code
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,610 (3.74/day)
Location
London,UK
System Name Codename: Icarus Mk.VI
Processor Intel 8600k@Stock -- pending tuning
Motherboard Asus ROG Strixx Z370-F
Cooling CPU: BeQuiet! Dark Rock Pro 4 {1xCorsair ML120 Pro|5xML140 Pro}
Memory 32GB XPG Gammix D10 {2x16GB}
Video Card(s) ASUS Dual Radeon™ RX 6700 XT OC Edition
Storage Samsung 970 Evo 512GB SSD (Boot)|WD SN770 (Gaming)|2x 3TB Toshiba DT01ACA300|2x 2TB Crucial BX500
Display(s) LG GP850-B
Case Corsair 760T (White)
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Corsair AX760
Mouse Logitech G900
Keyboard Duckyshine Dead LED(s) III
Software Windows 10 Pro
Benchmark Scores (ノಠ益ಠ)ノ彡┻━┻
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE

In that case why is the game so badly optimised? Unless its just teething issues with UE-5 being more or less new? But yeah, I know about Frostbite being complicated and hard to work with because nobody really knows how to use it properly, but EA was trying to get all the studios on board with it at one point.

Hindsight being 20/20, they should have tried harder to keep the original team that built the engine around for longer so they could maybe train more people but I guess the staff turnover has been pretty catastrophic the last few years.
 
Last edited:
Joined
Feb 1, 2013
Messages
1,253 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
In that case why is the game so badly optimised? Unless its just teething issues with UE-5 being more or less new? But yeah, I know about Frostbite being complicated and hard to work with because nobody really knows how to use it properly, but EA was trying to get all the studios on board with it at one point.

Hindsight being 20/20, they should have tried harder to keep the original team that built the engine around for longer so they could maybe train more people but I guess the staff turnover has been pretty high catastrophic the last few years.
In general, heavily-modularized software is harder to optimize -- so many moving parts, and keeping maximum compatibility/flexibility eats up the potential. UE being design-heavy (think no-code, plugins, extensions), rather than code-heavy is eating into the performance budget.

Also nanite replaces old static LOD meshes, which could be hand-tuned for best loading and residential load on the GPU. Nanite is an additional load on the CPU to dynamically "generate" these LOD's. Lumen is basically software global illumination which is really heavy, which used to require several tricks to run fast.
 
Joined
Dec 25, 2020
Messages
5,425 (4.15/day)
Location
São Paulo, Brazil
System Name Cocogoat
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.425V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I've been playing with the integrated Performance Budget Tool, which is found in the graphics menu. The game assesses your graphics card's and processor's performance on first launch, presents the benchmark results as your total GPU and CPU "budgets", and (supposedly) sets the optimal detail level. Every graphics setting is assigned two values reflecting its impact on GPU and CPU performance. Here's a detailed breakdown of the costs from their minimum to maximum value:

GPU costCPU costGPU costCPU cost
Texture qualitylow3010ultra5010
Visual effects qualitylow1010ultra3010
Shadow qualitylow3010ultra5010
Post processing qualitylow2010ultra4030
Volumetric fog resolutionlow2010ultra6030
Global illumination qualitylow3020ultra9020
Reflection qualitylow400high600
Anisotropic filteringoff0016x100100
Ambient occlusion qualitylow1010ultra5030
Atmosphere qualitylow1010ultra5010
Cinematics depth of field qualitylow2010ultra6030
Foliage qualitylow1010ultra1010
Mesh qualitylow3010ultra9010
Cinematics motion blur qualitylow3010ultra5010
Particle qualitylow4010ultra6010
Shadow mesh qualitylow1020ultra7020
Shadow resolution qualitylow2020ultra2040
Subsurface scattering qualitylow2010ultra8010
total cost3801901020390

There are also two toggles -- Lights shafts and Local exposure -- without allotted numerical costs.

While the whole idea sounds very practical, the values chosen by the developers to represent each cost make one wonder. While global illumination and mesh quality certainly burden the GPU, why is anisotropic filtering supposed to be the most demanding setting of all? Also, the tool seems to miscalculate the total CPU budget. According to the developers (the resolutions below assume upscaling in quality mode):

View attachment 310398

My 5800X3D was evaluated at 230, whereas the 13900K gets 290 -- both of which seem way off. The Performance Budget Tool could become really helpful eventually, but it needs further polishing. Also, the game would benefit greatly from textual descriptions of each setting, with the corresponding visual benefit/tradeoff. That could help less experienced gamers make more informed choices in quality vs. performance.

And just for reference, those are the total budgets assigned by the tool to various GPUs:

7900XTX1750
40901450
40801100
3080Ti800
4070700
3080700
3060Ti500
2060300
1650100
Steam Deck60

Haha wow, I can't even. This budgeting tool is hilarious. The 7900 XTX outweighing the 4090 by that much just proves it.

They can keep their game, the engine, and maybe partner with AMD to shift some extra copies for Red Team fans to benchmark and claim their GPU wins at something on the wccftech comment section. It's about all it's good for unless this game gets major patching to the point all Nvidia GPUs top to bottom get their budget score doubled at the very least

In general, heavily-modularized software is harder to optimize -- so many moving parts, and keeping maximum compatibility/flexibility eats up the potential. UE being design-heavy (think no-code, plugins, extensions), rather than code-heavy is eating into the performance budget.

Also nanite replaces old static LOD meshes, which could be hand-tuned for best loading and residential load on the GPU. Nanite is an additional load on the CPU to dynamically "generate" these LOD's. Lumen is basically software global illumination which is really heavy, which used to require several tricks to run fast.

I can understand the tech behind it, but the implementation is subpar and hilariously biased towards one type of hardware architecture. That's no way to ship a game.

There's just no possible way that a game that struggles like this on a rig of W1zz's caliber that gets a pass. No matter the reason.

I'm really interested in seeing if the devs will chime in and explain the situation or if this is just ashes of the benchmark 2.0.
 
Joined
Sep 10, 2018
Messages
6,049 (2.82/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Haha wow, I can't even. This budgeting tool is hilarious. The 7900 XTX outweighing the 4090 by that much just proves it.

They can keep their game, the engine, and maybe partner with AMD to shift some extra copies for Red Team fans to benchmark and claim their GPU wins at something on the wccftech comment section. It's about all it's good for unless this game gets major patching to the point all Nvidia GPUs top to bottom get their budget score doubled at the very least

This has more to do with the engine than the Developer I think.... I think in an interview they said some crazy stuff about using AI to make it more accurate but I could be wrong lol...
 
Joined
Dec 25, 2020
Messages
5,425 (4.15/day)
Location
São Paulo, Brazil
System Name Cocogoat
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.425V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
This has more to do with the engine than the Developer I think.... I think in an interview they said some crazy stuff about using AI to make it more accurate but I could be wrong lol...

Sure, AMD has the XDNA instructions on RDNA 3. Nvidia has had an entire dedicated AI engine available to developers since Turing going back 5 years?
 
Joined
May 19, 2009
Messages
1,838 (0.33/day)
Location
Latvia
System Name Personal \\ Work - HP EliteBook 840 G6
Processor 7700X \\ i7-8565U
Motherboard Asrock X670E PG Lightning
Cooling Noctua DH-15
Memory G.SKILL Trident Z5 RGB Black 32GB 6000MHz CL36 \\ 16GB DDR4-2400
Video Card(s) ASUS RoG Strix 1070 Ti \\ Intel UHD Graphics 620
Storage 2x KC3000 2TB, Samsung 970 EVO 512GB \\ OEM 256GB NVMe SSD
Display(s) BenQ XL2411Z \\ FullHD + 2x HP Z24i external screens via docking station
Case Fractal Design Define Arc Midi R2 with window
Audio Device(s) Realtek ALC1150 with Logitech Z533
Power Supply Corsair AX860i
Mouse Logitech G502
Keyboard Corsair K55 RGB PRO
Software Windows 11 \\ Windows 10
Yeah, this isn't exactly UE5, it's the "modern" game development yet again. I am not going to buy this even after I upgrade my GPU.
I know screenshots are never true representation of a game, but common, where is the WOW factor in one of the first "groundbreakers"? It really does look dated as W1zzard said, yet eats resources like there is no tomorrow.
 
Joined
Dec 25, 2020
Messages
5,425 (4.15/day)
Location
São Paulo, Brazil
System Name Cocogoat
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.425V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Yeah, this isn't exactly UE5, it's the "modern" game development yet again. I am not going to buy this even after I upgrade my GPU.
I know screenshots are never true representation of a game, but common, where is the WOW factor in one of the first "groundbreakers"? It really does look dated as W1zzard said, yet eats resources like there is no tomorrow.

Not to mention how it looks like the dev team had the time to develop and test it against one architecture and they picked just the one that's outsold 8 to 1 by its competition

The simple fact that this is attributing a much higher performance grade to the 7900 XTX tells you all you need to know.

I get UE5 being AMD friendly, Remnant II seemed a little AMD biased but nothing too outrageous but here? This makes no sense whatsoever.
 
Joined
Mar 28, 2020
Messages
1,689 (1.07/day)
I've hated it in every game I've tried it in it almost ruins FF16 although it's so bad in that it's got to be FSR1

If you like it good for you though.
I think you have to look back at the original intend of the technology. FSR 1 was release to make it simple to apply such that any GPUs, including older ones, have some sort of upscaling tech so that you don't have to sacrifice on resolution. Sure, it is known to be inferior to DLSS, but I don't see Nvidia doing a GPU agnostic solution like Intel and AMD.
 
Joined
Dec 25, 2020
Messages
5,425 (4.15/day)
Location
São Paulo, Brazil
System Name Cocogoat
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.425V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I think you have to look back at the original intend of the technology. FSR 1 was release to make it simple to apply such that any GPUs, including older ones, have some sort of upscaling tech so that you don't have to sacrifice on resolution. Sure, it is known to be inferior to DLSS, but I don't see Nvidia doing a GPU agnostic solution like Intel and AMD.

The question that needs to be asked is whether AMD would make FSR completely hardware agnostic of the following conditions were met:

1. They had a market leader position
2. Their hardware had matrix multiplication capabilities
3. By leveraging their own hardware's feature set, they could achieve a better result

Intel opted to make XeSS usable on more than just Arc by enabling an alternative, less efficient code path. Nvidia could probably do the same if it really wanted to, except that the three conditions above are currently met by their product line, while Intel's probably been selling Arc at cost or even subsidized to get by this debut generation.
 
Joined
Jan 31, 2011
Messages
2,205 (0.45/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
from what i see so far, all UE5 seems to have this grainy feel to the textures and not just because of the film grain, i thought it was lumen at first with the denoising but even remnant 2 seems to have it as it doesn't use lumen
 
Joined
Sep 10, 2018
Messages
6,049 (2.82/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
from what i see so far, all UE5 seems to have this grainy feel to the textures and not just because of the film grain, i thought it was lumen at first with the denoising but even remnant 2 seems to have it as it doesn't use lumen

Yeah, something just looks off to me as well maybe it's the way textures are compressed in this engine. The one thing epic doesn't seem to have solved is texture quality in general though it's pretty bad in some spots in both this and remnant not sure how much that is the engine or the developer though.
 
Joined
Nov 29, 2022
Messages
735 (1.22/day)
Processor Intel i7 77OOK
Motherboard Gigabyte Aorus something
Cooling Noctua NH-U12S dual fan
Memory Ballistix 32 Go
Video Card(s) MSI 3060 Gaming X
Storage Mixed bag of M2 SSD and SATA SSD
Display(s) MSI 34" 3440x1440 Artimys 343CQR
Case Old Corsair Obsidian something
Audio Device(s) Integrated
Power Supply Old Antec HCG 620 still running good
Mouse Steelseries something
Keyboard Steelseries someting too
Benchmark Scores bench ? no time to lose with bench ! :)
Given the 3 comparative images, the games far from being ugly in "low" preset ... (compared to max settings)
but still, the numbers are pretty low :/
 
Joined
Oct 1, 2006
Messages
4,913 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
Yeah, this isn't exactly UE5, it's the "modern" game development yet again. I am not going to buy this even after I upgrade my GPU.
I know screenshots are never true representation of a game, but common, where is the WOW factor in one of the first "groundbreakers"? It really does look dated as W1zzard said, yet eats resources like there is no tomorrow.
Yup the "fault" of UE5 is that it is so easy to use.
This even less competent devs like this can make games that look good on trailers and screenshots.
Once they got the money most people don't bother refunding their games even if it runs like garbage.
At most they complain on social media and forgets about it after a while.
So in the end the incentive is to skimp out on optimization which is time consuming and expensive in man-hours.
 
Last edited:

xploder270

New Member
Joined
Aug 24, 2023
Messages
2 (0.01/day)
Great to see RDNA doing well in this game. I would've thought the VRAM requirements would be much worse after seeing the recommended specs.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,671 (4.67/day)
Location
Kepler-186f
Elden Ring maker comes out with armored core vi today, @W1zzard i hope we get performance review for this one, it scored really well on pcgamer review yesterday. :rockout:
 
Joined
Aug 10, 2021
Messages
166 (0.15/day)
System Name Main
Processor 5900X
Motherboard Asrock 570X Taichi
Memory 32GB
Video Card(s) 6800XT
Display(s) Odyssey C49G95T - 5120 x 1440
Not to mention how it looks like the dev team had the time to develop and test it against one architecture and they picked just the one that's outsold 8 to 1 by its competition
That's not really true. I know this is a PC forum, but if you're talking about "architecture" and outselling (in gaming), you'd need to count the consoles too
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,276 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Elden Ring maker comes out with armored core vi today, @W1zzard i hope we get performance review for this one, it scored really well on pcgamer review yesterday. :rockout:
Yeah, definitely. Any idea if it's 60 FPS locked again?
 
Top