• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Immortals of Aveum Benchmark Test and Performance Analysis

Joined
Jan 31, 2011
Messages
2,210 (0.44/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
Joined
Oct 1, 2006
Messages
4,930 (0.75/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
Joined
Apr 17, 2021
Messages
556 (0.43/day)
System Name Jedi Survivor Gaming PC
Processor AMD Ryzen 7800X3D
Motherboard Asus TUF B650M Plus Wifi
Cooling ThermalRight CPU Cooler
Memory G.Skill 32GB DDR5-5600 CL28
Video Card(s) MSI RTX 3080 10GB
Storage 2TB Samsung 990 Pro SSD
Display(s) MSI 32" 4K OLED 240hz Monitor
Case Asus Prime AP201
Power Supply FSP 1000W Platinum PSU
Mouse Logitech G403
Keyboard Asus Mechanical Keyboard
Woh you don't need the launcher to play an EA game in Steam. Is this a first? Great news! I might buy it! :) I actually love EA games but hate the EA launcher ha.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,058 (4.65/day)
Location
Kepler-186f
Yeah, definitely. Any idea if it's 60 FPS locked again?

its not. they already announced or at least i read somewhere its 120 fps locked this time :)

which is good enough for me, beats the heck out of 60
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,705 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Woh you don't need the launcher to play an EA game in Steam. Is this a first? Great news! I might buy it! :) I actually love EA games but hate the EA launcher ha.
Correct, it is a first. It also doesn't have EA's Denuvo and only the regular Steam Denuvo, which allow unlimited GPU changes
 

Emmko

New Member
Joined
Aug 24, 2023
Messages
2 (0.00/day)
Not unexpected for an Unreal Engine 5 game, there's a little bit of pop-in as you travel across the world...
I thought Nanite should take care of this issue with it's native LOD scaling or what do you mean with the pop-in?
This I believe is the one area where any first-person (more noticeable) view game is still lacking to this day.
Swapping LOD on objects instantly breaks any immersion the game might want to create. :rolleyes:
 
Joined
Jan 31, 2011
Messages
2,210 (0.44/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
Meanwhile another UE5 game with lumen and nanite, Fort Solis
1692864705474.png
 
Joined
Jan 31, 2011
Messages
2,210 (0.44/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
Joined
Jan 8, 2017
Messages
9,391 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I thought Nanite should take care of this issue with it's native LOD scaling or what do you mean with the pop-in?
You don't have to use Nanite just because you're using UE5.
 
Joined
Feb 20, 2019
Messages
8,192 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I think you have to look back at the original intend of the technology. FSR 1 was release to make it simple to apply such that any GPUs, including older ones, have some sort of upscaling tech so that you don't have to sacrifice on resolution. Sure, it is known to be inferior to DLSS, but I don't see Nvidia doing a GPU agnostic solution like Intel and AMD.
IIRC FSR was born in the consoles which are underpowered for the 4K TVs they're almost always plugged into. FSR lets you render at whatever res is required to reach 30 or 60 fps, and then the UI/text can be rendered crisp and clean at 4K to give that next-gen polish that keeps people upgrading. Like Freesync, it was already something AMD had in the dev toolkits for console manufacturers, but Nvidia's DLSS forced AMD to spin it into a consumer feature to try and reach feature parity.
Given the 3 comparative images, the games far from being ugly in "low" preset ... (compared to max settings)
but still, the numbers are pretty low :/
Apart from some differences in shadowing there's practically no visible difference between low and ultra, right down to geometry, LOD, and textures. Some shadowing looks marginally higher resolution at Ultra but it's high enough resolution at Low that I cannot see why you'd want to go higher looking at still screenshots.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,147 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Hate to break it to you but your 3080 is about to hit 3 years old.
Technically, Nvidia officially released the GeForce RTX 3080 12GB graphics card on January 11, 2022. But of course the 10gb version and Ampere itself is almost 3 years, still only one gen old high end part though.

I can't believe how steep the requirements are for such a little visual return. Man 2023 game launches has been a roller-coaster that's for sure, seeing both ends of the spectrum.

Recent video of DF showed FSR2 does wonders for the re-release of Red Dead
Wonders compared to using no temporal solution at all, and also its FSR2's anti aliasing only on a native resolution image. No wonder it looked good in that comparison.
 
Joined
Sep 21, 2020
Messages
1,606 (1.07/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
This has more to do with the engine than the Developer I think....
Yep, those total budget values are the literal results of a synthetic benchmark built into UE4/5:

Code:
FSynthBenchmark (V0.92):
===============
Main Processor:
        ... 0.025383 s/Run 'RayIntersect'
        ... 0.027685 s/Run 'Fractal'

CompiledTarget_x_Bits: 64
UE_BUILD_SHIPPING: 0
UE_BUILD_TEST: 0
UE_BUILD_DEBUG: 0
TotalPhysicalGBRam: 32
NumberOfCores (physical): 16
NumberOfCores (logical): 32
CPU Perf Index 0: 100.9
CPU Perf Index 1: 103.3

Graphics:
Adapter Name: 'NVIDIA GeForce GTX 670'
(On Optimus the name might be wrong, memory should be ok)
Vendor Id: 0x10de
GPU Memory: 1991/0/2049 MB
      ... 4.450 GigaPix/s, Confidence=100% 'ALUHeavyNoise'
      ... 7.549 GigaPix/s, Confidence=100% 'TexHeavy'
      ... 3.702 GigaPix/s, Confidence=100% 'DepTexHeavy'
      ... 23.595 GigaPix/s, Confidence=89% 'FillOnly'
      ... 1.070 GigaPix/s, Confidence=100% 'Bandwidth'

GPU Perf Index 0: 96.7
GPU Perf Index 1: 101.4
GPU Perf Index 2: 96.2
GPU Perf Index 3: 92.7
GPU Perf Index 4: 99.8
CPUIndex: 100.9
GPUIndex: 96.7

The devs and publishers may choose to devote extra time to improve the technical aspects of their game and optimize the code, but that is highly unlikely given the present state of the industry. The majority of gamers want flashy graphics with "amazing new technologies", but they quickly get bored playing the same game and move on to another. Most of them will never finish that new game they just purchased, and many will never return to it. And plenty of gamers don't even care what they're playing as long as it's new/trendy/hyped.

Publishers simply capitalize on this attitude to maximize their profits, by pushing out new titles faster. Little time is put into testing and optimizing the final product, and performance deficits are masked by "amazing new technologies" like image scaling and frame generation. Unless the paradigm changes, we can expect all major titles utilizing UE5 to perform similarly IMO.
 
Last edited:
Joined
Aug 12, 2010
Messages
118 (0.02/day)
Location
Brazil
Processor Ryzen 7 7800X3D
Motherboard ASRock B650M PG Riptide
Cooling Wraith Max + 2x Noctua Redux NF-P12
Memory 2x16GB ADATA XPG Lancer Blade DDR5-6000 CL30
Video Card(s) Powercolor RX 7800 XT Fighter OC
Storage ADATA Legend 970 2TB PCIe 5.0
Display(s) Dell 32" S3222DGM - 1440P 165Hz + P2422H
Case HYTE Y40
Audio Device(s) Microsoft Xbox TLL-00008
Power Supply Cooler Master MWE 750 V2
Mouse Alienware AW320M
Keyboard Alienware AW510K
Software Windows 11 Pro
Pay for the game upfront and only play it with hardware that's coming out in two generations.

Whack job.
 
Joined
Oct 6, 2021
Messages
1,605 (1.43/day)
Frostbite is a huge POS that's extremely complicated to work with. UE on the other hand makes it REALLY easy to produce a game world and make it work .. with UE5 you can now play with a lot of things in the editor in real-time, while previously you had to bake lighting just to get a preview, and on other engines most of these things aren't even possible. there is a reason why everybody is using UE
Easy doesn't necessarily mean better. Mainly for companies that want to increase profits at any cost.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,705 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Easy doesn't necessarily mean better. Mainly for companies that want to increase profits at any cost.
If you can't find people to hire, because nobody likes your engine, and they all prefer to go somewhere, there will be no more business left to run for you
 
Joined
Oct 6, 2021
Messages
1,605 (1.43/day)
If you can't find people to hire, because nobody likes your engine, and they all prefer to go somewhere, there will be no more business left to run for you
Honestly, there are more people out of a job than the other way around, you could argue that people with the talent and experience to optimize an engine at low level to extract every last drop of performance from the available hardware turns out to be a very select group. Then I would have to agree.

But I still wonder if the only solution is to create engines full of shortcuts and easy ways (which work but not optimally) to do everything. Good times developers created their own engines from scratch... AAA games were heavy but delivered graphics compatible with the required hardware. :(
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,705 (3.70/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
But I still wonder if the only solution is to create engines full of shortcuts and easy ways (which work but not optimally) to do everything
Unreal Engine is open source, everybody is free to submit patches (which happens all the time). There's 8800 pull requests on their Github that have already been merged, and 1977 open ones
 
Joined
Dec 25, 2020
Messages
6,570 (4.66/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That's not really true. I know this is a PC forum, but if you're talking about "architecture" and outselling (in gaming), you'd need to count the consoles too

Consoles use the older RDNA 2 tech, and RDNA 2 is not receiving over-inflated budgeting or performing irrationally high on UE5 in general - it kinda just does its thing the way it should, with Ampere-like performance. Even if it was optimized for consoles only - it wouldn't apply to PC
 

Emmko

New Member
Joined
Aug 24, 2023
Messages
2 (0.00/day)
You don't have to use Nanite just because you're using UE5.
Yes, but in this case devs are using it - @W1zzard mentions this in the intro:
Immortals of Aveum harnesses the power of Unreal Engine 5, integrating cutting-edge technologies like Lumen and Nanite to enhance its visual and gameplay experience. These advancements ensure that players are immersed in a world of unparalleled realism and detail. Additionally, the game employs DirectX 12 as its graphics API exclusively, but there is no support for ray tracing. To improve FPS rates you may enable NVIDIA DLSS, DLSS 3 Frame Generation or AMD Radeon FSR 2.

Also:
Nanite is one example: This micropolygon geometry system intelligently adjusts the level of detail of any in-game object depending on how close you are to it. So, if you have an object that’s far enough away that you couldn’t make out fine details on it even if they were there, Nanite will actually make the object physically less complex—on the fly!—so that the game doesn’t waste resources rendering it fully.

“In any other game,” says our Chief Technology Officer Mark Maratea, “you might see what looks like a big craggy wall, but it’s actually flat with a craggy texture and maybe some shader trickery. We don’t have to do that; we actually build an object with all of that detail, and Nanite determines whether that detail shows up based on your distance.”
 
Joined
Sep 10, 2018
Messages
6,833 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Consoles use the older RDNA 2 tech, and RDNA 2 is not receiving over-inflated budgeting or performing irrationally high on UE5 in general - it kinda just does its thing the way it should, with Ampere-like performance. Even if it was optimized for consoles only - it wouldn't apply to PC

I know you are mostly talking about the silly arbitrary numbers this engine spits out when telling you what your gpu is capable of but RDNA2 does generally outperform Ampere in this game I would even say Ampere in general is not performing very well.
 
Joined
Dec 25, 2020
Messages
6,570 (4.66/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I know you are mostly talking about the silly arbitrary numbers this engine spits out when telling you what your gpu is capable of but RDNA2 does generally outperform Ampere in this game I would even say Ampere in general is not performing very well.

But unlike Ada v. RDNA 3, there generally wasn't a performance gulf between Ampere and RDNA 2 unless RT was directly involved. In lower resolutions, it was actually even a little faster due to the huge amount of raster muscle + the 128MB cache doing its magic. The 4080 and the XTX just about trade their blows as the 3090 and 6900 XT did, I would consider them equal each with their own strengths (as was the case then), the 4090 is just straight better than both.
 
Joined
Sep 10, 2018
Messages
6,833 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
But unlike Ada v. RDNA 3, there generally wasn't a performance gulf between Ampere and RDNA 2 unless RT was directly involved. In lower resolutions, it was actually even a little faster due to the huge amount of raster muscle + the 128MB cache doing its magic. The 4080 and the XTX just about trade their blows as the 3090 and 6900 XT did, I would consider them equal each with their own strengths (as was the case then), the 4090 is just straight better than both.

Was more talking about this game specifically with how a 6800 nearly matches a 3080 at 4k and beats it at 1440p and a 6900XT beats a 3090ti at 1440p and nearly matches it at 4k. Which you don't normally see in the majority of games especially 3080 vs 6800.
 
Joined
Oct 1, 2021
Messages
113 (0.10/day)
System Name Phenomenal1
Processor Ryzen 7 5800x3d
Motherboard MSI X570 Gaming Plus
Cooling Noctua NH-D15s with added NF-A12x25 fan on front
Memory 32 GB - 2 x 16 GB Ripjaws V CL16 @ 3600
Video Card(s) Dell RTX 3080 10GB
Storage Boot SSD: SATA 500GB - 1tb pcie3 nvme / Spinning Drives: 1tb + 1tb + 1tb + 2tb + 6 tb
Display(s) Gigabyte M27Q 27" 1440P 170 Hz IPS BGR monitor
Case Montech X3 Mesh - Black
Audio Device(s) Realtek ALC1220P
Power Supply 750 Watt Antec Earthwatts - 4 rail
Mouse Razer Viper
Keyboard Corsair K70 LUX RGB
Software Windows 11
Hate to break it to you but your 3080 is about to hit 3 years old.
Right, but a 3080 is about the same as a 4070 and the main advantage the 4070 has besides being more energy efficient is frame generation. The 3080 is a powerful card that happens to not support newer software enhancements which I suspect is intentional, and it had all the performance that I personally needed but it's locked out by design, again my suspicion. I just upgraded from a 3080 to a 4090 (arguably the best 4000 series value besides maybe the 4070) because I saw the writing on the wall where if you don't have upscaling along with frame generation (fake frames), which I call the combo "fakescaling", you're not going to have a good time going forward. Games are being made in UE5 engine without the optimization that you find in fortnite which would have made the 3080 run great in Immortals of Upscaling, I mean Aveum. So the only way the 3080 being 3 years old is a problem is that game designers or publishers or whoever is to blame are leaning heavily on the latest nvidia software enhancements so they can do less optimization so they can get the game out the door as fast as possible for maximum profits. So you're not wrong, but saying the problem is the 3080 being 3 years old isn't entirely right either. The problem is that the 3080 is locked out by nvidia combined with the greed of whoever's responsible for Immortals of Upscaling, I mean Aveum, being unoptimized.
 
Last edited:
Top