• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Star Wars Jedi: Survivor Benchmark Test & Performance Analysis

Joined
Jun 14, 2020
Messages
3,540 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
CyberPunk is way better with e-cores off certainly on WIN10. Spiderman Remastered and Hitman 3 have special engines to use them so they are a little better with them on.

Though extreme examples are a lot worse with e-cores on then them off where as games that benefit have oly minor ones from e-cores. Star Citizen and Wo Long: Fallen Dynasty come to mind as games that are horrible with e-cores on.

The hybrid arch is just too new of tech and so different form the traditionally SMP way of doing things in X86 ecosystem this century since Windows 2000 that there are bond to be games and apps that are disastrous with e-cores on.

Tech the last 10 years with the modern WIN7 to present arch has produced like no compatibility issues until e-cores introduced as once again it is such an unusual and drastic shift of the way of doing things.

Its like the change from DOS to 32-bit Windows NT. That happened so much faster.

But last 10 years we just needed more of same type of cores and faster clocks and IPC and faster AM and storage and GPUs.

We did not need such a drastic change n the way things are scheduled to introduce the pain of transitioning from DOS to 32-bit Windows NT based OS. Espeically when the length of time for such an X86 ecosystem was far longer in the 32-bit and 64-bit SMP world than we had in the consumer DOS world.
From my vigorous cyberpunk testing, ecores on is the way to go, but ive only tested in W11. Dunno what happens in windows 10.

If a game works better with ecores off, it's a game issue that needs to be fixed. That means it bypasses the built in hardware scheduler that both ALD and RPL have.. Ecores should only be used in gaming when you run out of physical threads.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Something I find particularly interesting is that when Nvidia sponsors games, and there is feature disparity, it's an atrocity, something to boycott and add to the bank of reasons why they're scum. When AMD sponsors games and fleeces half of all possible buyers of said game, it's fine! AMD to the rescue hey? Everyone gets to use their crap options.

Frankly, I'm disgusted by both, or either depending which way you slice it. I won't forgive one, or the other, and I can't wait to hear the mental gymnastics around why this is different, and AMD is the savior and what they did is in no way comparable.

I doubt I'll hear anything new, but lets see right? I love hearing why one company you hate is disgusting and why another you like is totally A'ok :) Gymnastics is my favorite spectator sport.

I've seen a few people around who are properly on to it, they hate it when either do it, and don't give away free passes.
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
^
I am always advocating for options. All big games should support everything.

But I do believe that not including a proprietary technology is worse than creating a proprietary technology and locking it down.

DLSS is integrated into the Unreal Engine SDK. It's not that the developers didn't implement it, they literally had to remove it from their project. And in my view that is scummy behavior.
 

64K

Joined
Mar 13, 2014
Messages
6,773 (1.72/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Something I find particularly interesting is that when Nvidia sponsors games, and there is feature disparity, it's an atrocity, something to boycott and add to the bank of reasons why they're scum. When AMD sponsors games and fleeces half of all possible buyers of said game, it's fine! AMD to the rescue hey? Everyone gets to use their crap options.

Frankly, I'm disgusted by both, or either depending which way you slice it. I won't forgive one, or the other, and I can't wait to hear the mental gymnastics around why this is different, and AMD is the savior and what they did is in no way comparable.

I doubt I'll hear anything new, but lets see right? I love hearing why one company you hate is disgusting and why another you like is totally A'ok :) Gymnastics is my favorite spectator sport.

I've seen a few people around who are properly on to it, they hate it when either do it, and don't give away free passes.

imo it's because AMD is the underdog and most people like to root for the underdog and are more apt to play it down when the underdog screws up. I don't care either way. I have spoken out about practices by Intel, Nvidia and AMD when they are doing something deceitful or causing unnecessary problems in my gaming hobby and for my fellow hobbyists.
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
imo it's because AMD is the underdog and most people like to root for the underdog and are more apt to play it down when the underdog screws up. I don't care either way. I have spoken out about practices by Intel, Nvidia and AMD when they are doing something deceitful or causing unnecessary problems in my gaming hobby and for my fellow hobbyists.


I think your point is spot on. Reality is despite all the competition between Intel vs AMD and AMD vs NVIDIA right now, all companies are involved in some shady practices and have too many issues with their products.

With AMD CPUs it is excessive heat consumption due to such a dense die and CPUs burning out and still terrible latency compared to Intel and only 4 wide and their insanely high prices for the platform despite worse IPC on Zen 4 compared to Raptor Cove and even slightly behind Golden Cove in IPC. And dual CCD Zen 4 buggy problems with games and severe cross latency penalty. And AMD CPUs SOC voltage and IMC burnout especially the X3D chips.

With NVIDIA the RTX 4090 is great, but too many have bad coil whine and power consumption is insanely high. And the lower SKU cards even going down to just 4080 a rip off in price for performance that is like severely gimped compared to the flagship no lower models really worth the price

With Intel CPUs, it is high power consumption and the hybrid arch causing issues and buginess and no more than 8 P cores on a single node. Also potential degradation issues with more than 300 watts applied for only a 5-12 hour severe stress test run. Look over at overclock.net Ichirou who states they degraded CPUs well north of 300 watts in just 5-10 hours or so. And nevermind Intel issues with DDR5 IMC signal balancing and very hard to get DDR5 at XMP even 6000 fully rock stable especially on Asus motherboards and even MSI and Gigabyte as well and really only mobo good for it without insane tweaking of RAM is eVGA Dark for Raptor Lake DDR5 XMP config up to 7200.

With AMD GPUs, its their prices and severe even worse coil whine and underwhelming performance when they hyped it up as being an NVIDIA RTX 4090 competitor.

Well do not get me wrong current products actually are very good, but all companies have issues of their own in the CPU and GPU space more so than usual.

I remember when Intel had no competition from AMD and the Core 2 and early Generation Core i7 CPUs just worked so reliably well and could be overclocked so easily on air and had no where near the amount of thermal issues and other problems today's CPUs have.
 
Last edited:
  • Like
Reactions: 64K
Joined
Jun 2, 2017
Messages
9,380 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I remember when Intel and no competition from AMD and the Core 2 and early Generation Core i7 CPUs just worked so reliably well and could be overclocked so easily on air and had no where near the amount of thermal issues and other problems today's CPUs have.
That is because it was basically the same arch.
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
That is because it was basically the same arch.


No the Core Nehlehm i series was a new arch and it worked very well at the time.

And Sandy Bridge was awesome and worked so well and stood the test of time.

Contrast to Raptor Lake which is same arch as Alder Lake and more thermal issues process node issues and such.

Zen 4 also the same compared to Zen 3. I mean AMD says 95C is normal lol. CPU could burn out. Basically same arch as Zen 3 with insanely higher clock speeds.
 
Joined
Sep 1, 2022
Messages
488 (0.58/day)
System Name Firestarter
Processor 7950X
Motherboard X670E Steel Legend
Cooling LF 2 420
Memory 4x16 G.Skill X5 6000@CL36
Video Card(s) RTX Gigabutt 4090 Gaming OC
Storage SSDS: OS: 2TB P41 Plat, 4TB SN850X, 1TB SN770. Raid 5 HDDS: 4x4TB WD Red Nas 2.0 HDDs, 1TB ext HDD.
Display(s) 42C3PUA, some dinky TN 10.1 inch display.
Case Fractal Torrent
Audio Device(s) PC38X
Power Supply GF3 TT Premium 850W
Mouse Razer Basilisk V3 Pro
Keyboard Steel Series Apex Pro
VR HMD Pimax Crystal with Index controllers
And Sandy Bridge was awesome and worked so well and stood the test of time.
Yes, it did, but it fell once AMD started pumping their CPU game up. Which is great for everyone.
 
Joined
Jun 14, 2020
Messages
3,540 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Look over at overclock.net Ichirou who states they degraded CPUs well north of 300 watts in just 5-10 hours or so. And nevermind Intel issues with DDR5 IMC signal balancing and very hard to get DDR5 at XMP even 6000 fully rock stable especially on Asus motherboards and even MSI and Gigabyte as well and really only mobo good for it without insane tweaking of RAM is eVGA Dark for Raptor Lake DDR5 XMP config up to 7200.
Don't believe everything you read. That's complete nonsense. Ichirou runs unstable overclocks and he thinks his CPU has degraded because he is crashing.
 
Joined
Jul 11, 2022
Messages
375 (0.42/day)
Don't believe everything you read. That's complete nonsense. Ichirou runs unstable overclocks and he thinks his CPU has degraded because he is crashing.


I think that is true regarding degradation as yeah how oculd something degrade so fast and be released.

Though I can confirm DDR5 XMP stability issues especially with Asus boards on 13th Gen (yes including Z790 Apex) are very real and bad and probably takes insane tweaking for months to get fully stable.

Do you have any experience with DDR5 XMP.

I hear eVGA Dark is really only board that works well up to 7200 out of box with minimal to no tweaking.

Well maybe MSI Unify X as well though it varies base don motherboard lottery I here for the Unify X.

I use Samsung BDie DDR4 in Gear 1 at 4200 DDR4 CL16 with TFRC at 260 and TRC at 50 per Good Old Gamer recommended tweaks that BDie can go that low. And I get great performance and low latency that is sub 50ns.

Since I had such an issue getting DDR5 stable, it was easy to tweak BDie to get it stable and tuned. Certainly much easier than DDR5 from my experience

 
Joined
Jul 11, 2015
Messages
806 (0.23/day)
System Name Harm's Rig's
Processor 5950X /2700x / AMD 8370e 4500
Motherboard ASUS DARK HERO / ASRock B550 Phantom Gaming 4
Cooling Arctic Liquid Freezer III 420 Push/Pull -6 Noctua NF-A14 i and 6 Noctua NF-A14 i Meshify 2 XL
Memory CORSAIR Vengeance RGB RT 32GB (4x16GB) DDR4 4266cl16 - Patriot Viper Steel DDR4 16GB (4x 8GB)
Video Card(s) ZOTAC AMP EXTREME AIRO 4090 / 1080 Ti /290X CFX
Storage SAMSUNG 980 PRO SSD 1TB/ WD DARK 770 2TB , Sabrent NVMe 512GB / 1 SSD 250GB / 1 HHD 3 TB
Display(s) Thermal Grizzly WireView / TCL 646 55 TV / 50 Xfinity Hisense A6 XUMO TV
Case Meshify 2 XL- TT 37 VIEW 200MM'S-ARTIC P14MAX
Audio Device(s) Sharp Aquos
Power Supply FSP Hydro PTM PRO 1200W ATX 3.0 PCI-E GEN-5 80 Plus Platinum - EVGA 1300G2/Corsair w750
Mouse G502
Keyboard G413
Using 5950X and 4090 at 4K EPIC raw , RT on , on my system with latest updates and drivers , there is no 70fps limit , also I prefer widest view as well .
side note playing thru Steam , not directly from EA , notices more usages on GPU .
 

Attachments

  • IMG20230807151800.jpg
    IMG20230807151800.jpg
    2.5 MB · Views: 41
  • IMG20230807152357.jpg
    IMG20230807152357.jpg
    2.3 MB · Views: 37
Last edited:
Joined
Jul 11, 2015
Messages
806 (0.23/day)
System Name Harm's Rig's
Processor 5950X /2700x / AMD 8370e 4500
Motherboard ASUS DARK HERO / ASRock B550 Phantom Gaming 4
Cooling Arctic Liquid Freezer III 420 Push/Pull -6 Noctua NF-A14 i and 6 Noctua NF-A14 i Meshify 2 XL
Memory CORSAIR Vengeance RGB RT 32GB (4x16GB) DDR4 4266cl16 - Patriot Viper Steel DDR4 16GB (4x 8GB)
Video Card(s) ZOTAC AMP EXTREME AIRO 4090 / 1080 Ti /290X CFX
Storage SAMSUNG 980 PRO SSD 1TB/ WD DARK 770 2TB , Sabrent NVMe 512GB / 1 SSD 250GB / 1 HHD 3 TB
Display(s) Thermal Grizzly WireView / TCL 646 55 TV / 50 Xfinity Hisense A6 XUMO TV
Case Meshify 2 XL- TT 37 VIEW 200MM'S-ARTIC P14MAX
Audio Device(s) Sharp Aquos
Power Supply FSP Hydro PTM PRO 1200W ATX 3.0 PCI-E GEN-5 80 Plus Platinum - EVGA 1300G2/Corsair w750
Mouse G502
Keyboard G413
Using 5950X and 4090 at 4K EPIC raw , RT on , on my system with latest updates and drivers , there is no 70fps limit , also I prefer widest view as well .
side note playing thru Steam , not directly from EA , notices more usages on GPU .
1691551973365.png

Never played a game ,where it used this much power !
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
That's where changing a few graphics settings would help a lot, and why i dislike CPU's and GPU's with a high maximum wattage - because sooner or later something will use it all, and then over time everything will.
I'd rather run DLSS or drop RTX than use 480W from my GPU - my entire PC uses half of that gaming


Edit: I learned this lesson back with an R9 290x, a 300W blower.
I tried to crossfire them. It went poorly.
High usage always meant high heat, and that hurt their performance and the rest of the system without screamer fans even when crossfire wasnt working (and those cards failed a lot, since the cooling wouldn't keep up)
 
Last edited:
Joined
Jan 14, 2019
Messages
12,593 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
That's where changing a few graphics settings would help a lot, and why i dislike CPU's and GPU's with a high maximum wattage - because sooner or later something will use it all, and then over time everything will.

I'd rather run DLSS or drop RTX than use 480W from my GPU - my entire PC uses half of that gaming
I would never even buy a 480 W GPU in the first place. My PSU could feed it thanks to the 7800X3D needing so little power, but it just seems excessive anyway. But each to their own.
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
I would never even buy a 480 W GPU in the first place. My PSU could feed it thanks to the 7800X3D needing so little power, but it just seems excessive anyway. But each to their own.

The thing is, if you have the money to spare, you can get a 4090 and undervolt it and limit it to 250 W, where it will still be faster than a 4080.

All GPUs and CPUs these days are actually designed to run at max power constantly, so they can achieve the highest benchmark score. This ignores efficiency and is completely pointless for regular gaming usage.

In the past people could overclock components themselves, now everything is pushed to the limits by manufacturers.

Efficiency = high number of cores, low clock and low voltage. That's how Apple does it. And that's exactly what can be done with PC hardware, but a lot of people don't want to lose 10-20% performance even if it cuts power in half. This is a stupid mindset, and that's exactly why those companies release factory overclocked products with bad efficiency, because people go for that.
 
Joined
Jan 14, 2019
Messages
12,593 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The thing is, if you have the money to spare, you can get a 4090 and undervolt it and limit it to 250 W, where it will still be faster than a 4080.

All GPUs and CPUs these days are actually designed to run at max power constantly, so they can achieve the highest benchmark score. This ignores efficiency and is completely pointless for regular gaming usage.

In the past people could overclock components themselves, now everything is pushed to the limits by manufacturers.

Efficiency = high number of cores, low clock and low voltage. That's how Apple does it. And that's exactly what can be done with PC hardware, but a lot of people don't want to lose 10-20% performance even if it cuts power in half. This is a stupid mindset, and that's exactly why those companies release factory overclocked products with bad efficiency, because people go for that.
I don't mind losing 10% performance for a much more realistic power consumption, heat and noise, but I'm way too lazy to tinker with voltages. :D

What I do instead, is apply a 60 FPS limit in the driver. That way, I get as much performance as needed when I'm below 60, and the desired high efficiency when I'm above it.
 
Joined
Jul 11, 2015
Messages
806 (0.23/day)
System Name Harm's Rig's
Processor 5950X /2700x / AMD 8370e 4500
Motherboard ASUS DARK HERO / ASRock B550 Phantom Gaming 4
Cooling Arctic Liquid Freezer III 420 Push/Pull -6 Noctua NF-A14 i and 6 Noctua NF-A14 i Meshify 2 XL
Memory CORSAIR Vengeance RGB RT 32GB (4x16GB) DDR4 4266cl16 - Patriot Viper Steel DDR4 16GB (4x 8GB)
Video Card(s) ZOTAC AMP EXTREME AIRO 4090 / 1080 Ti /290X CFX
Storage SAMSUNG 980 PRO SSD 1TB/ WD DARK 770 2TB , Sabrent NVMe 512GB / 1 SSD 250GB / 1 HHD 3 TB
Display(s) Thermal Grizzly WireView / TCL 646 55 TV / 50 Xfinity Hisense A6 XUMO TV
Case Meshify 2 XL- TT 37 VIEW 200MM'S-ARTIC P14MAX
Audio Device(s) Sharp Aquos
Power Supply FSP Hydro PTM PRO 1200W ATX 3.0 PCI-E GEN-5 80 Plus Platinum - EVGA 1300G2/Corsair w750
Mouse G502
Keyboard G413
No DLSS3 for this game , would for sure use it, like I did in Hogwarts .
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
The thing is, if you have the money to spare, you can get a 4090 and undervolt it and limit it to 250 W, where it will still be faster than a 4080.

All GPUs and CPUs these days are actually designed to run at max power constantly, so they can achieve the highest benchmark score. This ignores efficiency and is completely pointless for regular gaming usage.

In the past people could overclock components themselves, now everything is pushed to the limits by manufacturers.

Efficiency = high number of cores, low clock and low voltage. That's how Apple does it. And that's exactly what can be done with PC hardware, but a lot of people don't want to lose 10-20% performance even if it cuts power in half. This is a stupid mindset, and that's exactly why those companies release factory overclocked products with bad efficiency, because people go for that.
Not always, with the 3090 the extra VRAM modules used power even if the VRAM wasn't being hammered so a 3080 can have more wattage to the GPU within the same power limit.
The more cores on the GPU did balance out that somewhat, but it's not perfectly clear cut.

I don't mind losing 10% performance for a much more realistic power consumption, heat and noise, but I'm way too lazy to tinker with voltages. :D

What I do instead, is apply a 60 FPS limit in the driver. That way, I get as much performance as needed when I'm below 60, and the desired high efficiency when I'm above it.
Flat curve in afterburner is easy to set up - and if you're 60Hz you'll likely need a 59 or 58FPS cap to avoid the frame buffer qeueing. Gets weird if its not a Vsync display because the lower you go the more tearing or frame spikes you get, but the higher you go the more latency you have.

59FPS at least removes one full frame of latency with only one frame in 60 being potentially delayed, so it tends to work well - but 58 and shaving 2/3 of the latency off is absolutely noticeable in titles without reflex or some other queue buffer reduction like NULL or a frame discard method like fast Vsync
 
Joined
Jan 14, 2019
Messages
12,593 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Flat curve in afterburner is easy to set up - and if you're 60Hz you'll likely need a 59 or 58FPS cap to avoid the frame buffer qeueing. Gets weird if its not a Vsync display because the lower you go the more tearing or frame spikes you get, but the higher you go the more latency you have.

59FPS at least removes one full frame of latency with only one frame in 60 being potentially delayed, so it tends to work well - but 58 and shaving 2/3 of the latency off is absolutely noticeable in titles without reflex or some other queue buffer reduction like NULL or a frame discard method like fast Vsync
I've got Enhanced (Fast) Sync enabled, and also Radeon Something-Something that only lets one frame be queued. I'm not noticing any latency, but I'll try the lower frame cap, you got me interested.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I've got Enhanced (Fast) Sync enabled, and also Radeon Something-Something that only lets one frame be queued. I'm not noticing any latency, but I'll try the lower frame cap, you got me interested.
Nvidias overlay is fantastic for showing it, unsure how AMD's do with it
The render qeue is basically the CPU renders 2 frames in advance by default in DX12 so if the game engine says 'we've hit Vsync, halt" it still wants to use those frames its rendered
You could be 0.1ms past a display refresh cycle and you'd and up with that first frame being stuck waiting to display on the second cycle (not the first), then franes three and four display an intact image - but by now it's four cycles old (at 60Hz, 16.67ms x4 = 50ms)
Then the 5th frame is back to normal latency and it feels like you got adbucted by aliens and lost time, repeating any time it hits the cap.


Enhanced sync does work and works well, it's got the ability to discard frames so in theory it renders those two extra frames, throws the oldest and you're halving the worst case scenario.
You're meant to use Fast/Enhanced Vsync with something like 120FPS at 60Hz and get half the render delay, but the discard feature pays off too.
(Vulkan supports discarding natively, so it's got the best of both worlds)

rough numbers:
1x 16ms refresh cycle @60Hz, 1x16ms per frame - you get 32ms total. (not 16, because it's always rendering at least one frame while the previous frame is displaying)
1x 16ms refresh 60Hz@120FPS+ 8ms+8ms. Enhanced sync discards the first one and uses the newer one, so you get 8+16 for 24ms, a noticeable reduction.

This is where VRR Kicks in because you can have a 120Hz display at 60Hz and it uses the timing of 120Hz, so you'd get 8+8 at 60FPS 120Hz, for 16ms total - lower again.


In-between values are a problem because you could have frame one ready 2 refresh cycles in, then frame 2 is 4 cycles, then frame 3 is 5 cycles - microstutter!
Vsync on without VRR always ends up doing this with the worst case being 50% above or below refresh (45FPS at 60Hz, or 90FPS even with enhanced/fast Vsync - every second frame would be twice the delay of the first one)
That's where the old advice of running with Vsync off for competitive games comes from, it's just been made redundant with the ability to discard frames or use VRR - and even a 120Hz display halves the worst case scenario, so even a poorly optimised setup feels a ton better.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,593 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Nvidias overlay is fantastic for showing it, unsure how AMD's do with it
The render qeue is basically the CPU renders 2 frames in advance by default in DX12 so if the game engine says 'we've hit Vsync, halt" it still wants to use those frames its rendered
You could be 0.1ms past a display refresh cycle and you'd and up with that first cycle being stuck for two cycles, then three and four display an intact image - but by now it's four cycles old (at 60Hz, 16.67ms x4 = 50ms)
Then the 5th frame is back to normal latency and it feels like you got adbucted by aliens and lost time, repeating any time it hits the cap.


Enhanced sync does work and works well, it's got the ability to discard frames so in theory it renders those two extra frames, throws the oldest and you're halving the worst case scenario.
You're meant to use Fast/Enhanced Vsync with something like 120FPS at 60Hz and get half the render delay, but the discard feature pays off too.
(Vulkan supports discarding natively, so it's got the best of both worlds)

rough numbers:
1x 16ms refresh cycle @60Hz, 1x16ms per frame - you get 32ms total. (not 16, because it's always rendering at least one frame while the previous frame is displaying)
1x 16ms refresh 60Hz@120FPS+ 8ms+8ms. Enhanced sync discards the first one and uses the newer one, so you get 8+16 for 24ms, a noticeable reduction.

This is where VRR Kicks in because you can have a 120Hz display at 60Hz and it uses the timing of 120Hz, so you'd get 8+8 at 60FPS 120Hz, for 16ms total - lower again.


In-between values are a problem because you could have frame one ready 2 refresh cycles in, then frame 2 is 4 cycles, then frame 3 is 5 cycles - microstutter!
Vsync on without VRR always ends up doing this with the worst case being 50% above or below refresh (45FPS at 60Hz, or 90FPS even with enhanced/fast Vsync - every second frame would be twice the delay of the first one)
That's where the old advice of running with Vsync off for competitive games comes from, it's just been made redundant with the ability to discard frames or use VRR - and even a 120Hz display halves the worst case scenario, so even a poorly optimised setup feels a ton better.
That's a pretty comprehensive description, thanks! :)

Now one thing that never crossed my mind until now: if my monitor has got FreeSync, is it worth enabling Vsync in a game, or not? It's only 48-60 Hz, which is not a massive range, but I'm thinking about upgrading as prices are way down compared to only a couple years ago, and I'm kind of curious of ultrawide curved displays. My understanding used to be that Vsync is not needed for FreeSync to work, but now I'm not sure.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Gsync requires freesync to work, always has.

48-60 means you can cap to 58 and ensure you don't ever get that Vsync queue isue, but it displays with 60Hz timings spreading each image out slightly

60FPS at 60Hz is 16.67ms
58FPS is 16.94ms

With Gsync/Freesnc the display is still working in 16.67ms mode - so theres a 0.27ms delay each frame that would have built up to an eventual skip (every 62nd frame)
Instead, what happens is the variable blanking lets the display shift to a 58Hz mode (by making the blanking in front of each frame slightly larger) so every single frame is on time, and none are ever forced to wait for the next cycle to be seen.

With only a 2Hz difference it's not that big a change, but at 48FPS it'd be a massive improvement in visible stutter - every frame is on-time and even spaced, vs varying latency every frame (normal, double, triple, normal over and over)




Instead of staying on screen longer pushing things into the next display cycle, they cut back how long the frame is displayed to fit it into the timing of the higher refresh rate. It's done with the blanking, but honestly i'm not 100% sure on which part they extend or cut.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,593 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Gsync requires freesync to work, always has.

48-60 means you can cap to 58 and ensure you don't ever get that Vsync queue isue, but it displays with 60Hz timings spreading each image out slightly

60FPS at 60Hz is 16.67ms
58FPS is 16.94ms

With Gsync/Freesnc the display is still working in 16.67ms mode - so theres a 0.27ms delay each frame that would have built up to an eventual skip (every 62nd frame)
Instead, what happens is the variable blanking lets the display shift to a 58Hz mode (by making the blanking in front of each frame slightly larger) so every single frame is on time, and none are ever forced to wait for the next cycle to be seen.

With only a 2Hz difference it's not that big a change, but at 48FPS it'd be a massive improvement in visible stutter - every frame is on-time and even spaced, vs varying latency every frame (normal, double, triple, normal over and over)




Instead of staying on screen longer pushing things into the next display cycle, they cut back how long the frame is displayed to fit it into the timing of the higher refresh rate. It's done with the blanking, but honestly i'm not 100% sure on which part they extend or cut.
So it's best to turn Vsync on, right?
 
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Vsync with VRR is needed if you don't have an FPS limit. If your max refresh is 60 Hz, but the game goes over 60 FPS, you'll get tearing, it's the same behavior as Vsync off.

But if you cap your framerate ~3 FPS below your max refresh rate, you'll never reach Vsync and you'll never have tearing, it makes no difference if it's enabled or not. You don't want Vsync with a VRR display, because it will increase your input lag. Setting the FPS limit is the best option.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
So it's best to turn Vsync on, right?
Vsync+VRR is best

If you cant stay close to your refresh rate, you may need Vsync off to trade stuttering for tearing
 
Top