• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

Joined
May 10, 2023
Messages
481 (0.79/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
At least with FG you know they're running the same game
By your own previous argument, it's running in a different way and should not be comparable.
I don't think it's even about fairness it's just a shitty way of presenting those performance metrics for the tiny percentage of people who would even care, wouldn't you want to know that this now supports a smaller data type and that you can now run smaller models ?
The tiny percentage of people that care about it should be aware that those performance increases come from the extra bandwidth.
Be honest when you saw that did you assume it's the same data type or did you magically understand there must be more to it before squinting your eyes in the footnotes (if you ever did before someone else pointed it out for you), I for one admit I missed it before I saw someone else talk about it.
Being 100% honest, I hadn't even seen that Flux was in that graph at first lol
I just saw 1 game, then another one with DLSS stuff on top, and another one like that, and just stopped looking because I don't even care about games, and the comparison between DLSS stuff is kinda moot as everyone already agreed to in here.
I only noticed it when someone else brought the FP4 vs FP8 stuff earlier, I got confused what they were talking about until I noticed Flux was in there (and only for the 4090 vs 5090 comparison, not the others), so I was already aware of it by the time I saw it.

Does the AI industry even buy 5070/5080 level cards? I mean, home users getting their feet wet in AI, sure, but the wealthiest AI corps need a lot more oomph, don't they? That's who uber expensive professional cards are for. To them, everything you say about the 5070/5080 is meaningless.
5070s I don't think so, but 5080s would still be useful. If your model fits within 16GB, you can use these as cheap-yet-powerful inference stations.
Not a big company doing collocation in a big DC, but plenty of startups do something like that. Just take a look at vast.ai and you'll notice systems like so:
Screenshot 2025-01-08 at 16.32.49.png

8x 4070S Ti on an Epyc node with 258GB of RAM.

No, they are buyin B200 which also used FP4 claims (vs FP8 for hopper) in their marketing slides.


Look, the thing is, there was another company at CES that compared their 120w CPU vs the competitions 17w chip. With no small letters btw. No one is talking about it being misleading, but we have 50 different threads 20 pages long complaining about nvidia. Makes you wonder
B200s would be for the big techs, most others either buy smaller models (or even consumer GPUs), or just subscribe to some cloud provider.

Fair enough. Still wrong, imo, but as long as buyers are fine with it, who am I to argue.
They are more than fine, it means lots of vram savings and more perf.
Really? That's poor as well. I guess no one was really interested in that CPU. I don't even know which one you're talking about, it completely missed the spot with me (although I admit, I only looked for GPUs this time around).
That was AMD doing comparisons of strix halo vs lunar lake (and it was the shitty 30W lunar lake model, instead of the regular 17W).

RTX 4000 series changed the pattern about next gen. near top SKU beating the previous gen. top SKU. This no longer works, dudes.

Nvidia widened gap between SKUs. RTX 4080 has only 60% of 4090's compute units.
With RTX 5000 series, gap is even more widening. RTX 5080 will have only 50% of RTX 5090's compute units and just 65% of RTX 4090's.
RTX 5080 is basically RTX 4080S with 5% more compute units and a bit higher clocks.
No room for any significant performance boost this time (no significant node change).

Seriously, do your math people. I say it once more - RTX 5080 won't beat RTX 4090 in native (no DLSS and FG) because it lacks hardware resources.
We should come back to this discussion after reviews of 5080 are up. I don't have problem to admit that I was wrong WHEN I was wrong.

As for RX 9070 XT, no one is really expecting that it will go toe to toe with RX 7900 XTX, with RX 7900 XT maybe. (I'm not taking RT into account here.)
AMD clearly stated that they want to focus on making mainstream card for masses with vastly improved RT performance over previous generation.
I personally estimate for RX 9070 XT to be 5% below RX 7900XT but beating RX 7900XT in RT. Unfortunately, I don't give a f* about RT now.
RT may become a reasonable things when it will become less burdening on hardware, meaning ramping up RT will degrade performance as much as 15-20%.
Anything above that is just too much. My personal expectation is that AMD will move with RDNA4 from -60% perf. degradation to 30-35% degradation.
To be honest, even though the 4090 had almost 70% more cores, this doesn't mean that it had 70% more performance in games, in the same way the 5090 won't have 100% higher perf than the 5080 in this scenario.
The 4090 was really bottlenecked by memory bandwidth for games, and the 5080 has a bandwidth pretty similar to it, so the gap between those two may not be as big as the difference in SMs.
Will it be faster or equal in games? I don't know, reviews should reveal that once they're available, but I wouldn't be surprised if it does (in the same sense I wouldn't be in case it doesn't). Game perf is not really linear with either memory bandwidth nor compute units, so it's hard to estimate anything.
 
Low quality post by notoperable
Joined
Jul 4, 2023
Messages
45 (0.08/day)
Location
You wish
... The 4090 was really bottlenecked by memory bandwidth for games, and the 5080 has a bandwidth pretty similar to it, so the gap between those two may not be as big as the difference in SMs...
Who f**n cares? it has a higher number!
 
Joined
Jun 19, 2024
Messages
218 (1.06/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Whatever someone thinks about nvidias presentation, as bad as you think the cards they showed us are, they were good enough to make their competitor sound the retreat. But sure, let's once again be angry because nvidia is lying to us about the 5070 being a 4090, while completely ignoring the fact that the 5070 is good enough to make amd change whatever they had planned. Instead they wasted their time comparing a 17w chip to a 120w chip, because hey, that in no way is misleading.

On top of all this, the CEO couldn’t be bothered to show up to the single largest event of the year.
 
Joined
Oct 27, 2020
Messages
807 (0.53/day)
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
 
Joined
Jan 19, 2023
Messages
269 (0.37/day)
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
If on my 7900XTX I will apply Lossless Scaling 4X frame gen or even better FSR FG with AFMF2 which will give me 2 interpolated frames is my GPU suddenly faster than 4090? And can it be put into benchmarks? If not then 5070 is not equal to 4090.
Don't get me wrong I might jump on 5080 myself, just for the RT perf and tbh just because, but bullshit marketing is bullshit marketing.
 
Joined
Apr 14, 2022
Messages
771 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
Even if a gpu can create a million fake frames in a second, still we should never accept this as performance metric.
Yes it helps/it's necessary for some PT games but it's not a deal breaking feature.
DLSS is the most important asset nVidia has. Not the FG.
 
Joined
Jan 14, 2019
Messages
13,229 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
That's exactly what's happening. The thing is that without knowing how MFG affects your gameplay experience, this isn't valid information. It's almost like saying that the 5070 runs games faster at 720p low than the 4090 does at 4K ultra. Of course it does, duh. ;)

First we had fake resolutions, then fake frames, now we have multiple fake frames, all introducing different sorts of graphical and latency issues, and people are pissing their pants in joy because it gives them MOAR POWAH!!! What happened to just enjoying games? :(
 
Joined
Jun 19, 2024
Messages
218 (1.06/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Outputs from FP4 and FP8 models are not equivalent, quit thinking as an AI tourist. People supposedly using these for work would know this is a false comparison.

Your beloved is adding FP4.
  • The first product in the AMD Instinct MI350 Series, the AMD Instinct MI350X accelerator, is based on the AMD CDNA 4 architecture and is expected to be available in 2025. It will use the same industry standard Universal Baseboard server design as other MI300 Series accelerators and will be built using advanced 3nm process technology, support the FP4 and FP6 AI datatypes and have up to 288 GB of HBM3E memory.
 
Joined
Jan 14, 2019
Messages
13,229 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Your beloved is adding FP4.
  • The first product in the AMD Instinct MI350 Series, the AMD Instinct MI350X accelerator, is based on the AMD CDNA 4 architecture and is expected to be available in 2025. It will use the same industry standard Universal Baseboard server design as other MI300 Series accelerators and will be built using advanced 3nm process technology, support the FP4 and FP6 AI datatypes and have up to 288 GB of HBM3E memory.
You're missing the point. Vya wasn't discussing FP4 support, but the fact that it was compared to FP8 performance in the presentation as if they were equal.
 
Joined
May 10, 2023
Messages
481 (0.79/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
You're missing the point. Vya wasn't discussing FP4 support, but the fact that it was compared to FP8 performance in the presentation as if they were equal.
They did that to showcase a new feature. Anyone that's actually in the field would find this a nice thing.
Anyone in the field would also know that they could have been using 1-bit weights and the graph would still be pretty similar, the data type for that model is almost negligible since it's mostly memory-bound, and that's where the perf uplift came from.
 
Joined
Jan 8, 2017
Messages
9,579 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
they could have been using 1-bit weights and the graph would still be pretty similar, the data type for that model is almost negligible since
100% false, I don't know why you insist the data type is irrelevant and totally interchangeable but it's total nonsense.
 
Last edited:
Joined
Apr 24, 2020
Messages
2,742 (1.59/day)
FP4 has a place, mostly because 80B parameter models have better performance (even on highly compressed / quantized FP4) than 11B parameter models on FP16.

So running an 80B parameter model at 4-bit is better than running 11B parameter model at 16-bit. But everyone would prefer to run the 80B parameter at 16-bit if at all possible. Alas, the 80B parameter model needs too much RAM.

Comparing an FP8 benchmark on the old card and an FP4 benchmark on the new cards is 100% shenanigans. Its just false marketing, dare I say. In fact, because the FP4 model uses 1/2 the RAM I bet that running FP4 on old-hardware would still be a dramatic speedup (cuts down on memory bandwidths by 50% !!!). Even without any FP4 tensor units, you can just do the FP8 or FP16 multiply-and-accumulate, then downsample to FP4 storage.
 
Joined
Jun 14, 2020
Messages
3,741 (2.24/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Actually I find the statement in Nvidia slide that 5070 has 4090 performance plausible (except from misleading)
Before the slide Jensen was talking about raytracing and AI generated frames saying that for every 33 million pixels calculated with MFG, only 2 million is calculated through traditional rendering.
So the comparison that Nvidia seemingly wants to do for 5070 vs 4090 is FHD native res with raytracing applied and then upscaled to 4K with DLSS (performance) with MFG in 5070's case and FG in 4090's case.
Apply DLSS in the below results (4090 is 145 and 4070S is 92 for example) then multiple with MFG for 5070 and just FG for 4090, it seems perfectly doable and also the experience will not be far off for many games that aren't fast paced since 5070 will have as an example 30fps base with up to 120fps with MFG and 4090 will have 60 fps base with up to 120fps with FG.
Of course it's plausible when you use MFG, but that's the point, if the 5070 with MFG matches the 4090 with FG, that literally makes the 4090 twice as fast in raw horsepower.

Something im puzzled with, both with reading the comments here and on other platforms - and using the data from TPU's latest GPU testing (https://www.techpowerup.com/review/gpu-test-system-update-for-2025/2.html), looks like Nvidia will have the 5 (maybe 6, we have to see where the 5070ti lands) fastest cards for pure raster. Again, just raster, not even touching RT. Looking at RT, it has the top 9-11 fastest cards depending on resolution (assuming the 5070ti will be faster than the XTX, which is likely the case) And yet we are complaining that they are ignoring raw performance for AI....? The have cards from 2020 (LOL) that are faster in pure RT performance than amd's latest and greatest..

So, do they have to have the 50 top fastest cards in both RT and raster to stop complaining, or what am I missing?
 
Last edited:
Joined
Jul 24, 2024
Messages
314 (1.85/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
To be honest, even though the 4090 had almost 70% more cores, this doesn't mean that it had 70% more performance in games, in the same way the 5090 won't have 100% higher perf than the 5080 in this scenario.
The 4090 was really bottlenecked by memory bandwidth for games, and the 5080 has a bandwidth pretty similar to it, so the gap between those two may not be as big as the difference in SMs.
Will it be faster or equal in games? I don't know, reviews should reveal that once they're available, but I wouldn't be surprised if it does (in the same sense I wouldn't be in case it doesn't). Game perf is not really linear with either memory bandwidth nor compute units, so it's hard to estimate anything.
5080 has 75% memory bandwidth of 4090. I wouldn't call it "pretty similar".
Even though you made a partially valid point, this is (IMHO) still not enough for 5080 to beat 4090 in native (raster. perf.).

That's exactly what's happening. The thing is that without knowing how MFG affects your gameplay experience, this isn't valid information. It's almost like saying that the 5070 runs games faster at 720p low than the 4090 does at 4K ultra. Of course it does, duh. ;)

First we had fake resolutions, then fake frames, now we have multiple fake frames, all introducing different sorts of graphical and latency issues, and people are pissing their pants in joy because it gives them MOAR POWAH!!! What happened to just enjoying games? :(
Next is fake games! I've already mentioned that before if you recall (AI generated graphics, sounds, even script).
Gaming industry will face tremendous difficulties once gamers get their hands on tools to create AI generated games for free.
I have no problem playing older games as long as there is multi-player/co-op support (servers still running).
I doubt that quality of games such as L4D2, Diablo, Borderlands, Red Alert, Starcraft 2, Battlefield BC2, Jagged Alliance 2, etc. will ever get beaten in future. I lost thousands of my life hours with these games.
I'm not trying to be pessimistic here, but realistic. If you look at what is quality of games today ...
Of course it's plausible when you use MFG, but that's the point, if the 5070 with MFG matches the 4090 with FG, that literally makes the 4090 twice as fast in raw horsepower.
Exactly.
 
Joined
Jan 8, 2017
Messages
9,579 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
once gamers get their hands on tools to create AI generated games for free.
That stuff is still ages away from being close to usable, if it will ever be. The concern is misguided, it's these corporations who will try to fill games with AI generated trash the moment it becomes feasible.
 
Joined
Jan 14, 2019
Messages
13,229 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Next is fake games! I've already mentioned that before if you recall (AI generated graphics, sounds, even script).
Gaming industry will face tremendous difficulties once gamers get their hands on tools to create AI generated games for free.
I have no problem playing older games as long as there is multi-player/co-op support (servers still running).
I doubt that quality of games such as L4D2, Diablo, Borderlands, Red Alert, Starcraft 2, Battlefield BC2, Jagged Alliance 2, etc. will ever get beaten in future. I lost thousands of my life hours with these games.
I'm not trying to be pessimistic here, but realistic. If you look at what is quality of games today ...
Oh there are lots of amazing games out there, believe me! Just look further than the overhyped, mass produced usual EA / Ubisoft AAA crap. The future is in indie and lesser known titles, I've been saying this for years. Stray and Hellblade: Senua's Sacrifice both made me cry. Abzu is also a great one to recommend. They run great on a Steam Deck, you don't even need high-end hardware for them. I have a few more on my list that I've yet to play (Lost Ember, Star Trucker, Bramble: The Mountain King just to name a few).
 
Joined
Jun 14, 2020
Messages
3,741 (2.24/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Oh there are lots of amazing games out there, believe me! Just look further than the overhyped, mass produced usual EA / Ubisoft AAA crap. The future is in indie and lesser known titles, I've been saying this for years. Stray and Hellblade: Senua's Sacrifice both made me cry. Abzu is also a great one to recommend. They run great on a Steam Deck, you don't even need high-end hardware for them. I have a few more on my list that I've yet to play (Lost Ember, Star Trucker, Bramble: The Mountain King just to name a few).
KENA bridge of spirits is pretty damn good too.
 
Joined
Jan 14, 2019
Messages
13,229 (6.05/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
KENA bridge of spirits is pretty damn good too.
How did I forget, that's on my list of games to play, too! I just installed it on my Deck a few days ago. :)
 
Top