• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

Joined
May 10, 2023
Messages
433 (0.71/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
What that diagram tells me is that the 3090 should be a 250-260 Watt card. There is no need for it to eat more than that out-of-the-box. Overclockers would be happy with that, too.
Haven't you noticed how overclocking has diminished in popularity lately? Manufacturers are pushing components with higher clocks (and power, as a consequence) out of the box to try and get an edge and bigger numbers for marketing reasons.
Consumers have already shown they don't care about sane power consumption, they want that extra performance out of the box. Just look at what happened to the 9000 series from AMD, where they had to push for a bios with a higher default TDP to appease their consumers. Or Intel, where most people didn't give a damn about the great efficiency vs the previous gen.
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Haven't you noticed how overclocking has diminished in popularity lately? Manufacturers are pushing components with higher clocks (and power, as a consequence) out of the box to try and get an edge and bigger numbers for marketing reasons.
Except that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.

Consumers have already shown they don't care about sane power consumption, they want that extra performance out of the box. Just look at what happened to the 9000 series from AMD, where they had to push for a bios with a higher default TDP to appease their consumers. Or Intel, where most people didn't give a damn about the great efficiency vs the previous gen.
I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
 
Joined
Apr 13, 2022
Messages
1,209 (1.21/day)
That's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
The actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.
 
Joined
May 10, 2023
Messages
433 (0.71/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Except that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.
People are really willing to go through great lengths for that extra 2~5% performance, it is what it is.
I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
Eh, I do have opinions on that, but I guess it'd be pretty off-topic.
The actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.
Most prosumers will actually power limit those GPUs. Just look at the TDP of their enterprise lineup, it's way lower than their geforce counterparts since perf/watt is an important metric in that space.
 
Joined
Mar 10, 2010
Messages
11,880 (2.19/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
So that for some very specific cases it can stretch its legs all the way, using an extra 100W for a 100MHz bump for those synthetic benchmark scores.
Same goes for the 600W limit some models have, really pushing the power envelope for minor clock gains. Reminder that after some point, the performance scaling x power becomes exponential.

Both my 3090s have a default power limit of 370W, whereas at 275W I loose less than 10% perf.

Here's a simple example of power scaling for some AI workloads on a 3090, you can see that after some point you barely get any extra performance when increasing power:
View attachment 378225

That has been the case since... always. Here's another example with a 2080ti:
View attachment 378224

Games often don't really push a GPU that hard, so the consumption while playing is really lower than the actual limit.
Biscuits, path tracing etc etc IE the whole point of buying too much GPU ATM pushes a GPU just fine or do people buy these over the top priced cards just for raytraced Fortnite and Roblox.

Games now often Do push high load's.
 
Joined
Jul 9, 2021
Messages
80 (0.06/day)
But i'd also run the 5080 at 320w, and so the performance difference will still be whatever it ends up being.
problem is 5080 vram is bad over 2k already on Star Wars Outlaws with secret settings outlaws.
idk what games will use over 16gb vram maxed out or what insane textures can fit 32gb vram. witcher4 and gta6 might be some of it. no words on ac shadows and hexe.
and no, 5080 underpowered is not same as 5090. cause under hood is a totally different card. this is why overclocking won't make a huge difference excepting benchmarks.
and seriously,anyone buying 5090 should watercooled it in summers which become more and more hotter, way over 2 degree climate target
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
problem is 5080 vram is bad over 2k already on Star Wars Outlaws with secret settings outlaws.
idk what games will use over 16gb vram maxed out or what insane textures can fit 32gb vram. witcher4 and gta6 might be some of it. no words on ac shadows and hexe.
As long as the latest gen consoles have 16 GB RAM (not VRAM - total system RAM also used as VRAM), I'd be cautious about making predictions on this front.
 
Joined
Dec 14, 2011
Messages
1,120 (0.23/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Noctua NH-D15 G2
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage SAMSUNG 990 PRO 2TB
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 PRO - OPX Linear Switches
Software Microsoft Windows 11 - Enterprise (64-bit)
Except that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.


I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
Not to mention made-by-AMD and Nvidia FE cards look so much better than all the plastic bling-bling gamery shit made by board partners! :rolleyes:
 
Joined
Jun 14, 2020
Messages
3,661 (2.19/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
I wish better models had 2% more performance :D

They don't, just look at their clockspeeds, they are all within 0.01% of each other. Higher end models just have better PCB power delivery etc.

Not to mention made-by-AMD and Nvidia FE cards look so much better than all the plastic bling-bling gamery shit made by board partners! :rolleyes:
FE was nice when it was used on smaller models (got a 3060ti FE, it's really nice), but look at the 4090 FE, it's ugly as hell. There are a few designs that still look decent, usually it's the high end that's bling bling gamerz OC RGB, base models are nice, check the 4090 windforce for example.
 
Joined
May 10, 2023
Messages
433 (0.71/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Biscuits, path tracing etc etc IE the whole point of buying too much GPU ATM pushes a GPU just fine or do people buy these over the top priced cards just for raytraced Fortnite and Roblox.

Games now often Do push high load's.
I don't think that's the case, just take a look at steam's hw survey and see how the 3090 and 4090 fare in the list, way below the other products.
Most gamers won't be buying those products. Reminder that this forum is a niche with some enthusiasts that can afford it, but your average buyer won't even consider that product as an option.
That's debatable, the workloads you're talking about would still be VRAM limited.
And that's why you buy multiple of those :p
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
I don't think that's the case, just take a look at steam's hw survey and see how the 3090 and 4090 fare in the list, way below the other products.
Most gamers won't be buying those products. Reminder that this forum is a niche with some enthusiasts that can afford it, but your average buyer won't even consider that product as an option.
x90 is a niche even on this forum, I'd dare to say. It's for professionals, 4K high refresh gamers.
 
Last edited by a moderator:
Joined
Nov 22, 2023
Messages
256 (0.62/day)
Undervolting is the new overclocking. You get a card juiced way beyond the v/f curve out of the box, and the game is now to keep 95% of the performance for 70% of the power.

It's honestly for the best this way. Overclocking was fun (especially when you had to draw your own traces), but most buyers are fundamentally getting ripped off by not getting all of the performance out of the card they paid for.

Now you get all the performance, and it's on you to find the power/heat level that you're OK with.
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
It's honestly for the best this way. Overclocking was fun (especially when you had to draw your own traces), but most buyers are fundamentally getting ripped off by not getting all of the performance out of the card they paid for.
I kind of agree and disagree. I like using my card to its full potential, but 1. I'm not keen on the top 5% at all costs, and 2. we're constantly being fed that the planet needs to be saved, we're using too much energy and whatnot, so then why do GPUs have to consume hundreds of Watts more out-of-the-box for the extra 5% performance? Is it really worth it? I'm not sure it is.
 
Joined
Oct 28, 2012
Messages
1,199 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
That's debatable, the workloads you're talking about would still be VRAM limited. Unless you meant selling them in China, bypassing some of the usual sanctions?

They really aren't, the xx90 are neither here nor there IMO.
In art and design There's lots of freelancer who prefer the Geforce RTX xx90 over the workstation RTX. CAD/Scientific simulation users are pretty much the only target that I've really seen have a hard on for those professional cards, otherwise I've seen professionals says that in their domain, the ROI is just higher with a Geforce. And if a project really need more than that, they would use a cloud farm anyway (and charge the client accordingly)
 
Joined
Aug 12, 2010
Messages
143 (0.03/day)
Location
Brazil
Processor Ryzen 7 7800X3D
Motherboard ASRock B650M PG Riptide
Cooling Wraith Max + 2x Noctua Redux NF-P14r + 2x NF-P12
Memory 2x16GB ADATA XPG Lancer Blade DDR5-6000
Video Card(s) Powercolor RX 7800 XT Fighter OC
Storage ADATA Legend 970 2TB PCIe 5.0
Display(s) Dell 32" S3222DGM - 1440P 165Hz + P2422H
Case HYTE Y40
Audio Device(s) Microsoft Xbox TLL-00008
Power Supply Cooler Master MWE 750 V2
Mouse Alienware AW320M
Keyboard Alienware AW510K
Software Windows 11 Pro
Will this be the next Fermi?
 
Joined
Nov 22, 2023
Messages
256 (0.62/day)
I kind of agree and disagree. I like using my card to its full potential, but 1. I'm not keen on the top 5% at all costs, and 2. we're constantly being fed that the planet needs to be saved, we're using too much energy and whatnot, so then why do GPUs have to consume hundreds of Watts more out-of-the-box for the extra 5% performance? Is it really worth it? I'm not sure it is.

- Could just go with the accelerationism approach and try to burn down the planet faster... better a quick death than a long and drawn out battle against the inevitable :rockout:

But that's the beauty of undervolting... you don't have to get that last 5% at all costs, you can reduce power consumption and your own carbon footprint. The choice has simply shifted from having to gain performance to having to conserve power and heat, other side of the same coin.

I can't speak to newer Nvidia cards, but AMD definitely has some one click and done settings in their software that will bias the card one way or another so it doesn't even require a whole to of tweaking and noodling to get decent results.
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
- Could just go with the accelerationism approach and try to burn down the planet faster... better a quick death than a long and drawn out battle against the inevitable :rockout:
Interesting thought. I wouldn't say I entirely disagree, but let's not go there for the sake of other forum members' sanity. :D

But that's the beauty of undervolting... you don't have to get that last 5% at all costs, you can reduce power consumption and your own carbon footprint. The choice has simply shifted from having to gain performance to having to conserve power and heat, other side of the same coin.
Sure, but how many people do that? I'd bet at least 9 out of 10 people just plug their cards in and let it run full blast. What we see here in the forum is a tiny minority.

I can't speak to newer Nvidia cards, but AMD definitely has some one click and done settings in their software that will bias the card one way or another so it doesn't even require a whole to of tweaking and noodling to get decent results.
It's not that great. The last time I tried the "auto undervolt" button (if that's what you mean) when I briefly had a 7800 XT on my hands, it shaved maybe 10 W off of it. The power slider works nicely, though.

Does Nvidia have an equivalent function in their new software? I haven't tried it, yet (my HTPCs are running old drivers at the moment).
 
Joined
Jun 19, 2024
Messages
169 (0.84/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
You seem to be under the assumption that performance scales linearly with power.
A 5090 at a lower power budget than a 5080 is still going to have almost double the memory bandwidth, and way more cores, even if those are clocked lower.

Your assumptions are also wrong. A 5090 at 320W is likely to only be 10~20% slower than the stock setting.
The 5080 math is also not that simple because things (sadly) often do not scale linearly like that.

Example - my 4090 at a 90% power limit loses 2% performance.

and people with more money than brain.

That‘s where I have a problem with your thinking. Who are you to judge how I spend my money, what my values are, and what I derive pleasure from?

Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.

Just as an FYI I upgraded from a GTX 960. The reason I can afford a 4090 is because I saved the money by not upgrading constantly.
 
Last edited:
Joined
Jan 27, 2024
Messages
398 (1.15/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
The actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.

Nvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
They are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
That‘s where I have a problem with your thinking. Who are you to judge how I spend my money, what my values are, and what I derive pleasure from?

Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.

Just as an FYI I upgraded from a GTX 960. The reason I can afford a 4090 is because I saved the money by not upgrading constantly.
I listed several (even legitimate) use cases for the 4090. Did I say which kind of buyer you personally were? ;)

Also, how do you define an "enthusiast"?

Nvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
There's no more Quadro, just RTX Axxx. And who said they're not good at gaming?

They are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
That's pure marketing, not the full picture.
 
Joined
Aug 12, 2010
Messages
143 (0.03/day)
Location
Brazil
Processor Ryzen 7 7800X3D
Motherboard ASRock B650M PG Riptide
Cooling Wraith Max + 2x Noctua Redux NF-P14r + 2x NF-P12
Memory 2x16GB ADATA XPG Lancer Blade DDR5-6000
Video Card(s) Powercolor RX 7800 XT Fighter OC
Storage ADATA Legend 970 2TB PCIe 5.0
Display(s) Dell 32" S3222DGM - 1440P 165Hz + P2422H
Case HYTE Y40
Audio Device(s) Microsoft Xbox TLL-00008
Power Supply Cooler Master MWE 750 V2
Mouse Alienware AW320M
Keyboard Alienware AW510K
Software Windows 11 Pro
Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.
Judging by your own comments here, I can safely say that it is not.
 
Joined
Jan 27, 2024
Messages
398 (1.15/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
There's no more Quadro, just RTX Axxx. And who said they're not good at gaming?

Me. You know it, but... ?

1735930673661.png


https://www.reddit.com/r/buildapc/comments/167yyqu
 
Joined
Jan 14, 2019
Messages
13,051 (5.97/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Top