Plenty of genshin impact gamers with 4090s....Do people really pay $2000 to play crappy 1080p upscaled gamed?
Plenty of genshin impact gamers with 4090s....Do people really pay $2000 to play crappy 1080p upscaled gamed?
System Name | DLSS / YOLO-PC / FULLRETARD |
---|---|
Processor | i5-12400F / 10600KF / C2D E6750 |
Motherboard | Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333 |
Cooling | Laminar RM1 / Gammaxx 400 / 775 Box cooler |
Memory | 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700 |
Video Card(s) | RX 6700 XT / R9 380 2 GB / 9600 GT |
Storage | A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD |
Display(s) | Compit HA2704 / MSi G2712 / non-existent |
Case | Matrexx 55 / Junkyard special / non-existent |
Audio Device(s) | Want loud, use headphones. Want quiet, use satellites. |
Power Supply | Thermaltake 1000 W / Corsair CX650M / non-existent |
Mouse | Don't disturb, cheese eating in progress... |
Keyboard | Makes some noise. Probably onto something. |
VR HMD | I live in real reality and don't need a virtual one. |
Software | Windows 11 / 10 / 8 |
GTX 770 came one year after 670 and offered 13% more edge. 970 offered at least 33% advantage.GTX 770 refresh scenario.....
Triple A gaming in 2020s is like waterboarding in the sense it only sounds cool till you know what it actually is.
System Name | Skunkworks 3.0 |
---|---|
Processor | 5800x3d |
Motherboard | x570 unify |
Cooling | Noctua NH-U12A |
Memory | 32GB 3600 mhz |
Video Card(s) | asrock 6800xt challenger D |
Storage | Sabarent rocket 4.0 2TB, MX 500 2TB |
Display(s) | Asus 1440p144 27" |
Case | Old arse cooler master 932 |
Power Supply | Corsair 1200w platinum |
Mouse | *squeak* |
Keyboard | Some old office thing |
Software | Manjaro |
Still incorrect to label it as "inefficient". Its more efficient then the 4090 and blows every single AMD card out of the water.RTX 4090 was not most efficient card, based on W1zzard's reviews:
View attachment 381454
(TPU's RTX 4080 Super review here.)
Same, it's not more efficient than previous top efficient card - aka RTX 4080(S).
Still, remember, guys, those results are based on just one game - Cyberpunk 2077. It varies between games, just so you know.
Unfortunately, GN video shows efficiency comparison only in 3 games, which is still more than in one as seen on TPU:
It would be nice to have bigger statistical sample, 10 games at least, same settings, same rest of hardware, RTX 4090 vs RTX 5090. The more games, the better accuracy of the results. What was already tested by German colleagues, they limited power of RTX 5090 to 450W (RTX 4090 level) and saw 11-15% performance improvement. That means RTX 5090 limited to 450W is indeed more efficient than RTX 4090. As for 575W TGP, I don't think so. I'd say they are pretty much on par, though RTX 4090 might be very slightly more efficient. Of course, undervolted RTX 5090 might be totally different story, similarly to undervolted RTX 4090's story.
Processor | AMD Ryzen™ 7 5700X |
---|---|
Motherboard | ASRock B450M Pro4-F R2.0 |
Cooling | Arctic Freezer A35 |
Memory | Lexar Thor 32GB 3733Mhz CL16 |
Video Card(s) | PURE AMD Radeon™ RX 7800 XT 16GB |
Storage | Lexar NM790 2TB + Lexar NS100 2TB |
Display(s) | HP X34 UltraWide IPS 165Hz |
Case | Zalman i3 Neo + Arctic P12 |
Audio Device(s) | Airpulse A100 + Edifier T5 |
Power Supply | Sharkoon Rebel P20 750W |
Mouse | Cooler Master MM730 |
Keyboard | Krux Atax PRO Gateron Yellow |
Software | Windows 11 Pro |
Those are some strong words when new AMD cards are not even tested yet. IPC will be more easy to test on RTX 5080 vs RTX 4080 due to the same 256 bit bus. Apples vs apples not apples vs oranges!Its more efficient then the 4090 and blows every single AMD card out of the water.
System Name | GraniteXT |
---|---|
Processor | Ryzen 9950X |
Motherboard | ASRock B650M-HDV |
Cooling | 2x360mm custom loop |
Memory | 2x24GB Team Xtreem DDR5-8000 [M die] |
Video Card(s) | RTX 3090 FE underwater |
Storage | Intel P5800X 800GB + Samsung 980 Pro 2TB |
Display(s) | MSI 342C 34" OLED |
Case | O11D Evo RGB |
Audio Device(s) | DCA Aeon 2 w/ SMSL M200/SP200 |
Power Supply | Superflower Leadex VII XG 1300W |
Mouse | Razer Basilisk V3 |
Keyboard | Steelseries Apex Pro V2 TKL |
Still better than sharing flat out incorrect info I supopose. When Der8auer contacted nvidia to ask why the hotspot temperature was removed, their reply was somewhere along the lines of "oh that sensor was bogus, but we added memory temperatures now!". We've had memory temperature for ages.The CUDA cores can now all execute either INT or FP, on Ada only half had that capability. When I asked NVIDIA for more details on the granularity of that switch they acted dumb and gave me an answer to a completely different question and mentioned "that's all that we can share"
System Name | Main PC |
---|---|
Processor | 13700k |
Motherboard | Asrock Z690 Steel Legend D4 - Bios 13.02 |
Cooling | Noctua NH-D15S |
Memory | 32 Gig 3200CL14 |
Video Card(s) | 4080 RTX SUPER FE 16G |
Storage | 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red |
Display(s) | LG 27GL850 |
Case | Fractal Define R4 |
Audio Device(s) | Soundblaster AE-9 |
Power Supply | Antec HCG 750 Gold |
Software | Windows 10 21H2 LTSC |
5090 is basically a "I give no ***** about being economic". Worse under load, massively worse when idle.A bit more apples to apples:
4090 idle: 22W
5090 idle: 30W, +36%
4090 multi monitor: 27W
5090 multi monitor:39W, +44%
4090 video playback: 26W
5090 video playback: 54W, +108%
It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.
But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quadro" lineup, or "home and small business AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
Thats scary. UK regulated tariff, what we call SVR, at current exchange rates is in between those bottom 2 examples you listed.Anyone that can afford a 5090 probably isn't overly concerned about the cost to run it for gaming.
If you game 4 hours a day, that's 28 hours a week.
If the GPU runs at a continuous 600W an hour while gaming you end up with 16.8kWh a week.
If you pay $0.10 / kWh = $1.68 a week
If you pay $0.20 / kWh = $3.36 a week
If you pay $0.30 / kWh = $5.04 a week
If you pay $0.70 / kWh = $11.76 a week
Remember, this is if the GPU is running a sustained, continuous 600W those 4 straight hours of gaming. It all depends on the game, resolution, settings and so on. Also, remember the V-Sync power chart shows the GPU pulling about 90W. The above numbers would be for top-end power draw scenarios.
Personally I wouldn't want a GPU that can suck 600W for gaming. Not to mention the fact that this GPU is priced nearly 3x over what I'm comfortable in spending on a GPU, so I'm not the target for this product. If I had oodles of money and no brains, I'd get one, but I've got a meager amount of money and brains so I won't be getting one.
System Name | Skunkworks 3.0 |
---|---|
Processor | 5800x3d |
Motherboard | x570 unify |
Cooling | Noctua NH-U12A |
Memory | 32GB 3600 mhz |
Video Card(s) | asrock 6800xt challenger D |
Storage | Sabarent rocket 4.0 2TB, MX 500 2TB |
Display(s) | Asus 1440p144 27" |
Case | Old arse cooler master 932 |
Power Supply | Corsair 1200w platinum |
Mouse | *squeak* |
Keyboard | Some old office thing |
Software | Manjaro |
Well, given the last 3 generations of AMD struggled in the efficiency game and the actual arch improvements have been near non existent, I'mma make an educated guess and say rDNA4 isnt going to be setting that efficiency graph on fire.Those are some strong words when new AMD cards are not even tested yet.
No idea why IPC was brought up. Apples vs oranges indeed.IPC will be more easy to test on RTX 5080 vs RTX 4080 due to the same 256 bit bus. Apples vs apples not apples vs oranges!
Yeah, but those costs are at 600w continuously, 4 hours a day, every day.Thats scary. UK regulated tariff, what we call SVR, at current exchange rates is in between those bottom 2 examples you listed.
Processor | AMD Ryzen™ 7 5700X |
---|---|
Motherboard | ASRock B450M Pro4-F R2.0 |
Cooling | Arctic Freezer A35 |
Memory | Lexar Thor 32GB 3733Mhz CL16 |
Video Card(s) | PURE AMD Radeon™ RX 7800 XT 16GB |
Storage | Lexar NM790 2TB + Lexar NS100 2TB |
Display(s) | HP X34 UltraWide IPS 165Hz |
Case | Zalman i3 Neo + Arctic P12 |
Audio Device(s) | Airpulse A100 + Edifier T5 |
Power Supply | Sharkoon Rebel P20 750W |
Mouse | Cooler Master MM730 |
Keyboard | Krux Atax PRO Gateron Yellow |
Software | Windows 11 Pro |
384 (4090) vs 512 (5090) technically not the same. For power and IPC testing 256 vs 256 will be more accurate.No idea why IPC was brought up. Apples vs oranges indeed.
Processor | Intel Core i5-4690 |
---|---|
Motherboard | MSI H97 PC Mate |
Video Card(s) | PowerColor Red Devil RX 480 8GB |
Case | be quiet! Silent Base 800 Orange Window |
Recall that for us European consumers, once VAT is calculated, the card costs approximately 2'300 /2'400 Euro.If you pay $0.10 / kWh = $1.68 a week
If you pay $0.20 / kWh = $3.36 a week
If you pay $0.30 / kWh = $5.04 a week
If you pay $0.70 / kWh = $11.76 a week
Processor | AMD Ryzen 7 9800X3D |
---|---|
Motherboard | MSI MPG X870E Carbon Wifi |
Cooling | ARCTIC Liquid Freezer II 280 A-RGB |
Memory | 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30) |
Video Card(s) | MSI GeForce RTX 4090 SUPRIM Liquid X |
Storage | Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink |
Display(s) | AORUS FO32U2P 4K QD-OLED 240Hz (DisplayPort 2.1) |
Case | CoolerMaster H500M (Mesh) |
Audio Device(s) | AKG N90Q with AudioQuest DragonFly Red (USB DAC) |
Power Supply | Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1 |
Mouse | Logitech G PRO X SUPERLIGHT |
Keyboard | Razer BlackWidow V3 Pro |
Software | Windows 10 64-bit |
On Blackwell all the CUDA Cores can do FP32 or INT32, but games use about ~35% of INT32 cores to run them (some Nvidia employees confirmed it), so the 5090 is only a ~110 FP32 Gaming GPU ! That's a lot of FP32 performance left on the table... I wonder if they could make each core have a "Dual Instruction mode" aka doing both FP32 + INT32 at the same time in next-gen architectures. That could give them a huge boost just by changing the way the architecture works.The CUDA cores can now all execute either INT or FP, on Ada only half had that capability. When I asked NVIDIA for more details on the granularity of that switch they acted dumb and gave me an answer to a completely different question and mentioned "that's all that we can share"
The 5090 has a 512-bit bus and 33% more CUDA Cores...and they had to lower Core clocks compared to the 4090 to not use "too much" power... Also they're on a TSMC 4nm node so they can't get much more than that.A bit more apples to apples:
4090 idle: 22W
5090 idle: 30W, +36%
4090 multi monitor: 27W
5090 multi monitor:39W, +44%
4090 video playback: 26W
5090 video playback: 54W, +108%
It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.
But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quadro" lineup, or "home and small business AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
System Name | Main PC |
---|---|
Processor | 13700k |
Motherboard | Asrock Z690 Steel Legend D4 - Bios 13.02 |
Cooling | Noctua NH-D15S |
Memory | 32 Gig 3200CL14 |
Video Card(s) | 4080 RTX SUPER FE 16G |
Storage | 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red |
Display(s) | LG 27GL850 |
Case | Fractal Define R4 |
Audio Device(s) | Soundblaster AE-9 |
Power Supply | Antec HCG 750 Gold |
Software | Windows 10 21H2 LTSC |
System Name | Best AMD Computer |
---|---|
Processor | AMD 7900X3D |
Motherboard | Asus X670E E Strix |
Cooling | In Win SR36 |
Memory | GSKILL DDR5 32GB 5200 30 |
Video Card(s) | Sapphire Pulse 7900XT (Watercooled) |
Storage | Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500 |
Display(s) | GIGABYTE FV43U |
Case | Corsair 7000D Airflow |
Audio Device(s) | Corsair Void Pro, Logitch Z523 5.1 |
Power Supply | Deepcool 1000M |
Mouse | Logitech g7 gaming mouse |
Keyboard | Logitech G510 |
Software | Windows 11 Pro 64 Steam. GOG, Uplay, Origin |
Benchmark Scores | Firestrike: 46183 Time Spy: 25121 |
Ritch boys video card.Well I watched the OC3D review of the 5090 Suprim from MSI. When I saw the power draw was 836 Watts I was blown away. Just about all reviewers are looking at price as the mitigating factor and it does not matter how you try to fit it 836 watts from 1 component in a PC is insane.
Well I watched the OC3D review of the 5090 Suprim from MSI. When I saw the power draw was 836 Watts I was blown away. Just about all reviewers are looking at price as the mitigating factor and it does not matter how you try to fit it 836 watts from 1 component in a PC is insane.
System Name | Best AMD Computer |
---|---|
Processor | AMD 7900X3D |
Motherboard | Asus X670E E Strix |
Cooling | In Win SR36 |
Memory | GSKILL DDR5 32GB 5200 30 |
Video Card(s) | Sapphire Pulse 7900XT (Watercooled) |
Storage | Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500 |
Display(s) | GIGABYTE FV43U |
Case | Corsair 7000D Airflow |
Audio Device(s) | Corsair Void Pro, Logitch Z523 5.1 |
Power Supply | Deepcool 1000M |
Mouse | Logitech g7 gaming mouse |
Keyboard | Logitech G510 |
Software | Windows 11 Pro 64 Steam. GOG, Uplay, Origin |
Benchmark Scores | Firestrike: 46183 Time Spy: 25121 |
I am pretty sure it was FurmarkAre you sure that isn't total system draw? I haven't seen review go that high yet unless they were showing the total system power instead of just the GPU.
System Name | G-Station 2.0 "YGUAZU" |
---|---|
Processor | AMD Ryzen 7 5700X3D |
Motherboard | Gigabyte X470 Aorus Gaming 7 WiFi |
Cooling | Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters |
Memory | Asgard Bragi DDR4-3600CL14 2x16GB |
Video Card(s) | Sapphire PULSE RX 7900 XTX |
Storage | 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD |
Display(s) | Samsung 34" Odyssey OLED G8 |
Case | Lian Li Lancool 216 |
Audio Device(s) | Astro A40 TR + MixAmp |
Power Supply | Cougar GEX X2 1000W |
Mouse | Razer Viper Ultimate |
Keyboard | Razer Huntsman Elite (Red) |
Software | Windows 11 Pro, Garuda Linux |
I mean, there's an article here in TPU about Igor's Lab measuring 901W spikesAre you sure that isn't total system draw? I haven't seen review go that high yet unless they were showing the total system power instead of just the GPU.
I mean, there's an article here in TPU about Igor's Lab measuring 901W spikes
GeForce RTX 5090 Power Excursions Tested: Can Spike to 901W Under 1ms
Igor's Lab conducted an in-depth analysis of the power management system of the new NVIDIA GeForce RTX 5090 graphics card, including the way the card draws peak power within the tolerances of the ATX 3.1 specification. This analysis should prove particularly useful for those still on older ATX...www.techpowerup.com
But that's what it is: spikes. And still covered by ATX 3.1 specs.
System Name | Best AMD Computer |
---|---|
Processor | AMD 7900X3D |
Motherboard | Asus X670E E Strix |
Cooling | In Win SR36 |
Memory | GSKILL DDR5 32GB 5200 30 |
Video Card(s) | Sapphire Pulse 7900XT (Watercooled) |
Storage | Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500 |
Display(s) | GIGABYTE FV43U |
Case | Corsair 7000D Airflow |
Audio Device(s) | Corsair Void Pro, Logitch Z523 5.1 |
Power Supply | Deepcool 1000M |
Mouse | Logitech g7 gaming mouse |
Keyboard | Logitech G510 |
Software | Windows 11 Pro 64 Steam. GOG, Uplay, Origin |
Benchmark Scores | Firestrike: 46183 Time Spy: 25121 |
Well the Asus card here draws over 600 Watts. I am not trying to bash the card but the power draw is insane. No matter how good the cooling solution is. 1200 Watt PSUs are not cheap.Thanks, I had missed that article. Seems like a nothingburger if it is a 1ms spike and covered by the spec to account for it, not that it is constantly pulling almost 850watts continuously
System Name | PC |
---|---|
Processor | i7-13700k |
Motherboard | Z790 Aorus Elite AX |
Cooling | NH-D15 |
Memory | Corsair 2x32GB 6000mt/s cl30 |
Video Card(s) | RTX4070 Gaming OC |
Storage | 2TB 990 pro, 2TB Crucial P1, 2TB 870 EVO, 2TB 870 QVO, 3x8TB Seagate Exos 7E10, 6TB Toshiba N300 |
Display(s) | Dell S255HG |
Case | Fractal Define R6 TG |
Audio Device(s) | Creative AE-5 plus, 2xPresonus Eris E8 XT |
Power Supply | Corsair RM750i |
Mouse | Locitech G403 |
Keyboard | Corsair K70 TKL |
Software | win10 pro + win11pro + fedora |
Benchmark Scores | https://www.3dmark.com/spy/44211099 |
Processor | AMD Ryzen™ 7 5700X |
---|---|
Motherboard | ASRock B450M Pro4-F R2.0 |
Cooling | Arctic Freezer A35 |
Memory | Lexar Thor 32GB 3733Mhz CL16 |
Video Card(s) | PURE AMD Radeon™ RX 7800 XT 16GB |
Storage | Lexar NM790 2TB + Lexar NS100 2TB |
Display(s) | HP X34 UltraWide IPS 165Hz |
Case | Zalman i3 Neo + Arctic P12 |
Audio Device(s) | Airpulse A100 + Edifier T5 |
Power Supply | Sharkoon Rebel P20 750W |
Mouse | Cooler Master MM730 |
Keyboard | Krux Atax PRO Gateron Yellow |
Software | Windows 11 Pro |
Good pc case also needed to cool down that oven. Original cooler is bad ~40dba for that price it is not acceptable.....No matter how good the cooling solution is. 1200 Watt PSUs are not cheap.
Anyone that can afford a 5090 probably isn't overly concerned about the cost to run it for gaming.
If you game 4 hours a day, that's 28 hours a week.
If the GPU runs at a continuous 600W an hour while gaming you end up with 16.8kWh a week.
If you pay $0.10 / kWh = $1.68 a week
If you pay $0.20 / kWh = $3.36 a week
If you pay $0.30 / kWh = $5.04 a week
If you pay $0.70 / kWh = $11.76 a week
Remember, this is if the GPU is running a sustained, continuous 600W those 4 straight hours of gaming. It all depends on the game, resolution, settings and so on. Also, remember the V-Sync power chart shows the GPU pulling about 90W. The above numbers would be for top-end power draw scenarios.
Personally I wouldn't want a GPU that can suck 600W for gaming. Not to mention the fact that this GPU is priced nearly 3x over what I'm comfortable in spending on a GPU, so I'm not the target for this product. If I had oodles of money and no brains, I'd get one, but I've got a meager amount of money and brains so I won't be getting one.
System Name | PC |
---|---|
Processor | i7-13700k |
Motherboard | Z790 Aorus Elite AX |
Cooling | NH-D15 |
Memory | Corsair 2x32GB 6000mt/s cl30 |
Video Card(s) | RTX4070 Gaming OC |
Storage | 2TB 990 pro, 2TB Crucial P1, 2TB 870 EVO, 2TB 870 QVO, 3x8TB Seagate Exos 7E10, 6TB Toshiba N300 |
Display(s) | Dell S255HG |
Case | Fractal Define R6 TG |
Audio Device(s) | Creative AE-5 plus, 2xPresonus Eris E8 XT |
Power Supply | Corsair RM750i |
Mouse | Locitech G403 |
Keyboard | Corsair K70 TKL |
Software | win10 pro + win11pro + fedora |
Benchmark Scores | https://www.3dmark.com/spy/44211099 |
Well the Asus card here draws over 600 Watts. I am not trying to bash the card but the power draw is insane. No matter how good the cooling solution is. 1200 Watt PSUs are not cheap.