System Name | Dark Palimpsest |
---|---|
Processor | Intel i9 13900k with Optimus Foundation Block |
Motherboard | EVGA z690 Classified |
Cooling | MO-RA3 420mm Custom Loop |
Memory | G.Skill 6000CL30, 64GB |
Video Card(s) | Nvidia 4090 FE with Heatkiller Block |
Storage | 3 NVMe SSDs, 2TB-each, plus a SATA SSD |
Display(s) | Gigabyte FO32U2P (32" QD-OLED) , Asus ProArt PA248QV (24") |
Case | Be quiet! Dark Base Pro 900 |
Audio Device(s) | Logitech G Pro X |
Power Supply | Be quiet! Straight Power 12 1200W |
Mouse | Logitech G502 X |
Keyboard | GMMK Pro + Numpad |
That literally is the reason. By Nvidia just going entirely that direction, they're trying to force the power supply manufacturers to go that way too. I'm not saying I agree with it, I cna just hear the management decision about "well, if we don't force the issue they'll never change their cables"I see that side of it but there are not a whole lot of 650w units out there with a 12V-2×6 connector and none that I know of that are 550w or lower. You don't need a 750w unit to run a 100w GPU (or even 170-200w). The PSU will need to step up to the plate in their offerings as well. Sure you can use an adapter but why force people to use an adapter when there really is no need to if you just kept the same connector.
5090 titan? $3k electrical heater for your home
System Name | PCGOD |
---|---|
Processor | AMD FX 8350@ 5.0GHz |
Motherboard | Asus TUF 990FX Sabertooth R2 2901 Bios |
Cooling | Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED |
Memory | 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V) |
Video Card(s) | AMD Radeon 290 Sapphire Vapor-X |
Storage | Samsung 840 Pro 256GB, WD Velociraptor 1TB |
Display(s) | NEC Multisync LCD 1700V (Display Port Adapter) |
Case | AeroCool Xpredator Evil Blue Edition |
Audio Device(s) | Creative Labs Sound Blaster ZxR |
Power Supply | Seasonic 1250 XM2 Series (XP3) |
Mouse | Roccat Kone XTD |
Keyboard | Roccat Ryos MK Pro |
Software | Windows 7 Pro 64 |
What I want to see is the cards used without an internet connection enabled at all, the results recorded, and then with internet connection enabled, results recorded, stock default software no tuning.I think the more pressing reason is more powah = moar clocks = lower shader count required = lower die size = bigger margins. I don't think AMD was ever in their minds while designing and positioning their stacks. AMD was happy designing their stuff for consoles already for quite a while - and Nvidia had SUPER warming up anyway.
Lazy because its just a power bump and there have been few if any architectural improvements coming our way. The Nvidia I knew and happily bought from truly innovated. What we've got now is a clusterfuck of vaseline smeared brute forced lighting that makes zero sense and is just there to fake the idea of progress. In the meantime, games play like absolute shit and Nvidia GPUs get slaughtered by Lumen just the same. But who knows, maybe Nvidia pulls out a rabbit on that front. Oh yeah, AI... of course.
System Name | EXTREME-FLIGHT SIM |
---|---|
Processor | AMD RYZEN 7 9800X3D 4.7GHZ 8-core 120W |
Motherboard | ASUS ROG X670E Crosshair EXTREME BIOS V.2506 |
Cooling | be quiet! Silent Loop 2 360MM, Light Wings 120 & 140MM |
Memory | G. SKILL Trident Z5 RGB 32MBx2 DDR5-6000 CL32/EXPOⅡ |
Video Card(s) | ASUS ROG Strix RTX4090 O24 |
Storage | 2TB CRUCIAL T705 M.2, 4TB Seagate FireCuda 3.5"x7200rpm |
Display(s) | Samsung Odyssey Neo G9 57" 5120x1440 120Hz DP2.1 #2.Ulrhzar 8" Touchscreen(HUD) |
Case | be quiet! Dark Base Pro 900 Rev.2 Silver |
Audio Device(s) | ROG SupremeFX ALC4082, Creative SoundBlaster Katana V2 |
Power Supply | be quiet! Dark Power Pro 12 1500W via APC Back-UPS 1500 |
Mouse | LOGITECH Pro Superlight2 and POWERPLAY Mouse Pad |
Keyboard | CORSAIR K100 AIR |
Software | WINDOWS 11 x64 PRO 23H2, MSFS2020-2024 Aviator Edition, DCS |
Benchmark Scores | fast and stable AIDA64 |
Processor | AMD Ryzen 7 9800X3D (+PBO) |
---|---|
Motherboard | MSI MPG X870E Carbon Wifi |
Cooling | ARCTIC Liquid Freezer II 280 A-RGB |
Memory | 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30) |
Video Card(s) | MSI GeForce RTX 4090 SUPRIM Liquid X |
Storage | Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink |
Display(s) | AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz) |
Case | CoolerMaster H500M (Mesh) |
Audio Device(s) | AKG N90Q with AudioQuest DragonFly Red (USB DAC) |
Power Supply | Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1 |
Mouse | Logitech G PRO X SUPERLIGHT |
Keyboard | Razer BlackWidow V3 Pro |
Software | Windows 10 64-bit |
Oh yeah definitely.. they went almost all-in with Lovelace (best TSMC node they could get, more than 50% CUDA cores vs Ampere, higher clocks, higher TDP, a lot more L2 Cache, Frame Generation, Path Tracing, Ray Reconstruction, etc.) but like I said AMD were supposed to beat them with their 8192 Shader RDNA 3 cores (in Raster of course, not RT/PT), whereas Blackwell will not have competition on the high-end so they don't need to push as hard, they're just competing with themselves. That's why the SUPER or Refresh variants will probably have 3GB GDDR7 chips too vs 2GB chips for the vanilla ones. AMD are busy with SoC chips, trying to catch up with Nvidia on A.I. and gain back some GPU market share (hence the Mainstream RDNA 4 GPUs).I think the more pressing reason is more powah = moar clocks = lower shader count required = lower die size = bigger margins. I don't think AMD was ever in their minds while designing and positioning their stacks. AMD was happy designing their stuff for consoles already for quite a while - and Nvidia had SUPER warming up anyway.
System Name | IZALITH (or just "Lith") |
---|---|
Processor | AMD Ryzen 7 7800X3D (4.2Ghz base, 5.0Ghz boost, -30 PBO offset) |
Motherboard | Gigabyte X670E Aorus Master Rev 1.0 |
Cooling | Deepcool Gammaxx AG400 Single Tower |
Memory | Corsair Vengeance 64GB (2x32GB) 6000MHz CL40 DDR5 XMP (XMP enabled) |
Video Card(s) | PowerColor Radeon RX 7900 XTX Red Devil OC 24GB (2.39Ghz base, 2.99Ghz boost, -30 core offset) |
Storage | 2x1TB SSD, 2x2TB SSD, 2x 8TB HDD |
Display(s) | Samsung Odyssey G51C 27" QHD (1440p 165Hz) + Samsung Odyssey G3 24" FHD (1080p 165Hz) |
Case | Corsair 7000D Airflow Full Tower |
Audio Device(s) | Corsair HS55 Surround Wired Headset/LG Z407 Speaker Set |
Power Supply | Corsair HX1000 Platinum Modular (1000W) |
Mouse | Logitech G502 X LIGHTSPEED Wireless Gaming Mouse |
Keyboard | Keychron K4 Wireless Mechanical Keyboard |
Software | Arch Linux |
Processor | AMD Ryzen 7 9800X3D (+PBO) |
---|---|
Motherboard | MSI MPG X870E Carbon Wifi |
Cooling | ARCTIC Liquid Freezer II 280 A-RGB |
Memory | 2x32GB (64GB) G.Skill Trident Z Royal @ 6400MHz 1:1 (30-38-38-30) |
Video Card(s) | MSI GeForce RTX 4090 SUPRIM Liquid X |
Storage | Samsung 990 PRO 2TB w/ Heatsink SSD + Seagate FireCuda 530 SSD 2TB w/ Heatsink |
Display(s) | AORUS FO32U2P 4K QD-OLED 240Hz monitor (+ LG OLED C9 55" TV 4K@120Hz) |
Case | CoolerMaster H500M (Mesh) |
Audio Device(s) | AKG N90Q with AudioQuest DragonFly Red (USB DAC) |
Power Supply | Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1 |
Mouse | Logitech G PRO X SUPERLIGHT |
Keyboard | Razer BlackWidow V3 Pro |
Software | Windows 10 64-bit |
Nvidia released a 2nd version of the 16-pin connector and since then the 12VHRPWR specs have been upgraded, new PSUs come with ATX 3.1/PCIe 5.1I'm a little out of the loop since I prefer AMD, has the 12V-2×6 connector solved the failure issues of the 12VHPWR? Because a repeat performance of $2000+ GPUs burning would be extremely disappointing.
System Name | Office case + Dremel = gaming |
---|---|
Processor | Ryzen 5 5600x |
Motherboard | Asus Prime b450m-A II |
Cooling | Thermalright Assassin X 120 SE |
Memory | Corsair Vengeance 2x8+2x16=48GB 3600 Mtps |
Video Card(s) | Msi Aero OC Gtx 1080ti |
Storage | crucial nvme ssd 1 tb pcie 3.0 |
Display(s) | Minifire 180hz Full HD IPS |
Case | Asus Aspire m3201 |
Audio Device(s) | xbox wireless headset (over usb) |
Power Supply | MSI MAG A650BN 650w non-modular |
Mouse | TMKB M1SE (Pink) |
Keyboard | TMKB T98SE (Brown tactile switches) |
VR HMD | Oculus quest 2 |
Software | windows11 |
Benchmark Scores | P106-90 6GB in Steel Nomad: 550 (Vega 8 for display out) |
That is probably true, but I don’t think they need to. You hear a lot about the top end cards online, but most people stick to mid to high end. If Amd release a 5070/5080 equivalent at a better price and/or lower tdp, I‘d see that as a win.AMD can't build a 5090.
System Name | BigRed |
---|---|
Processor | I7 12700k |
Motherboard | Asus Rog Strix z690-A WiFi D4 |
Cooling | Noctua D15S chromax black/MX6 |
Memory | TEAM GROUP 32GB DDR4 4000C16 B die |
Video Card(s) | MSI RTX 3080 Gaming Trio X 10GB |
Storage | M.2 drives WD SN850X 1TB 4x4 BOOT/WD SN850X 4TB 4x4 STEAM/USB3 4TB OTHER |
Display(s) | Dell s3422dwg 34" 3440x1440p 144hz ultrawide |
Case | Corsair 7000D |
Audio Device(s) | Logitech Z5450/KEF uniQ speakers/Bowers and Wilkins P7 Headphones |
Power Supply | Corsair RM850x 80% gold |
Mouse | Logitech G604 lightspeed wireless |
Keyboard | Logitech G915 TKL lightspeed wireless |
Software | Windows 10 Pro X64 |
Benchmark Scores | Who cares |
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
That was RDNA2 and RDNA3 and it didn't materialize, did it? Nvidia had 20% more expensive raster perf, AMD dropped to <10% market share.That is probably true, but I don’t think they need to. You hear a lot about the top end cards online, but most people stick to mid to high end. If Amd release a 5070/5080 equivalent at a better price and/or lower tdp, I‘d see that as a win.
Why? Because you think Nvidia is secretly downloading FPS? I don't think we've arrived there just yet lol, but you might be right 3 generations from now. I think that's the gen we'll be back at 1-slot add-incards for a GPU, all they need is a network chip We'll still pay 1,5K for them though, or you can sub to Geforce NOW for the small fee of $5,- per hour of gaming.What I want to see is the cards used without an internet connection enabled at all, the results recorded, and then with internet connection enabled, results recorded, stock default software no tuning.
Processor | Ryzen 5 5700x |
---|---|
Motherboard | B550 Elite |
Cooling | Thermalright Perless Assassin 120 SE |
Memory | 32GB Fury Beast DDR4 3200Mhz |
Video Card(s) | Gigabyte 3060 ti gaming oc pro |
Storage | Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs |
Display(s) | LG 27gp850 1440p 165Hz 27'' |
Case | Lian Li Lancool II performance |
Power Supply | MSI 750w |
Mouse | G502 |
System Name | Skunkworks 3.0 |
---|---|
Processor | 5800x3d |
Motherboard | x570 unify |
Cooling | Noctua NH-U12A |
Memory | 32GB 3600 mhz |
Video Card(s) | asrock 6800xt challenger D |
Storage | Sabarent rocket 4.0 2TB, MX 500 2TB |
Display(s) | Asus 1440p144 27" |
Case | Old arse cooler master 932 |
Power Supply | Corsair 1200w platinum |
Mouse | *squeak* |
Keyboard | Some old office thing |
Software | Manjaro |
AMD already said that during the coof heard round the world, AMD prioritized what production they had on EPYC chips and ryzen chips by extension, over GPUs. So rDNA2 was in vanishingly short supply until the end of the generation, when sales had slowed.That was RDNA2 and RDNA3 and it didn't materialize, did it? Nvidia had 20% more expensive raster perf, AMD dropped to <10% market share.
So long as physics remain unbroken, latency and input lag will ensure that geforce NOW remains the poor mans option, with even a 4060 providing a better experience.Why? Because you think Nvidia is secretly downloading FPS? I don't think we've arrived there just yet lol, but you might be right 3 generations from now. I think that's the gen we'll be back at 1-slot add-incards for a GPU, all they need is a network chip We'll still pay 1,5K for them though, or you can sub to Geforce NOW for the small fee of $5,- per hour of gaming.
Processor | Ryzen 5 5700x |
---|---|
Motherboard | B550 Elite |
Cooling | Thermalright Perless Assassin 120 SE |
Memory | 32GB Fury Beast DDR4 3200Mhz |
Video Card(s) | Gigabyte 3060 ti gaming oc pro |
Storage | Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs |
Display(s) | LG 27gp850 1440p 165Hz 27'' |
Case | Lian Li Lancool II performance |
Power Supply | MSI 750w |
Mouse | G502 |
rDNA3 was super late to the game. The 7600 was a waste of sand, more expensive then the 6650 for the same performance (oh hey look it isnt just nvidia that does this), the 7800xt was great but launched over half a year too late to matter, with nvidia's 4060 and 4070 saturating the market, and the 7700xt was another mispriced disappointment (and also way too late to market).
System Name | Office |
---|---|
Processor | Ryzen 5600G |
Motherboard | ASUS B450M-A II |
Cooling | be quiet! Shadow Rock LP |
Memory | 16GB Patriot Viper Steel DDR4-3200 |
Video Card(s) | Gigabyte RX 5600 XT |
Storage | PNY CS1030 250GB, Crucial MX500 2TB |
Display(s) | Dell S2719DGF |
Case | Fractal Define 7 Compact |
Power Supply | EVGA 550 G3 |
Mouse | Logitech M705 Marthon |
Keyboard | Logitech G410 |
Software | Windows 10 Pro 22H2 |
Why is it nuts? What is the arbitrary number GPUs should not go past, and why?
System Name | Hellbox 5.1(same case new guts) |
---|---|
Processor | Ryzen 7 5800X3D |
Motherboard | MSI X570S MAG Torpedo Max |
Cooling | TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res |
Memory | 2x16GB Gskill Trident Neo Z 3600 CL16 |
Video Card(s) | Powercolor Hellhound 7900XTX |
Storage | 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB |
Display(s) | Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400 |
Case | TT Kandalf L.C.S. |
Audio Device(s) | Soundblaster ZX/Logitech Z906 5.1 |
Power Supply | Seasonic TX~’850 Platinum |
Mouse | G502 Hero |
Keyboard | G19s |
VR HMD | Oculus Quest 3 |
Software | Win 11 Pro x64 |
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Irrelevant, because it wasnt until AMD dropped prices far below msrp that the 6700/6800 actually sold out, and ampere was already a good 30% more expensive per frame.AMD already said that during the coof heard round the world, AMD prioritized what production they had on EPYC chips and ryzen chips by extension, over GPUs. So rDNA2 was in vanishingly short supply until the end of the generation, when sales had slowed.
It was a good decision on AMD's part, but it's still AMD's fault they lost market share there.
rDNA3 was super late to the game. The 7600 was a waste of sand, more expensive then the 6650 for the same performance (oh hey look it isnt just nvidia that does this), the 7800xt was great but launched over half a year too late to matter, with nvidia's 4060 and 4070 saturating the market, and the 7700xt was another mispriced disappointment (and also way too late to market).
So long as physics remain unbroken, latency and input lag will ensure that geforce NOW remains the poor mans option, with even a 4060 providing a better experience.
System Name | Zen 3 Daily Rig |
---|---|
Processor | AMD Ryzen 9 5900X |
Motherboard | ASUS Crosshair VIII Dark Hero |
Cooling | Optimus Foundation AM4, Alphacool Eisblock 3080 FE, HWLabs 360GTX and 360GTS, D5, ModMyMods ModWater |
Memory | G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14) |
Video Card(s) | Nvidia RTX 3080 Ti Founders Edition |
Storage | x2 Samsung 970 Evo Plus 2TB, Crucial MX500 1TB |
Display(s) | LG 42" C4 OLED |
Case | Lian Li O11 Dynamic |
Audio Device(s) | Aquacomputer HighFlow NEXT, Aquacomputer Octo |
Power Supply | be Quiet! Straight Power 12 1500W |
Mouse | Corsair Scimitar RGB Elite Wireless |
Keyboard | Keychron Q1 Pro |
Software | Windows 11 Pro |
They are sticking with 5nm TSMC again aren't they? Probably getting any gen on gen increase with sure architectural improvements, but guessing much larger dies and of course, moar power.Such a lazy release. Nvidia pushing the power button every other gen does not bode well.
System Name | Skunkworks 3.0 |
---|---|
Processor | 5800x3d |
Motherboard | x570 unify |
Cooling | Noctua NH-U12A |
Memory | 32GB 3600 mhz |
Video Card(s) | asrock 6800xt challenger D |
Storage | Sabarent rocket 4.0 2TB, MX 500 2TB |
Display(s) | Asus 1440p144 27" |
Case | Old arse cooler master 932 |
Power Supply | Corsair 1200w platinum |
Mouse | *squeak* |
Keyboard | Some old office thing |
Software | Manjaro |
I dont remember that. I DO remember 6800s being totally unavailable for over a year, and when I DID get my 6800xt, it was near $1000. Same went for 6700xts, which I watched for my friend throughout the lockdowns and afterwards waiting for them to show up.Irrelevant, because it wasnt until AMD dropped prices far below msrp that the 6700/6800 actually sold out, and ampere was already a good 30% more expensive per frame.
I dont either. To me it seems arbitrary, people are worried about power use on cards that cost far more then the electricity would ever run, even in exepnsiv eplaces like europe or california.I don't understand how we're swinging so far again with the Power. I mean I had a 1200W PSU over a decade ago as at the time I planned on Xfiring some 29000XTs(yeah yeah fail card) Then I dropped down to 1050W and now currently running an 850W Platinum.
Why when the nodes are going down which should be hypothetically more efficient are the last few gens becoming so power hungry again....I mean I know my 850W is "enough" but the fact that we are back to 1000, 1200 and even 1600W PSU's and GPUs like this that apparently have already saturated the stupid connector they've had to add another....
Why are you limited to 550 watts? 1kw+ PSUs have been around for a long time, hell 750w PSUs are not that much more then 550s.250W. That keeps a single-GPU system able to be powered by a 550W PSU and helps keep whole-system cost and waste heat down.
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ROG STRIX B650E-F GAMING WIFI |
Memory | 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5) |
Video Card(s) | INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2 |
Storage | 2TB Samsung 980 PRO, 4TB WD Black SN850X |
Display(s) | 42" LG C2 OLED, 27" ASUS PG279Q |
Case | Thermaltake Core P5 |
Power Supply | Fractal Design Ion+ Platinum 760W |
Mouse | Corsair Dark Core RGB Pro SE |
Keyboard | Corsair K100 RGB |
VR HMD | HTC Vive Cosmos |
For some reason AMD and Nvidia both are sticking with 5nm class process for GPUs, specifically N4P this time around. The 3nm dies we know about are what, Apple's M4 at 165 mm², Arrow Lake's compute die at 115 mm², anything larger in a non-enterprise space? Should this raise concerns about the yields of TSMC 3nm class processes for large dies considering that TSMC has had N3 in volume production since very late 2022? Or would the problem be just the wafer price?They are sticking with 5nm TSMC again aren't they?
System Name | Office |
---|---|
Processor | Ryzen 5600G |
Motherboard | ASUS B450M-A II |
Cooling | be quiet! Shadow Rock LP |
Memory | 16GB Patriot Viper Steel DDR4-3200 |
Video Card(s) | Gigabyte RX 5600 XT |
Storage | PNY CS1030 250GB, Crucial MX500 2TB |
Display(s) | Dell S2719DGF |
Case | Fractal Define 7 Compact |
Power Supply | EVGA 550 G3 |
Mouse | Logitech M705 Marthon |
Keyboard | Logitech G410 |
Software | Windows 10 Pro 22H2 |
I dont remember that. I DO remember 6800s being totally unavailable for over a year, and when I DID get my 6800xt, it was near $1000. Same went for 6700xts, which I watched for my friend throughout the lockdowns and afterwards waiting for them to show up.
I dont either. To me it seems arbitrary, people are worried about power use on cards that cost far more then the electricity would ever run, even in exepnsiv eplaces like europe or california.
and if you're concerned, you can always undervolt for dramatic gains.
Why are you limited to 550 watts? 1kw+ PSUs have been around for a long time, hell 750w PSUs are not that much more then 550s.
We could apply the same argument to a 150w GPU, that keeps whole system cost and waste heat even LOWER then 250w. So why is 250w/550w the cutoff?
Processor | Ryzen 7 5700X |
---|---|
Motherboard | ASUS TUF Gaming X570-PRO (WiFi 6) |
Cooling | Noctua NH-C14S (two fans) |
Memory | 2x16GB DDR4 3200 |
Video Card(s) | Reference Vega 64 |
Storage | Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA |
Display(s) | Nixeus NX-EDG27, and Samsung S23A700 |
Case | Fractal Design R5 |
Power Supply | Seasonic PRIME TITANIUM 850W |
Mouse | Logitech |
VR HMD | Oculus Rift |
Software | Windows 11 Pro, and Ubuntu 20.04 |
I think you're mistaken about the jump in IPC from Ampere to Ada. The SMX is essentially the same; it was Ampere that saw nearly a 25% increase in clock normalized throughput per SMX compared to Turing. Turing also improved performance per SMX when compared to Pascal; just compare a 2070 Super to a 1080.Why is it nuts? What is the arbitrary number GPUs should not go past, and why?
This ignores reality, ADA showed a significant bump from ampere in IPC, not just clock and core bumps. IIRC its a 16% increase when adjusted for core and clock count. It's ironic, the behavior you describe fits AMD's rDNA3 far better then Ada.
When hardware T&L came out, it crushed the first few gens of compatible cards. Can you imagine if the forums were as cynical as they are today? PC gaming would have been snuffed out because your geforce 256 couldnt play half life at 300 FPS on max settings.
The way games paly has nothing to do with nvidia, and game developer's inability to optimize their games is also not their fault.
System Name | Ryzen Reflection |
---|---|
Processor | AMD Ryzen 9 5900x |
Motherboard | Gigabyte X570S Aorus Master |
Cooling | 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi |
Memory | Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v |
Video Card(s) | Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz |
Storage | WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2) |
Display(s) | Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p |
Case | Lian Li PC-011D XL | Custom cables by Cablemodz |
Audio Device(s) | FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic |
Power Supply | Seasonic Prime Ultra Platinum 850 |
Mouse | Razer Viper v2 Pro |
Keyboard | Corsair K65 Plus 75% Wireless - USB Mode |
Software | Windows 11 Pro 64-Bit |
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ROG STRIX B650E-F GAMING WIFI |
Memory | 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5) |
Video Card(s) | INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2 |
Storage | 2TB Samsung 980 PRO, 4TB WD Black SN850X |
Display(s) | 42" LG C2 OLED, 27" ASUS PG279Q |
Case | Thermaltake Core P5 |
Power Supply | Fractal Design Ion+ Platinum 760W |
Mouse | Corsair Dark Core RGB Pro SE |
Keyboard | Corsair K100 RGB |
VR HMD | HTC Vive Cosmos |
The "normal" TDP has increased a LOT in 2 decades. And gains from overclocking have gone down, also a LOT. From outside factors, cost of power has increased in most countries since then.Enthusiasts actually complain about TDP these days is funny to me, when maybe 2 decades ago PC Enthusiasts would purposefully try to blow through those numbers with overclocking everything they had to the actual brink of failure. Somehow TDP actually matters now for desktop users
id go with amd right now if they did well not only in gaming but productivity too. I dont care for RT. content creators/editing is on the rise. a 6900xt is just a tad better then a 2080ti thats 2.5x less cost in the used market. I want to see them make the changes and id be happy to go amd with gpus as well. I think alot of people are fed up with leather boy and would happily come over if they upped their game. people buy nvidia becuase no choice. even davinci resolve ceo says "we currently dont recommend amd cards"That is probably true, but I don’t think they need to. You hear a lot about the top end cards online, but most people stick to mid to high end. If Amd release a 5070/5080 equivalent at a better price and/or lower tdp, I‘d see that as a win.
System Name | S.L.I + RTX research rig |
---|---|
Processor | Ryzen 7 5800X 3D. |
Motherboard | MSI MEG ACE X570 |
Cooling | Corsair H150i Cappellx |
Memory | Corsair Vengeance pro RGB 3200mhz 32Gbs |
Video Card(s) | 2x Dell RTX 2080 Ti in S.L.I |
Storage | Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2 |
Display(s) | HP X24i |
Case | Corsair 7000D Airflow |
Power Supply | EVGA G+1600watts |
Mouse | Corsair Scimitar |
Keyboard | Cosair K55 Pro RGB |
Correction 350 watt for high-end/prosumer cpus (threadripper) & 450 to 600 watt GPU's (RTX 4090 asus Strix bios was 600 watts at one point)The "normal" TDP has increased a LOT in 2 decades. And gains from overclocking have gone down, also a LOT. From outside factors, cost of power has increased in most countries since then.
2 decades ago was 2004.
Cream of the crop GPUs were 6800 Ultra with its horrible 105W and X800 XT at 85-90W.
Best CPUs were Athlon 64s with 90W TDP. On Intel side the much-maligned Prescott P4s and older Gallatin P4EEs at 115W.
Potential overclocking gains, especially for 24/7 usage, were significant.
In most cases, TDP was not the limiting factor for performance and often enough parts did not consume up to TDP.
Compare this with today where high-end GPU consumes 300W and more. High-end CPU consumes over 200W.
Gains from overclocking are kind of there but for 24/7 usage the power cost for any performance increases is very bad.
Oh, and TDP is generally the limiting factor as well
We have gone from 250W or so for a high-end PC in 2004 to 600W or so in 2024. Or potentially much more if you have a 4090 and something like "TDP is just a suggestion" Intel CPU.