Emwun Garand
New Member
- Joined
- Jun 2, 2021
- Messages
- 2 (0.00/day)
You're giving it a Thumbs up/Pro for being 8nm? As opposed to what?
Pascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.
ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!
just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE
System Name | ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017) |
---|---|
Processor | ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K |
Motherboard | ❶ X570-F ❷ Z390-E ❸ Z270-E |
Cooling | ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62 |
Memory | ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16 |
Video Card(s) | ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI |
Storage | ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD |
Display(s) | ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS |
Case | ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C |
Audio Device(s) | ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432 |
Power Supply | ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2 |
Mouse | ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502 |
Keyboard | ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610 |
Software | ❶ Win 11 ❷ 10 ❸ 10 |
Benchmark Scores | I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail |
Processor | Ryzen 5700x |
---|---|
Motherboard | Gigabyte X570S Aero G R1.1 BiosF5g |
Cooling | Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm |
Memory | Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A) |
Video Card(s) | AMD RX 6800 - Asus Tuf |
Storage | Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX |
Display(s) | LG 27UL550-W (27" 4k) |
Case | Be Quiet Pure Base 600 (no window) |
Audio Device(s) | Realtek ALC1220-VB |
Power Supply | SuperFlower Leadex V Gold Pro 850W ATX Ver2.52 |
Mouse | Mionix Naos Pro |
Keyboard | Corsair Strafe with browns |
Software | W10 22H2 Pro x64 |
System Name | H7 Flow 2024 |
---|---|
Processor | AMD 5800X3D |
Motherboard | Asus X570 Tough Gaming |
Cooling | Custom liquid |
Memory | 32 GB DDR4 |
Video Card(s) | Intel ARC A750 |
Storage | Crucial P5 Plus 2TB. |
Display(s) | AOC 24" Freesync 1m.s. 75Hz |
Mouse | Lenovo |
Keyboard | Eweadn Mechanical |
Software | W11 Pro 64 bit |
Wow, so many stupid assumptions.What's the point? Nvidia could package up toenail shavings for a couple thousand and countless idiots would buy it, jacking up the prices for everyone else (not for toenails but graphics cards to be clear).
I liked PC gaming before the middle class kids or casuals got interested in it about 5 or 6 years ago. Now they all want the best graphics cards so save up a whole month's worth of their McDonald's counter salary to buy one. These people don't have bills or kids.
Processor | i7-13900K |
---|---|
Motherboard | ROG Maximus Z690 |
Video Card(s) | RTX 3090 |
Why is everybody so concerned with NVIDIA's MSRP? It's just a meaningless number. They probably didn't want to lower the x80 Ti MSRP compared to 2080 Ti, so they picked 1200, to not look bad when they announce 4080 Ti.
You will not be able to buy the 3080 Ti at that price, probably ever. As much as that sucks for all of us, that's what will happen. Look at what the card offers, compared to what's available at what price and make a decision based on that? I tried to go through some options and examples in my conclusion.
Processor | i7 7700k |
---|---|
Motherboard | MSI Z270 SLI Plus |
Cooling | CM Hyper 212 EVO |
Memory | 2 x 8 GB Corsair Vengeance |
Video Card(s) | Temporary MSI RTX 4070 Super |
Storage | Samsung 850 EVO 250 GB and WD Black 4TB |
Display(s) | Temporary Viewsonic 4K 60 Hz |
Case | Corsair Obsidian 750D Airflow Edition |
Audio Device(s) | Onboard |
Power Supply | EVGA SuperNova 850 W Gold |
Mouse | Logitech G502 |
Keyboard | Logitech G105 |
Software | Windows 10 |
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG Maximus Z790 Apex Encore |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Audio Device(s) | Apple USB-C + Sony MDR-V7 headphones |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | IBM Model M type 1391405 (distribución española) |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
You're giving it a Thumbs up/Pro for being 8nm? As opposed to what?
System Name | Skunkworks 3.0 |
---|---|
Processor | 5800x3d |
Motherboard | x570 unify |
Cooling | Noctua NH-U12A |
Memory | 32GB 3600 mhz |
Video Card(s) | asrock 6800xt challenger D |
Storage | Sabarent rocket 4.0 2TB, MX 500 2TB |
Display(s) | Asus 1440p144 27" |
Case | Old arse cooler master 932 |
Power Supply | Corsair 1200w platinum |
Mouse | *squeak* |
Keyboard | Some old office thing |
Software | Manjaro |
System Name | Rainbow Sparkles (Power efficient, <350W gaming load) |
---|---|
Processor | Ryzen R7 5800x3D (Undervolted, 4.45GHz all core) |
Motherboard | Asus x570-F (BIOS Modded) |
Cooling | Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate |
Memory | 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V) |
Video Card(s) | Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W)) |
Storage | 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2 |
Display(s) | Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144) |
Case | Fractal Design R6 |
Audio Device(s) | Logitech G560 | Corsair Void pro RGB |Blue Yeti mic |
Power Supply | Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY) |
Mouse | Logitech G Pro wireless + Steelseries Prisma XL |
Keyboard | Razer Huntsman TE ( Sexy white keycaps) |
VR HMD | Oculus Rift S + Quest 2 |
Software | Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware! |
Benchmark Scores | Nyooom. |
*Begins crying*According to the HWiNFO developer, GDDR6X modules are rated to throttle at around 110°C. They're toasty and consume a lot of power, any 3090 owner will attest to that
Processor | AMD Ryzen 9 5950x, Ryzen 9 5980HX |
---|---|
Motherboard | MSI X570 Tomahawk |
Cooling | Be Quiet Dark Rock Pro 4(With Noctua Fans) |
Memory | 32Gb Crucial 3600 Ballistix |
Video Card(s) | Gigabyte RTX 3080, Asus 6800M |
Storage | Adata SX8200 1TB NVME/WD Black 1TB NVME |
Display(s) | Dell 27 Inch 165Hz |
Case | Phanteks P500A |
Audio Device(s) | IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio |
Power Supply | Corsair RM850x |
Mouse | Logitech G502 SE Hero |
Keyboard | Corsair K70 RGB Mk.2 |
VR HMD | Samsung Odyssey Plus |
Software | Windows 10 |
Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.
The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.
View attachment 202619
I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ASRock X670E Taichi |
Cooling | Noctua NH-D15 Chromax |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | MSI RTX 4090 Trio |
Storage | Too much |
Display(s) | Acer Predator XB3 27" 240 Hz |
Case | Thermaltake Core X9 |
Audio Device(s) | Topping DX5, DCA Aeon II |
Power Supply | Seasonic Prime Titanium 850w |
Mouse | G305 |
Keyboard | Wooting HE60 |
VR HMD | Valve Index |
Software | Win 10 |
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls.
Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.
Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. LOLCALMDOWN
Processor | Ryzen 5700X |
---|---|
Motherboard | Gigabyte AX-370 Gaming 5, BIOS F51h |
Cooling | MSI Core Frozr L |
Memory | 32GB 3200MHz CL16 |
Video Card(s) | MSI GTX 1080 Ti Trio |
Storage | Crucial MX300 525GB + Samsung 970 Evo 1TB + 3TB 7.2k + 4TB 5.4k |
Display(s) | LG 34UC99 3440x1440 75Hz + LG 24MP88HM |
Case | Phanteks Enthoo Evolv ATX TG Galaxy Silver |
Audio Device(s) | Edifier XM6PF 2.1 |
Power Supply | EVGA Supernova 750 G3 |
Mouse | Steelseries Rival 3 |
Keyboard | Razer Blackwidow Lite Stormtrooper Edition |
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently notPascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.
ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!
just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE
It's not meaningless in the long run... just look at the direction MSRP prices are headed: GTX 680 = $499 / 780TI =$699 / 980TI = $649 / 1080TI = $699 / 2080TI = $999 / 3080TI =$1.199... 240% price hike in 9 years (18% inflation in this period). Elevated MSRPs are here to stay even after mining graze ends.
System Name | Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck |
---|---|
Processor | i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t |
Memory | 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5 |
Video Card(s) | RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs |
Storage | 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme |
Display(s) | 50" 4k TV | Dell 27" |22" |3.3"|7" |
VR HMD | Samsung Odyssey+ | Oculus Quest 2 |
Software | Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro |
So he is right on timeCards go on sale tomorrow
System Name | The de-ploughminator Mk-III |
---|---|
Processor | 9800X3D |
Motherboard | Gigabyte X870E Aorus Master |
Cooling | DeepCool AK620 |
Memory | 2x32GB G.SKill 6400MT Cas32 |
Video Card(s) | Asus RTX4090 TUF |
Storage | 4TB Samsung 990 Pro |
Display(s) | 48" LG OLED C4 |
Case | Corsair 5000D Air |
Audio Device(s) | KEF LSX II LT speakers + KEF KC62 Subwoofer |
Power Supply | Corsair HX850 |
Mouse | Razor Death Adder v3 |
Keyboard | Razor Huntsman V3 Pro TKL |
Software | win11 |
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently not
1060 was a phenomenal card, Nvidia will not be able to beat it with the current increasing MSRP. xx60 will reach xx80's price in the near future, as @RedelZaVedno said here:
Processor | AMD Ryzen 5 5600@80W |
---|---|
Motherboard | MSI B550 Tomahawk |
Cooling | ZALMAN CNPS9X OPTIMA |
Memory | 2*8GB PATRIOT PVS416G400C9K@3733MT_C16 |
Video Card(s) | Sapphire Radeon RX 6750 XT Pulse 12GB |
Storage | Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB |
Display(s) | AOC 27G2U/BK IPS 144Hz |
Case | SHARKOON M25-W 7.1 BLACK |
Audio Device(s) | Realtek 7.1 onboard |
Power Supply | Seasonic Core GC 500W |
Mouse | Sharkoon SHARK Force Black |
Keyboard | Trust GXT280 |
Software | Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux |
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ASRock X670E Taichi |
Cooling | Noctua NH-D15 Chromax |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | MSI RTX 4090 Trio |
Storage | Too much |
Display(s) | Acer Predator XB3 27" 240 Hz |
Case | Thermaltake Core X9 |
Audio Device(s) | Topping DX5, DCA Aeon II |
Power Supply | Seasonic Prime Titanium 850w |
Mouse | G305 |
Keyboard | Wooting HE60 |
VR HMD | Valve Index |
Software | Win 10 |
Who cares about maximum power consumption when you can tweak the power limits to your liking, reducing power consumption will increase efficiency, demonstrated by the mobile GPU.
Infact you should be thankful that Nvidia/AMD keep increasing the maximum power limits on their desktop GPU because they have to design better VRMs to accomodate higher power consumption limits, better VRM --> higher VRM efficiency. Let say you have 6 phrase VRM that have 20W power loss at 150W TGP before, now you have 10+ phrase VRM that have only 10W power loss at 150W TGP
1660 Super was a super fine GPU at 230usd
Processor | AMD Ryzen 9 5900X |
---|---|
Motherboard | ASUS ROG STRIX B550-F GAMING (WI-FI) |
Cooling | Noctua NH-D15 G2 |
Memory | 32GB G.Skill DDR4 3600Mhz CL18 |
Video Card(s) | ASUS GTX 1650 TUF |
Storage | SAMSUNG 990 PRO 2TB |
Display(s) | Dell S3220DGF |
Case | Corsair iCUE 4000X |
Audio Device(s) | ASUS Xonar D2X |
Power Supply | Corsair AX760 Platinum |
Mouse | Razer DeathAdder V2 - Wireless |
Keyboard | Corsair K70 PRO - OPX Linear Switches |
Software | Microsoft Windows 11 - Enterprise (64-bit) |
I feel GDDR6X is a stop gap solution for a faster GDDR standard. Its almost similar to GDDR5X that never had a future beyond Nvidia's Pascal. As a result, of pushing such high clockspeed as compared to GDDR6, a lot of power is required. I wasn't very sure if GDDR6 uses that much power until I noticed the TGP of the RTX 3070 vs 3070 Ti. And in this case, its got only 8x GDDR6X 1GB. When you have 10, 12 or 24 of hot and power hungry RAM onboard, that will increase power requirement drastically. And I do agree that AMD's Infinity Cache is a great way to go around this power requirement and yet achieve better or comparable memory bandwidth.Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.
The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.
View attachment 202619
I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.
That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
Processor | Ryzen 7 5800X3D |
---|---|
Motherboard | Gigabyte X570 Aorus Elite |
Cooling | Thermalright Phantom Spirit 120 SE |
Memory | 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14 |
Video Card(s) | RTX3080 Ti FE |
Storage | SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB |
Display(s) | LG 34GN850P-B |
Case | SilverStone Primera PM01 RGB |
Audio Device(s) | SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX |
Power Supply | SeaSonic Focus Plus Gold 750W |
Mouse | Endgame Gear XM1R |
Keyboard | Wooting Two HE |
Dude, watch any YT video with RivaTuner running on a 5950X and 3090. An engine designed around Jaguar cores isn't going to utilize 32 threads:And here is why the use of 5800X is a bottleneck for top-of-the-line GPU reviews now-a-days since some games properly utilise more threads
View attachment 202635
Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ASRock X670E Taichi |
Cooling | Noctua NH-D15 Chromax |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | MSI RTX 4090 Trio |
Storage | Too much |
Display(s) | Acer Predator XB3 27" 240 Hz |
Case | Thermaltake Core X9 |
Audio Device(s) | Topping DX5, DCA Aeon II |
Power Supply | Seasonic Prime Titanium 850w |
Mouse | G305 |
Keyboard | Wooting HE60 |
VR HMD | Valve Index |
Software | Win 10 |
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.
In my opinion, its the people that are savvy that will figure out something is not right, and will try and fix it, i.e. fiddle with the power limits, etc. For people that are not savvy, they probably will live with it since while it runs hot, it works.
System Name | Meh |
---|---|
Processor | 7800X3D |
Motherboard | MSI X670E Tomahawk |
Cooling | Thermalright Phantom Spirit |
Memory | 32GB G.Skill @ 6000/CL30 |
Video Card(s) | Gainward RTX 4090 Phantom / Undervolt + OC |
Storage | Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server |
Display(s) | 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR |
Case | Fractal Design North XL |
Audio Device(s) | FiiO DAC |
Power Supply | Corsair RM1000x / Native 12VHPWR |
Mouse | Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro |
Keyboard | Corsair K60 Pro / MX Low Profile Speed |
Software | Windows 10 Pro x64 |