• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Most Expensive RTX 4080 Custom Just $50 Shy of the RTX 4090 MSRP: MicroCenter Pricing Leak

Joined
Dec 25, 2020
Messages
5,744 (4.32/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Spend an insane amount on this overpromised and underdelivered hardware so when Jensen releases another architecture in 2 years' time, you get the latest features gated away from you based on some "claim" that it requires updated hardware... just like DLSS 3 being blocked for us RTX 3090 owners 'cause supposedly our "BFGPU" is now some sort of ancient relic, unworthy toaster really what are we, poor?

Solution: buy a Radeon. I know I will.
 
Joined
Apr 6, 2021
Messages
1,131 (0.92/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
Well I can tell you I have just returned it today. Managed to get a gigabyte 4090. So I hope this is coil whine free or at least down to a acceptable level.

Awaiting your update here with the Gigabyte. :) Let's see how it goes.
 
Joined
Jan 3, 2015
Messages
2,976 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
You bought a 4090 to play PUBG?? Really? o_O
How about me then. I got 4090 just play pacman at 480P...

Awaiting your update here with the Gigabyte. :) Let's see how it goes.
The gigabyte cards does have coil whine and you can here it. But it is not as load as my asus card. So far it´s better when it comes to coil whine. However gigabyte card has some downside compared to asus card. It has more noisy fans. Even with silent bios fans spin faster. Cooler cover is plastik where the asus card is metal and generelt feelt more hig-end quality. If not for the coil whine, i would rather have keept the asus tuf card.
 
Joined
Dec 25, 2020
Messages
5,744 (4.32/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Coil whine can even correct itself after a while. I think you made a bad deal returning the TUF card, man.
 
Joined
Jul 9, 2015
Messages
3,413 (1.03/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
but then I wouldn't benefit from the monitor's sync features. And because my budget for upgrades is around ~400€, the way I see it, I'll either spend 400€ on a Nvidia card or 400€ for and a AMD card + monitor, and in this scenario Nvidia "wins"...
And how many times are you going to pay for that bad decision from the past?

People will still buy them because the average consumer is uninformed and will at most see some sponsored video gushing over them.
Judging by 3050Ti vs 6600 sales, indeed that's how things will work.
 

ARF

Joined
Jan 28, 2020
Messages
4,548 (2.74/day)
Location
Ex-usa | slava the trolls
Yeah, people are fanatics, they buy what they believe in. Stupid evil world.

These are even not in the same performance segment.

1668333824956.png
 
Joined
Sep 26, 2022
Messages
224 (0.32/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
And how many times are you going to pay for that bad decision from the past?
Good point! In short, 1 or 2 times.
The monitor is ~2 years old, and an "upgrade" (jumping from 1440p to 4k) would cost >400€.
I don't update hardware very often, the previous monitor and cpu died on me so that's what triggered those purchases. My previous graphics card was a GTX570, so I'm looking for something that can play current games at 1440p. I have a big backlog to go through still, so I'll probably only play current gen games in about 3~4 years. If my math is correct I shouldn't need a new GFX/monitor for at least 6 years.
 
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I see that you have an LG 32GK850G that currently goes for around 550-600€.
You can get a decent 4K monitor starting from 200€.

View attachment 269791

4K Monitor günstig kaufen | Black Friday 2022 | Preisvergleich bei idealo
Calling those "decent", at least for gaming, would be a stretch. Do they work? Sure. Will they be a significant downgrade from the LG? Yes, without a doubt.

Then again, gaming at 2160p is outright stupid unless you have tons of money to burn anyway. They could likely sell their LG for a decent chunk of change to finance a side-grade to a newer, better 1440p144 monitor with FreeSync/Adaptive Sync.
 

ARF

Joined
Jan 28, 2020
Messages
4,548 (2.74/day)
Location
Ex-usa | slava the trolls
Calling those "decent", at least for gaming, would be a stretch. Do they work? Sure. Will they be a significant downgrade from the LG? Yes, without a doubt.

Then again, gaming at 2160p is outright stupid unless you have tons of money to burn anyway. They could likely sell their LG for a decent chunk of change to finance a side-grade to a newer, better 1440p144 monitor with FreeSync/Adaptive Sync.

Are you trolling us?

2160p gaming is advertised on the GFX boxes :D

1668336260889.png


And you can buy 4K 144 gaming monitor for as low as or lower than that old 2-year-old LG :D
1668335864936.png
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Are you trolling us?

2160p gaming is advertised on the GFX boxes :D
... yes, and does that somehow negate the fact that it is extremely resource intensive for very little perceptible gain in visual quality?

Last i checked, marketing people absolutely love selling stupid but flashy features to people. I'd go so far as to say they'll much rather focus on something stupid than something reasonable.
And you can buy 4K 144 gaming monitor for as low as or lower than that old 2-year-old LG :D
... yes, but - and I don't know if you actually don't know this or are just playing dumb on purpose - but you generally don't get the retail price for a product back when selling it used. And they didn't buy that LG yesterday.

I mean, you can get decent 1440p144 monitors (HP has a great 27" option that very cheap) not too far above $200. That this old LG somehow has stuck at a high price point likely just tells us that whatever stock is left over in retail channels is old unsold stock that isn't selling.


Literally none of this changes the fact that they could sell their LG, get a decent 1440p144 FS option (though wanting 32" will drive up the price by a bit) and get an affordable AMD GPU, while for their stated budget 2160p gaming makes no sense whatsoever.

They've stated they have a €400 budget for upgrades, which makes you talking about €5-600 monitors and/or 2160p gaming rather absurd. If you're trying to give advice, maybe at least try to make sure the advice fits the relevant realities at all? If your advice is either to get a crap-tier 2160p60 monitor and a cheap GPU - a setup that makes zero sense for gaming - or spend 150% of the available budget on a good gaming 2160p144 monitor, then, well, neither of those are good suggestions.

On the other hand, selling the LG, getting a good 1440p144/160/170Hz FS monitor - which can be found pretty affordably - and a good AMD GPU will likely be the best choice (unless that LG monitor is really good, which IMO is rather unlikely given its age). With the second best, but easier option being keeping the current monitor and paying the extra brand tax for an Nvidia GPU.
 
Joined
Jul 9, 2015
Messages
3,413 (1.03/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
2160p gaming is advertised on the GFX boxes :D
Heck, we were in 8k world even last gen. DF said so. You don't think they'd be shilling for some company, do you?



"Here's the Digital Foundry analysis of the RTX 3090, RTX 3080 and RTX ... The card is capable of running games at 8K resolution "
 
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Heck, we were in 8k world even last gen. DF said so. You don't think they'd be shilling for some company, do you?



"Here's the Digital Foundry analysis of the RTX 3090, RTX 3080 and RTX ... The card is capable of running games at 8K resolution "
Something being capable of a thing, and that thing being a good idea or efficient use of resources, are not really related concepts at all. Yes, current top end GPUs can indeed run games at 4320p at passable framerates. Does that make such gaming anything but pure idiocy? No, not at all, as you won't be able to tell the difference between that and 2160p at all at any possible combination of screen size and viewing distance (unless you sit so close that you can't see the whole screen, rendering both size and resolution useless). And while the same isn't quite true for 2160p v 1440p, the perceptible difference at normal screen sizes and viewing distances is tiny - and certainly not worth the extra processing power.


I think DF just got carried away by the "OMG cool new tech look how crazy powerful it is" and entirely failed to engage critically with it. Which isn't quite the same as being a shill - it just makes you a poor quality journalist.
 
Joined
Jul 9, 2015
Messages
3,413 (1.03/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Something being capable of a thing, and that thing being a good idea or efficient use of resources, are not really related concepts
GPUs in general being "efficient use of resources" or that parameter making significant difference in GPU world is a novel take on things... :D
Yes, current top end GPUs can indeed run games at 4320p at passable framerates. Does that make such gaming anything but pure idiocy?
The 3000 series in question absolutely can NOT run games at 8k.
What they can do is upscale using the worst quality upscaler (use fancier word for upscale, if it helps) and that is pretty silly to do indeed.

As for 8k gaming, if GPUs really were capable of it, 77" and even 88" 8k OLED TVs with Freesync support are on the market already, I could even imagine people with sharp eyes noticing the difference.

I think DF just got carried away
I just think that when someone looks as NV shill for a number of occasions (the criminal energy put into shitting on FSR was impressive) she/he/xim probably is NV shill. I don't even need to read anything by them to know they are very positive of DLSS3, as, of course they would be.
 
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
GPUs in general being "efficient use of resources" or that parameter making significant difference in GPU world is a novel take on things... :D
Only if you for some reason think gaming isn't a worthwhile activity. I tend to think that the most valuable things human societies tend to produce are their cultural artefacts, so I definitely don't see producing GPUs as more inherently inefficient resource use than a lot of other things.
The 3000 series in question absolutely can NOT run games at 8k.
What they can do is upscale using the worst quality upscaler (use fancier word for upscale, if it helps) and that is pretty silly to do indeed.
They can if you don't insist on running Ultra - but then, if you're that gung-ho for visual fidelity that you insist on rendering 33.2MP per frame, I don't find it likely that you'd be happy with stepping down to Medium settings, so that point is kind of moot. But the more general point stands: it's technically possible, but it's dumb AF, and will continue to be so for the foreseeable future.
As for 8k gaming, if GPUs really were capable of it, 77" and even 88" 8k OLED TVs with Freesync support are on the market already, I could even imagine people with sharp eyes noticing the difference.
They really can't. Human vision is at peak sharpness in a ~10° circle, and is reasonably sharp within about 40°, with detail perception dropping off quite a bit past that. Meaning that if your TV or monitor fills more than that part of your field of view, you're not utilizing its resolution fully. Of course having a larger display/closer to you can increase immersion through sheer sensory effect - but it also places more of the display outside of your sharp field of view, forcing more eye movement (=eye strain) and wasting a lot of rendering resources (hence why foveated rendering is such a good idea). But even at those display sizes, you'd need to sit very uncomfortably close to the displays to be able to discern an actual difference between 2160p and 4320p in even moderate motion. Like, so close that that TV would more than fill your field of view. At which point it just makes no sense.
I just think that when someone looks as NV shill for a number of occasions (the criminal energy put into shitting on FSR was impressive) she/he/xim probably is NV shill. I don't even need to read anything by them to know they are very positive of DLSS3, as, of course they would be.
They got early access to the RTX 3000 generation, and got a lot of deserved flak for their lack of nuance and sensationalist coverage on that. While it's not a site I frequent, I haven't seen anything similar from them since - and from a quick skim of their DLSS 3 coverage, they seem reservedly positive like most tech reviewers seem to be - with the only real difference in conclusions being the tolerance for rendering errors and how much weight is put on them. Me, I tend to agree with the conclusions from the recent LTT video - I'd much rather have 60fps with some artefacts and not-quite-there input lag than 30fps without, but DLSS3 seems entirely wasted on the RTX 4090 (and likely 4080, 4070, and 4060 too!).
 

ARF

Joined
Jan 28, 2020
Messages
4,548 (2.74/day)
Location
Ex-usa | slava the trolls
... yes, and does that somehow negate the fact that it is extremely resource intensive for very little perceptible gain in visual quality?

Last i checked, marketing people absolutely love selling stupid but flashy features to people. I'd go so far as to say they'll much rather focus on something stupid than something reasonable.

... yes, but - and I don't know if you actually don't know this or are just playing dumb on purpose - but you generally don't get the retail price for a product back when selling it used. And they didn't buy that LG yesterday.

I mean, you can get decent 1440p144 monitors (HP has a great 27" option that very cheap) not too far above $200. That this old LG somehow has stuck at a high price point likely just tells us that whatever stock is left over in retail channels is old unsold stock that isn't selling.


Literally none of this changes the fact that they could sell their LG, get a decent 1440p144 FS option (though wanting 32" will drive up the price by a bit) and get an affordable AMD GPU, while for their stated budget 2160p gaming makes no sense whatsoever.

They've stated they have a €400 budget for upgrades, which makes you talking about €5-600 monitors and/or 2160p gaming rather absurd. If you're trying to give advice, maybe at least try to make sure the advice fits the relevant realities at all? If your advice is either to get a crap-tier 2160p60 monitor and a cheap GPU - a setup that makes zero sense for gaming - or spend 150% of the available budget on a good gaming 2160p144 monitor, then, well, neither of those are good suggestions.

On the other hand, selling the LG, getting a good 1440p144/160/170Hz FS monitor - which can be found pretty affordably - and a good AMD GPU will likely be the best choice (unless that LG monitor is really good, which IMO is rather unlikely given its age). With the second best, but easier option being keeping the current monitor and paying the extra brand tax for an Nvidia GPU.

I think you have to learn how to appreciate 2160p144. Once you do, you will be fine and won't pull people backward to lower quality.

The RX 6800 XT 16 GB can absolutely run 4K@200 FPS if the settings are in check and there is no reason to waste its potential with lower grade stupid garbage 1440p res..
 
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I think you have to learn how to appreciate 2160p144. Once you do, you will be fine and won't pull people backward to lower quality.

The RX 6800 XT 16 GB can absolutely run 4K@200 FPS if the settings are in check and there is no reason to waste its potential with lower grade stupid garbage 1440p res..
I do know and appreciate those settings - but I also know just how small the overall perceptible difference is when actually playing games, and just how much higher the computational workload is for that resolution compared to 1440p. And for anyone who isn't actively shitting money, the trade-off just isn't worth it. You can get a great 1440p144 (or 160, 170, even 240) monitor and GPU capable of really making use of that monitor for around half the price of the same experience for 2160p144. And yes, lowering settings is absolutely a good idea - playing on Ultra is equally stupid to playing at 2160p - but that also applies downwards. At medium-high, the RX 6600 is an excellently capable 1440p GPU (it also does Ultra at decent enough FPS, just not great). And, again, the difference in how the game actually looks between the two resolutions, at the same screen size and viewing distance, is pretty small overall. 2160p has much more going for it for desktop use, where the additional resolution drastically improves text sharpness and usable desktop area (if you have the eyesight for lower scaling factors).
 

ARF

Joined
Jan 28, 2020
Messages
4,548 (2.74/day)
Location
Ex-usa | slava the trolls
I do know and appreciate those settings - but I also know just how small the overall perceptible difference is when actually playing games, and just how much higher the computational workload is for that resolution compared to 1440p. And for anyone who isn't actively shitting money, the trade-off just isn't worth it. You can get a great 1440p144 (or 160, 170, even 240) monitor and GPU capable of really making use of that monitor for around half the price of the same experience for 2160p144. And yes, lowering settings is absolutely a good idea - playing on Ultra is equally stupid to playing at 2160p - but that also applies downwards. At medium-high, the RX 6600 is an excellently capable 1440p GPU (it also does Ultra at decent enough FPS, just not great). And, again, the difference in how the game actually looks between the two resolutions, at the same screen size and viewing distance, is pretty small overall. 2160p has much more going for it for desktop use, where the additional resolution drastically improves text sharpness and usable desktop area (if you have the eyesight for lower scaling factors).

A great 1440p monitor doesn't exist. There is an extremely unpleasant screen door effect because the pixel density is much lower and the pixels are quite visible.
 
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
A great 1440p monitor doesn't exist. There is an extremely unpleasant screen door effect because the pixel density is much lower and the pixels are quite visible.
Then either you're using monitors too large for the resolution while sitting too close to them, or you have superhuman eyesight. Either way, you do you I guess? Honestly, if you've seen a screen door effect on a 1440p monitor, I would go out on a limb and say that you've never seen a reasonably new 1440p monitor. Heck, even my 2011-era Dell U2711 had no visible pixels even really close (though that was in part due to its very aggressive anti-glare coating). But no reasonable quality 27" 1440p monitor has visible pixels at any reasonable viewing distance, and even 32" is unlikely to have visible pixels unless you're really looking for it, getting too close for realistic use (i.e. where you have no chance of seeing the whole monitor at once).
 

64K

Joined
Mar 13, 2014
Messages
6,496 (1.71/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Retailer and card manufacturer gouging because it is given that supplies will be short.
 
Joined
Dec 25, 2020
Messages
5,744 (4.32/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
The 3000 series in question absolutely can NOT run games at 8k.

I've run quite a few (easier to run) games at 8K with my RTX 3090. If you consider 8K30 an acceptable target, this card will do.

The RTX 4090 is clearly not enough for an 8K60 target yet, and therein lies their master gimmick - DLSS frame generation, which they refuse to enable on Turing and Ampere, probably to try and bolster sales of the RTX 4090. This company is unscrupulous and will always lie about hardware requirements to upsell, it is an NVIDIA tradition, and they have done it numerous times in the past (RTX Voice, etc.), taking advantage of their closed-source, black box ecosystem to stave off official media from prying too much into it. I must confess, I'm actually developing contempt for NVIDIA's boneheaded approach. People who upgrade every single generation on day one will continue to do so. They'll finance the card in 24 months if they have to, but they will.

The whole circus with the 4090, with the low availability if you're not in a privileged country, the ultra-high prices, the design issues with power delivery that are leading to fires etc. just flipped me back to the red side. The RX 7900 XTX will be a monster, compatible with my existing base and I won't have to deal with NVIDIA's disrespect for their enthusiast customers, it's not the first time I just get burned by buying a high-end NVIDIA card, and AMD is giving us a better deal all-around.

I'll still wait some time for AMD's board partners to come up with their high-end designs, in the meantime, I just save so I don't get into any debt for it ;)
 
Joined
Feb 1, 2019
Messages
3,031 (1.50/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
easier for aibs to meet baseline when the baseline is inflated.

I've run quite a few (easier to run) games at 8K with my RTX 3090. If you consider 8K30 an acceptable target, this card will do.

The RTX 4090 is clearly not enough for an 8K60 target yet, and therein lies their master gimmick - DLSS frame generation, which they refuse to enable on Turing and Ampere, probably to try and bolster sales of the RTX 4090. This company is unscrupulous and will always lie about hardware requirements to upsell, it is an NVIDIA tradition, and they have done it numerous times in the past (RTX Voice, etc.), taking advantage of their closed-source, black box ecosystem to stave off official media from prying too much into it. I must confess, I'm actually developing contempt for NVIDIA's boneheaded approach. People who upgrade every single generation on day one will continue to do so. They'll finance the card in 24 months if they have to, but they will.

The whole circus with the 4090, with the low availability if you're not in a privileged country, the ultra-high prices, the design issues with power delivery that are leading to fires etc. just flipped me back to the red side. The RX 7900 XTX will be a monster, compatible with my existing base and I won't have to deal with NVIDIA's disrespect for their enthusiast customers, it's not the first time I just get burned by buying a high-end NVIDIA card, and AMD is giving us a better deal all-around.

I'll still wait some time for AMD's board partners to come up with their high-end designs, in the meantime, I just save so I don't get into any debt for it ;)
Deffo have eye on amd, £650 for my 3080 wasnt too bad, but i paid for dedicated rt cores i got no interest in, and vram is starved, i feel amd have the better direction right now. just need them to sell their FE equivalent in UK.

Also like you said locking software features to hardware is predatory.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Deffo have eye on amd, £650 for my 3080 wasnt too bad, but i paid for dedicated rt cores i got no interest in, and vram is starved, i feel amd have the better direction right now. just need them to sell their FE equivalent in UK.
Don't they have an official online store there like for the rest of Europe?

Edit: I see that you're left out like Norway and Switzerland. That's too bad!
 
Joined
Feb 1, 2019
Messages
3,031 (1.50/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Don't they have an official online store there like for the rest of Europe?

Edit: I see that you're left out like Norway and Switzerland. That's too bad!
Yeah brexit screwed us I think on that, AMD probably not wanting to eat the import costs.
 
Joined
May 2, 2017
Messages
7,762 (2.91/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Yeah brexit screwed us I think on that, AMD probably not wanting to eat the import costs.
Yeah, it's the same here in Norway. Having just moved back here from Sweden I'm seriously considering having friends in Sweden buy GPUs for me in the future - AMD's reference cards are always really nice (after they ditched the blower fans!), and they're incredibly hard to get around here. Not that I mind my current Powercolor cards though.
 
Top