• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

$700-800 Ideal Price for GeForce RTX 4080: TechPowerUp Poll Surveying 11,000 Respondents

Joined
Jul 10, 2017
Messages
2,671 (1.00/day)
I would give them... tree-fiddy! Nothing more, nothing less! :D
Well, if anything, three-fiddy is the amount you should never give!


My personal problem is that I cannot get rid of nVidia gpus.
Apart from the fact that I do like the RT tech and would never buy an alternative with lower RT performance, the CUDA acceleration for the CAD/Civil Engineering apps cannot be ignored.

If only AMD had something similar that would be useful to most of us.
Because right now we have no alternative.
AMD neglected this space for far too long now, and that hurts me deeply. The crazy things I had to do in OpenCL just to stay away from proprietary solutions... Man!
 
Joined
Dec 22, 2011
Messages
3,890 (0.83/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Clearly a PERFECT opportunity for AMD to pull the rug from under Nvidia's feet an crush them and steal tons of market share with their $650 RX 7900 XTX and $575 Radeon RX 7900 XT.
 
Joined
May 3, 2018
Messages
2,881 (1.21/day)
4N must be really expensive
Actually that's part of the problem Nvidia got themselves into. Paying way over the money for bleeding edge node unlike AMD sticking to proven and less expensive N5. Nvidia is screwed on pricing, they simply cannot compete, RDNA3 cost AMD a lot less money to manufacture and even at $899 for 7900XT they are making a nice healthy margin. I would not pay more than $799 for 4080, but even if it were I would most likely still get 7900XT(X). Nvidia doesn't deserve a red cent of mine.
 
Joined
May 11, 2018
Messages
1,213 (0.51/day)
Being OK with 70% price increase will inevitably lead to this:

2020, RTX 3080 - $700
2022, RTX 4080 - $1200 <- WE ARE HERE
2024, RTX 5080 - $2040
2026, RTX 6080 - $3468
2028, RTX 7080 - $5896
2030, RTX 8080 - $10022
2032, RTX 9080 - $17038
2034, GTX 1080 - $28965

And remember, before commenting about higher wafer and new processes cost, that this time around an X080 card doesn't even come with a cut down top of the line processor (RTX 4090 has AD102, 608 mm²), as it was normal, but is build around much smaller AD103, 378.6 mm². Compare this to Ampere architecture cards:

GeForce RTX 3080 (GA102, 628 mm²)
GeForce RTX 3080 12GB (GA102)
GeForce RTX 3080 Ti (GA102)
GeForce RTX 3090 (GA102)
GeForce RTX 3090 Ti (GA102)

So, not only we are having an unprecedented price inflation in a single generation (+70%), Nvidia is also selling us a lower tier, which was up until now reserved for X070 cards.

I can only imagine this was all planned during crypto high, and the coke still hasn't run out at Nvidia headquarters.
 
Joined
Sep 15, 2011
Messages
6,680 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I think 700 Euros or Dollars, including taxes and VAT, it's a fair price for a custom 4080 card. The nGreedia own's card shouldn't be more than 649 MSRP....
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,745 (3.32/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
The 8800 Ultra was less than $850 at launch. The 8800 Ultra was the top end card. The 4090 is $1600 MSRP, and there's still room for a Ti variant.
 
Joined
Feb 23, 2019
Messages
6,015 (2.89/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Joined
Sep 17, 2014
Messages
22,269 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Should I have made it clear that I was referring to best single gaming card of the gen?
No I shouldn't...
You've been making perfect sense. The model number in the tier/stack of products does tell us a lot about its positioning, intent, target audience, etc. Its what the whole marketing game is about, denial has no place here, even if there are some outliers to what is an established norm.

In the same vein, die size is an absolute factor too, but the performance relative to a generation before or after it, is up for interpretation and thát is what makes or breaks a price. That's why a 314 sq/mm 1080 sold well while current x80's are a harder sell with a múch larger die.

That's the space Nvidia is exploring here really, trying to stretch things up further to justify higher pricing. Except this is like shock therapy; this isn't just stagnation or a baby step forward, the 4080 is a massive step backwards. Something is gonna give though, and I think this time, its Nvidia, unless they are content with losing market share.

Clearly a PERFECT opportunity for AMD to pull the rug from under Nvidia's feet an crush them and steal tons of market share with their $650 RX 7900 XTX and $575 Radeon RX 7900 XT.
I see what you did there :D
 
Last edited:
Joined
Jun 14, 2020
Messages
3,275 (2.05/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Absolutely the die was smaller! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.

Today, a shrink enables an immediate maxing out of the silicon and then it is still not enough, so we need retarded power targets.
I agree that pascal was a thing of beauty, but honestly, do we REALLY care about how much nvidia is charging us per mm of die? I don't frankly. I check the performance and the price, whether they use a huge or small die is kinda irrelevant to me.

I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
 
Joined
Sep 17, 2014
Messages
22,269 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Actually that's part of the problem Nvidia got themselves into. Paying way over the money for bleeding edge node unlike AMD sticking to proven and less expensive N5. Nvidia is screwed on pricing, they simply cannot compete, RDNA3 cost AMD a lot less money to manufacture and even at $899 for 7900XT they are making a nice healthy margin. I would not pay more than $799 for 4080, but even if it were I would most likely still get 7900XT(X). Nvidia doesn't deserve a red cent of mine.
You see it too?

Team green needs to secure the top spot in the epeen ladder for 2024 by...
- maxing out the silicon directly on the top end node of the moment
- using increased power targets gen-to-gen since Turing, and despite a shrink, yet another.
- enabling 450W~600W power target on the top end to even extract meaningful OC results... if we call 6% meaningful, which is a stretch.
- pushing even an x80 to a price point beyond what used to be absolute top end

One does start to wonder how they'll proceed from here. But a reduction in ridiculousness is going to surprise me. What roads do they have left? Surely they're not going to follow suit on AMD's technology leadership now, are they? :laugh:

The longer you look at recent developments, it slowly identifies winning and losing technology. Clearly chiplet approach is the way, and Nvidia + Intel might have stuck too long to tried and tested stuff. Both companies keep launching product stacks that only show us 'further escalation' of proven approaches, except they're constantly running in the red zone, stacking bandaid upon bandaid to keep it all afloat. DLSS 3, is for Nvidia what PL1=2 is for Intel. The similarities are striking, the trade offs for that last snippet of performance (or not destroying latency...) are similar too.

Those ships have most definitely hit the iceberg, but apparently the captain still feels fine and the band keeps playing.

I agree that pascal was a thing of beauty, but honestly, do we REALLY care about how much nvidia is charging us per mm of die? I don't frankly. I check the performance and the price, whether they use a huge or small die is kinda irrelevant to me.

I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.
Absolutely agreed, the one caveat is part of the consumer 'battle' is outrage over prices. These things matter, we should applaud that, not criticise it altogether. What also matters, is that people put their money where their mouth is. (Or, don't put money :)) Don't ever underestimate the power of peer pressure here. It matters, and we need it.

To me, die size / price is an indicator of how much wiggle room there is left for a company to flesh out a product stack further. And it can then also tell us a lot about pricing, what's 'fair' and what's not, etc.
 
Last edited:
Joined
May 11, 2018
Messages
1,213 (0.51/day)
I also don't understand the outrage with the prices. You think its expensive? Great, don't buy it, the prices will drop, profit.

Outrage? Its a discussion on a forum, not violent protest in a street. But people are clearly "not buying it". Proof? Just look at the availability of an X080 card, released two weeks ago. When has this happened before?

RTX 4080 at Geizhals.eu

Screenshot_20221130_090907_com.android.chrome.jpg
 
Joined
Apr 2, 2008
Messages
414 (0.07/day)
System Name -
Processor Ryzen 9 5900X
Motherboard MSI MEG X570
Cooling Arctic Liquid Freezer II 280 (4x140 push-pull)
Memory 32GB Patriot Steel DDR4 3733 (8GBx4)
Video Card(s) MSI RTX 4080 X-trio.
Storage Sabrent Rocket-Plus-G 2TB, Crucial P1 1TB, WD 1TB sata.
Display(s) LG Ultragear 34G750 nano-IPS 34" utrawide
Case Define R6
Audio Device(s) Xfi PCIe
Power Supply Fractal Design ION Gold 750W
Mouse Razer DeathAdder V2 Mini.
Keyboard Logitech K120
VR HMD Er no, pointless.
Software Windows 10 22H2
Benchmark Scores Timespy - 24522 | Crystalmark - 7100/6900 Seq. & 84/266 QD1 |
You guys are in the EU just buy it in any other country. Some even do free shipping (not sure in those eastern parts but should be the same). Not that you can find much cheaper for what i see
This pointless for a number of reasons, but the obvious are import and vat charges and no international warranty. With just those 2 issues, it renders your suggestion useless.
 
Joined
Jun 14, 2020
Messages
3,275 (2.05/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Outrage? Its a discussion on a forum, not violent protest in a street. But people are clearly "not buying it". Proof? Just look at the availability of an X080 card, released two weeks ago. When has this happened before?

RTX 4080 at Geizhals.eu

View attachment 272290
Im not specifically talking about this forum. Multiple posters in multiple forums going bonkers. People are not buying the 4080? Great, prices will drop.
 
Joined
Apr 2, 2008
Messages
414 (0.07/day)
System Name -
Processor Ryzen 9 5900X
Motherboard MSI MEG X570
Cooling Arctic Liquid Freezer II 280 (4x140 push-pull)
Memory 32GB Patriot Steel DDR4 3733 (8GBx4)
Video Card(s) MSI RTX 4080 X-trio.
Storage Sabrent Rocket-Plus-G 2TB, Crucial P1 1TB, WD 1TB sata.
Display(s) LG Ultragear 34G750 nano-IPS 34" utrawide
Case Define R6
Audio Device(s) Xfi PCIe
Power Supply Fractal Design ION Gold 750W
Mouse Razer DeathAdder V2 Mini.
Keyboard Logitech K120
VR HMD Er no, pointless.
Software Windows 10 22H2
Benchmark Scores Timespy - 24522 | Crystalmark - 7100/6900 Seq. & 84/266 QD1 |
nvidia got rid of sli on mid range cards for a reason..
Yeah was because it wasnt a viable solution. Of the top of my head, micro-stutter, power useage didnt match performance and what I suspect as the main reason - nVidia dumping the responsibility for making games compatible onto games devs. Said devs clearly saw through that B$, which resulted in less and less games supporting the standard. So nVidia as a for-profit corp, it was then about the profit and where best to extract it from. Which si why you only see nVlink in the professional space.
 
Last edited:
Joined
May 11, 2018
Messages
1,213 (0.51/day)
Im not specifically talking about this forum. Multiple posters in multiple forums going bonkers. People are not buying the 4080? Great, prices will drop.

And people are going bonkers about people complaining about the price. Which is, in my opinion, even sillier.

Will the prices drop? When Nvidia released RTX 2080 with price increase and almost no performance increase compared to GTX 1080 Ti, just the promise of RTX and DLSS games in the future, people weren't very enthusiastic either. But Nvidia persevered, and waited with quite a lot of quarters in the red in gaming sector. And tried to improve this at Ampere launch, which was so inviting on paper. If only the crypto 2020 bull wouldn't happen.

So I wouldn't count on a quick reaction by Nvidia. Right now their coffers are full of dirty crypto money.
 
D

Deleted member 120557

Guest
Absolutely the die was smaller! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.

Today, a shrink enables an immediate maxing out of the silicon and then it is still not enough, so we need retarded power targets.
The beauty of Pascal was due to 16nm being such a massive leap on its own along with some extra transistor work to make it clock faster. The 16nm process was probably the biggest leap from purely process work since the turn of the century.
GTX 1080 started at 599$, how much of that is the cost of the chip itself? 20%? 30%?
RTX 4080 is a 20% larger chip at 100% higher cost per chip.

So, its contribution to the bill of materials in that asking 1200$+ is mere 20-30%, too.
Yea, probably around the same. I will say the R&D and initial process investment have skyrocketed too. Not enough to cover the cost difference but it is something.

What makes me mad though is when people compare AMD’s 500 series prices to now when AMD was not making anything on their GPUs. They were literally giving GPUs away just to keep a presence in the market. Crypto might be the only reason AMD could be net positive on their discrete GPUs since they bought ATI. The rest of the time they either lost money or broke even.

But still I can’t stand how many whiners there are for the high end luxury market toys. I just have to let this out somewhere once… As long as the low GPUs that can play anything at 1080p are reasonably priced, the market is fine by me. We’ll see how that shakes out with crypto finally where it belongs.
 
Joined
May 17, 2021
Messages
3,005 (2.38/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
This pointless for a number of reasons, but the obvious are import and vat charges and no international warranty. With just those 2 issues, it renders your suggestion useless.

there is no import charges inside the EU, vat changes just by the difference, and the warranty is exactly the same, 2 years and can be claimed anywhere inside the EU
 
Joined
Sep 17, 2014
Messages
22,269 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
But still I can’t stand how many whiners there are for the high end luxury market toys. I just have to let this out somewhere once… As long as the low GPUs that can play anything at 1080p are reasonably priced, the market is fine by me. We’ll see how that shakes out with crypto finally where it belongs.
The last good 1080p GPU with strong perf/$ was the 1060. We only regressed since. The price increases trickle down through the stack, that's why whine happens, because it does damage the market. Another aspect of that is that if people can't 'move up' the stack because its just priced out of the larger part of the target audience's comfort, you get stagnation in gaming as well. Devs cater to majority groups, not minorities. It will inevitably hurt the adoption rates of new technology in gaming, such as RT.

Fine Whine, indeed ;) You should expand your view a little bit I think. In this sense, Nvidia is clearly pushing for short term gain with little care for the long term market conditions. Do they know something we don't about the intermediate future? One might start to think so. They're already pushing harder on markets outside Geforce.

'This is fine', you say. It's clearly not. Its like all things in life; a world of haves and have nots, is a world in perpetual conflict. That's what you see here; when the gap between these two groups becomes 'unfair', is when things spiral out of control.
 
Last edited:
Joined
May 11, 2018
Messages
1,213 (0.51/day)
there is no import charges inside the EU, vat changes just by the difference, and the warranty is exactly the same, 2 years and can be claimed anywhere inside the EU

But the "common EU market" is less and less common. Large differences in VAT have forced legislation that stores that sell to other countruies MUST calculate the VAT of the recipient country, so for instance someone from Poland or Hungary (27% VAT!) can't order a product from Germany and expect a German VAT (19%).

Several large tech stores have even stopped shipping items abroad. Mindfactory, the largest one, for instance. You can order through packet forwarding company like Mailboxde (taxes are still calculated from the card holder's country), but some of the stores have forbidden even that (Notebooksbilliger, the only seller of Nvidia Founders Edition cards in EU).
 
Joined
Mar 10, 2010
Messages
11,878 (2.22/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
That's on Europe. Nobody forced them to have insane VAT such as Poland's 25%.
Yes I signed up to pay extra Vat when I was born where I was, it's automagic, we have as much say in it as we do in the weather.

Meanwhile I laugh at your pricing Nvidia, the masses have spoken, stick it up your ass until it's price cut to all hell.
 
Joined
Feb 20, 2019
Messages
8,169 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I’m strongly considering either a 7900xt or 7900xtx once the reviews come out. I need something better than my rx6600 now that I have a 4k monitor. I agree purely based on specs and the numbers we have so far the XT at $899 is a bad deal compared to the XTX for $999. However, given that I’m pairing it with an i5-11th gen, I wonder if I’m better off with the XT as I’ll probably be wasting an XTX?
The problem with the 6600 at 4K is memory bandwidth and an infinitycache that's too small for higher resolutions. The 6600 tanks hard, even when running FSR performance (1080p native) because it simply lacks the bandwidth to handle the upscaled frame data. You might find that for the games you're playing a 6800XT on clearance will be more than enough. You can pick them up right now for about $520 new, or they regularly sell used for $450 on ebay if you browse by "sold items". Get $175 back from your RX 6600 and that solves your 4K problem for under $300 while avoiding the early-adopter tax and flagship pricing model.

By mid 2023 we should have the 4070/4070Ti and the RX 7800XT. Presumably these will shake up the pricing a lot - in a way that these flagship launches have no real hope of doing.
 
Joined
Jan 11, 2013
Messages
1,237 (0.29/day)
Location
California, unfortunately.
System Name Sierra
Processor Core i5-11600K
Motherboard Asus Prime B560M-A AC
Cooling CM 212 Black RGB Edition
Memory 64GB (2x 32GB) DDR4-3600
Video Card(s) MSI GeForce RTX 3080 10GB
Storage 4TB Samsung 990 Pro with Heatsink NVMe SSD
Display(s) 2x Dell S2721QS 4K 60Hz
Case Asus Prime AP201
Power Supply Thermaltake GF1 850W
Software Windows 11 Pro
The problem with the 6600 at 4K is memory bandwidth and an infinitycache that's too small for higher resolutions. The 6600 tanks hard, even when running FSR performance (1080p native) because it simply lacks the bandwidth to handle the upscaled frame data. You might find that for the games you're playing a 6800XT on clearance will be more than enough. You can pick them up right now for about $520 new, or they regularly sell used for $450 on ebay if you browse by "sold items". Get $175 back from your RX 6600 and that solves your 4K problem for under $300 while avoiding the early-adopter tax and flagship pricing model.

By mid 2023 we should have the 4070/4070Ti and the RX 7800XT. Presumably these will shake up the pricing a lot - in a way that these flagship launches have no real hope of doing.
Hmm, that is probably the path I should take, but I’d just be irritated if I spent $500 on something that didn’t perform as good as I hoped. Regardless I’d be keeping my RX6600 for my secondary PC.
 
Joined
Feb 20, 2019
Messages
8,169 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
You see it too?

Team green needs to secure the top spot in the epeen ladder for 2024 by...
- maxing out the silicon directly on the top end node of the moment
- using increased power targets gen-to-gen since Turing, and despite a shrink, yet another.
- enabling 450W~600W power target on the top end to even extract meaningful OC results... if we call 6% meaningful, which is a stretch.
- pushing even an x80 to a price point beyond what used to be absolute top end

One does start to wonder how they'll proceed from here.
I was having a similar discussion on the CPU front the other day too - specifically that desktop CPUs are not the only CPUs being made. There's no room for a 350W CPU in a laptop and laptops have been steadily overtaking desktops for half a decade at this point. Even on desktop, very few people actually want to spend on crazy CPUs with all the flagship platform costs of a high-end cooler/board/DDR5.

Progress cannot be made much longer the way Nvidia are currently doing it:
  • The 3090 started the trend of tripping OCP on high-end, 1000W PSUs.
  • The partner-model 4090s don't physically fit in several enthusiast cases that were previously thought to have plenty of space for huge GPUs.
  • The 4090 can't be powered safely, apparently as we go through the second flawed iteration of a dumb-idea connector.
  • How big is the gulf going to be between a 450W 4090FE and a laptop 4080(M)? Laptop OEMs were really pushing their luck with 130W variants last generation, there's only so much physical space you can dedicate to cooling in something that has to be about an inch thick and water-cooling is, realistically, not an option.

The last good 1080p GPU with strong perf/$ was the 1060. We only regressed since. The price increases trickle down through the stack, that's why whine happens, because it does damage the market. Another aspect of that is that if people can't 'move up' the stack because its just priced out of the larger part of the target audience's comfort, you get stagnation in gaming as well. Devs cater to majority groups, not minorities. It will inevitably hurt the adoption rates of new technology in gaming, such as RT.
RX6600 might change your opinion on that?

It's built for 1080p in terms of ROPs, bandwidth, and cache.
It's priced about the same as the 1060 if you adjust for inflation and added US-China tariffs that didn't exist in 2016 when the 1060 was made.
Partner models use ~130W, which is marginally less than the typical 140-150W that most non-FE 1060 cards used. There's a fair bit of undervolting headroom, too - just like the 1060.
It's twice the performance of the 1060, five years later, or roughly equivalent to 2 generations of 40% uplift each time
It was the clear performance/Watt leader of its generation.

Unfortunately it was an underrated card because it was unexciting, and suffered from low-availability and scalping, but now that those issues are gone it's my go-to recommendation for someone looking to upgrade from a 1060 or lower. You can pick them up used for $175 and if you look for deals you can find them new for less than the 1060's original MSRP.
 
Last edited:
Top