Processor | Ryzen 7800X3D |
---|---|
Motherboard | MSI MAG Mortar B650 (wifi) |
Cooling | be quiet! Dark Rock Pro 4 |
Memory | 32GB Kingston Fury |
Video Card(s) | Gainward RTX4070ti |
Storage | Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb |
Display(s) | LG 32" 165Hz 1440p GSYNC |
Case | Asus Prime AP201 |
Audio Device(s) | On Board |
Power Supply | be quiet! Pure POwer M12 850w Gold (ATX3.0) |
Software | W10 |
System Name | money pit.. |
---|---|
Processor | Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset |
Motherboard | Asus rog Strix Z370-F Gaming |
Cooling | Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling.. |
Memory | 32 gb corsair vengeance 3200 |
Video Card(s) | Palit Gaming Pro OC 2080TI |
Storage | 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2.. |
Display(s) | 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440.. |
Case | Gigabyte mid-tower.. cheap and nothing special.. |
Audio Device(s) | onboard sounds with stereo amp.. |
Power Supply | EVGA 850 watt.. |
Mouse | Logitech G700s |
Keyboard | Logitech K270 |
Software | Win 10 pro.. |
Benchmark Scores | Firestike 29500.. timepsy 14000.. |
yeah- that or else we get a price cut 6 months from nowFor this stupid 1200$ price they sell current stocks 4080 in 2-4year
Processor | Ryzen 7800X3D |
---|---|
Motherboard | MSI MAG Mortar B650 (wifi) |
Cooling | be quiet! Dark Rock Pro 4 |
Memory | 32GB Kingston Fury |
Video Card(s) | Gainward RTX4070ti |
Storage | Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb |
Display(s) | LG 32" 165Hz 1440p GSYNC |
Case | Asus Prime AP201 |
Audio Device(s) | On Board |
Power Supply | be quiet! Pure POwer M12 850w Gold (ATX3.0) |
Software | W10 |
What are you smoking? SLI disappeared because it was a driver-tastic nightmare and relied too heavily on developers. Nvidia didn't drop SLI to charge more for single cards - that's the worst apologist excuse I've ever heard. Are you trolling?nvidia got rid of sli on mid range cards for a reason.. so that they double the price for a high end card as opposed to buying two cheaper cards in sli mode..
trog
I voted $900 mainly due to inflation correction, i'd say between 800 and 900 is fair.
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ASRock X670E Taichi |
Cooling | Noctua NH-D15 Chromax |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | MSI RTX 4090 Trio |
Storage | Too much |
Display(s) | Acer Predator XB3 27" 240 Hz |
Case | Thermaltake Core X9 |
Audio Device(s) | Topping DX5, DCA Aeon II |
Power Supply | Seasonic Prime Titanium 850w |
Mouse | G305 |
Keyboard | Wooting HE60 |
VR HMD | Valve Index |
Software | Win 10 |
If you factor in inflation, $800 seems more like it. GTX780 released in 2013 was around that price (in terms of buying power) as well.
The poll poses a trick question - because it does not ask users what to them is a realistic price, but what would they pay for an RTX 4080. To that? 400 USD is a perfectly viable answer, even if NVIDIA actually loses money on selling one.
I just gave in and bought an Asus tuf 3070 v2 for $585. Mainly due to the fact my youngest daughter needed a new gpu. I gave her my old 2070 which worked great for 1440p @144hz gaming on it. I was trying to hold out for the 4070 but this card dropped below 600 bucks and I jumped on it. I will not pay the increased prices for the 4xxx cards. These prices are crazy for GPU's. The 4080 should be back to $699 due to ethereum becoming POS.
We'll see how stubborn nVIDIA will insist on it's current pricing when 7900XTX/XT launches. If AMD is at least somewhat competitive I doubt the price will not be lowered - unless nVIDIA and their shareholders are fine with cards sitting on shelves.
i spent £1100 quid on a 2080ti.. before that i had two 1070 cards in SLI mode and before that a pair of 980ti cards in sli mode.. before that it was a pair of 970 cards in sli mode..
high end or near high end has never been cheap to me for a long time.. nvidia got rid of sli on mid range cards for a reason.. so that they double the price for a high end card as opposed to buying two cheaper cards in sli mode..
trog
System Name | Tower of Power / Delliverance |
---|---|
Processor | i7 14700K / i9-14900K |
Motherboard | ASUS ROG Strix Z690-A Gaming WiFi D4 / Z690 |
Cooling | CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / Air |
Memory | CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / DDR5 2x 16gb |
Video Card(s) | ASUS TUF Gaming GeForce RTX 4070 Ti / GeForce RTX 4080 |
Storage | 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / NVM3 PC801 SK hynix 1TB |
Display(s) | Samsung 32" Odyssy G5 Gaming 144hz 1440p, 2x LG HDR 32" 60hz 4k / 2x LG HDR 32" 60hz 4k |
Case | Phantek "400A" / Dell XPS 8960 |
Audio Device(s) | Realtek ALC4080 / Sound Blaster X1 |
Power Supply | Corsair RM Series RM750 / 750w |
Mouse | Razer Deathadder V3 Hyperspeed Wireless / Glorious Gaming Model O 2 Wireless |
Keyboard | Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes |
VR HMD | Quest 3 (512gb) + Rift S + HTC Vive + DK1 |
Software | Windows 11 Pro x64 / Windows 11 Pro x64 |
Benchmark Scores | Yes |
Yeah but my decision was mainly due to the better performance in VR the 3070 has over the 6800/6900. I did look at those cards as well. Also, I forgot to add I had a 60 dollar instant rebate that made the price $525.84. It's more than I wanted to pay but I don't think I did bad. The price is back up to 650. Clearly Nvidia's pricing isn't going to continue because prices are coming down, albeit not fast enough. lolHoly smokes that price is a ripoff. You could have gotten a 6900 XT for $20 more or a 6800 XT for $520 (AsRock model). More raster performance in either case and equal RT performance with the 6900 XT. Last gen AMD cards have been dropping in price much faster than Nvidia.
This is part of the reason Nvidia can charge whatever it wants, people buy them regardless. Nvidia's pricing of the 4000 series isn't dumb from a business perspective because this will clearly continue.
Processor | 7800X3D |
---|---|
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | Thermalright Peerless Assassin |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Interpretation I think, or semantics - a realistic price is one the market 'will bear'. After all, if people say its unrealistic, people consider it a bad deal.The poll poses a trick question - because it does not ask users what to them is a realistic price, but what would they pay for an RTX 4080. To that? 400 USD is a perfectly viable answer, even if NVIDIA actually loses money on selling one.
I don't understand what you're saying I think. Nvidia isn't selling something new, right? It's just an x80 that has an impressive performance gap to the next card up the stack. In that sense, its not even a great x80. Nvidia's marketing department would have balls of steel if they tried to sell this as something it's not; they have experience with that, and even just a small lie about a number on a spec sheet (4GB) was enough for an outrage.This is an example of how marketing can have such a huge impact on perception. If Nvidia was able to sell past x80 models at $500-$800 and still make well over 50% margins, then the 4080 must include more silicon features to require an almost doubling of price to maintain margins.
If this is the case then the 4080 is something different and not directly comparable to past generations. But Nvidia doesn’t change the model name (Geforce) or model numbering (4080).
Thats a somewhat cowardly move by Nvidia’s marketing department showing very little initiative to sell something new. AMD is also at fault for not changing its GPU naming with the introduction of RDNA (although they did change their CPUs to Ryzen/Epyc from Athlon/Opteron/FX).
Let that be a lesson to you all. If you change something so significantly that it costs more, you need to change the name and market it differently. Otherwise past product comparisons will be brutal.
System Name | Current |
---|---|
Processor | i7 12700k |
Motherboard | Asus Prime Z690-A |
Cooling | Noctua NHD15s |
Memory | 32GB G.Skill |
Video Card(s) | GTX 1070Ti |
Storage | WD SN-850 2TB |
Display(s) | LG Ultragear 27GL850-B |
Case | Fractal Meshify 2 Compact |
Audio Device(s) | Onboard |
Power Supply | Seasonic 1000W Titanium |
System Name | My Current Desktop |
---|---|
Processor | i9 12900KF |
Motherboard | Asus ROG STRIX Z690-E GAMING WIFI |
Cooling | ARCTIC Liquid Freezer II 360 |
Memory | G.Skill Trident Z5 RGB Series 32GB (2 x 16GB) DDR5 6400 F5-6400J3239G16GA2-TZ5RS |
Video Card(s) | RTX 3090 FE |
Storage | SAMSUNG 980 PRO SSD 1TB |
Display(s) | Samsung G80SD |
Case | Fractal Design Torrent White |
Audio Device(s) | Schiit Bifrost2 |
Power Supply | Corsair HX850 |
Mouse | Razer Basilisk v3 pro |
Keyboard | Keychron Q6 Max (brown) |
Software | Win 11 Pro |
Processor | Ryzen 7800X3D |
---|---|
Motherboard | ASRock X670E Taichi |
Cooling | Noctua NH-D15 Chromax |
Memory | 32GB DDR5 6000 CL30 |
Video Card(s) | MSI RTX 4090 Trio |
Storage | Too much |
Display(s) | Acer Predator XB3 27" 240 Hz |
Case | Thermaltake Core X9 |
Audio Device(s) | Topping DX5, DCA Aeon II |
Power Supply | Seasonic Prime Titanium 850w |
Mouse | G305 |
Keyboard | Wooting HE60 |
VR HMD | Valve Index |
Software | Win 10 |
Yeah but my decision was mainly due to the better performance in VR the 3070 has over the 6800/6900. I did look at those cards as well. Also, I forgot to add I had a 60 dollar instant rebate that made the price $525.84. It's more than I wanted but I don't think I did bad. The price is back up to 650. Clearly Nvidia's pricing isn't going to continue because prices are coming down, albeit not fast enough. lol
Fun fact - If you were to apply "GPU's should cost +60% more because they are +60% faster than last gen" consistently from nVidia's early RIVA TNT2's through to today, then today's average GPU would cost $3m. Likewise, RAM also used to cost £100/MB at once point. Using same "nVidia pricing logic", 2x 16GB sticks of DDR4 'should' cost £3.2m. We should also be thankful that nVidia doesn't make storage devices or this chart would be a flat horizontal line that never dipped below $1m per GB, at which point modern 1TB SSD's would cost $1bn each...
Back in the real world, the whole point of progress is "tech gets better at same price", not "price ends up completely divorced from reality because 'tech improved as expected'".
Processor | 7800X3D |
---|---|
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | Thermalright Peerless Assassin |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Is that so, and a fact? I would say SLI disappeared for many more reasons... bit of a double edged blade imho. The driver nightmare is a fact, of course; money went into that too. For developers, the same thing applies in that sense. (Now they got RT in return ) On the technical side, we've seen a battle over latency and high refresh is a market demand just the same as perfect frame pacing. SLI doesn't fit in here. And what about Gsync...What are you smoking? SLI disappeared because it was a driver-tastic nightmare and relied too heavily on developers. Nvidia didn't drop SLI to charge more for single cards - that's the worst apologist excuse I've ever heard. Are you trolling?
First off, I will say I think the 4080 is overpriced, but people are too kind to the 10 series. All the die sizes were tiny relative to their name. The 1080 was 314mm2. The 4080 is actually smaller than I thought at 378 but the wafer cost TSMC is charging for 5nm is 2-3x what 16nm was. Moore’s law is effectively dead because TSMC is charging what the market will bear.April 2017 - today
Ryzen 5 1600 - 220$
Ryzen 5 7600 - 229$
with inflation it should be 267$
May 2016 - today
Nvidia 1080 - 599$
Nvidia 4080 - 1199$
with inflation it should be 743$
It certainly has nothing to do with Moore's Law or inflation. Stop using inflation as an excuse
First off, I will say I think the 4080 is overpriced, but people are too kind to the 10 series. All the die sizes were tiny relative to their name. The 1080 was 314mm2. The 4080 is actually smaller than I thought at 378 but the wafer cost TSMC is charging for 5nm is 2-3x what 16nm was. Moore’s law is effectively dead because TSMC is charging what the market will bear.
Plus, the 1600 to 7600 is a bad example. Since AMD uses chiplets now the expensive CPU portion is only ~70mm2 compared to a much bigger 213 for the 1600 though the io die being 120 makes it much closer in total. AMD is still taking a margin cut over their 1600 most likely.
System Name | Tower of Power / Delliverance |
---|---|
Processor | i7 14700K / i9-14900K |
Motherboard | ASUS ROG Strix Z690-A Gaming WiFi D4 / Z690 |
Cooling | CM MasterLiquid ML360 Mirror ARGB Close-Loop AIO / Air |
Memory | CORSAIR Vengeance LPX 32GB (2 x 16GB) DDR4 3600 / DDR5 2x 16gb |
Video Card(s) | ASUS TUF Gaming GeForce RTX 4070 Ti / GeForce RTX 4080 |
Storage | 4x Samsung 980 Pro 1TB M.2, 2x Crucial 1TB SSD / NVM3 PC801 SK hynix 1TB |
Display(s) | Samsung 32" Odyssy G5 Gaming 144hz 1440p, 2x LG HDR 32" 60hz 4k / 2x LG HDR 32" 60hz 4k |
Case | Phantek "400A" / Dell XPS 8960 |
Audio Device(s) | Realtek ALC4080 / Sound Blaster X1 |
Power Supply | Corsair RM Series RM750 / 750w |
Mouse | Razer Deathadder V3 Hyperspeed Wireless / Glorious Gaming Model O 2 Wireless |
Keyboard | Glorious GMMK with box-white switches / Keychron K6 pro with blue swithes |
VR HMD | Quest 3 (512gb) + Rift S + HTC Vive + DK1 |
Software | Windows 11 Pro x64 / Windows 11 Pro x64 |
Benchmark Scores | Yes |
The 6900 XT was out of my budget. The biggest thing now is that it's done and I'm happy with my card. I only paid 26 bucks over MSRP that pretty much matches the performance of the 3070 Ti. Plus, the big picture here is that I used my wife's money so in reality I got a free 3070. Just to be clear you have made very valid points that I did look at before hand.. I just could not find a 6900 xt for $525.84. Let's not keep derailing the thread over my poor GPU decision making.. I do miss the good ol days when you were with your people and they were just excited you got a new piece of hardware.. lol Now kick me and send me a PM telling me about how E Cores are crap... hehehe j/k don't send me that pm.. I'm happy with my mediocre gaming pc..Well the thing is the 3070 has worse performance in VR as well:
VR Wars: the RTX 3070 vs. the RTX 2080 Ti – FCAT VR Performance benchmarked
VR Wars: Ampere vs Turing – the RTX 3070 vs. the RTX 2080 Ti – Performance benchmarked Using FCAT VR & the Vive Pro This ... Read morebabeltechreviews.comVR Wars: The Red Devil RTX 6900 XT versus the RTX 3090 Founders Edition (Part 2)
VR Wars: The Red Devil RX 6900 XT vs. the RTX 3090 FE – 15 Games Performance benchmarked using the Vive Pro This review presents ... Read morebabeltechreviews.com
The 6900 XT wins by a large margin.
Processor | 7800X3D |
---|---|
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | Thermalright Peerless Assassin |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Absolutely the die was smaller! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.First off, I will say I think the 4080 is overpriced, but people are too kind to the 10 series. All the die sizes were tiny relative to their name. The 1080 was 314mm2. The 4080 is actually smaller than I thought at 378 but the wafer cost TSMC is charging for 5nm is 2-3x what 16nm was. Moore’s law is effectively dead because TSMC is charging what the market will bear.
Plus, the 1600 to 7600 is a bad example. Since AMD uses chiplets now the expensive CPU portion is only ~70mm2 compared to a much bigger 213 for the 1600 though the io die being 120 makes it much closer in total. AMD is still taking a margin cut over their 1600 most likely.
Absolutely! But that's entirely the beauty of Pascal. It did so much more with so little. That's a shrink The Way It's Meant to be Played. Part of that is also that Nvidia had been stuck on 28nm for só long.
Today, a shrink enables an immediate maxing out of the silicon and then it is still not enough, so we need retarded power targets.
Processor | 7800X3D |
---|---|
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | Thermalright Peerless Assassin |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
No they are not, the 4090 is 608 sq/mm just about like the 28nm Titan, heck its even larger.The retarded power targets are an "innovation" to maximise the profits with as little as possible physical input aka materials
No they are not, the 4090 is 608 sq/mm just about like the 28nm Titan, heck its even larger.
Processor | AMD Ryzen 7 5800X3D |
---|---|
Motherboard | ASUS B550M-Plus WiFi II |
Cooling | Noctua U12A chromax.black |
Memory | Corsair Vengeance 32GB 3600Mhz |
Video Card(s) | Palit RTX 4080 GameRock OC |
Storage | Samsung 970 Evo Plus 1TB + 980 Pro 2TB |
Display(s) | Acer Nitro XV271UM3B IPS 180Hz |
Case | Asus Prime AP201 |
Audio Device(s) | Creative Gigaworks - Razer Blackshark V2 Pro |
Power Supply | Corsair SF750 |
Mouse | Razer Viper |
Keyboard | Asus ROG Falchion |
Software | Windows 11 64bit |
Below 500 and over 1200 answers are ridiculous.
Processor | 7800X3D |
---|---|
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | Thermalright Peerless Assassin |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Right, I suppose your definition of relative is different from mine.8N Turing were larger - 754 sq. mm second tier chip, 545 sq. mm third tier chip, 445 sq. mm forth tier chip, 284 sq. mm fifth tier chip.
608 sq. mm for nvidia is a relatively small chip.
Processor | AMD Ryzen 7 5800X3D |
---|---|
Motherboard | ASUS B550M-Plus WiFi II |
Cooling | Noctua U12A chromax.black |
Memory | Corsair Vengeance 32GB 3600Mhz |
Video Card(s) | Palit RTX 4080 GameRock OC |
Storage | Samsung 970 Evo Plus 1TB + 980 Pro 2TB |
Display(s) | Acer Nitro XV271UM3B IPS 180Hz |
Case | Asus Prime AP201 |
Audio Device(s) | Creative Gigaworks - Razer Blackshark V2 Pro |
Power Supply | Corsair SF750 |
Mouse | Razer Viper |
Keyboard | Asus ROG Falchion |
Software | Windows 11 64bit |
If so, why did I buy the mighty Radeon HD 4890 top of the line GPU for only $195 brand new back in summer 2009?