• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Widespread GeForce RTX 4080 SUPER Card Shortage Reported in North America

Joined
Apr 29, 2014
Messages
4,303 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Was at Micro Center yesterday picking up a few items and a guy checking out in front of me was excited he was able to pickup the last 4080 Super (priced at $1000 for the model he got) they had in stock. Damn....after tax that guy spent close to $1100 on a 4080 Super. Good for him and all, but I don't know how you could be excited about dropping that much money on a GPU.

How he wants to spend his money is his propagative, but if you ask me, a fool and his money are soon parted. The card alone cost almost the same as the rest of the items he was getting (SSD, MB, CPU, case, RAM, PSU). In all he was looking at around $2300 for everything after taxes, based on the prices on the items (it is possible some were priced lower than the sticker cost).

To be fair, I asked a guy in the DIY department how many 4080S they see come in and he said just a couple here and there and within a few days the couple that come do sell. They are selling, but by no means are they coming in by the truckload.
That is why I bet there is this "Shortage", its artificial to make people buy them when they see them in stock. Helps sales with people who see a card sold out constantly being willing to buy it the moment they see it in stock. I was at Micro-Center as well in Dallas this week and they told me it was a very low amount of stock they got.

The problem with that is most buyers end up purchasing Nvidia regardless of the fact that for 95% of games (without RT) AMD offers a better value product in the midrange. Voting with your “wallet” requires thinking about a purchase rather than looking at a halo product and automatically jumping to the conclusion everything down the stack is the best you can get. Eventually consumers are going to price themselves out of PC gaming by accepting whatever ridiculous price Nvidia sets.

Proves time and time again, most people make pretty poor choices when it comes to their “needs” when it comes to tech.
I don't disagree with that statement, though I would not say 95% of the time as I think its a much closer battle than that depending on which cards you are comparing. That being said I stand by saying the 7900 XTX is a better value at $900 than buying a RTX 4080 S at $1000.
 
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
DRAMExchange shows otherwise. This also lines up with other reports.

View attachment 334531

Another data point is the price of DDR5 on Digikey. Notice it says 16Gbit. Now we all know that this price is much higher than the actual price of DRAM. DDR5 is now starting at $73 for two 16 GB DIMMs. That is less than a quarter the price of the Digikey quote if they are selling it one IC at a time. The logical conclusion is that the reel, i.e. one unit, doesn't correspond to one DRAM IC.

View attachment 334532
It goes down to $13.5 when ordering in large quantities.

View attachment 334533

This isn't the same model or even type of IC, though. Each IC has a different cost, so it's an apples to oranges comparison. Sure using standard ~16-18Gbps GDDR6 is cheaper, but Nvidia's cards have that specific bin, and chip type of GDDR6X that I referred to in the Digikey argument. You can double check with W1zzard's reviews.

As Vya pointed out, the 2080 was a 545mm2 die. It also cost $800.



Sorta of like how the 2080 isn't going to be doing RT on any modern game either? Yeah you can technically do RT on either the 5700 XT and 2080 and both are going to give you are terrible experience. The 3070 is already half in the same boat already as well.



You are trying to make this hypothetical argument that the 2080 WOULD be a midrange card in your hypothetical situation but it doesn't workout because 1) It would be even more expensive in that scenario given the increased cost of 7nm 2) It'd still be at least an $800+ video card 3) Your hypothetical is not reality. The 2080 in reality has the price tag, name, and die size of a higher card. Let's call a pig a pig.




xx90 cards do FP64 at half the rate of Titan cards (1:64 vs 1:32). Nvidia wanted people doing scientific calculations, one of the major uses of Titan cards, to spend more money so they got rid of Titan cards. The xx90 class is a more gimped version of the Titan cards focused specifically on the gaming aspect. People didn't get a discount, most people who legimitimately needed the features of the Titan cards eneded up having to spend more and buy up.



Entirely depends if the consoles are considered average at the point of measure in their lifecycle. Console performance tends to go down over time as games become more demanding. Games requirements do not remain fixed and often games released later in a console gen tend to perform more poorly as we reach the end of a cycle.




Couple of factors you are forgetting:

1) Both consoles have additional reserved memory just for the OS

2) Consoles have less overhead

3) Both consoles have proprietary texture decompression chips that allow them to dynamically stream data off their SSDs at high speeds and low latency, drastically reducing the amount of data that has to be stored in memory / VRAM.

I've not forgotten any of this.

1. No, RX 5700 XT cannot do raytracing at all. It's just not compatible with the technology and AMD has made absolutely no effort to support a software DXR driver on this hardware. This was their choice. The RTX 2080 may lack the performance of its higher end and contemporary siblings, but it is fully compatible with DirectX 12 Ultimate. AMD has no claim to this prior to RDNA 2.

2. Regardless of die size and cost, the TU104 served as Turing's midrange offering alongside the low-end TU106 and the high-end TU102. Its larger die area is owed to the earlier fabrication process and the presence of extra hardware features that the competition plain didn't support. We know Turing was expensive, but it's the one thing you're completely unwilling to accept: reality is that midrange GPUs have been costing $800 for some time now. And that's not about to change. The way the market is going - and this includes AMD, is that the pricing "floor" on cards that are worth buying is consistently being raised generation after generation. There's an interesting video that's been making the rounds recently where the guy approaches exactly this problem:


3. Regarding FP64, frankly, who cares? The ratio may have changed to more or less keep this about the same level but FP64 is increasingly unimportant across all kinds of GPU computing segments. Has been for many years, remember 10 years ago when Titan X Maxwell removed the FP64 dedicated cores that the Kepler models had? It's not that they were disabled, Maxwell simply didn't support that... there was no demand for that then, and there is no demand for it now. Titan optimizations went beyond FP64, they simply enabled the optimizations from their enterprise drivers for targeted applications such as specviewperf and similar suites in an answer to AMD's Vega Frontier Edition semi-pro GPU. I owned one and AMD abandoned it, feature incomplete, far before announcing GCN5 EOL because they simply did not care for maintaining that product and the promise they had made to its buyers.

4. Neither the PS5 nor the Xbox Series are particularly powerful in comparison to a contemporary PC. Digital Foundry's placed an RTX 4080 SUPER at about ~3-3.2x the performance of a PS5. The PS5 may have a few tricks up its sleeve like the aforementioned dedicated decomp engine but... on the other hand, we've got far more powerful processors with much faster storage and much faster memory available, so really, it balances that out even if you disregard things like DirectStorage.


No its intentionally making their products obsolete sooner. Nvidia is very careful when it comes to making cards last for a certain amount of time as that their business strategy in getting people to upgrade. VRAM is an easy way to limit a cards future prospects as we have seen VRAM usage grow exponentially and its a reason Nvidia forced vendors to stop putting out special versions of cards with more VRAM. Only the very top of the product stack has reasonable VRAM amounts that would allow you to use the card effectively longer. Hence why the card had a 192bit bus at launch and lower vram (But now we have a newer version with a 256 bit bus and 16gb near the end of the cycle) because they are trying to gin up a reason to purchase it for the hold outs.

I agree to a point on the optimization argument as it has become more common place to just dump all textures into the GPU memory for developers to avoid having to do work harder in pull those assets when needed. But that unfortunately is the reality and we now have to contend with that fact when purchasing a GPU.

The "intentionally making their products obsolete sooner" argument flies directly in the face of Nvidia's lengthy and comprehensive software support over the years. I'd argue nothing makes a graphics card more obsolete than ceasing its driver support, and AMD has already axed pretty much everything older than the RX 5700 XT already.

It's not some big scheme, they just upmarked the SKUs in relation to the processor that powered them way too much. AMD did the same, and why you ended with such a massive gap between the 7600 (full Navi 33) and the 7800 XT (full Navi 32). They had to shoehorn SKUs between these, and it resulted in a crippled 7700 XT that's got 12 GB of VRAM and became unpopular because of it, and a 7600 XT that's basically a 16 GB 7600 that... no one sane would purchase at the prices being asked, and was received just as poorly as the 16 GB 4060 Ti.

I don't disagree with that statement, though I would not say 95% of the time as I think its a much closer battle than that depending on which cards you are comparing. That being said I stand by saying the 7900 XTX is a better value at $900 than buying a RTX 4080 S at $1000.

For $100 you're taking:

- NVIDIA's vastly superior software ecosystem (all the bells and whistles) with a much longer support lifecycle
- Identical raster with 20% extra RT performance
- A GPU with a higher power efficiency figure

That's very much up to you... personally, I wouldn't touch the XTX if I was asked to choose between it at $900 and the 4080S at $1K. The XTX needs to be priced at $799 to become a clear winner in my eyes.
 
Joined
Nov 26, 2021
Messages
1,702 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
This isn't the same model or even type of IC, though. Each IC has a different cost, so it's an apples to oranges comparison. Sure using standard ~16-18Gbps GDDR6 is cheaper, but Nvidia's cards have that specific bin, and chip type of GDDR6X that I referred to in the Digikey argument. You can double check with W1zzard's reviews.



I've not forgotten any of this.

1. No, RX 5700 XT cannot do raytracing at all. It's just not compatible with the technology and AMD has made absolutely no effort to support a software DXR driver on this hardware. This was their choice. The RTX 2080 may lack the performance of its higher end and contemporary siblings, but it is fully compatible with DirectX 12 Ultimate. AMD has no claim to this prior to RDNA 2.

2. Regardless of die size and cost, the TU104 served as Turing's midrange offering alongside the low-end TU106 and the high-end TU102. Its larger die area is owed to the earlier fabrication process and the presence of extra hardware features that the competition plain didn't support. We know Turing was expensive, but it's the one thing you're completely unwilling to accept: reality is that midrange GPUs have been costing $800 for some time now. And that's not about to change. The way the market is going - and this includes AMD, is that the pricing "floor" on cards that are worth buying is consistently being raised generation after generation. There's an interesting video that's been making the rounds recently where the guy approaches exactly this problem:


3. Regarding FP64, frankly, who cares? The ratio may have changed to more or less keep this about the same level but FP64 is increasingly unimportant across all kinds of GPU computing segments. Has been for many years, remember 10 years ago when Titan X Maxwell removed the FP64 dedicated cores that the Kepler models had? It's not that they were disabled, Maxwell simply didn't support that... there was no demand for that then, and there is no demand for it now. Titan optimizations went beyond FP64, they simply enabled the optimizations from their enterprise drivers for targeted applications such as specviewperf and similar suites in an answer to AMD's Vega Frontier Edition semi-pro GPU. I owned one and AMD abandoned it, feature incomplete, far before announcing GCN5 EOL because they simply did not care for maintaining that product and the promise they had made to its buyers.

4. Neither the PS5 nor the Xbox Series are particularly powerful in comparison to a contemporary PC. Digital Foundry's placed an RTX 4080 SUPER at about ~3-3.2x the performance of a PS5. The PS5 may have a few tricks up its sleeve like the aforementioned dedicated decomp engine but... on the other hand, we've got far more powerful processors with much faster storage and much faster memory available, so really, it balances that out even if you disregard things like DirectStorage.




The "intentionally making their products obsolete sooner" argument flies directly in the face of Nvidia's lengthy and comprehensive software support over the years. I'd argue nothing makes a graphics card more obsolete than ceasing its driver support, and AMD has already axed pretty much everything older than the RX 5700 XT already.

It's not some big scheme, they just upmarked the SKUs in relation to the processor that powered them way too much. AMD did the same, and why you ended with such a massive gap between the 7600 (full Navi 33) and the 7800 XT (full Navi 32). They had to shoehorn SKUs between these, and it resulted in a crippled 7700 XT that's got 12 GB of VRAM and became unpopular because of it, and a 7600 XT that's basically a 16 GB 7600 that... no one sane would purchase at the prices being asked, and was received just as poorly as the 16 GB 4060 Ti.



For $100 you're taking:

- NVIDIA's vastly superior software ecosystem (all the bells and whistles) with a much longer support lifecycle
- Identical raster with 20% extra RT performance
- A GPU with a higher power efficiency figure

That's very much up to you... personally, I wouldn't touch the XTX if I was asked to choose between it at $900 and the 4080S at $1K. The XTX needs to be priced at $799 to become a clear winner in my eyes.
How is it even remotely realisitic for Nvidia to be paying 8 times as much as the memory I linked. By the same logic, DDR5 also seems to cost much more on Digikey. I believe that you're misunderstanding the packaging to refer to one IC when all the evidence seems to point to it being more than one IC. Tape and Reel doesn't contain one IC.

In tape and reel, any components are set into specially-designed pockets in a long piece of plastic tape (the "tape"). This tape is then sealed to keep components in place and wound around a central "reel". This method of packaging helps protect the components during storage from damage or dust.

1707933713256.png
 
Last edited:
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
How is it even remotely realisitic for Nvidia to be paying 8 times as much as the memory I linked. By the same logic, DDR5 also seems to cost much more on Digikey. I believe that you're misunderstanding the packaging to refer to one IC when all the evidence seems to point to it being more than one IC. Tape and Reel doesn't contain one IC.

View attachment 334542

It'll vary by the precise chip type, I understand what you're trying to get at, though. I'm sure it's less than even large bulk cost on IC suppliers like Digikey or Mouser, but those are about the best leads we've got without being industry insiders with access to these deals.
 
Joined
Nov 26, 2021
Messages
1,702 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
It'll vary by the precise chip type, I understand what you're trying to get at, though. I'm sure it's less than even large bulk cost on IC suppliers like Digikey or Mouser, but those are about the best leads we've got without being industry insiders with access to these deals.
Yes, we can only guess at upper bounds for cost based on information available from places like DRAMeXchange. Higher speed bins will be more expensive, but not as expensive as one would expect, and then there's the fact that AMD, Intel, and Nividia enjoy volume discounts.
 
Joined
Apr 14, 2018
Messages
697 (0.29/day)
That is why I bet there is this "Shortage", its artificial to make people buy them when they see them in stock. Helps sales with people who see a card sold out constantly being willing to buy it the moment they see it in stock. I was at Micro-Center as well in Dallas this week and they told me it was a very low amount of stock they got.


I don't disagree with that statement, though I would not say 95% of the time as I think its a much closer battle than that depending on which cards you are comparing. That being said I stand by saying the 7900 XTX is a better value at $900 than buying a RTX 4080 S at $1000.

Considering most game time is going into multiplayer games and mobas like COD, valorant, league etc…, in the massive majority of games being played (lacking RT or more often than not disabled for high FPS) AMD literally offers better or the same rasterized performance at lower price points across the board. I stand by that statement 100%.

Tech enthusiasts overestimate what your average “gamer” is doing with their hardware. Chances are theyre streaming TFT on twitch and listening spotify, not running CP2077 at 8k with path tracing on a 15k rig.
 
Joined
Apr 29, 2014
Messages
4,303 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
The "intentionally making their products obsolete sooner" argument flies directly in the face of Nvidia's lengthy and comprehensive software support over the years. I'd argue nothing makes a graphics card more obsolete than ceasing its driver support, and AMD has already axed pretty much everything older than the RX 5700 XT already.

It's not some big scheme, they just upmarked the SKUs in relation to the processor that powered them way too much. AMD did the same, and why you ended with such a massive gap between the 7600 (full Navi 33) and the 7800 XT (full Navi 32). They had to shoehorn SKUs between these, and it resulted in a crippled 7700 XT that's got 12 GB of VRAM and became unpopular because of it, and a 7600 XT that's basically a 16 GB 7600 that... no one sane would purchase at the prices being asked, and was received just as poorly as the 16 GB 4060 Ti.
I checked AMD's website, the newest product that has no further driver support would be the R9 Fury series from 2015. Everything else after has a driver as new as 1/24/2024. Nvidia stops major support after the next generation comes out and offers a driver that has no improvements or changes to the product after that point. They just keep the driver profile checked off in their downloads. My Titan has seen 0 changes since the 2XXX series came out (I am not complaining about that, just pointing out). Neither company is supporting the products much after next gen comes out.

I stand by the argument, Nvidia makes the more midrange cards only really good for that generation at the moment and not for much future unlike their top products. Yes the GPU's are less powerful but memory has been what holds many of the cards back at times. I am mostly aiming this at the second main tier cards like the XX70 series. Its totally reasonable the 4060ti and others around and below to have less memory.
For $100 you're taking:

- NVIDIA's vastly superior software ecosystem (all the bells and whistles) with a much longer support lifecycle
- Identical raster with 20% extra RT performance
- A GPU with a higher power efficiency figure

That's very much up to you... personally, I wouldn't touch the XTX if I was asked to choose between it at $900 and the 4080S at $1K. The XTX needs to be priced at $799 to become a clear winner in my eyes.
I again would argue your first bullet point as the definition of "Vastly superior" because it all depends how you look at it. I stand by saying AMDs software center is superior in this day and age and neither have major driver issues anymore.
The raster argument vs Ray Tracing is a choice. Yes less than 5% difference RX 7900 XTX vs RTX 4080 S overall in raster with the edge being to the 7900 XTX. But that is general gaming versus a tech not in every game which is where I argue. Its a choice at the end if they prefer raster performance (Something for every game) or Ray Tracing performance which is available in select games and is a performance killer on all GPU's.
The power argument is true, but your talking about something like a 50watt difference depending on the load. Which in reality is not making much of a difference in someone electric bill even if they stress the card 24/7. Not to mention people didn't seem to care about that with the RTX 3XXX series (Yes I know people complained but the products still sold). I generally only use that argument when the difference is over 100watts in difference for same performance.
I mean thats fine but I am talking about most gamers when using the arguments I am talking about. I am fine if a person says they want more RT performance for the games they play, then the obvious choice is the 4080 S, but most people I talk to/game with could not care less. They care about stretching their budget as much as possible to get the most performance they can.
Considering most game time is going into multiplayer games and mobas like COD, valorant, league etc…, in the massive majority of games being played (lacking RT or more often than not disabled for high FPS) AMD literally offers better or the same rasterized performance at lower price points across the board. I stand by that statement 100%.

Tech enthusiasts overestimate what your average “gamer” is doing with their hardware. Chances are theyre streaming TFT on twitch and listening spotify, not running CP2077 at 8k with path tracing on a 15k rig.
I am not even referring to RT in my comparison, just saying the difference in overall is less than 5% between the two and once you drop down to different categories the difference between them make some better values than others.
 
Joined
Mar 7, 2023
Messages
917 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
They probably just didn't make many of these, I doubt there were that many people on fence at this point in time that were about to drop 1K$+ for a video card. Actually I doubt there are that many people ready to do that period.
I think there's people who this appeals to... there's always people building new systems all the time, and those need video cards, whenever they happen. I can see people who would have gone to a 4090 before, going to a 4080s now. I think I would have when I got my 4090. I just refused to pay that spit the face price of $1200 ( or more like $1600+ here), that was just too much to swallow for a card that was so much slower than the halo product.

I know this isn't a reliable metric or anything but I have been seeing a lot of people asking about 4080S on facebook and reddit lately.

Though I do agree on the whole low stock does not necessarily equal high demand thing.
 
Joined
Apr 14, 2018
Messages
697 (0.29/day)
I think there's people who this appeals to... there's always people building new systems all the time, and those need video cards, whenever they happen. I can see people who would have gone to a 4090 before, going to a 4080s now. I think I would have when I got my 4090. I just refused to pay that spit the face price of $1200 ( or more like $1600+ here), that was just too much to swallow for a card that was so much slower than the halo product.

I know this isn't a reliable metric or anything but I have been seeing a lot of people asking about 4080S on facebook and reddit lately.

Though I do agree on the whole low stock does not necessarily equal high demand thing.

The new “msrp” is a load. Trickle in/low stock release, give it a few weeks and all of the 4080 super cards are going to sit around $1200 in the US +/- $50. Not to mention the FE cards probably won’t exist after the first few weeks. It’s more along the lines of a brief price cut for some (bad?) publicity. Theres next to no appeal at all.

The only marginally good super card was the 4070s, and even then thats a stretch with the awful midrange pricing we now have.
 
Joined
Mar 7, 2023
Messages
917 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
The new “msrp” is a load. Trickle in/low stock release, give it a few weeks and all of the 4080 super cards are going to sit around $1200 in the US +/- $50. Not to mention the FE cards probably won’t exist after the first few weeks. It’s more along the lines of a brief price cut for some (bad?) publicity. Theres next to no appeal at all.

The only marginally good super card was the 4070s, and even then thats a stretch with the awful midrange pricing we now have.
You're not wrong that a lot of them are in that price range. That would kind of explain the high interest in the msrp, or close to msrp models, for those brief moments we see them. Quickly checked the american Newegg, and looks like you can backorder a Zotac card for $1050 that is said to be back in stock tomorrow. If you were already planning to spend $1000, an extra $50 isn't that hard to eat. Also its white, and people seem to like that for whatever reason.
 
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I checked AMD's website, the newest product that has no further driver support would be the R9 Fury series from 2015. Everything else after has a driver as new as 1/24/2024. Nvidia stops major support after the next generation comes out and offers a driver that has no improvements or changes to the product after that point. They just keep the driver profile checked off in their downloads. My Titan has seen 0 changes since the 2XXX series came out (I am not complaining about that, just pointing out). Neither company is supporting the products much after next gen comes out.

It's quite misleading, actually. AMD no longer updates the drivers for RX Vega and RX 400/500 GPUs, they're strictly on a quarterly maintenance release, just like the R470 security updates Nvidia still releases for Kepler GPUs. They call it "24.1.1", but the driver under the hood is completely different, as the RDNA is supported by the current 23.40 branch. GCN cards never received the 23.20 and newer branches, it's still on 23.19 series, which is the branch that, if I recall correctly, the "23.9.1" driver was first released at. I don't blame you for missing this detail, although it's very important, it's not made anywhere near as clear as it should've been.

RDNA:

1707953997198.png


GCN 4/5:

1707954052849.png


Regarding quality, I'll take your word with a massive grain of salt, I'm well aware of the progress with the AMD drivers and I must say that so far, I am not yet satisfied. They have much grueling work to do. But I'm hopeful that by the 8900 XTX or whatever RDNA 4 is called, they'll have a solid thing going on. I doubt it, but I'm probably going to give them a chance if I manage to upgrade while keeping my 4080.

Needless to say, your Pascal card is still very much supported and fixes are actively developed for it - most recently they're aware of and developing a fix for configurations that have HAGS+SLI having random freeze issues. It may miss out on some of the newer RTX features, but it already supports things like HAGS that only RDNA 3 have come to support. It received all of the other features that don't rely on tensor cores too, like image sharpening, integer scaling, software DXR driver, etc. that AMD either doesn't support at all, or hasn't backported to their older architectures... and going well beyond that, both Maxwell and Pascal are still getting routine bugfixes and game ready profiles, so there's very little to complain there, IMHO. You even have a fancy Xp... I took a look at Pascal recently (and for the first time) with a 1070 Ti I scored some time ago.
 
Joined
Apr 14, 2018
Messages
697 (0.29/day)
It's quite misleading, actually. AMD no longer updates the drivers for RX Vega and RX 400/500 GPUs, they're strictly on a quarterly maintenance release, just like the R470 security updates Nvidia still releases for Kepler GPUs. They call it "24.1.1", but the driver under the hood is completely different, as the RDNA is supported by the current 23.40 branch. GCN cards never received the 23.20 and newer branches, it's still on 23.19 series, which is the branch that, if I recall correctly, the "23.9.1" driver was first released at. I don't blame you for missing this detail, although it's very important, it's not made anywhere near as clear as it should've been.

RDNA:

View attachment 334582

GCN 4/5:

View attachment 334583

Regarding quality, I'll take your word with a massive grain of salt, I'm well aware of the progress with the AMD drivers and I must say that so far, I am not yet satisfied. They have much grueling work to do. But I'm hopeful that by the 8900 XTX or whatever RDNA 4 is called, they'll have a solid thing going on. I doubt it, but I'm probably going to give them a chance if I manage to upgrade while keeping my 4080.

Needless to say, your Pascal card is still very much supported and fixes are actively developed for it - most recently they're aware of and developing a fix for configurations that have HAGS+SLI having random freeze issues. It may miss out on some of the newer RTX features, but it already supports things like HAGS that only RDNA 3 have come to support. It received all of the other features that don't rely on tensor cores too, like image sharpening, integer scaling, software DXR driver, etc. that AMD either doesn't support at all, or hasn't backported to their older architectures... and going well beyond that, both Maxwell and Pascal are still getting routine bugfixes and game ready profiles, so there's very little to complain there, IMHO. You even have a fancy Xp... I took a look at Pascal recently (and for the first time) with a 1070 Ti I scored some time ago.

Currently owning both amd and nvidia GPUs, from an overall functionality standpoint AMD drivers are superior imo. The user interface and built in monitoring/oc tools are the icing on the cake.

I haven’t ran into any significant driver issues on either side (running 7900XTX, 3080, and 2070s).

For often being touted as a software company, nvidia could really serve to update the UI.
 
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Currently owning both amd and nvidia GPUs, from an overall functionality standpoint AMD drivers are superior imo. The user interface and built in monitoring/oc tools are the icing on the cake.

I haven’t ran into any significant driver issues on either side (running 7900XTX, 3080, and 2070s).

For often being touted as a software company, nvidia could really serve to update the UI.

I don't care about the UI, I care about the driver's performance, functionality (and by this I mean, the software that is able to run on it) and the stability. I'll acknowledge progress primarily on the functionality area (although we could do without the gaffes like antilag plus thing getting people banned because some junior dev decided that injecting code haphazardly instead of developing a proper SDK was a great idea!), but the first and last points still stand IMO.
 
Joined
Apr 14, 2018
Messages
697 (0.29/day)
I don't care about the UI, I care about the driver's performance, functionality (and by this I mean, the software that is able to run on it) and the stability. I'll acknowledge progress primarily on the functionality area, but the first and last points still stand IMO.

And you have no hands on experience with current AMD drivers so how can you come to this conclusion? Nvidia having superior drivers is just a load of BS that continues to be parroted around. Both have bugs, installation issues, or bad drivers, hell the other day someone made a post on this forum about Nvidia drivers not detecting cards or installing properly. Clearly superior…
 
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
And you have no hands on experience with current AMD drivers so how can you come to this conclusion? Nvidia having superior drivers is just a load of BS that continues to be parroted around. Both have bugs, installation issues, or bad drivers, hell the other day someone made a post on this forum about Nvidia drivers not detecting cards or installing properly. Clearly superior…

The conclusion is about as simple as: "oh so I wanna enable all the eye candy, get me some DLAA with ray reconstruction and (actually generative) frame generation going on, then punch in the GameWorks features like HBAO+, I can do all that and still have my cake and eat it too", all while not worrying if my GPU is gonna croak and BSOD because I had VLC or something on the background. That's been my nightmare all along. Oh, and I almost forgot, actual low-latency support, because you know, got that fancy OLED and all. And that's from a gamer's point of view, too. I wanna play a new game? Driver support is ready day one. Always. Remember last year's first quarter how AMD ignored practically every single AAA release and kept the most recent driver only to the 7900 XTX while everyone else lingered? So... I think I don't need to go much further. I mean, damn, I don't think I'm asking much. They certainly don't need to go as far as releasing an easy to use personal LLM engine to run on their GPU like Nvidia already did, but...
 
Joined
Dec 30, 2019
Messages
145 (0.08/day)
It just seems more and more that nVidia wants exit the consumer gaming market. They go from one scam to another: Crypto mining to AI. When AI goes bust, I wonder what they'll move to.
 
Joined
Dec 25, 2020
Messages
6,978 (4.80/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
It just seems more and more that nVidia wants exit the consumer gaming market. They go from one scam to another: Crypto mining to AI. When AI goes bust, I wonder what they'll move to.

To the next big thing™, just like both Intel and AMD are doing. I mean, both Meteor Lake and the Phoenix-HS have their "neural processing capabilities" as the central selling point... it is the big craze right now. Truth be told, it's a bit obvious the tech industry is beginning to enter a period of relapse, we don't really have anything new that is truly groundbreaking and we haven't had something that had a decisive wow factor since Sandy Bridge. Ryzen was responsible for bringing that to the masses, but... I'm sure everyone on a socket AM4 machine and any GPU made in the past 6 years is just dying to upgrade... not
 
Joined
Apr 19, 2018
Messages
1,227 (0.50/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
I think there's people who this appeals to... there's always people building new systems all the time, and those need video cards, whenever they happen. I can see people who would have gone to a 4090 before, going to a 4080s now. I think I would have when I got my 4090. I just refused to pay that spit the face price of $1200 ( or more like $1600+ here), that was just too much to swallow for a card that was so much slower than the halo product.

I know this isn't a reliable metric or anything but I have been seeing a lot of people asking about 4080S on facebook and reddit lately.

Though I do agree on the whole low stock does not necessarily equal high demand thing.
Erm, a 4080 Super is the same price as a 4080. The price reduction was a 2 day myth.

It just seems more and more that nVidia wants exit the consumer gaming market. They go from one scam to another: Crypto mining to AI. When AI goes bust, I wonder what they'll move to.
Hopefully the people going all-in on A.I. will soon come to realise the tech is cool, but has no clothes. You simply cannot trust a word any of them tell you, you have to fact-check everything.

Then all nGreedia will have, is the gamers they used to be all about, until they shat all over them.
 
Joined
Mar 7, 2023
Messages
917 (1.40/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
Erm, a 4080 Super is the same price as a 4080. The price reduction was a 2 day myth.
I already responded to that yesterday. Just look up a couple posts.


What I did was briefly check the American Newegg, and pretty quickly found a 4080s on backlog but only by 1day, which you could buy for $1050. Not perfect, but not quite as spit in my face as $1200+. I might have considered this if I hadn't already got a 4090.

4080s backorder.png


Though I tend to agree the quantity is being limited.


I was even wondering before the launch... how many 4080s will they be able to make where 0% of the cores are defective? Least that leaves a lot of dies for 4070 ti supers I guess....
 
Joined
Nov 26, 2021
Messages
1,702 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
I already responded to that yesterday. Just look up a couple posts.


What I did was briefly check the American Newegg, and pretty quickly found a 4080s on backlog but only by 1day, which you could buy for $1050. Not perfect, but not quite as spit in my face as $1200+. I might have considered this if I hadn't already got a 4090.

View attachment 334768

Though I tend to agree the quantity is being limited.


I was even wondering before the launch... how many 4080s will they be able to make where 0% of the cores are defective? Least that leaves a lot of dies for 4070 ti supers I guess....
Given that N5's defect density is pretty good, we can assume that it is no worse than 0.1 per square cm. AD103 is 379 mm^2 so even worst case yields should be about 69% for fully functional dies.
 
Joined
Apr 29, 2014
Messages
4,303 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
It's quite misleading, actually. AMD no longer updates the drivers for RX Vega and RX 400/500 GPUs, they're strictly on a quarterly maintenance release, just like the R470 security updates Nvidia still releases for Kepler GPUs. They call it "24.1.1", but the driver under the hood is completely different, as the RDNA is supported by the current 23.40 branch. GCN cards never received the 23.20 and newer branches, it's still on 23.19 series, which is the branch that, if I recall correctly, the "23.9.1" driver was first released at. I don't blame you for missing this detail, although it's very important, it's not made anywhere near as clear as it should've been.

RDNA:

View attachment 334582

GCN 4/5:

View attachment 334583
I am aware, they both do that. What I was referencing was 0 support meaning not even a Win 11 driver/something that you can install right now and run on a modern machine. Yes the new drivers don't offer anything and are just repackaged old drivers. I thought you were referencing 0 support and 0 drivers that could be downloaded for a modern machine. But I think its going to come down to how we define support, because my reference was reading bug fixes and improvements in performance which I generally saw more often from AMD from previous generation cards versus Nvidia. AMD's newest control center and many of those features are added to the older generation cards with the new drivers center as well depending on what it is. However, many of those features are not as noteworthy. I will say Nvidia is better about putting some of the bigger software based features down at least 2 generations. However my point it is going to be how we define support. I will redirect what I said and say both support their cards for a reasonable amount of time in my book.
I already responded to that yesterday. Just look up a couple posts.


What I did was briefly check the American Newegg, and pretty quickly found a 4080s on backlog but only by 1day, which you could buy for $1050. Not perfect, but not quite as spit in my face as $1200+. I might have considered this if I hadn't already got a 4090.

View attachment 334768

Though I tend to agree the quantity is being limited.


I was even wondering before the launch... how many 4080s will they be able to make where 0% of the cores are defective? Least that leaves a lot of dies for 4070 ti supers I guess....
Yea they are available, I will say that if at least a few can hold that price area that will be fine. I am worried that the base price will disappear pretty quickly.
 
Top