• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Why does everyone hate the 4080?

Status
Not open for further replies.
nVidia desperately wants to be the Apple of GPUs: closed off ecosystem (Hairworks, G-sync, DLSS to name a few) and complete control of the manufacturing-to-sales line with insane pricing, with loyal fans lining up to buy the next iPhone overpriced GPU.
Except that it doesn't control the manufacturing-to-sales line simply because it doesn't manufacture anything and it sells very few video cards.
As for having differentiators to along various standards... that's not something Nvidia invented, it's been around for years. Hell, when they were ahead of the game, even ATI introduced TruForm in a non-standard way. Personally, I'm not worried about these. If they catch on, they'll make their way into DX or Vulkan. If not, you can continue to safely ignore them. I mean, Witcher 3 has Hairworks. Is it a lesser game if you play it on an AMD card without Hairworks? I think not.
 
I feel like this change is affecting the entire product stack, not just the best tech. Nvidia has moved the entire stack up in price, while keeping small the incremental performance in the hardware. We can see this by the smaller (cheaper) dies being used in the higher priced cards.

I avoid the best of the best due to the falloff of performance for dollar, but I like gaming on the high end.

My last GPU cycles have seen $800 for a 1080ti, then moved to the 2080ti at $1200. That was a tough pill to swallow, but AMD had nothing to compete that generation. Then the 3080ti, also for $1200. Decent, but not great generational improvement, but the price stayed the same. Now enter the 4080, also priced at $1200, with a HUGE hardware gap between itself and the 4090. This leads me to believe that the 4080ti model with come in the $1400+ range. Makes me miss the R9 290x for $400, or the 4850 at $200.

I'm skipping this generation as I just don't see the value in it. But as an enthusiast that's been in this hobby for decades, it's very sad to me that the cost of entry for new people entering the hobby is so high. When the RX480 launched at $200, I thought we were entering a time of reasonable prices for entry level PC gaming again. The market went the other way and now everything from MBs to GPUs have skyrocketed compared to a few generations ago. That's sad to me when I'm trying to raise kids in this hobby.
Yes, I agree. The current times are not better in that aspect as before.
That's why I stick to my 970GTX dispite Total makeover for my main rig.

We can never have positive progress endlessly. Now come the time of stagnation and we are on the golden path to regression- very much like Intel era of 4 to 9 gen.
And just as with that Intel era, now NV makes it so very easy NOT to buy the product offered to you.
No ones life is attached to GPUs, despite some may think or feel (or try to PR that to the crowd).
Life can and will still be good without owning top notch, last gen tech.
 
Last edited:
Except that it doesn't control the manufacturing-to-sales line simply because it doesn't manufacture anything and it sells very few video cards.
That's the "wants to be" part, as in in the future they aim to do that :)
 
I will resume it for you:

1. Very high prices!
2. Very low generational improvement.
3. High-yield, cut down harvested chips throughout the entire stack (this includes the 4090!)
4. Due to the aforementioned situation, ample room for price increases and slot-in SKUs that perform as they should but are even more terribly priced
5. Utter disregard for previous-generation flagships and unwillingness to introduce relevant features that owners of these want, as a gesture of good will, by introducing artificial locks and making claims about requirement of new-generation hardware for them

In short, the entire Ada lineup is an insult to anyone who's been following this industry for some time. It's clearly intended as a mass-market product aimed at consumers plain and simple, not at hobbyists and much less at enthusiasts such as the TPU patronage.
 
Hating a product is -mildly put- stupid. But its stupidly high price made this GPU indifferent to any budget-sensible PC owner. The reaction to the marketing that nVidia is pushing through paid reviews is fairly to be strong also.
 
I can only say one thing. Trying to deduce the profit margin solely by the BoM means nothing. It's not like those materials grow on trees (even then, you'd factor in the cost of harvesting them). There's design, prototyping, iterations that go into making a product.
NVIDIA needs no protection, their reports show copius amounts of profit. The medium sized silicon in the card next to rest of components to result in a rather mild BoM. But also tooling and design of cards (Hi, I work as a part of a PCB design team, we make boards, with chips in them) this time around could be a lot cheaper. The RTX 4080 is quite possibly NVIDIA's largest MSRP to over all cost ratio GeForce product in a long while. Again, speculatively.

The consumer back lash over the value of the RTX 4080 and the behavior of NVIDIA towards pricing its products - is - justified.
 
Hating a product is -mildly put- stupid. But its stupidly high price made this GPU indifferent to any budget-sensible PC owner. The reaction to the marketing that nVidia is pushing through paid reviews is fairly to be strong also.
True. I don't hate the 4080 the same way I don't hate Ferrari. But if someone threw me one, I'd sell it straight away and buy a Mustang and a house. :roll:
 
Oh well, I'm having lots of fun with 4090, way overkill rasterization performance at 4K (getting 170FPS avg in Days Gone @ 4K Ultra, gonna do some reshade RTGI to make lightning look better).

Way too many salty people in forums, OP should just listen to W1zzard advice :D
 
In short, the entire Ada lineup is an insult to anyone who's been following this industry for some time. It's clearly intended as a mass-market product aimed at consumers plain and simple, not at hobbyists and much less at enthusiasts such as the TPU patronage.
Very well said. On top of the horrible value proposition, no one has even mentioned the new 16 pin power connector debacle.

This launch is just a disaster, which is so sad because Nvidia really does make great GPUs.
 
I will resume it for you:

1. Very high prices!
2. Very low generational improvement.
3. High-yield, cut down harvested chips throughout the entire stack (this includes the 4090!)
4. Due to the aforementioned situation, ample room for price increases and slot-in SKUs that perform as they should but are even more terribly priced
5. Utter disregard for previous-generation flagships and unwillingness to introduce relevant features that owners of these want, as a gesture of good will, by introducing artificial locks and making claims about requirement of new-generation hardware for them

In short, the entire Ada lineup is an insult to anyone who's been following this industry for some time. It's clearly intended as a mass-market product aimed at consumers plain and simple, not at hobbyists and much less at enthusiasts such as the TPU patronage.
I agree except for the conclusion- I look at it from a different angle: I`m no victem of NV way of doing business and I dont feel insulted dispited being tech enthusiast for many years (ever since my Athlon 3200+ and Nvidia TNT2 GPUs). They actually made me a big favor with such high prices- my choice to avoid them has never been that easier.
 
Last edited:
Oh well, I'm having lots of fun with 4090, way overkill rasterization performance at 4K (getting 170FPS avg in Days Gone @ 4K Ultra, gonna do some reshade RTGI to make lightning look better).

Way too many salty people in forums, OP should just listen to W1zzard advice :D
Buy me a 4090, and maybe I'll be a bit less salty. Just because you can afford to spend way more on a graphics card than you should, it doesn't mean there aren't any budget-conscious gamers left.
 
Buy me a 4090, and maybe I'll be a bit less salty. Just because you can afford to spend way more on a graphics card than you should, it doesn't mean there aren't any budget-conscious gamers left.

Like, take it from my perspective - I am not exactly "budget conscious". I even like to splurge a bit. But there is a line, and I strongly feel this line has been crossed. AMD's $999 MSRP for their halo part is not only justified - it's fair, too.

The market conditions at Ampere's launch somewhat justified an $1500 GPU of the stature of the RTX 3090. But Ada introduces practically no groundbreaking components and the pressure from the COVID-19 lockdowns and cryptocurrency mining have both completely subsided, and the global strain on component supply has also greatly eased as a result. The main roadblocks remaining are within China and their social policy against Covid still in effect, and they too will resolve with time.

I agree but the conclusion- I look at it from a different angle: I`m no victem of NV way of doing business and I`m dont feel insulted dispited being tech enthusiast for many years (ever since my Athlon 3200+ and Nvidia TNT2 GPUs). They actually made me a big favor with such high prices- my choice to avoid them has never been easier before.

You need to account for something, though. AMD and Intel are both businesses. If they ever close the mindshare gap, or the technical gap with NVIDIA, they will raise their prices accordingly, and NVIDIA will only have blazed the trail by normalizing this - justifying the whole process. We should rebuke these cards, but alas, people still got in line to buy 4090's anyway...

Very well said. On top of the horrible value proposition, no one has even mentioned the new 16 pin power connector debacle.

This launch is just a disaster, which is so sad because Nvidia really does make great GPUs.

The new connector was eventually necessary, I won't be deducting too much over it, even though I still prefer and think they should have made available models with traditional 8-pin connectors, even if four of them were required. But the whole thing about them melting down just screams that the specification wasn't ready, and that it was pushed too far. Operating well under lab conditions is one thing, but with the variances in people's PSU quality, and even their AC mains' stability would obviously result in what we've seen.
 
Talk about greed all them crypto miners greedy bastards! got taken to the bank to LMFAO!
I have no skin in the game with crypto but the miners don't set the prices for video cards. They may increase the demand for them but the price is set by the chip manufactures, board partners, and online markets/sellers. Those guys are the "greedy bastards". Let's not forget Nvidia and AMD have a history of being sued for price fixing.
 
Last edited:
Oh well, I'm having lots of fun with 4090, way overkill rasterization performance at 4K (getting 170FPS avg in Days Gone @ 4K Ultra, gonna do some reshade RTGI to make lightning look better).

Way too many salty people in forums, OP should just listen to W1zzard advice :D
How about you re read the OP then figure out what the actual Ffff your reply means to it, are we debating your precious, no, pipe down epean queen.
 
Yes, they'll still price them at a profitable (and even greedy) prices, but not as much as nVidia, if they can do it.
AMD is just as greedy as Nvidia, they just sell different products with different demands. Nvidia can sell their products for more because of either performance or perceived value or both. AMD would happily (and have a fiduciary responsibility to) do the same if the tables were turned.
 
You need to account for something, though. AMD and Intel are both businesses. If they ever close the mindshare gap, or the technical gap with NVIDIA, they will raise their prices accordingly, and NVIDIA will only have blazed the trail by normalizing this - justifying the whole process. We should rebuke these cards, but alas, people still got in line to buy 4090's anyway...
Yep, top end stuff will keep on selling with over the top price regardless of what words will be written in here, for pretty much as long as people will need the escapism from reality (that is, eternity). I have no special dis-sympathy to NV- Intel and AMD are on the same rotten boat (yet on different areas on that boat).
But knowing the situation as it is will, hopefully, help me do better choices regarding tech purchases.
 
Last edited:
I assume the 4080 12GB was intended to be the "real" 4080, and the 4080 16GB the 4080 Ti? If nVidia would have simply called the 4080 16GB version a "4080 Ti" would people not be as pissed as they are?
Incorrect. AD104, which the 4080 12GB was based on, would've typically either been used as a 60 or 70-class card.
This is further backed up by the 192-bit bus width on AD104. The 3080 has a 320-bit bus, 3080 Ti, 3090 and 3090 Ti has a 384 bit. The 3070 has a 256 Bit bus, and the 4080 16GB has a 256-bit bus.
The die size of AD104, which is 295mm^2 also suggests that it would've been used in a low-end GPU, as it is comparable to the die size of GA106, which is 276mm^2, and was used for 3060 and below. The only similar flagship to have a die size so small was the GTX 1080.
Also, when Nvidia posted the performance slides for the RTX 4080, it was only 25% better than the RTX 3080 in performance, and similar performance to an RTX 3080 Ti - what we would normally expect of a -70 series card. (GTX 970 ($330) = 780 Ti ($700), GTX 1070 ($349) = 980 Ti ($649), RTX 2070 Sup ($499) = GTX 1080 Ti ($699), RTX 3070 (Meant to be $499, obviously Crypto ruined that.) = RTX 2080 Ti ($1199)...)
Need any more evidence?
 
WHAT? Repeat after me: Size does not matter! This is like saying 100 grams of apples should cost the same as 100 grams of steak, because they share the same weight.

If you want to count dollar per something physical, at least pick something meaningful. Such as transistor count.
Transistor count isn't as good a metric as die size as die size is directly related to the cost of the product. Transistor count is also influenced by the area devoted to logic vs SRAM or IO.
 
Erm, what? :wtf: AMD's drivers have been rock solid ever since the release of RDNA 2. Only the 5700 series had stability issues. Their control panel is miles ahead of Nvidia's, both in design and functionality! I'll give you the point on PR, though. AMD just can't seem to achieve the same number of wet pants during a product launch that Nvidia does for some reason. We'll see how far Nvidia can keep driving prices up before buyers start to think it's a bit too much. Personally, I think Nvidia is trying to position itself as the 500 HP Ferrari (kept in market by hype), while AMD is the 500 HP Mustang (kept in market by value).
For some reason the best driver for my r7 260x is 17.12.1, both in windows 7 and 10. Not the latest 22 one. If i take a 660ti or something, or a 1060 (the most recent nvidia i had), there was no such issue, you just update the driver and see no negative results. Price/performance amd is better, thats why i am using radeon atm. But as it turned out it comes at a price.


Graphics control panel is designed to contain only the basic, minimal settings. So that developers dont spend their precious limited time trying to optimize a shitty ui control panel with "game optimizer", "upgrade advisor", and other redundant crap. This is exactly why people prefer iphones over crappy xiaomi phones. Keep it minimal, keep it essential. Users will download the additonal software if they need to.
 
For some reason the best driver for my r7 260x is 17.12.1, both in windows 7 and 10. Not the latest 22 one. If i take a 660ti or something, or a 1060 (the most recent nvidia i had), there was no such issue, you just update the driver and see no negative results. Price/performance amd is better, thats why i am using radeon atm. But as it turned out it comes at a price.


Graphics control panel is designed to contain only the basic, minimal settings. So that developers dont spend their precious limited time trying to optimize a shitty ui control panel with "game optimizer", "upgrade advisor", and other redundant crap. This is exactly why people prefer iphones over crappy xiaomi phones. Keep it minimal, keep it essential. Users will download the additonal software if they need to.
The 260X is an old card with driver support long gone. It's not relevant in a discussion about currently sold products, and current driver support.

The gimmicks like game optimizer and upgrade advisor can be easily disregarded. I don't use them, either. On the other hand, AMD's drivers include overclocking, fan speed control, undervolting, and all the fine-tuning tools that I think should be basic stuff in every driver package. AMD knows this, Intel knows this, Nvidia still forces you to use third-party software for some reason.

As for the UI, I disagree. Nvidia's version is extremely outdated, like it's straight out of a Windows 98 machine. They also seem to be cramming every new feature under the "3D settings" tab, which was fine up to a point, but it's getting overcrowded now, while other tabs barely contain any options. My complaint against AMD is that they have similar options scattered across different tabs, so it can take you a while to find what you need, but their design is top notch, in my opinion.
 
Has anybody mentioned that they are pricing these accordingly so they can remove their excess 30xx stock?

Or am i wrong?
 
Has anybody mentioned that they are pricing these accordingly so they can remove their excess 30xx stock?

Or am i wrong?
Your probably right but that doesn't make it Right by me.
 
The 260X is an old card with driver support long gone. It's not relevant in a discussion about currently sold products, and current driver support.

The gimmicks like game optimizer and upgrade advisor can be easily disregarded. I don't use them, either. On the other hand, AMD's drivers include overclocking, fan speed control, undervolting, and all the fine-tuning tools that I think should be basic stuff in every driver package. AMD knows this, Intel knows this, Nvidia still forces you to use third-party software for some reason.

As for the UI, I disagree. Nvidia's version is extremely outdated, like it's straight out of a Windows 98 machine. They also seem to be cramming every new feature under the "3D settings" tab, which was fine up to a point, but it's getting overcrowded now, while other tabs barely contain any options. My complaint against AMD is that they have similar options scattered across different tabs, so it can take you a while to find what you need, but their design is top notch, in my opinion.
for me the driver quality should be judged by the oldest cards that support it. nvidia drivers always been fine stability-wise. Not so good performance-wise (Kepler 600 and 700 series mainly). If you check the gpu popularity list on steam https://store.steampowered.com/hwsurvey/videocard/, there are a lot of Maxwell based cards (2015, can be considered as old). Yea undervolting can be done via wattman, but dont let this fool you tho, its still software based. i modded the bios of the 260X so it pulls 85W instead of 120W (hawaii bios editor, works for any gcn 2 based card), and i can transfer the card into any system and it will draw 85W under load, no need to re-configure anything. Bios has been locked on modern amd/nvidias, so the only option is software profile. And for that i better use a dedicated tool, like msi afterburner. And considering that there are more nvidia cards in existence compared to amd, and the fact that amd drivers get much more criticism than nvidia's, already tells us something. Amd just fails to deliver the whole package, mainly drivers need more tweaking and optimization. If they skip all the addional bells and whistles and instead focus on the core stuff, the stability should go up.
 
Probably mentioned but I'll just say this... The 3080 was a huge boost over the 2080 but at same price. As with every succeeding generation, both AMD and Nvidia offered great performance increases at a lower price. It was not uncommon to see upper midrange cards beat or equal last gens flagship cards at almost half the price. The 4080 breaks this trend pretty bad.
 
Status
Not open for further replies.
Back
Top