• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Cancels GeForce RTX 4080 12GB, To Relaunch it With a Different Name

Nvidia can jam it after what happened to evga I bought amd this time around.
 
Someone explain me why everyone get's their panties in a twist because of a memory bus?
Do you take A and B roads to far away places or motorways.

Take away all the motorways now in your mind.

Your there.
 
I suspect your RX 580 was defective on a hardware level, nothing to do with the software from AMD. Or broken Windows and browser.
I was going to say the same exact thing what was described makes no sense otherwise. If it was a launch card, maybe but if doesn't sound like it was. The other thought I had was older Windows versions had issues when switching between AMD and Nvidia, you had to use a third party utility to get all the old drivers out or re-install Windows from scratch. I think that was fixed since at least since Windows 10. I think that issue was a BSOD on boot, I may just be remembering wrong. I know I used to use a third party utility to clean out old drivers for some reason.
 
They tried to test people stupidity, if people were willing to pay $2k+ for 3080 during the GPU crisis why not $900 4070. That is also the possible cause why Ngreedia charge 4080 16gb with $1200 for this gen.
Anti scalping MSRP. :laugh:
 
Anti scalping MSRP. :laugh:

There is no blanket threshold amount above which scalping happens and below which scalping stops.

Scalping happens when there are people who are willing to pay above MSRP whether it's a graphics card, concert tickets, or passes to the World Cup finals.

With the launch of the 4090, its quick sellout and subsequent eBay activity, clearly there are people who are willing to buy scalped 4090s above the porcine MSRPs.

During the pandemic and the Great GPU Shortage, we saw graphics cards (and CPUs) at all price levels scalped. Hell, even prices for used graphics cards went nuts. I bought a Sapphire Pulse Radeon RX 550 2GB card for my non-gaming daily driver PC in September 2020 for $65 (below its $79 launch price). At the peak of the shortage, this card was going for over $200.

Four. Year. Old. Graphics. Card.

People will buy a scalped potato if potatoes are rarities.
 
There is no blanket threshold amount above which scalping happens and below which scalping stops.

Scalping happens when there are people who are willing to pay above MSRP whether it's a graphics card, concert tickets, or passes to the World Cup finals.

With the launch of the 4090, its quick sellout and subsequent eBay activity, clearly there are people who are willing to buy scalped 4090s above the porcine MSRPs.

During the pandemic and the Great GPU Shortage, we saw graphics cards (and CPUs) at all price levels scalped. Hell, even prices for used graphics cards went nuts. I bought a Sapphire Pulse Radeon RX 550 2GB card for my non-gaming daily driver PC in September 2020 for $65 (below its $79 launch price). At the peak of the shortage, this card was going for over $200.

Four. Year. Old. Graphics. Card.

People will buy a scalped potato if potatoes are rarities.

Instead of allowing to be scalped, these people can simply patiently wait. Buying a graphics card is not a life-saving requirement :D
 
Instead of allowing to be scalped, these people can simply patiently wait. Buying a graphics card is not a life-saving requirement :D

Of course patience is an option. However not everyone is patient nor sensible.

Scalpers exploit those who prioritize instant gratification over patiently waiting for a good value. FOMO is a driving influence for many. Look at the 3090. It's frequently available below MSRP now. The 6900 XT can be occasionally found at a ~50% discount from its MSRP which doesn't even go into the street prices during the Great GPU Shortage. It's not like 3090s stopped working the moment the 4090s started shipping.

Look at all of those super geniuses who pre-purchased Cyberpunk 2077 at full price which later repeatedly went on sale with discounts of 50% or more. FOR SOFTWARE. That wasn't even release quality. They paid top dollar for a sh!tty gameplay experience riddled with bugs and performance issues just to be first on the block to say they owned the game.

Heck, even graphics cards generally end up being better toward the end of the release cycle due to minor hardware improvements (respinning the boards, new chip steppings, etc.) and better software drivers.
 
Last edited:
Of course patience is an option. However not everyone is patient.

Scalpers exploit those who prioritize instant gratification over a good value. FOMO is a driving influence for many.

Look at all of those super geniuses who pre-purchased Cyberpunk 2077 at full price which later repeatedly went on sale with discounts of 50% or more. FOR SOFTWARE.

I hope that they deeply regret afterwards. There is no justification of throwing their hard-earned money against the wind.
 
Plenty of DE shops sell 4090 and have it in stock.

Although price point is 2500... which is quite a bit too much even given the 19% VAT.
 
Of course patience is an option. However not everyone is patient.

Scalpers exploit those who prioritize instant gratification over a good value. FOMO is a driving influence for many.

Look at all of those super geniuses who pre-purchased Cyberpunk 2077 at full price which later repeatedly went on sale with discounts of 50% or more. FOR SOFTWARE. That wasn't even release quality. They paid top dollar for a sh!tty gameplay experience riddled with bugs and performance issues just to be first on the block to say they owned the game.

Even on sale its still not even a shadow from what was promised so even buying it at a 50% discount would be giving too much.
at this point ill get it when its like 8 bucks at some point, its not worth more.
 
Droping the 4080 16GB to 900$ and the 4070 aka 4080 12GB to 600$ would make it a slightly better situation, but I'm pretty sure they won't do it unless pressured by AMD.
 
I hope that they deeply regret afterwards. There is no justification of throwing their hard-earned money against the wind.

Well, it's their money. They can do with it however they please. They have the opportunity to turn down that $7 espresso drink, that $15 10-oz. beer at the ballpark, that $25 bowl of ramen.

When the renamed 4080 12GB card starts selling, it's up to each person to decide whether or not they will pay what is being charged for it whether its the Founders Edition at MSRP or some ridiculously priced scalped offering on fleeceBay.

But for sure, there are those who will pay big bucks for scalped product including some TPU participants. By doing business with scalpers they are encouraging scalping to continue.
 
Last edited:
From resetera thread, cough:

1665859523574.png
 
From resetera thread, cough:

View attachment 265602

This is just the results from one game at one resolution so it only gives a glimpse of the performance differences. Again, cherry picking one game benchmark isn't really productive since most PC gamers don't play just one title.

There's a +45% performance increase between the 3080 and 4080 (16GB) models.

Amusingly, there's also a +45% increase between the 3070 and 4080 (12GB) models.

So the 4080 (12GB) really does look like a 4070 if one were to expect a similar generational uplift in performance.
 
This is just the results from one game at one resolution
No shit Watson, but it is also quite in line with the expectations, given the CU cutdowns.

, cherry picking one game
Oh, I didn't know there are other game benchmarks. Share them please.

PS
Eternal
1665861124757.png


Amusingly, there's also a +45% increase between the 3070 and 4080 (12GB) models.
That's a funny way to refer to "4080 12GB is quite on par with 3080"
 
Oh, I didn't know there are other game benchmarks. Share them please.

I could share more gaming benchmarks -- as time permits -- after you show mastery of the term STFW.

Hint: there are plenty of other gaming benchmarks on the Internet. Even TPU graphics cards reviews cover multiple games so start here:


and once you've read all 833 posts, check back at the beginning because invariably there will be more. That should keep you busy for the weekend.

Enjoy!

:):p:D
 
Last edited:
Why on earth is 4090 106 Fps, and 4080 - 55 Fps. That's 92.5% and it only has 70% more CUDA, what is this free performance,. 16GB not enough already?
 
Why on earth is 4090 106 Fps, and 4080 - 55 Fps. That's 92.5% and it only has 70% more CUDA, what is this free performance,. 16GB not enough already?

It's likely that it is not just one factor but a combination of many factors including -- but not limited to -- memory bus width, memory clock frequency, memory bandwidth, GPU clock frequency, and other things. Game performance isn't based on one type of transistor on a GPU.

Remember that the 4090 has a 384-bit memory bus and the 4080 has 256-bit.

As mentioned repeatedly in many, many threads, NVIDIA is binning silicon and earmarking better transistors for their higher priced products. Their top GPUs end up in data centers.

Very excellent GPUs end up in their top tier graphics cards. They are also binning VRMs, VRAM, and other silicon. Not all GDDR6X chips are the same in the same way that not all DDR4 chips perform equally. Not sure if you've noticed that.

All of these slight improvements add up.

There's also a very real possibility that the software driver used for these comparisons was optimized for the 4090. After all, that was the first Ada Lovelace card to be released so undoubtedly NVIDIA engineers prioritized that GPU.

This is yet another example why one can't look at a single game benchmark for at a single display resolution and make conclusive statements. Some games will benefit from more raster cores, some games can take advantage of RT and ML cores. Other games might favor fast and wide memory transfers, others just a lot of VRAM. Some games rely more on the CPU. And not each card works equally well with all graphics APIs. Some cards are better for DX11 games, others are better for DX12. And game developers sometimes end up writing code that favors a particular architecture, occasionally because game development was sponsored by a GPU manufacturer (AMD, NVIDIA, and now Intel).

So in the end, it's more than counting CUDA cores.
 
Last edited:
I was going to say the same exact thing what was described makes no sense otherwise. If it was a launch card, maybe but if doesn't sound like it was. The other thought I had was older Windows versions had issues when switching between AMD and Nvidia, you had to use a third party utility to get all the old drivers out or re-install Windows from scratch. I think that was fixed since at least since Windows 10. I think that issue was a BSOD on boot, I may just be remembering wrong. I know I used to use a third party utility to clean out old drivers for some reason.
I literally had a GTX 1070 + A2000 in my rig (with gaming and workstation drivers) and then swapped the 1070 for an RX580 + A2000 in my system. So I had Adrenaline and RTX Studio drivers installed. Never did an uninstall/reinstall between cards on either swap. They all worked flawlessly together, although I only ran it that way for a few hours to make sure the cards were working.
 
You didn't realize the chart contains 4080, did you...

Oh, I'm pretty sure I've typed "4080" multiple times in my recent comments to this discussion about the resetera chart. In fact, I even noticed that both 4080 16GB and 4080 12GB were used.

The point is that using a single game benchmark to describe _____ graphics card (regardless of the make and model number) at one resolution is not a meaningful way of summarizing its capabilities. Different cards exhibit different performance with different software under different operating conditions (not just display resolution).

Again, cherry picking through graphics benchmarks to pick one to argue a point just shows pure desperation and a complete disconnect from reality. That benchmark is only valid for that one game under those specific test conditions.

I don't know about you but I don't just play one game. Most people do not which is why these single-game benchmarks are nearly meaningless unless part of a larger assessment that includes other benchmarks, datapoints, and situations.

These single game benchmarks are actually better for assessing how well the game's code was written by seeing if there are gross anomalies between various cards. Sometimes gaming benchmarks/performance tuning guides are good for pinning down optimal settings for better performance.

I realize this is a very difficult concept for you to grasp and undoubtedly someone else can explain it better to you than me. I don't write graphics card reviews for a living.
 
Last edited:
Someone explain me why everyone get's their panties in a twist because of a memory bus?
I don't. I got sand in them for a $900 192-bit bus. For that price I want a 256-bit bus................whether it helps or not. :)
 
I expected 4070 with 8704 Cuda and 16GB. Whatever they name it would still let me down. 42 FPS and 4090 does 106. This whatever this is now, should be no more than $599. and 799 for the 4080 16GB.
Nvidia isn't undoing how bad this is.. Everytime after the mining ends they somehow put up the most ridiculous prices, Lets hope AMD puts a dent in their market share for good.
 
Back
Top