• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Cancels GeForce RTX 4080 12GB, To Relaunch it With a Different Name

Joined
Oct 27, 2020
Messages
791 (0.53/day)
Anyone having access (I don't) in DRAMexchange can see the GDDR6 spot price differences and have an indication.(just an indication!)
Spot price for 8Gbit GDDR6 is even lower than GDDR5 and there is a chance the 16Gb GDDR6 to be only around 1.5X the 8Gb price.(and Nvidia probably buys lower than the below spot session lows...)
So for example the actual difference between 8 8Gbit GDDR6 ICs (256bit bus case/8GB total) and 6 16Gbit GDDR6 ICs (192bit bus case/12GB) can even be only $5 total depending the 16Gbit GDDR6 IC price.
There is a reason for example ARC A770 8GB has only $20 difference with 16GB version and this could be it (8 x $5 for 8Gbit ICs vs 8 x $7.5 for 16Gbit ICs) (i don't buy Intel just want to push 16GB version)
Anyone with access can enlight us!


IMG_20221014_232823.jpg
 
Joined
Nov 26, 2021
Messages
1,642 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Names are PR. Consumers need to inform themselves about the specs. I will not defend people, who buy their GPUs based on naming or nicely colored packaging.
The delay is an estimation. Fill in whatever timespan you like.
Estimation means you have no source. Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.
 
Joined
Aug 21, 2015
Messages
1,723 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Names have meaning. For a long time, Nvidia's top tier GPU was the x80. It changed to x80 Ti when AMD surprised them with Hawaii, and remained so until Ampere. I would also like to know your source regarding a delay in RDNA3.

Not exactly. We've had the GTX 295, 590 and 690. Then x90 took a long nap until Ampere.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,692 (2.91/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3466
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage 3.3TB of SSDs + 3TB USB3.0 HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus ROG Strix Edge Nordic
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
More like, they got caught in their own stupidity and now are acting desperate to fix things:

- Pretending there is a scalper/miner shortage on 4090
- "Loyalty program" for buying FE 4090, only for current nvidia owners
- Killing the LHR from 30 series
- And now, cancelling the dumb 4080 12GB.

More "unusual moves" to be expected as Jensen continues to wake up from his hubris enduced coma.
Ah, there are plot twists like that. I have to admit that I haven't even been looking that much news about these new ones as their pricing make them so uninteresting.

But I wouldn't call this a canceling but rather renaming it to a model it should've been in the beginning.
 
Joined
Nov 26, 2021
Messages
1,642 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,692 (2.91/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3466
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage 3.3TB of SSDs + 3TB USB3.0 HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus ROG Strix Edge Nordic
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
Joined
Jun 5, 2018
Messages
237 (0.10/day)
Ah, there are plot twists like that. I have to admit that I haven't even been looking that much news about these new ones as their pricing make them so uninteresting.

But I wouldn't call this a canceling but rather renaming it to a model it should've been in the beginning.

Well it's more than renaming cause there is no way they will just call it 4070 but keep the same price. That would reach a new level of rejection.
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
Wrong.

Per NVs own discovery (was in their slides) they've shockingly discovered that customers stick with series (e.g. 970 => 1070 => 2070) rather than sticking with the price bracket.

4080 losing "80" would be a major hit.
Perspective, it doesn't make a difference to me and to anyone who is informed beforehand. If you want to talk about sales strategies and uninformed consumers, you are right. If nVidia gets a major hit, depends more on AMD, then AD104 naming.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.66/day)
Location
Ex-usa | slava the trolls
Estimation means you have no source. Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.

Why? I think you think the RTX 4090 is out of reach, why exactly?

RX 6950 XT difference to RTX 4090 is only 53%.

1665780745605.png


The performance jump from the older generation top dog RX 5700 XT to RX 6900 XT was 101% in a single move.

1665780802255.png


AMD can do it.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,692 (2.91/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3466
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage 3.3TB of SSDs + 3TB USB3.0 HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus ROG Strix Edge Nordic
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
Well it's more than renaming cause there is no way they will just call it 4070 but keep the same price. That would reach a new level of rejection.
Just wondering that what will happen to the cards already been made, manufacturers will bios-flash those with a 4070 named bios?
 
Joined
Jun 21, 2021
Messages
3,121 (2.50/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Estimation means you have no source. Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.

AMD knows for sure how the 7900XT stacks up to the 4090 in pure rasterization. The 7900XT launch is placed closer to the 4080 launch because it will be more comparable in performance to that card. It also buys AMD some time to improve their driver software. The hardware is already finished, probably sitting on pallets in some warehouse's finished goods section.

Radeon RT cores will be weaker than GeForce RT cores and there's no indication that AMD will dethrone NVIDIA any time soon in machine learning either.

And one key battleground is the developer environment. NVIDIA stands very tall here.
 
Joined
Aug 21, 2015
Messages
1,723 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Joined
Nov 26, 2021
Messages
1,642 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Why? I think you think the RTX 4090 is out of reach, why exactly?

RX 6950 XT difference to RTX 4090 is only 53%.

View attachment 265527

The performance jump from the older generation top dog RX 5700 XT to RX 6900 XT was 101% in a single move.

View attachment 265528

AMD can do it.
I think they can do it, but all indications are that they are using chiplets for the larger GPUs. Thus they will take a power hit compared to a monolithic GPU as on-die interconnects will now be inter-chip.

AMD knows for sure how the 7900XT stacks up to the 4090 in pure rasterization. The 7900XT launch is placed closer to the 4080 launch because it will be more comparable in performance to that card. It also buys AMD some time to improve their driver software. The hardware is already finished, probably sitting on pallets in some warehouse's finished goods section.

Radeon RT cores will be weaker than GeForce RT cores and there's no indication that AMD will dethrone NVIDIA any time soon in machine learning either.

And one key battleground is the developer environment. NVIDIA stands very tall here.
The 4080 isn't much faster than a 3090 Ti; again, Nvidia's own benchmarks show a 10 to 25% improvement over the 3090 Ti. AMD has claimed a performance per watt increase of over 50%. A 7900XT 50% faster than a 6950 XT would be comfortably 35% faster than a 3090 Ti. On the other hand, your points about RT cores, machine learning, and developer relations are all valid.
 
Last edited:
Joined
Jun 21, 2021
Messages
3,121 (2.50/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Just wondering that what will happen to the cards already been made, manufacturers will bios-flash those with a 4070 named bios?

Yes. They are probably sitting in some warehouse in bulk packaging anyhow waiting for final firmware. Even if there were a few sample units sent out, they will all likely get reflashed to new code that identifies it as whatever model number NVIDIA decides.

There are product stickers on the PCB with the wrong SKU and some AIB partners might have put the model number on the cooler.

All of the packaging will have to be scrapped of course.

Remember that Apple's iOS software is RTM'ed shortly before the iPhone launch, maybe 10-14 days to give the manufacturer time to flash units for channel distribution and bricks-and-mortar stores.
 
Joined
Dec 12, 2016
Messages
1,819 (0.63/day)
AMD needs to make their drivers more stable, tried a RX 580, had drivers issues even when surfing it was BSODing. Passed to a RTX2070, never looked back since.

I don't know if they improved drastically with time, tbh I am looking seriously at RDNA3 too in addition to the 4080 for my next upgrade...but if picking a GPU with the most stable drivers is gullible, then I am gullible. Nothing instinctive with picking up Nvidia...had 2 AMD GPU a ATI 4870 and an RX 580, both were drivers hell, at one point you get tired of DDU, troubleshooting, etc.
There have been no significant tech reviewer comments about AMD driver failures. Only reason anyone thinks AMD drivers are bad is anonymous internet posts like this. Nothing you said can be verified but it will still make someone casually reading these forums think twice and continue to perpetuate this myth.
 
Joined
Oct 15, 2010
Messages
951 (0.18/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
Okay, maybe it doesn't, it's still within the same performance bracket. Ofc the 900 USD price is insane if it doesn't beat the 3090... :)
Have people gone mad? How is 900 justifiable for a 4070??, the freaking 3070 beat the 2080 Ti!! and for less than half the price!!, that would translate to less than half or the $1500 for the 3090 = 650 - 700 and that is pushing the price still.
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
Estimation means you have no source.
Every forecast is an estimate. My argument was, nVidia can cash in until AMDs new GPUs are physically available.
Unless AMD managed to make a properly scaling multi GPU chiplet card, they are unlikely to have anything competitive with the 4090, launching around the 4080 would be fine. I would expect the 7900XT to beat the 4080 easily in rasterized games; raytracing performance is unlikely to be as good as Nvidia.
I agree. Market price depends on demand and competitive AMD products. As MSRP loses its meaning, we may have to wait for the street price, once rx7000 is available. Soon we will see if that's pressuring nVidia to drop the AD104 and AD103 in price. I hope so.
 
Joined
Nov 26, 2021
Messages
1,642 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Every forecast is an estimate. My argument was, nVidia can cash in until AMDs new GPUs are physically available.
I agree. Market price depends on demand and competitive AMD products. As MSRP loses its meaning, we may have to wait for the street price, once rx7000 is available. Soon we will see if that's pressuring nVidia to drop the AD104 and AD103 in price. I hope so.
Of course, they'll cash in both before and after the availability of AMD's new GPUs. The uninformed masses will continue to buy Nvidia even when AMD is better. That is why the 3060 has almost the same street price as the far superior 6700 XT in Canada.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
The 7900XT launch is placed closer to the 4080 launch because it will be more comparable in performance to that card

AMD also wants to get into business of next gen cards being sold as yet another (more expensive) higher tier.

What year is it, seriously...

all indications are that they are using chiplets for the larger GPUs.
Chiplets is how underdog had trounced Intel.

I doubt Frau Su would do that if it meant losing the flagship competition outright.
 
Joined
Oct 27, 2020
Messages
791 (0.53/day)
Pre-binned Navi21 XTXH could hit 3GHz with the right conditions.
Supposedly Navi31 with the same conditions can hit close to 4GHz, let's say 3.9GHz that's +30% speed improvement.
The most power efficient Navi21 GPU in most cases/res was RX6800.
That had 2105MHz boost, add 30% on that (already high, 15% is the official TSMC difference regarding nodes) and maybe we have 2735MHz boost for the most efficient Navi31 based GPU (the one that they claim has +50% more performance/W?)
With so low frequency and if we are talking about a 300W to 335W Navi31 based model the performance potential can be uneventful. (in relation with what Nvidia can achieve)
Add to that the pessimistic scenario that the +50% performance/W claim made using upcoming FSR3.0 that maybe RDNA3 has an advantage there and the conclusions are even more pessimistic regarding performance potential.
Anyway, the above are probably bullshit, I don't believe them, i just examined the possibilities...
 
Joined
Nov 26, 2021
Messages
1,642 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
AMD also wants to get into business of next gen cards being sold as yet another (more expensive) higher tier.


What year is it, seriously...


Chiplets is how underdog had trounced Intel.

I doubt Frau Su would do that if it meant losing the flagship competition outright.
No one doubts AMD's engineering chops; connecting a multi die GPU would require a massive off-chip interconnect, but it can be done. If they could pull it off, then such a multi die GPU, with the proper scaling, would easily surpass the 4090 in rasterization, but we will see in November.
 
Joined
Jun 21, 2021
Messages
3,121 (2.50/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
AMD also wants to get into business of next gen cards being sold as yet another (more expensive) higher tier.

Both NVIDIA and AMD have excess inventory of previous generation cards in the channel as well as unused GPU chips.

As far as I can tell, NVIDIA doesn't have any problem right now selling 4090 cards. The best binned GPU chips will end up in data centers anyhow.

It's really the low to mid-range graphics cards (Ampere and RDNA2) that are the major source of headaches for both companies and have contributed to these weird marketing conundrums.

What year is it, seriously...

A very tricky one for AMD, Intel, NVIDIA, and others.

Chiplets is how underdog had trounced Intel.

Well, the hardware for the 7900XT is already done. It's not like AMD can make it a multi-chiplet GPU in reaction to the 4090. These sort of architectural decisions need to be made years in advance.

It's important to point out that Intel gave AMD a chance to catch up by failing to transition to a smaller process node in a timely manner. It's not just the chiplet design that helped, TSMC gets a lot of the credit in the success of Ryzen 2, Ryzen 3, and now Ryzen 4.

Both AMD and NVIDIA are using TSMC's foundries for this new generation's family of GPUs. NVIDIA probably gave AMD a little help by using Samsung's foundries for the Ampere generation.
 
Joined
Jul 10, 2020
Messages
383 (0.24/day)
System Name Spaceheater
Processor Ryzen 9 5950X H2O
Motherboard asus x570-e ROG Strix
Cooling Triple radiator + vardar fans D5 for cpu and external MO-RA 420 NF-A20 1200 rpm dual D5 pump for GPU
Memory 4x8 GB G.Skill Trident Z Neo F4-3600C16Q-32GTZN
Video Card(s) Liquid devil 7900 XTX with PTM7950 + AMD 6900 XT in 2e machine
Storage 2x 980 pro 2tb 2x 860 EVO 1tb raid0 2x mx500 2 tb raid0 2x WD40EZRZ 4 tb raid0
Display(s) LG 38WN95C-W 3840x1600 144hz | 11,9 inch system info display
Case Lian Li PC-O11 Dynamic XL (ROG Certified) White
Audio Device(s) z5500 + steelseries Arctis Nova Pro wireless
Power Supply be quiet! Dark Power Pro 12 1200W
Mouse Logitech G502 X
Keyboard Keychron Q1
Software windows 11 23H2
I bet they changed their mind because of combination of things, community realizing rtx 4080 12 gb are just old rtx 30 cards, and them not wanting people to find out, next having people find out about dlss actually working fine on rtx 30 cards and 20 cards.
I wish Nvidia treated their own customers properly with new featrures, imagine buying a gtx 1080 and all the old cards get cool new features, but then they release the rtx 20 series and they cant even release anything new for gtx 1080, and having to rely on AMD FSR instead.
 
Joined
Oct 8, 2015
Messages
769 (0.23/day)
Location
Earth's Troposphere
System Name 3 "rigs"-gaming/spare pc/cruncher
Processor R7-5800X3D/i7-7700K/R9-7950X
Motherboard Asus ROG Crosshair VI Extreme/Asus Ranger Z170/Asus ROG Crosshair X670E-GENE
Cooling Bitspower monoblock ,custom open loop,both passive and active/air tower cooler/air tower cooler
Memory 32GB DDR4/32GB DDR4/64GB DDR5
Video Card(s) Gigabyte RX6900XT Alphacooled/AMD RX5700XT 50th Aniv./SOC(onboard)
Storage mix of sata ssds/m.2 ssds/mix of sata ssds+an m.2 ssd
Display(s) Dell UltraSharp U2410 , HP 24x
Case mb box/Silverstone Raven RV-05/CoolerMaster Q300L
Audio Device(s) onboard/onboard/onboard
Power Supply 3 Seasonics, a DeltaElectronics, a FractalDesing
Mouse various/various/various
Keyboard various wired and wireless
VR HMD -
Software W10.someting or another,all 3
Great, now what to do with my $900.... :banghead:
Maybe stop $ feeding the troll?

Maybe this was a PR stunt. Now they can say they listen to their customers’ concerns and do something about it!
Preemptive , perceive company image damage control ?
Dont worry they will announce the new 4070 ti 12 gig version tomorrow.
Or , better yet ? Super ti - mx-q.
They should have simply "re-launched" the 4080 16GB as the 4085, all problems solved. View attachment 265499
Long gone are the days of the "GF-Fermi" where one could unlock more shader cores with bios mod.

I am 3 pages in so far and calling it a night.
Is it just me "sensing a pattern" with the last couple graphics cards launches from the "big green camp"?
I mean first to time around(rtx3xna series) could be seen as a one off , but , all over again can shift towards another p.o.w. , such as , the "green camp" turned into a troll(price-scalping troll).

Not like that I couldn't fork the buck-a-roos for a 3000 series at launch'es (-90-80-70) , though it of been to the last cent then. Similarly different thing now , kinda all over again type a situation, got the savings ok after not having to "burn" trough them post surgery. Nope , nVidia's RTX 4090 out of stock, or way north of $2Kilo. At peace with skipping this gen' of theirs I am.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
12,692 (2.91/day)
Location
Jyväskylä, Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X @ PBO +200 -20CO
Motherboard Asus ROG Crosshair VII Hero
Cooling Arctic Freezer 50, EKWB Vector TUF
Memory 32GB Kingston HyperX Fury DDR4-3466
Video Card(s) Asus GeForce RTX 3080 TUF OC 10GB
Storage 3.3TB of SSDs + 3TB USB3.0 HDDs
Display(s) 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60
Case Corsair 4000D Airflow White
Audio Device(s) Asus TUF H3 Wireless / Corsair HS35
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 + Asus ROG Strix Edge Nordic
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis
Yes. They are probably sitting in some warehouse in bulk packaging anyhow waiting for final firmware. Even if there were a few sample units sent out, they will all likely get reflashed to new code that identifies it as whatever model number NVIDIA decides.

There are product stickers on the PCB with the wrong SKU and some AIB partners might have put the model number on the cooler.

All of the packaging will have to be scrapped of course.

Remember that Apple's iOS software is RTM'ed shortly before the iPhone launch, maybe 10-14 days to give the manufacturer time to flash units for channel distribution and bricks-and-mortar stores.
Though relabeling packages isn't too uncommon. I remember RX 480 4GB for example, they used the same ones as 8GB versions, just a 4GB sticker over it.
 
Top