• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090)

Joined
Sep 10, 2018
Messages
6,926 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Yeah, I think that's spot on. And I also think that if Nvidia were to get their will with going that way, it'll be the end of PC gaming - barring someone else stepping in to fill their shoes. PC gaming as a hyper-expensive niche just isn't sustainable as a business. If anything, the Steam Deck has demonstrated that the future of gaming might lie in the exact opposite direction, and that we might in some ways be reaching a point of "good enough" in a lot of ways.

I also agree that this situation presents a massive opportunity for AMD. Regardless of where RDNA3 peaks in terms of absolute performance, if they flesh out the $300-700 segment with high performing, good value options relatively quickly, they have an unprecedented opportunity to overtake Nvidia - as this pitch from Nvidia (as well as their earnings call) confirms that they've got more Ampere stock than they know what to do with, and they don't want to stomach the cost of cutting prices unless they really have to. If RDNA3 actually delivers its promised 50% perf/W increase, and die sizes and MCM packaging costs allow for somewhat competitive pricing, this could be the generation where AMD makes a significant move up from their perennial ~20% market share. That's my hope - but knowing AMD and their recent preference for high ASPs and following in Nvidia's pricing footsteps, I'm not that hopeful. Fingers crossed, though.

I think AMD is more focused on CPU than GPU all their products are made on the Same TSMC node unlike Nvidia that gets a special node just for their gpus. I think AMD is happy with 20-25% of the market and just making a profit on top of being in the PS5 and Series X/S.

That being said I still expect the 7900XT to be 1200 usd or less and the 7800XT to be 800 usd or less I just don't think AMD cares as long as its 50-100 usd cheaper than a comparable Nvidia sku even if its much cheaper to make. I also wouldn't be surprised if the 7900XT was 1500 usd though..... I doubt anything lower than these 2 cards will launch anytime soon though with maybe a 7700XT early next year that to me will be the more interesting card in how they price it.

Another thing and this goes for Nvidia as well is TSMC is really the one dictating prices. Not sure if true but the New Ada gpus are supposedly much more costly to make than Ampere I really would like to see a breakdown of component cost for sure though I know SMD are getting more expensive as well.
 
Last edited:
Joined
Aug 21, 2015
Messages
1,725 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Yeah, I think that's spot on. And I also think that if Nvidia were to get their will with going that way, it'll be the end of PC gaming - barring someone else stepping in to fill their shoes. PC gaming as a hyper-expensive niche just isn't sustainable as a business. If anything, the Steam Deck has demonstrated that the future of gaming might lie in the exact opposite direction, and that we might in some ways be reaching a point of "good enough" in a lot of ways.

I also agree that this situation presents a massive opportunity for AMD. Regardless of where RDNA3 peaks in terms of absolute performance, if they flesh out the $300-700 segment with high performing, good value options relatively quickly, they have an unprecedented opportunity to overtake Nvidia - as this pitch from Nvidia (as well as their earnings call) confirms that they've got more Ampere stock than they know what to do with, and they don't want to stomach the cost of cutting prices unless they really have to. If RDNA3 actually delivers its promised 50% perf/W increase, and die sizes and MCM packaging costs allow for somewhat competitive pricing, this could be the generation where AMD makes a significant move up from their perennial ~20% market share. That's my hope - but knowing AMD and their recent preference for high ASPs and following in Nvidia's pricing footsteps, I'm not that hopeful. Fingers crossed, though.


This is absolutely true. There's also the simple fact of GPUs largely being good enough for quite a lot of things for quite a while. We're no longer seeing anyone need a GPU upgrade after even 3 years - unless you're an incorrigible snob, at least. Lots of factors play into this, from less growth in graphical requirements to the massive presence of AA/lower budget/indie games in the current market to open source upscaling tech to lower resolutions in many cases still being fine in terms of graphical fidelity. It could absolutely be that Nvidia is preparing for a future of (much) lower sales, but I don't think they can do that by pushing for $1500 GPUs becoming even remotely normal - the number of people affording those isn't going to increase anytime soon. Diversifying into other markets is smart, though, and AMD also needs to do the same (not that they aren't already). But the impending/already happened death of the 1-2 year GPU upgrade cycle won't be solved by higher prices and more premium products.

I think it's less trying to position $1500 GPUs as normal, but using the halo effect to normalize $500-700 cards as midrange. $350 was upper midrange not that long ago.
 
Joined
Oct 15, 2010
Messages
951 (0.18/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
if that's what you consider fair pricing logic then next GPU gen will be 1499 for the 5080 becuase its faster.

Finaly someone with a brain.
/rant
I for the life of me don't understand all these people claiming normal prices, jsut because it's faster! IT'S FUCKING SUPPOSED TO BE FASTER, It's a new gen for god sake. How on earth is it justifiable that a gepu that launched at $700 is now launching at freaking $1100?!?!?!?

Fucking next gen normal price is $2000 because is 2x faster than 4080????!?!?!?
rant/
 
Joined
Dec 14, 2011
Messages
1,044 (0.22/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS GTX 1650 TUF
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Redragon K618 RGB PRO
Software Microsoft Windows 11 - Enterprise (64-bit)
You would think that, after the record-high profits nVidia reaped, they would cut gamers some slack this time around. But it looks like, they are just rolling on ahead with giving middle fingers. an RTX4080 that should be a 4070... and the prices.... LOL! GTFO nGreedia!

Don't buy their stuff unless you NEED a replacement, time to teach them a lesson they won't soon forget.
 
Joined
Aug 25, 2021
Messages
1,170 (0.98/day)
Finaly someone with a brain.
/rant
I for the life of me don't understand all these people claiming normal prices, jsut because it's faster! IT'S FUCKING SUPPOSED TO BE FASTER, It's a new gen for god sake. How on earth is it justifiable that a gepu that launched at $700 is now launching at freaking $1100?!?!?!?

Fucking next gen normal price is $2000 because is 2x faster than 4080????!?!?!?
rant/
Calm down. It's not normal. If high end GPU enthusiasts are happy to be milked, allow them the pleasure. It's not mining crisis anymore, so those expensive cards will have harder time to sell, when you can buy used 3090 under $700 on ebay.
 
Joined
Jan 20, 2019
Messages
1,560 (0.73/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I think it's less trying to position $1500 GPUs as normal, but using the halo effect to normalize $500-700 cards as midrange. $350 was upper midrange not that long ago.

I had the same feeling back in 2017 seeing the increase in price for the 1080 TI which many (like myself) were induced to invest into. For me it was a tough decision at the time being my earlier purchases of previous Gen hardware at the GPU level hardly shot north of £350-£400. To justify forking out £600/so for a 1080 TI i dropped going the custom liquid cooling route which was preplanned for both the GPU and CPU... hence convinced myself it was a worthwhile splurge paired up with a less expensive AIO. At the time it was apparent back then if gamers consciously splurged £600-£750 for the top performance territory admission (or closely trailing) the end result was always inevitable: "Gen-2-Gen perpetual and immense price increases " or in more subtle terms "taking the p-i-s-s". What I didn't anticipate was how far the day light robbery would stretch and to what extent the influence would carry over to mid-performance cards. I can't see a 4070/4060 being reasonably priced by any stretch of the imagination and even worse i think we're losing merit in defining what is reasonable as we're long past sensible/logical price moderation and a large fraction of the added cost already appears inherent. Worse of it, i felt this way at the launch of 1080 TI... current offerings for me are just a bad joke gone right for NVIDIA and soon to follow, AMD.

Anyway, i've made my mind up.. i'm not spending a dime over £800 for my next upgrade which is already an extravagant ask. I'll wait for RDNA3 to drop too. Whatever comes available at this price range, if it delivers a sizeable performance uplift over my current graphics card (2080-TI) i'm pulling the trigger. What i won't be doing is picking up a high refresh rate 4K display over the current 1440p which by now (5 years on) I thought it would be economically feasible. NVIDIA, thanks!
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
I have to correct something. 4090 will not support nvlink. The 4090 will be more a gaming card, than the 3090 was. Interesting. Now there is room for a new rtx Titan around 2000 $ and nvlink. Also possible, nVidia wants to stop cannibalizing quadro clients. It seams, the 4090 is mainly for 4k/RT enthusiasts. Didn't expected it.
 

oneils

New Member
Joined
Sep 21, 2022
Messages
2 (0.00/day)
Hopefully we could have DLSS-3 without RT (RT off) so a 4050/4060 will be just fine for a 4K, 60FPS, not max ultra settings.



4080 12gb
4080 12gb ti
4080 12gb super
4080 12gb super ti

4080 16gb
4080 16gb ti
4080 16gb super
4080 16gb super ti

4090 20gb/ti/super/super ti

4090 24gb ti/super/ super ti

and you can have every bit of +/-256 shaders you like
:)

Hahaha. Reminds me of the early 2000s. Around 2002-2004. So many tiers...
 
Joined
Nov 26, 2021
Messages
1,648 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
I think AMD is more focused on CPU than GPU all their products are made on the Same TSMC node unlike Nvidia that gets a special node just for their gpus. I think AMD is happy with 20-25% of the market and just making a profit on top of being in the PS5 and Series X/S.

That being said I still expect the 7900XT to be 1200 usd or less and the 7800XT to be 800 usd or less I just don't think AMD cares as long as its 50-100 usd cheaper than a comparable Nvidia sku even if its much cheaper to make. I also wouldn't be surprised if the 7900XT was 1500 usd though..... I doubt anything lower than these 2 cards will launch anytime soon though with maybe a 7700XT early next year that to me will be the more interesting card in how they price it.

Another thing and this goes for Nvidia as well is TSMC is really the one dictating prices. Not sure if true but the New Ada gpus are supposedly much more costly to make than Ampere I really would like to see a breakdown of component cost for sure though I know SMD are getting more expensive as well.
TSMC's N5 is significantly more expensive than their own N7 or Samsung's 8 nm node. That isn't the full explanation though. Given the size of the 4090's die, the 4080 16 GB should be based on a die around 400 mm^2. That is about the same size as GA104, i.e. the die used for the 3060 Ti, 3070, and 3070 Ti. As the 4080-16 is unlikely to be harvested from perfect dies, it will cost Nvidia about 67 to 96 dollars more than GA104. With a 50% margin, this means the 4080 16 GB should be 134 to 192 dollars more than the 3070 based on the die cost alone. I don't know what the 16 GB of GDDR6X cost compared to 8 GB of GDDR6. Let's say it's an extremely unlikely $220 vs $55. Now the difference is $ 165 for the memory alone; margins for the retailer and AIB should increase it to $ 200 or so. Overall, the cost increase should top out at $ 450 for memory plus GPU, and is probably more realistically around $ 300 or less. The 4080 16 GB shouldn't be sold for more than $ 949 based on this very cursory analysis.
 
Joined
Sep 10, 2018
Messages
6,926 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
TSMC's N5 is significantly more expensive than their own N7 or Samsung's 8 nm node. That isn't the full explanation though. Given the size of the 4090's die, the 4080 16 GB should be based on a die around 400 mm^2. That is about the same size as GA104, i.e. the die used for the 3060 Ti, 3070, and 3070 Ti. As the 4080-16 is unlikely to be harvested from perfect dies, it will cost Nvidia about 67 to 96 dollars more than GA104. With a 50% margin, this means the 4080 16 GB should be 134 to 192 dollars more than the 3070 based on the die cost alone. I don't know what the 16 GB of GDDR6X cost compared to 8 GB of GDDR6. Let's say it's an extremely unlikely $220 vs $55. Now the difference is $ 165 for the memory alone; margins for the retailer and AIB should increase it to $ 200 or so. Overall, the cost increase should top out at $ 450 for memory plus GPU, and is probably more realistically around $ 300 or less. The 4080 16 GB shouldn't be sold for more than $ 949 based on this very cursory analysis.

Tsmc 5nm is like $ 17000 vs $9000 for 7nm per 300mm wafer. Not sure if what Nvidia is using with the 4n is even more expensive. I'm pretty sure samsungs 8nm was much cheaper than 7nm tsmc. No idea what the yealds are so hard to compare pricing.


I agree though pricing seems to suck on both the 4080s my guess is if ampere was all sold out they'd price them at 799 and 999 but I'm probably wrong.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,346 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
You would think that, after the record-high profits nVidia reaped, they would cut gamers some slack this time around. But it looks like, they are just rolling on ahead with giving middle fingers. an RTX4080 that should be a 4070... and the prices.... LOL! GTFO nGreedia!

Don't buy their stuff unless you NEED a replacement, time to teach them a lesson they won't soon forget.
Present-day capitalism is about constant growth, and not cutting some slack. If they can sell stuff for more money, they will. Let's see what AMD's got in hand. I hope, not the same "it's more expensive because it's faster" crap.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I have to correct something. 4090 will not support nvlink. The 4090 will be more a gaming card, than the 3090 was. Interesting. Now there is room for a new rtx Titan around 2000 $ and nvlink. Also possible, nVidia wants to stop cannibalizing quadro clients. It seams, the 4090 is mainly for 4k/RT enthusiasts. Didn't expected it.
Nvlink isn't necessary for most pro applications either - only the ones where memory coherency between cards is a major benefit. Most renderers and similar applications in the prosumer/pro range treat individual GPUs separately regardless of the presence of an Nvlink bridge. So the loss of Nvlink won't matter much to the types of pros interested in a Titan class GPU anyhow. Of course, if you're working with tens of GB of data that needs loading into VRAM, then Nvlink (and ideally 48GB or higher Quadros) is a must.
 
Joined
Apr 14, 2022
Messages
749 (0.78/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
I don't mind the new naming scheme (two 4080s! like two 3080s like two 1060s) since I always look for how it performs at what cost but....kind of dissapointed of the prices.
949 and 1269 pounds for the FE??!!

I was always reading about skipping the 3000, if you have a 2080Ti, but now seems to be the only reasonable, pricewise, alternative.

Also if someone is willing to pay the price for a 4080, he should go all in for the 4090. I see the 4080 16GB as terrible value.
 
Joined
Nov 26, 2021
Messages
1,648 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Tsmc 5nm is like $ 17000 vs $9000 for 7nm per 300mm wafer. Not sure if what Nvidia is using with the 4n is even more expensive. I'm pretty sure samsungs 8nm was much cheaper than 7nm tsmc. No idea what the yealds are so hard to compare pricing.


I agree though pricing seems to suck on both the 4080s my guess is if ampere was all sold out they'd price them at 799 and 999 but I'm probably wrong.
I used estimates of 16,000 for N5 and 7,000 for Samsung's 8 nm. Yields are actually much higher than the defect rate indicates as they aren't selling full dies; they can harvest many, if not all, defective dies. Still, I used a defect density of 0.09 per square cm for both processes which works out to a yield of 70% for a GA104 sized chip. According to TSMC, N4 will use EUV for more layers thereby reducing the number of steps and masks, thus offering a cost advantage.
 
Joined
Sep 10, 2018
Messages
6,926 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
I used estimates of 16,000 for N5 and 7,000 for Samsung's 8 nm. Yields are actually much higher than the defect rate indicates as they aren't selling full dies; they can harvest many, if not all, defective dies. Still, I used a defect density of 0.09 per square cm for both processes which works out to a yield of 70% for a GA104 sized chip. According to TSMC, N4 will use EUV for more layers thereby reducing the number of steps and masks, thus offering a cost advantage.

Apparently prices have gone up ay least 20% since TSMC announced 5nm at 17000 per wafer back in Q1 2020.

Again hard to get a guage of what Nvidia is actually paying. They are also Apparently not offering discounts on large orders.

That's about all I can find... Again not defending Nvidia the two 4080s currently look bad.... Flagships are always overpriced but adjusted for inflation the 4090 is cheaper than the 3090 non ti at launch at least in the US.

At the same time I won't be comparing these cards to launch prices I'll be doing it vs the current 3090ti at 1100 usd and my 3080ti at it's current 800 ish usd price to guage if the 4090/4080 16GB is worth buying.
 
Last edited:
Joined
Jan 14, 2019
Messages
12,346 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I don't mind the new naming scheme (two 4080s! like two 3080s like two 1060s) since I always look for how it performs at what cost but....kind of dissapointed of the prices.
949 and 1269 pounds for the FE??!!

I was always reading about skipping the 3000, if you have a 2080Ti, but now seems to be the only reasonable, pricewise, alternative.

Also if someone is willing to pay the price for a 4080, he should go all in for the 4090. I see the 4080 16GB as terrible value.
I agree.

Considering this, my next upgrade will either be a
  • 4080 12 GB if I can find one at a reasonable price (doubtful), or
  • 7700 XT... in fact, I'm toying with the thought of having an all-AMD machine again, or
  • Arc A770. Again, only at a decent enough price. Based on its predicted performance, it should be a sweet deal at around 300 quid.
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Just for reference here, since people are discussing nodes: the differences between N5 and N4 (assuming it's base N4 and not N4P or N4X, which don't seem to be in volume production yet?) aren't huge, so Nvidia isn't likely to have that much of an advantage from this node difference compared to RDNA3. Isn't AMD also said to be using a bespoke N5 process, not the stock one? Or is that just for Ryzen 7000?

From Anandtech:
1663795882963.png

This is obviously rather vague - especially the N4 vs N5 comparison - but we can assume that regular N4 has less of a power drop than -22%, and less of a performance increase than +11%. Nvidia can definitely use that power drop, given how Ampere lagged behind RDNA2 and the upcoming RNDA3 getting a boost from moving to 5nm. Still, this is definitely going to be interesting going forwards.
 
Joined
Jan 14, 2019
Messages
12,346 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Just for reference here, since people are discussing nodes: the differences between N5 and N4 (assuming it's base N4 and not N4P or N4X, which don't seem to be in volume production yet?) aren't huge, so Nvidia isn't likely to have that much of an advantage from this node difference compared to RDNA3. Isn't AMD also said to be using a bespoke N5 process, not the stock one? Or is that just for Ryzen 7000?

From Anandtech:
View attachment 262524
This is obviously rather vague - especially the N4 vs N5 comparison - but we can assume that regular N4 has less of a power drop than -22%, and less of a performance increase than +11%. Nvidia can definitely use that power drop, given how Ampere lagged behind RDNA2 and the upcoming RNDA3 getting a boost from moving to 5nm. Still, this is definitely going to be interesting going forwards.
I've noticed that the advantages of new nodes have been diminishing while the prices of final products have been rapidly growing lately. Rhetorical question: are we coming close to a point when developing new manufacturing nodes just isn't worth it anymore?
 
Joined
Sep 13, 2021
Messages
86 (0.07/day)
Nvlink isn't necessary for most pro applications either - only the ones where memory coherency between cards is a major benefit. Most renderers and similar applications in the prosumer/pro range treat individual GPUs separately regardless of the presence of an Nvlink bridge. So the loss of Nvlink won't matter much to the types of pros interested in a Titan class GPU anyhow. Of course, if you're working with tens of GB of data that needs loading into VRAM, then Nvlink (and ideally 48GB or higher Quadros) is a must.
Thanks. I am in offline ray tracing, some content creating using Nvidia Iray engine as a hobby. In my application, the complexity of a scenery is limited by the discrete VRAM, while the cuda- and rt-cores can be pooled, as you mentioned. The professionals I am following in the community are in content creating, rendering videos and video editing. They recommended nvlink for video rendering and editing. Anyway, a single 3090 or 4090 are good enough for professional work. For 700 $ I would buy a used 3090 at once. Unfortunately, no ebay in my homeland, retail price still around 1600 $. I went from 970 to 2070 and are planning to upgrade to a 4090 if there is no inexpensive 4070 12-16 GB version. The 4080 models don't met my price-performance expectation.
 
Joined
Dec 31, 2020
Messages
986 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
What is this 3060 starts at 329 no change as of Jan 12th, 2021 year, the same good old MSR of the cards that we never saw because of mining, and could not affort because of inflated prices, are now starting where they left.

3060 must drop to 199 ASAP, 3070 299, 3080 499. and launch $599 4070 with 12GB, 7168 shaders together with 4080 12G. Nvidia is driving me nuts
 
Joined
May 2, 2017
Messages
7,762 (2.80/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I've noticed that the advantages of new nodes have been diminishing while the prices of final products have been rapidly growing lately. Rhetorical question: are we coming close to a point when developing new manufacturing nodes just isn't worth it anymore?
I think it's more a case of new nodes being so incredibly expensive to develop that we're seeing a proliferation of specialized and optimized sub-nodes, which in turn leads to smaller perceived changes node-to-node. But in a way, yes - we are definitely reaching a point where for some products, moving to a new node doesn't really make sense, which renders new nodes less useful.
 
Joined
Nov 26, 2021
Messages
1,648 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Apparently prices have gone up ay least 20% since TSMC announced 5nm at 17000 per wafer back in Q1 2020.

Again hard to get a guage of what Nvidia is actually paying. They are also Apparently not offering discounts on large orders.

That's about all I can find... Again not defending Nvidia the two 4080s currently look bad.... Flagships are always overpriced but adjusted for inflation the 4090 is cheaper than the 3090 non ti at launch at least in the US.

At the same time I won't be comparing these cards to launch prices I'll be doing it vs the current 3090ti at 1100 usd and my 3080ti at it's current 800 ish usd price to guage if the 4090/4080 16GB is worth buying.
They increased prices by up to 10% for the newer nodes. 20% was for older nodes.
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Jeez, if this is not the worst release and pricing then I dont know what is. Screw the performance uplift since that one has nothing to do with any progress or advancement if you pay almost a $900 for what should have been a mid range affordable card. It does make sense what NV has been trying to do with the Turing and the Ampere price hikes etc. Slowly, the prices are going up for the top and mid products and this is exactly the outcome. I'm guessing, it wont stop.
 
Joined
Jan 14, 2019
Messages
12,346 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Jeez, if this is not the worst release and pricing then I dont know what is. Screw the performance uplift since that one has nothing to do with any progress or advancement if you pay almost a $900 for what should have been a mid range affordable card. It does make sense what NV has been trying to do with the Turing and the Ampere price hikes etc. Slowly, the prices are going up for the top and mid products and this is exactly the outcome. I'm guessing, it wont stop.
Nvidia in 2014: "Look, here's twice the performance for the same price."
Nvidia in 2022: "You need to pay more to get more. That's just the way it is."
WTF?
 
Top