• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA: Don't Expect Base 20-Series Pricing to Be Available at Launch

Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
Would you say that if 2080 is 10% faster than 2070 for $200 more and all they have to justify it is gigarays or whatever it is called ?
I like buying new gen cards too,I'm itching to upgrade even though I have stellear performance on 1080, but pre ordering rtx is just silly unless you absolutely do not care about perf/price.

GTX 1080Ti was like 200€ more than GTX 1080. For almost 50% bump in performance. So, there's that... But when someone is looking for ultimate peak performance, 10% for 200€ is something they don't really care about.
 
Joined
Aug 6, 2017
Messages
7,412 (2.77/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
GTX 1080Ti was like 200€ more than GTX 1080. For almost 50% bump in performance. So, there's that... But when someone is looking for ultimate peak performance, 10% for 200€ is something they don't really care about.
More like 25-30 depending on the resolution,with more vram, but still good perf upgrade.I feel the price of 2080 is more dictated by gigarays.I feel like 2080 will be the entry card to run rtx at decent framerates,while 2070 will be too weak in ray tracing but better perf/price without them.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,759 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Same clocks, same arch, less CCs. Do the math. Baloney tricks don't count.
We do not know anything about clocks. Stock clock have not really mattered for couple generations already.
 
Joined
Aug 6, 2017
Messages
7,412 (2.77/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
We do not know anything about clocks. Stock clock have not really mattered for couple generations already.
Core clocks will be similar. I think there's either improvement in shading performance or they simply went back to higher shader clock like we saw in the past when shader clock was higher than core clock.
 
Joined
Feb 3, 2017
Messages
3,759 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
From what little has been disclosed about the architecture, Turing's SMs are close to Volta's. When it comes to gaming, not a big difference. Separate clock domains are not likely either, that would make things unnecessarily complicated. Also, distribution of other things like even ROPs or TMUs suggests not much has changed.

All Pascal cards run considerably higher than their boost clocks given even remotely adequate cooling. It is probably safe to say Turings will not be running on their boost clocks either but higher. The question is, how much higher. Overclocking-wise all Pascals top out somewhere a little past 2 GHz. If Turing cannot put at least 10-15% on top of Pascal's clocks there will be a problem.

Anyhow, we'll see in 2 weeks :)
 
Joined
Sep 17, 2014
Messages
22,479 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
From what little has been disclosed about the architecture, Turing's SMs are close to Volta's. When it comes to gaming, not a big difference. Separate clock domains are not likely either, that would make things unnecessarily complicated. Also, distribution of other things like even ROPs or TMUs suggests not much has changed.

All Pascal cards run considerably higher than their boost clocks given even remotely adequate cooling. It is probably safe to say Turings will not be running on their boost clocks either but higher. The question is, how much higher. Overclocking-wise all Pascals top out somewhere a little past 2 GHz. If Turing cannot put at least 10-15% on top of Pascal's clocks there will be a problem.

Anyhow, we'll see in 2 weeks :)

Pascal doesn't top out at 2 Ghz on air, most of the time, you end up around 1900-1950 due to thermals / heat build up over time. And that is with some of the better open air versions. Only the x70 and lower can reliably top 2 Ghz at full load.

Turing isn't on a fantastic node or anything and I really don't see how they are going to go much higher especially when RT cores are putting out extra heat.
 
Joined
Apr 19, 2018
Messages
1,227 (0.51/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
nVidia is very busy transforming itself into an Apple wannabe. They started many years ago with the marketing department, now sales and soon it will be nVidia that will be the only one making and selling its own cards. It's the only way they see to increase profits and control the market with an even tighter grip. Another bonus for them is that they can charge even more, and keep prices artificially high, as they can create their own "shortages" much more easily.
 
Joined
Mar 31, 2012
Messages
860 (0.19/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X | AMD Ryzen 9 9950X
Motherboard QUANTA | ASUS Crosshair VII Hero | MSI MEG ACE X670E
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3400(OC) CL14@1.38v | Fury Beast 64 Gb CL30
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus | TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 15,5" / 27" /34"
Case Black & Grey | Phanteks P400S | O11 EVO XL
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W | FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores
seems the price doesn't matter at all for their loyalties. LOL.
oh NV, you actually can suck their wallet more and more. you're stupid. LOL

INSANE
 
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Joined
Sep 1, 2009
Messages
1,234 (0.22/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling ARCTIC Liquid Freezer II 240
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64
I canceled my 2080 pre order and bought a used 1080TI. Should be great for 1440p. After the 3dmark numbers and DLSS having to be baked into a game, I'm not going to pay nVidias R&D into ray tracing.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,089 (3.00/day)
Location
UK\USA
I don't mean to offend you, but you are the one showing your anger toward the CEO. Sounds like you are angry paying for the cards over the years.

I think price is slightly higher but justifiable. The 2080 ti is for the top end and for people that don't want to pay that, the 2080 is an option. It is faster than the 1080 ti but at a higher cost. The problem is people are anchored in that the ti is the top card and “should” be at a certain price point. I say buy what you can afford to pay if the performance gain gives you the satisfaction. Not everyone needs the top end cards.

So speaking hypothetically, going by what you say we should pay even more on top of the cost of a 3080 when released because it's faster and older cards should not go lower in price.

I canceled my 2080 pre order and bought a used 1080TI. Should be great for 1440p. After the 3dmark numbers and DLSS having to be baked into a game, I'm not going to pay nVidias R&D into ray tracing.

Well why pay for some thing that you cannot use lol, screw that shit.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,995 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
I canceled my 2080 pre order and bought a used 1080TI. Should be great for 1440p. After the 3dmark numbers and DLSS having to be baked into a game, I'm not going to pay nVidias R&D into ray tracing.
It’s excellent at 1440p! It plays that resolution like a 980Ti plays 1080p.
 
Joined
Jan 6, 2013
Messages
350 (0.08/day)
The chip is too big because it was originally aimed for AI. Most of the real estate are not really beneficial for games. AND nVidia PR is really working overtime to push those features as "Gaming".

If they instead just gave a true Dx 12 card with a-sync and used that 750mm2 for good old CUDAs, the price would be justified.
I truly doubt it is a mistake chip. These things are thought from early on, ray traycing is not an AI feature and the AI part of the chip doesn't take a whole lot of space.
But yes, indeed, they are quite expensive, rightfully so, but not worth the money.
 
Joined
Aug 6, 2017
Messages
7,412 (2.77/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Pascal doesn't top out at 2 Ghz on air, most of the time, you end up around 1900-1950 due to thermals / heat build up over time. And that is with some of the better open air versions. Only the x70 and lower can reliably top 2 Ghz at full load.

Turing isn't on a fantastic node or anything and I really don't see how they are going to go much higher especially when RT cores are putting out extra heat.
2152 stable on air here baby :D on stock volage

It’s excellent at 1440p! It plays that resolution like a 980Ti plays 1080p.

Even better than 980ti runs 1080p.
 
Joined
Aug 2, 2011
Messages
1,458 (0.30/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 Pro 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Alienware AW3423DWF, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502X Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
You should wait for reviews anyway.

I have no horse in this race except for very high hopes for 2070 matching 1080Ti or coming very close. 2080 is going to be a meh deal compared to 2070 while 2080Ti is just ridiculous at $1000 even though it will offer amazing performance.

This time around, I don't think our 2070 will match the previous Ti card like we have in the past. 2080 will probably match 1080 Ti. Which is sad.

Womp, womp, nvidia seems to be hiding something... Only the biggest huang suckers allowed https://www.hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/

That is nothing new. AMD does this as well. Kyle has be incredibly anti-NVIDIA ever since the GPP fiasco. He was right to be angry about that, but now he's just hit insanity mode (the NDA thing was blown out of proportion too. NVIDIA's NDA was bog standard boiler plate NDA).
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
AMD does this as well.
AMD "does this", my arse, but this kind of whitewashing is quite a bit better than "no big deal".

NVIDIA's NDA was bog standard boiler plate NDA).
An outright lie.
It wasn't even Kyle who was first to refuse to sign bendoverbackward NDA, it was heise.de, a site that is, to put it softly, somewhat bigger than this one.
 
Joined
Aug 6, 2017
Messages
7,412 (2.77/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
AMD pulled their Ryzen and Threadripper samples from a popular Polish tech site cause they wanted to impose changes in their testing and the site refused. They do this crap too.
If you believe any of the big three is clean and fair you'd better have your cataracts operated on asap. Nvidia and Intel have done more of it in the recent years cause they have a grip on the majority market, but just wait until AMD win the majority of the market for themselves.....
 
Last edited:
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
Lot of strong opinions for a card that hasn't been released or reviewed yet. I expect to see that in politics, not in the tech sector, but it would seem that AMD / nVidia discussions are more politial then technical.... and it can't be technical until there's real data to discuss. Not much difference between the Red and Blue (in US politics) arguing about about subjects based upon presumption rather than facts, and many of these red and green discussions.
 
Joined
Aug 2, 2012
Messages
1,987 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
These prices are ****ing insane.
 
Joined
Jul 9, 2016
Messages
1,078 (0.35/day)
System Name Main System
Processor i9-10940x
Motherboard MSI X299 Xpower Gaming AC
Cooling Noctua NH-D15S + Second Fan
Memory G.Skill 64GB @3200MHz XMP
Video Card(s) ASUS Strix RTX 3090 24GB
Storage 2TB Samsung 970 EVO Plus; 2TB Corsair Force MP600; 2TB Samsung PM981a
Display(s) Dell U4320Q; LG 43MU79-B
Case Corsair A540
Audio Device(s) Creative Lab SoundBlaster ZX-R
Power Supply EVGA G2 1300
Mouse Logitech MK550
Keyboard Corsair K95 Platinum XT Brown Switches
Software Windows 10 Pro
Benchmark Scores Cinebench R20 - 6910; FireStrike Ultra - 13241; TimeSpy Extreme - 10067; Port Royal - 13855
So speaking hypothetically, going by what you say we should pay even more on top of the cost of a 3080 when released because it's faster and older cards should not go lower in price.

Well why pay for some thing that you cannot use lol, screw that shit.

Nice try, but no. The prices are set via the same economic principle - supply and demand, and what are the consumers' willingness to pay. During the mining craze all AIB increased their prices and people, or miners, still bought them up and out. So they continue to increase the prices, and people continue to buy them up. For businesses, it is important not to leave any money on the table. nVidia did the price increase, and of course along with some performance improvement, and see how the market will react. Will they sell out? Will people reject the price increase? Then they will do the price adjustment. Or bundle some games. This is more straightforward than you think. Don't like the price increase? Don't like the performance improvement? Don't buy. Can't afford to pay? Like the performance but don't like to pay for it? Don't buy. Simple as that.

Prices of the 10xx series did go down. Perhaps not as much as you want or expect, but they did go down.
 
Joined
Feb 17, 2017
Messages
854 (0.30/day)
Location
Italy
Processor i7 2600K
Motherboard Asus P8Z68-V PRO/Gen 3
Cooling ZeroTherm FZ120
Memory G.Skill Ripjaws 4x4GB DDR3
Video Card(s) MSI GTX 1060 6G Gaming X
Storage Samsung 830 Pro 256GB + WD Caviar Blue 1TB
Display(s) Samsung PX2370 + Acer AL1717
Case Antec 1200 v1
Audio Device(s) aune x1s
Power Supply Enermax Modu87+ 800W
Mouse Logitech G403
Keyboard Qpad MK80
Better wait for navi, hopefully will be slightly slower than this but prices won't be this insane.
 

ThisguyTM

New Member
Joined
Sep 1, 2018
Messages
6 (0.00/day)
That is inevitable,sadly. Big die = low yeld = big costs.


So let's make a guest on how much Nvidia makes per wafer on Turing, asuming per wafer cost plus production is 50k and yield is bad at around less than 20%(most definitely not as Nvidia had TSMC custom made 12nm for them and 12nm is basically 16nm improved). Total wafer yield is 68 die, with 20% yield you are looking at 14 perfect GPU die.
Those perfect GPU die will be sold as the RTX8000 each costing 10k. So 14x10k is 140k, Nvidia would have made significant profit on those.

Not to mention that not all defective GPU die is completely dead, those go on to become second/third tier RTX quadro and the worst quality silicon went on to be the 2080ti or share some with the third tier quadro to meet demand. So Nvidia is making significant amount of cash per wafer. People like to say or justify the high price of the 2080ti saying "Oh it's expensive because the GPU die is massive", no they are not, it's expensive simply because Nvidia can and will, fanboys will still buy it regardless. This is what happens when consumers allow this to happen.
 
Joined
Aug 6, 2017
Messages
7,412 (2.77/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
and 12nm is basically 16nm improved). Total wafer yield is 68 die, with 20% yield you are looking at 14 perfect GPU die.
Those perfect GPU die will
Die size is a factor,no competition is a factor,pascal overstock is a factor,new memory is a factor,new architecture is a factor.... they can price them like this simply cause they can afford it. Mining did not help cause we're dealing with pascal overstock,AMD does not help either when they openly say Vega 20 is for HPC and professionals.
 
Joined
Jul 9, 2016
Messages
1,078 (0.35/day)
System Name Main System
Processor i9-10940x
Motherboard MSI X299 Xpower Gaming AC
Cooling Noctua NH-D15S + Second Fan
Memory G.Skill 64GB @3200MHz XMP
Video Card(s) ASUS Strix RTX 3090 24GB
Storage 2TB Samsung 970 EVO Plus; 2TB Corsair Force MP600; 2TB Samsung PM981a
Display(s) Dell U4320Q; LG 43MU79-B
Case Corsair A540
Audio Device(s) Creative Lab SoundBlaster ZX-R
Power Supply EVGA G2 1300
Mouse Logitech MK550
Keyboard Corsair K95 Platinum XT Brown Switches
Software Windows 10 Pro
Benchmark Scores Cinebench R20 - 6910; FireStrike Ultra - 13241; TimeSpy Extreme - 10067; Port Royal - 13855
So let's make a guest on how much Nvidia makes per wafer on Turing, asuming per wafer cost plus production is 50k and yield is bad at around less than 20%(most definitely not as Nvidia had TSMC custom made 12nm for them and 12nm is basically 16nm improved). Total wafer yield is 68 die, with 20% yield you are looking at 14 perfect GPU die.
Those perfect GPU die will be sold as the RTX8000 each costing 10k. So 14x10k is 140k, Nvidia would have made significant profit on those.

Wait, how did you "assume" the cost of the wafer plus production cost is $50k? Your calcuation of profit is based on this "number". Your idea of: retail price - production wafer cost = pure profit is absolutely wrong. The cost of a single card is not just the wafer. It includes so many things - engineering, administration, marketing, employees salaries, pre-production sampling, 10 years of R&D, PCB manufacturing, licensing, etc. The list goes on.

Not to mention that not all defective GPU die is completely dead, those go on to become second/third tier RTX quadro and the worst quality silicon went on to be the 2080ti or share some with the third tier quadro to meet demand. So Nvidia is making significant amount of cash per wafer. People like to say or justify the high price of the 2080ti saying "Oh it's expensive because the GPU die is massive", no they are not, it's expensive simply because Nvidia can and will, fanboys will still buy it regardless. This is what happens when consumers allow this to happen.

So bad die becomes the 2080 ti? What??? :kookoo: So my 1080 ti is actually a defective chip? You know, the die could be bad in places that are important. Most bad die cannot be reused, especially for a high end chip/card that utilizes most of the die.
 

mnemo_05

New Member
Joined
May 18, 2017
Messages
9 (0.00/day)
cant wait for the release of this card. and no i wont be getting a rtx at these prices, that is very dumb thing to do.

i cant wait for the prices for the 10-series to go down further, my buy in for a 1080ti is at 400-450.
 
Top