Thursday, August 30th 2018

NVIDIA: Don't Expect Base 20-Series Pricing to Be Available at Launch

Tom Petersen, NVIDIA's director of technical marketing, in a webcast with HotHardware, expressed confidence in the value of its RTX 20-series graphics cards - but threw a wrench on consumers' pricing expectations, set by NVIDIA's own MSRP. That NVIDIA's pricing for their Founder's Edition graphics cards would give partners leeway to increase their margins was a given - why exactly would they sell at lower prices than NVIDIA, when they have increased logistical (and other) costs to support? And this move by NVIDIA might even serve as a small hand-holding for partners - remember that every NVIDIA-manufactured graphics cad sold is one that doesn't go to its AIB's bottom-lines, so there's effectively another AIB contending for their profits. This way, NVIDIA gives them an opportunity to make some of those profits back (at least concerning official MSRP).
Tom Petersen had this to say on the HotHardware webcast: "The partners are basically going to hit the entire price point range between our entry level price, which will be $499, up to whatever they feel is the appropriate price for the value that they're delivering. (...) In my mind, really the question is saying 'am i gonna ever see those entry prices?' And the truth is: yes, you will see those entry prices. And it's really just a question of how are the partners approaching the market. Typically when we launch there is more demand than supply and that tends to increase the at-launch supply price."

Of course, there were some mitigating words left for last: "But we are working really hard to drive that down so that there is supply at the entry point. We're building a ton of parts and it's the natural behaviour of the market," Tom Petersen continued. "So it could be just the demand/supply equation working its way into retail pricing - but maybe there's more to it than that."
Sources: HotHardware Webcast, via PCGamesN
Add your own comment

95 Comments on NVIDIA: Don't Expect Base 20-Series Pricing to Be Available at Launch

#51
cucker tarlson
RejZoRGTX 1080Ti was like 200€ more than GTX 1080. For almost 50% bump in performance. So, there's that... But when someone is looking for ultimate peak performance, 10% for 200€ is something they don't really care about.
More like 25-30 depending on the resolution,with more vram, but still good perf upgrade.I feel the price of 2080 is more dictated by gigarays.I feel like 2080 will be the entry card to run rtx at decent framerates,while 2070 will be too weak in ray tracing but better perf/price without them.
Posted on Reply
#52
londiste
TheGuruStudSame clocks, same arch, less CCs. Do the math. Baloney tricks don't count.
We do not know anything about clocks. Stock clock have not really mattered for couple generations already.
Posted on Reply
#53
cucker tarlson
londisteWe do not know anything about clocks. Stock clock have not really mattered for couple generations already.
Core clocks will be similar. I think there's either improvement in shading performance or they simply went back to higher shader clock like we saw in the past when shader clock was higher than core clock.
Posted on Reply
#54
londiste
From what little has been disclosed about the architecture, Turing's SMs are close to Volta's. When it comes to gaming, not a big difference. Separate clock domains are not likely either, that would make things unnecessarily complicated. Also, distribution of other things like even ROPs or TMUs suggests not much has changed.

All Pascal cards run considerably higher than their boost clocks given even remotely adequate cooling. It is probably safe to say Turings will not be running on their boost clocks either but higher. The question is, how much higher. Overclocking-wise all Pascals top out somewhere a little past 2 GHz. If Turing cannot put at least 10-15% on top of Pascal's clocks there will be a problem.

Anyhow, we'll see in 2 weeks :)
Posted on Reply
#55
Vayra86
londisteFrom what little has been disclosed about the architecture, Turing's SMs are close to Volta's. When it comes to gaming, not a big difference. Separate clock domains are not likely either, that would make things unnecessarily complicated. Also, distribution of other things like even ROPs or TMUs suggests not much has changed.

All Pascal cards run considerably higher than their boost clocks given even remotely adequate cooling. It is probably safe to say Turings will not be running on their boost clocks either but higher. The question is, how much higher. Overclocking-wise all Pascals top out somewhere a little past 2 GHz. If Turing cannot put at least 10-15% on top of Pascal's clocks there will be a problem.

Anyhow, we'll see in 2 weeks :)
Pascal doesn't top out at 2 Ghz on air, most of the time, you end up around 1900-1950 due to thermals / heat build up over time. And that is with some of the better open air versions. Only the x70 and lower can reliably top 2 Ghz at full load.

Turing isn't on a fantastic node or anything and I really don't see how they are going to go much higher especially when RT cores are putting out extra heat.
Posted on Reply
#56
stimpy88
nVidia is very busy transforming itself into an Apple wannabe. They started many years ago with the marketing department, now sales and soon it will be nVidia that will be the only one making and selling its own cards. It's the only way they see to increase profits and control the market with an even tighter grip. Another bonus for them is that they can charge even more, and keep prices artificially high, as they can create their own "shortages" much more easily.
Posted on Reply
#57
SIGSEGV
seems the price doesn't matter at all for their loyalties. LOL.
oh NV, you actually can suck their wallet more and more. you're stupid. LOL

INSANE
Posted on Reply
#59
dir_d
I canceled my 2080 pre order and bought a used 1080TI. Should be great for 1440p. After the 3dmark numbers and DLSS having to be baked into a game, I'm not going to pay nVidias R&D into ray tracing.
Posted on Reply
#60
AsRock
TPU addict
LFaWolfI don't mean to offend you, but you are the one showing your anger toward the CEO. Sounds like you are angry paying for the cards over the years.

I think price is slightly higher but justifiable. The 2080 ti is for the top end and for people that don't want to pay that, the 2080 is an option. It is faster than the 1080 ti but at a higher cost. The problem is people are anchored in that the ti is the top card and “should” be at a certain price point. I say buy what you can afford to pay if the performance gain gives you the satisfaction. Not everyone needs the top end cards.
So speaking hypothetically, going by what you say we should pay even more on top of the cost of a 3080 when released because it's faster and older cards should not go lower in price.
dir_dI canceled my 2080 pre order and bought a used 1080TI. Should be great for 1440p. After the 3dmark numbers and DLSS having to be baked into a game, I'm not going to pay nVidias R&D into ray tracing.
Well why pay for some thing that you cannot use lol, screw that shit.
Posted on Reply
#61
rtwjunkie
PC Gaming Enthusiast
dir_dI canceled my 2080 pre order and bought a used 1080TI. Should be great for 1440p. After the 3dmark numbers and DLSS having to be baked into a game, I'm not going to pay nVidias R&D into ray tracing.
It’s excellent at 1440p! It plays that resolution like a 980Ti plays 1080p.
Posted on Reply
#62
yeeeeman
sweetThe chip is too big because it was originally aimed for AI. Most of the real estate are not really beneficial for games. AND nVidia PR is really working overtime to push those features as "Gaming".

If they instead just gave a true Dx 12 card with a-sync and used that 750mm2 for good old CUDAs, the price would be justified.
I truly doubt it is a mistake chip. These things are thought from early on, ray traycing is not an AI feature and the AI part of the chip doesn't take a whole lot of space.
But yes, indeed, they are quite expensive, rightfully so, but not worth the money.
Posted on Reply
#63
cucker tarlson
Vayra86Pascal doesn't top out at 2 Ghz on air, most of the time, you end up around 1900-1950 due to thermals / heat build up over time. And that is with some of the better open air versions. Only the x70 and lower can reliably top 2 Ghz at full load.

Turing isn't on a fantastic node or anything and I really don't see how they are going to go much higher especially when RT cores are putting out extra heat.
2152 stable on air here baby :D on stock volage
rtwjunkieIt’s excellent at 1440p! It plays that resolution like a 980Ti plays 1080p.
Even better than 980ti runs 1080p.
Posted on Reply
#64
Slizzo
cucker tarlsonYou should wait for reviews anyway.

I have no horse in this race except for very high hopes for 2070 matching 1080Ti or coming very close. 2080 is going to be a meh deal compared to 2070 while 2080Ti is just ridiculous at $1000 even though it will offer amazing performance.
This time around, I don't think our 2070 will match the previous Ti card like we have in the past. 2080 will probably match 1080 Ti. Which is sad.
TheGuruStudWomp, womp, nvidia seems to be hiding something... Only the biggest huang suckers allowed www.hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/
That is nothing new. AMD does this as well. Kyle has be incredibly anti-NVIDIA ever since the GPP fiasco. He was right to be angry about that, but now he's just hit insanity mode (the NDA thing was blown out of proportion too. NVIDIA's NDA was bog standard boiler plate NDA).
Posted on Reply
#65
medi01
SlizzoAMD does this as well.
AMD "does this", my arse, but this kind of whitewashing is quite a bit better than "no big deal".
SlizzoNVIDIA's NDA was bog standard boiler plate NDA).
An outright lie.
It wasn't even Kyle who was first to refuse to sign bendoverbackward NDA, it was heise.de, a site that is, to put it softly, somewhat bigger than this one.
Posted on Reply
#66
cucker tarlson
AMD pulled their Ryzen and Threadripper samples from a popular Polish tech site cause they wanted to impose changes in their testing and the site refused. They do this crap too.
If you believe any of the big three is clean and fair you'd better have your cataracts operated on asap. Nvidia and Intel have done more of it in the recent years cause they have a grip on the majority market, but just wait until AMD win the majority of the market for themselves.....
Posted on Reply
#67
John Naylor
Lot of strong opinions for a card that hasn't been released or reviewed yet. I expect to see that in politics, not in the tech sector, but it would seem that AMD / nVidia discussions are more politial then technical.... and it can't be technical until there's real data to discuss. Not much difference between the Red and Blue (in US politics) arguing about about subjects based upon presumption rather than facts, and many of these red and green discussions.
Posted on Reply
#68
TheDeeGee
These prices are ****ing insane.
Posted on Reply
#69
LFaWolf
AsRockSo speaking hypothetically, going by what you say we should pay even more on top of the cost of a 3080 when released because it's faster and older cards should not go lower in price.

Well why pay for some thing that you cannot use lol, screw that shit.
Nice try, but no. The prices are set via the same economic principle - supply and demand, and what are the consumers' willingness to pay. During the mining craze all AIB increased their prices and people, or miners, still bought them up and out. So they continue to increase the prices, and people continue to buy them up. For businesses, it is important not to leave any money on the table. nVidia did the price increase, and of course along with some performance improvement, and see how the market will react. Will they sell out? Will people reject the price increase? Then they will do the price adjustment. Or bundle some games. This is more straightforward than you think. Don't like the price increase? Don't like the performance improvement? Don't buy. Can't afford to pay? Like the performance but don't like to pay for it? Don't buy. Simple as that.

Prices of the 10xx series did go down. Perhaps not as much as you want or expect, but they did go down.
Posted on Reply
#70
oxidized
Better wait for navi, hopefully will be slightly slower than this but prices won't be this insane.
Posted on Reply
#71
ThisguyTM
cucker tarlsonThat is inevitable,sadly. Big die = low yeld = big costs.

So let's make a guest on how much Nvidia makes per wafer on Turing, asuming per wafer cost plus production is 50k and yield is bad at around less than 20%(most definitely not as Nvidia had TSMC custom made 12nm for them and 12nm is basically 16nm improved). Total wafer yield is 68 die, with 20% yield you are looking at 14 perfect GPU die.
Those perfect GPU die will be sold as the RTX8000 each costing 10k. So 14x10k is 140k, Nvidia would have made significant profit on those.

Not to mention that not all defective GPU die is completely dead, those go on to become second/third tier RTX quadro and the worst quality silicon went on to be the 2080ti or share some with the third tier quadro to meet demand. So Nvidia is making significant amount of cash per wafer. People like to say or justify the high price of the 2080ti saying "Oh it's expensive because the GPU die is massive", no they are not, it's expensive simply because Nvidia can and will, fanboys will still buy it regardless. This is what happens when consumers allow this to happen.
Posted on Reply
#72
cucker tarlson
ThisguyTMand 12nm is basically 16nm improved). Total wafer yield is 68 die, with 20% yield you are looking at 14 perfect GPU die.
Those perfect GPU die will
Die size is a factor,no competition is a factor,pascal overstock is a factor,new memory is a factor,new architecture is a factor.... they can price them like this simply cause they can afford it. Mining did not help cause we're dealing with pascal overstock,AMD does not help either when they openly say Vega 20 is for HPC and professionals.
Posted on Reply
#73
LFaWolf
ThisguyTMSo let's make a guest on how much Nvidia makes per wafer on Turing, asuming per wafer cost plus production is 50k and yield is bad at around less than 20%(most definitely not as Nvidia had TSMC custom made 12nm for them and 12nm is basically 16nm improved). Total wafer yield is 68 die, with 20% yield you are looking at 14 perfect GPU die.
Those perfect GPU die will be sold as the RTX8000 each costing 10k. So 14x10k is 140k, Nvidia would have made significant profit on those.
Wait, how did you "assume" the cost of the wafer plus production cost is $50k? Your calcuation of profit is based on this "number". Your idea of: retail price - production wafer cost = pure profit is absolutely wrong. The cost of a single card is not just the wafer. It includes so many things - engineering, administration, marketing, employees salaries, pre-production sampling, 10 years of R&D, PCB manufacturing, licensing, etc. The list goes on.
ThisguyTMNot to mention that not all defective GPU die is completely dead, those go on to become second/third tier RTX quadro and the worst quality silicon went on to be the 2080ti or share some with the third tier quadro to meet demand. So Nvidia is making significant amount of cash per wafer. People like to say or justify the high price of the 2080ti saying "Oh it's expensive because the GPU die is massive", no they are not, it's expensive simply because Nvidia can and will, fanboys will still buy it regardless. This is what happens when consumers allow this to happen.
So bad die becomes the 2080 ti? What??? :kookoo: So my 1080 ti is actually a defective chip? You know, the die could be bad in places that are important. Most bad die cannot be reused, especially for a high end chip/card that utilizes most of the die.
Posted on Reply
#74
mnemo_05
cant wait for the release of this card. and no i wont be getting a rtx at these prices, that is very dumb thing to do.

i cant wait for the prices for the 10-series to go down further, my buy in for a 1080ti is at 400-450.
Posted on Reply
#75
Vayra86
mnemo_05cant wait for the release of this card. and no i wont be getting a rtx at these prices, that is very dumb thing to do.

i cant wait for the prices for the 10-series to go down further, my buy in for a 1080ti is at 400-450.
I doubt you will find one at 450. That's probably where the 1080 will land. Simply because Turing does not make the same performance 'cheaper' - it just makes more performance equally more expensive.

The reason Maxwell 980ti dropped sharply in price is because Pascal was so fast in comparison, even wíth the price hike.
Posted on Reply
Add your own comment
Dec 26th, 2024 23:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts