Thursday, August 30th 2018
NVIDIA: Don't Expect Base 20-Series Pricing to Be Available at Launch
Tom Petersen, NVIDIA's director of technical marketing, in a webcast with HotHardware, expressed confidence in the value of its RTX 20-series graphics cards - but threw a wrench on consumers' pricing expectations, set by NVIDIA's own MSRP. That NVIDIA's pricing for their Founder's Edition graphics cards would give partners leeway to increase their margins was a given - why exactly would they sell at lower prices than NVIDIA, when they have increased logistical (and other) costs to support? And this move by NVIDIA might even serve as a small hand-holding for partners - remember that every NVIDIA-manufactured graphics cad sold is one that doesn't go to its AIB's bottom-lines, so there's effectively another AIB contending for their profits. This way, NVIDIA gives them an opportunity to make some of those profits back (at least concerning official MSRP).Tom Petersen had this to say on the HotHardware webcast: "The partners are basically going to hit the entire price point range between our entry level price, which will be $499, up to whatever they feel is the appropriate price for the value that they're delivering. (...) In my mind, really the question is saying 'am i gonna ever see those entry prices?' And the truth is: yes, you will see those entry prices. And it's really just a question of how are the partners approaching the market. Typically when we launch there is more demand than supply and that tends to increase the at-launch supply price."
Of course, there were some mitigating words left for last: "But we are working really hard to drive that down so that there is supply at the entry point. We're building a ton of parts and it's the natural behaviour of the market," Tom Petersen continued. "So it could be just the demand/supply equation working its way into retail pricing - but maybe there's more to it than that."
Sources:
HotHardware Webcast, via PCGamesN
Of course, there were some mitigating words left for last: "But we are working really hard to drive that down so that there is supply at the entry point. We're building a ton of parts and it's the natural behaviour of the market," Tom Petersen continued. "So it could be just the demand/supply equation working its way into retail pricing - but maybe there's more to it than that."
95 Comments on NVIDIA: Don't Expect Base 20-Series Pricing to Be Available at Launch
50k for production of silicon alone per wafer(maybe around 30k). Bill of materials for a titan or quadro grade GPU could be $200(includes the whole card from the actual silicon die to the VRM to the PCB) or less and sold at $1500 to $10,000, so how much profit margin again? Oh not much only 1k to 9k per card. Let's say Nvidia bought 20k wafers from TSMC per month, 14 perfect dies per wafer(very unlikely, worse case here), that's around 280k perfect grade RTX 8000(10k each, 9600 after minus BOM) dies which is roughly 2 billion dollars wow. That's 20k wafer, one month. Nvidia's R&D were 2 Billion, so in fact Nvidia just needs to sell one month or two months worth of RTX Quadro 8000 to recoup back everything. Where do you think the funding for the 2000 series comes from? Oh right Pascal, and based on Nvidia's earnings on Pascal, they made shit ton amount of money.
www.marketwatch.com/investing/stock/nvda/financials - There Nvidia clean profit 1.6 Billion in 2017, 3 billion in 2018. Total revenue in 2018 is 9 Fricking Billion, 9 Billion actually close to 10 fricking billion. Their operating expenses is in the dozens of millions, their administrative, marketing etc are in the hundreds of millions(800 million in 2018), The R&D is 2 Billion so can you see how much Nvidia makes? Stop giving people excuses that "Oh it's expensive because yadayadayada"
10 years of R&D, dude, this guy, seems like your the type Huang likes, very easy to get money from. Apart from the new tensor cores and RTX cores and stuff, Nvidia, the traditional cores in the 2000 series are not that different compared to pascal(not much R&D there). Yes it's non-recurring cost, but since the 2000 series cores are similar to the pascal version they don't have to spend much R&D there, they spent most of the R&D on new stuff that few games will take advantage of in the next 3 years. So how does it feel to pay more every year? In short, it's expensive because Nvidia says so and no competition.
1. Show me the proof that all 1080 ti are defective chips destined for the Titan as I don't believe you.
2. $2B R&D per year. This is a 10-year project. Do the math.
3. For the number of cards and products they sell, net income of $3B is good, but not great. Many companies are much higher. Intel is $9B. Apple is $48B. Don't hate on their successes.
4. Turing is a new architecture. Not the same as Pascal.
www.techpowerup.com/gpudb/2863/titan-x-pascal - Titan XP
www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti - 1080ti
wow suspiciously similar wow much wow. Both are 471mm square with the same exact transistor count wow. Yeap they are not the same you're right :( How could my eyes deceive me, clearly these two gpu chips are not the same. silly me
Bonus
www.techpowerup.com/gpudb/2865/quadro-p6000 - Quadro P6000
wow much similar such amazement, oh noooo, I must get my eyes checked, they are not the same. :(
2)10 year project bruh, What the hell? Gosh damn, Nvidia did not waste R&D on this for 10 years, I repeat they did not spend 10 years worth of R&D on RTX. What the F? 2Billion per year? Lmao :kookoo: :kookoo: :kookoo: 10 years in the making does not mean 2 billion in R&D per year. That is stupid business. Nvidia makes it sounds like oh we had designed it in for 10 years, it's marketing Ray tracing was done long ago not new. They had the initial idea to do it in real time but the technology(the litography process) was not there yet if so Kepler would have inkling of being able to do ray tracing in their architecture. If you said so, then AMD also wasted 10 years of R&D on this, their first ray tracing demo was back in 2008 go check it out.
3)Good good, Nvidia listen to this guy, raise your profit margin so that you can milk more money out of consumers.
4) Turing traditional cores are similar to Pascal(rasterization, 99% of games), the only new stuff was ray tracing cores, tensor cores was available through volta. All of their R&D money was spent on the new stuff and again all the R&D fund comes from Pascal. Which yet again no mass adoption of games of the new tech in 2-3 years time. 2080ti is shown to be just capable of ray tracing at 60FPS at 1080p, I repeat 60FPS at 1080P. Now I don't know about you but I prefer over 100 FPS in competitive games over wow pretty shadows. So what about the 2080? 2070? hell 2060? ray tracing 60FPS at 480p for the 2060?
1. Still no proof that defective die became 1080 ti. Just because they are of the same size does not mean they are defective. It is just your argument, perception. Binning chips != defective chips all the time. Show me the proof.
2. From the financials that you linked, from 2014 and on, they spent at least $1.3B a year on R&D. Can you do math?
3. Every business will do this given the opportunity. It is basic principle of economics - leave no money on the table. Do you expect charity?
4. Didn't know you have access to pre-release benchmark. Good for you.
2)Those are for developing next gen GPU you turd, 2014 R&D is for 2016 GPU, 2010 R&D is for 2012 GPU and so on. What the F? Oh right those billions of R&D over the past decades every year are used to fund the development of the 2000 series 10 years later. Wow, amazing man such dedication, you as a company spend 2 billion dollars every year researching tech that you will only benefit 10 years later. Mindblown man I didnt know that Wow super amazing. Nvidia's Fermi, Kepler, Maxwell, Pascal doesn't need R&D man, they are developed from magic. Amazing stuff.
3)This contradicts your own stupid statement, 10 years in the making means you got to have ROI on it or else you are a dumbass. Again, 10 years in the making does not equal 2.2 billion dollar yearly in R&D, the bulk of the R&D is to fund the development of next GPU architecture while probably researching ray tracing tech at the sidelines. Again this is Marketing, oh look 10 years in the making yet we can only do 1080p 60FPS, what the F? Yeah the fanboys voted with their money even when AMD offered superior tech, so now we get to enjoy the monopoly, look at how to CPU market turned out to be because of competition? Nvidia has been steadily increasing their margin simply because no competition. At what point then all of you guys wake up? 2080ti @ $1200 suggested retail price yet it sold out @ $1700? Nvidia has data now, they know that people are willing to pay more and so expect the 3000 series to be ever more expensive.
4)The ray tracing demo numbers is direct from Nvidia and select game studios you turd, what Nvidia lying now? The 2080ti can just about barely do 60FPS at 1080p with ray tracing, need I repeat myself 1080p? This is why Huang said 10 years, it took us 10 years of GPU horsepower increase to reach this level. Do you think 2060 can do ray tracing at 1080p 30fps? Did you not look at the demo? "Look at those reflections woooooo, as if someone is just gonna stand there and look at their opponents eyes and marvel at the reflection and not shoot the enemy. Must be nice paying for technology that you can't use or see limited benefits of for the next 3 years" I'll be happy if Nvidia instead dedicate the whole 700mm square at traditional compute and we see a massive 2x performance jump over pascal, 4k max at 144 FPS is way way way way better than 1080p barely 60FPS, low res ray tracing shit.
Be it memory controller issues, clock issues with all cores enabled, leaky die... it was not capable performing without error with everything enabled at clocks required.
There are no fully enabled RTX dies... Quadro RTX8000 doesn't even use the full die, and the RTX2080ti is cut down from that.
And, Heise was working off a translation, one that appears to be bad. And both AMD and NVIDIA choose who has access to prelaunch hardware. Like I said, nothing new there.
Only exception is if the cards don't performe 35-50% better (2080ti/2080) than the 10 series cards in current titles.
Agreed that we are paying for RT which may not be ready at launch, but I think we will see this becoming better.
No-one likes the price but it's the new 'norm'.
Oh, and what was the point to link the dude:
a) was caught with shit
b) signed NDA and has all reasons to pretend he didn't bent over backwards.
EDIT: Also, watch from here:
Basically, gist of it? Linus sees no problem with NDA, nor NVIDIA controlling drivers pre-launch. Said NDA was boilerplate, bog standard NDA language. And said that NVIDIA has every right to protect their interests, which is true.
No ,it's not "usual NDA".
No, it isn't "normal".
Yes, it is very ambiguous. I didn't know, he was a lawyer or a source known for his principles.
Yes, it IS a usual NDA according to all credible sources. Yes, it is normal, and it's really not all that ambigous. But you keep being blindly anti-NVIDIA.
Linus isn't a lawyer, no. But he has had to sign many an NDA in his career, and if he sees no issue with it especially as a small business owner that relies on getting product in to review in order to make money....