Thursday, August 30th 2018

NVIDIA: Don't Expect Base 20-Series Pricing to Be Available at Launch

Tom Petersen, NVIDIA's director of technical marketing, in a webcast with HotHardware, expressed confidence in the value of its RTX 20-series graphics cards - but threw a wrench on consumers' pricing expectations, set by NVIDIA's own MSRP. That NVIDIA's pricing for their Founder's Edition graphics cards would give partners leeway to increase their margins was a given - why exactly would they sell at lower prices than NVIDIA, when they have increased logistical (and other) costs to support? And this move by NVIDIA might even serve as a small hand-holding for partners - remember that every NVIDIA-manufactured graphics cad sold is one that doesn't go to its AIB's bottom-lines, so there's effectively another AIB contending for their profits. This way, NVIDIA gives them an opportunity to make some of those profits back (at least concerning official MSRP).
Tom Petersen had this to say on the HotHardware webcast: "The partners are basically going to hit the entire price point range between our entry level price, which will be $499, up to whatever they feel is the appropriate price for the value that they're delivering. (...) In my mind, really the question is saying 'am i gonna ever see those entry prices?' And the truth is: yes, you will see those entry prices. And it's really just a question of how are the partners approaching the market. Typically when we launch there is more demand than supply and that tends to increase the at-launch supply price."

Of course, there were some mitigating words left for last: "But we are working really hard to drive that down so that there is supply at the entry point. We're building a ton of parts and it's the natural behaviour of the market," Tom Petersen continued. "So it could be just the demand/supply equation working its way into retail pricing - but maybe there's more to it than that."
Sources: HotHardware Webcast, via PCGamesN
Add your own comment

95 Comments on NVIDIA: Don't Expect Base 20-Series Pricing to Be Available at Launch

#76
Unregistered
Me to NVidia: Don't expect me to be getting my wallet out for your 20x series anytime soon (if ever).
#77
ThisguyTM
ddcretfg@gmail.com
LFaWolfWait, how did you "assume" the cost of the wafer plus production cost is $50k? Your calcuation of profit is based on this "number". Your idea of: retail price - production wafer cost = pure profit is absolutely wrong. The cost of a single card is not just the wafer. It includes so many things - engineering, administration, marketing, employees salaries, pre-production sampling, 10 years of R&D, PCB manufacturing, licensing, etc. The list goes on.



So bad die becomes the 2080 ti? What??? :kookoo: So my 1080 ti is actually a defective chip? You know, the die could be bad in places that are important. Most bad die cannot be reused, especially for a high end chip/card that utilizes most of the die.
Yes your 1080ti is a defective chip lowest quality out of the whole silicon intended to produce titans, it's not perfect, parts are lasered off to create your 1080ti. The perfect chips goes on to be the titan and quadro(perfect chips are what chips without any defects. Those that can't are called bad dies, as those have no hope in becoming a titan grade silicon :kookoo:). This is called binning. You mad that you paid money for bad silicon? Awww should have went and bought the titan, or the 1080 instead of the 1070 :(

50k for production of silicon alone per wafer(maybe around 30k). Bill of materials for a titan or quadro grade GPU could be $200(includes the whole card from the actual silicon die to the VRM to the PCB) or less and sold at $1500 to $10,000, so how much profit margin again? Oh not much only 1k to 9k per card. Let's say Nvidia bought 20k wafers from TSMC per month, 14 perfect dies per wafer(very unlikely, worse case here), that's around 280k perfect grade RTX 8000(10k each, 9600 after minus BOM) dies which is roughly 2 billion dollars wow. That's 20k wafer, one month. Nvidia's R&D were 2 Billion, so in fact Nvidia just needs to sell one month or two months worth of RTX Quadro 8000 to recoup back everything. Where do you think the funding for the 2000 series comes from? Oh right Pascal, and based on Nvidia's earnings on Pascal, they made shit ton amount of money.

www.marketwatch.com/investing/stock/nvda/financials - There Nvidia clean profit 1.6 Billion in 2017, 3 billion in 2018. Total revenue in 2018 is 9 Fricking Billion, 9 Billion actually close to 10 fricking billion. Their operating expenses is in the dozens of millions, their administrative, marketing etc are in the hundreds of millions(800 million in 2018), The R&D is 2 Billion so can you see how much Nvidia makes? Stop giving people excuses that "Oh it's expensive because yadayadayada"

10 years of R&D, dude, this guy, seems like your the type Huang likes, very easy to get money from. Apart from the new tensor cores and RTX cores and stuff, Nvidia, the traditional cores in the 2000 series are not that different compared to pascal(not much R&D there). Yes it's non-recurring cost, but since the 2000 series cores are similar to the pascal version they don't have to spend much R&D there, they spent most of the R&D on new stuff that few games will take advantage of in the next 3 years. So how does it feel to pay more every year? In short, it's expensive because Nvidia says so and no competition.
Posted on Reply
#78
LFaWolf
ThisguyTMddcretfg@gmail.com

Yes your 1080ti is a defective chip lowest quality out of the whole silicon intended to produce titans, it's not perfect, parts are lasered off to create your 1080ti. The perfect chips goes on to be the titan and quadro(perfect chips are what chips without any defects. Those that can't are called bad dies, as those have no hope in becoming a titan grade silicon :kookoo:). This is called binning. You mad that you paid money for bad silicon? Awww should have went and bought the titan, or the 1080 instead of the 1070 :(

50k for production of silicon alone per wafer(maybe around 30k). Bill of materials for a titan or quadro grade GPU could be $200(includes the whole card from the actual silicon die to the VRM to the PCB) or less and sold at $1500 to $10,000, so how much profit margin again? Oh not much only 1k to 9k per card. Let's say Nvidia bought 20k wafers from TSMC per month, 14 perfect dies per wafer(very unlikely, worse case here), that's around 280k perfect grade RTX 8000(10k each, 9600 after minus BOM) dies which is roughly 2 billion dollars wow. That's 20k wafer, one month. Nvidia's R&D were 2 Billion, so in fact Nvidia just needs to sell one month or two months worth of RTX Quadro 8000 to recoup back everything. Where do you think the funding for the 2000 series comes from? Oh right Pascal, and based on Nvidia's earnings on Pascal, they made shit ton amount of money.

www.marketwatch.com/investing/stock/nvda/financials - There Nvidia clean profit 1.6 Billion in 2017, 3 billion in 2018. Total revenue in 2018 is 9 Fricking Billion, 9 Billion actually close to 10 fricking billion. Their operating expenses is in the dozens of millions, their administrative, marketing etc are in the hundreds of millions(800 million in 2018), The R&D is 2 Billion so can you see how much Nvidia makes? Stop giving people excuses that "Oh it's expensive because yadayadayada"

10 years of R&D, dude, this guy, seems like your the type Huang likes, very easy to get money from. Apart from the new tensor cores and RTX cores and stuff, Nvidia, the traditional cores in the 2000 series are not that different compared to pascal(not much R&D there). Yes it's non-recurring cost, but since the 2000 series cores are similar to the pascal version they don't have to spend much R&D there, they spent most of the R&D on new stuff that few games will take advantage of in the next 3 years. So how does it feel to pay more every year? In short, it's expensive because Nvidia says so and no competition.
Okay I have to sleep so make it short -
1. Show me the proof that all 1080 ti are defective chips destined for the Titan as I don't believe you.
2. $2B R&D per year. This is a 10-year project. Do the math.
3. For the number of cards and products they sell, net income of $3B is good, but not great. Many companies are much higher. Intel is $9B. Apple is $48B. Don't hate on their successes.
4. Turing is a new architecture. Not the same as Pascal.
Posted on Reply
#79
ThisguyTM
LFaWolfOkay I have to sleep so make it short -
1. Show me the proof that all 1080 ti are defective chips destined for the Titan as I don't believe you.
2. $2B R&D per year. This is a 10-year project. Do the math.
3. For the number of cards and products they sell, net income of $3B is good, but not great. Many companies are much higher. Intel is $9B. Apple is $48B. Don't hate on their successes.
4. Turing is a new architecture. Not the same as Pascal.
1) They came from the same wafer using the same mask for lithography process as the titan, it's called binning ok. Dont you know that? It's how you can have the RX 580 and the RX 570, the wafer is made to produce only RX 580 but due to defects not all chips end up becoming the RX 580. You want prime example? look at Nvidia, why was the 1070ti released after 1 year? Two reasons, one because of vega 56 and second, the process improved by so much that they now have excess chips that is better than the 1070 but could not be the 1080 due to defects, so Nvidia released the 1070ti to make more money. This is the most basic "How to make money" of silicon wafer. A mask used to etch a wafer cost in excess of several millions of dollars, it will be extremely expensive to design separate mask for each chip separate line of chips. The 1080ti is the lowest quality silicon on the wafer period, all the highest quality silicon goes on to be sold at the professional market. Ever wonder why AMD Ryzen 1st gen threadripper could clock way higher compared to their 1st gen desktop counterpart? It's because threadripper chips are top 10% of the chips in terms of quality.

www.techpowerup.com/gpudb/2863/titan-x-pascal - Titan XP
www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti - 1080ti
wow suspiciously similar wow much wow. Both are 471mm square with the same exact transistor count wow. Yeap they are not the same you're right :( How could my eyes deceive me, clearly these two gpu chips are not the same. silly me
Bonus
www.techpowerup.com/gpudb/2865/quadro-p6000 - Quadro P6000
wow much similar such amazement, oh noooo, I must get my eyes checked, they are not the same. :(

2)10 year project bruh, What the hell? Gosh damn, Nvidia did not waste R&D on this for 10 years, I repeat they did not spend 10 years worth of R&D on RTX. What the F? 2Billion per year? Lmao :kookoo: :kookoo: :kookoo: 10 years in the making does not mean 2 billion in R&D per year. That is stupid business. Nvidia makes it sounds like oh we had designed it in for 10 years, it's marketing Ray tracing was done long ago not new. They had the initial idea to do it in real time but the technology(the litography process) was not there yet if so Kepler would have inkling of being able to do ray tracing in their architecture. If you said so, then AMD also wasted 10 years of R&D on this, their first ray tracing demo was back in 2008 go check it out.

3)Good good, Nvidia listen to this guy, raise your profit margin so that you can milk more money out of consumers.

4) Turing traditional cores are similar to Pascal(rasterization, 99% of games), the only new stuff was ray tracing cores, tensor cores was available through volta. All of their R&D money was spent on the new stuff and again all the R&D fund comes from Pascal. Which yet again no mass adoption of games of the new tech in 2-3 years time. 2080ti is shown to be just capable of ray tracing at 60FPS at 1080p, I repeat 60FPS at 1080P. Now I don't know about you but I prefer over 100 FPS in competitive games over wow pretty shadows. So what about the 2080? 2070? hell 2060? ray tracing 60FPS at 480p for the 2060?
Posted on Reply
#80
LFaWolf
ThisguyTM1) They came from the same wafer using the same mask for lithography process as the titan, it's called binning ok. Dont you know that? It's how you can have the RX 580 and the RX 570, the wafer is made to produce only RX 580 but due to defects not all chips end up becoming the RX 580. You want prime example? look at Nvidia, why was the 1070ti released after 1 year? Two reasons, one because of vega 56 and second, the process improved by so much that they now have excess chips that is better than the 1070 but could not be the 1080 due to defects, so Nvidia released the 1070ti to make more money. This is the most basic "How to make money" of silicon wafer. A mask used to etch a wafer cost in excess of several millions of dollars, it will be extremely expensive to design separate mask for each chip separate line of chips. The 1080ti is the lowest quality silicon on the wafer period, all the highest quality silicon goes on to be sold at the professional market. Ever wonder why AMD Ryzen 1st gen threadripper could clock way higher compared to their 1st gen desktop counterpart? It's because threadripper chips are top 10% of the chips in terms of quality.

www.techpowerup.com/gpudb/2863/titan-x-pascal - Titan XP
www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti - 1080ti
wow suspiciously similar wow much wow. Both are 471mm square with the same exact transistor count wow. Yeap they are not the same you're right :( How could my eyes deceive me, clearly these two gpu chips are not the same. silly me
Bonus
www.techpowerup.com/gpudb/2865/quadro-p6000 - Quadro P6000
wow much similar such amazement, oh noooo, I must get my eyes checked, they are not the same. :(

2)10 year project bruh, What the hell? Gosh damn, Nvidia did not waste R&D on this for 10 years, I repeat they did not spend 10 years worth of R&D on RTX. What the F? 2Billion per year? Lmao :kookoo::kookoo::kookoo: 10 years in the making does not mean 2 billion in R&D per year. That is stupid business. Nvidia makes it sounds like oh we had designed it in for 10 years, it's marketing Ray tracing was done long ago not new. They had the initial idea to do it in real time but the technology(the litography process) was not there yet if so Kepler would have inkling of being able to do ray tracing in their architecture. If you said so, then AMD also wasted 10 years of R&D on this, their first ray tracing demo was back in 2008 go check it out.

3)Good good, Nvidia listen to this guy, raise your profit margin so that you can milk more money out of consumers.

4) Turing traditional cores are similar to Pascal(rasterization, 99% of games), the only new stuff was ray tracing cores, tensor cores was available through volta. All of their R&D money was spent on the new stuff and again all the R&D fund comes from Pascal. Which yet again no mass adoption of games of the new tech in 2-3 years time. 2080ti is shown to be just capable of ray tracing at 60FPS at 1080p, I repeat 60FPS at 1080P. Now I don't know about you but I prefer over 100 FPS in competitive games over wow pretty shadows. So what about the 2080? 2070? hell 2060? ray tracing 60FPS at 480p for the 2060?
You make me laugh.
1. Still no proof that defective die became 1080 ti. Just because they are of the same size does not mean they are defective. It is just your argument, perception. Binning chips != defective chips all the time. Show me the proof.
2. From the financials that you linked, from 2014 and on, they spent at least $1.3B a year on R&D. Can you do math?
3. Every business will do this given the opportunity. It is basic principle of economics - leave no money on the table. Do you expect charity?
4. Didn't know you have access to pre-release benchmark. Good for you.
Posted on Reply
#81
londiste
LFaWolfYou make me laugh.
1. Still no proof that defective die became 1080 ti. Just because they are of the same size does not mean they are defective. It is just your argument, perception. Binning chips != defective chips all the time. Show me the proof.
`Dude, this is common knowledge and industry practice. What kind of proof do you expect?
Posted on Reply
#82
ThisguyTM
LFaWolfYou make me laugh.
1. Still no proof that defective die became 1080 ti. Just because they are of the same size does not mean they are defective. It is just your argument, perception. Binning chips != defective chips all the time. Show me the proof.
2. From the financials that you linked, from 2014 and on, they spent at least $1.3B a year on R&D. Can you do math?
3. Every business will do this given the opportunity. It is basic principle of economics - leave no money on the table. Do you expect charity?
4. Didn't know you have access to pre-release benchmark. Good for you.
1)Holy facking shit man, the 1080ti cannot be a quadro p6000 nor a titan, it's defective in that sense, instead of throwing it away, Nvidia laser off the defective die(hence the lower cuda core) and sell it as a 1080ti. It's the lowest quality chip, gosh damn it. What the hell. Can a 1080ti be a quadro p6000? can it? Can those defective cuda cores found in the 1080ti be activated and be sold as a quadro p6000? Wondered why those cuda cores in the 1080ti are lasered off? Oh right it's defective. Those 1080 gpu dies are of higher quality than those 1080ti gpu dies.

2)Those are for developing next gen GPU you turd, 2014 R&D is for 2016 GPU, 2010 R&D is for 2012 GPU and so on. What the F? Oh right those billions of R&D over the past decades every year are used to fund the development of the 2000 series 10 years later. Wow, amazing man such dedication, you as a company spend 2 billion dollars every year researching tech that you will only benefit 10 years later. Mindblown man I didnt know that Wow super amazing. Nvidia's Fermi, Kepler, Maxwell, Pascal doesn't need R&D man, they are developed from magic. Amazing stuff.

3)This contradicts your own stupid statement, 10 years in the making means you got to have ROI on it or else you are a dumbass. Again, 10 years in the making does not equal 2.2 billion dollar yearly in R&D, the bulk of the R&D is to fund the development of next GPU architecture while probably researching ray tracing tech at the sidelines. Again this is Marketing, oh look 10 years in the making yet we can only do 1080p 60FPS, what the F? Yeah the fanboys voted with their money even when AMD offered superior tech, so now we get to enjoy the monopoly, look at how to CPU market turned out to be because of competition? Nvidia has been steadily increasing their margin simply because no competition. At what point then all of you guys wake up? 2080ti @ $1200 suggested retail price yet it sold out @ $1700? Nvidia has data now, they know that people are willing to pay more and so expect the 3000 series to be ever more expensive.

4)The ray tracing demo numbers is direct from Nvidia and select game studios you turd, what Nvidia lying now? The 2080ti can just about barely do 60FPS at 1080p with ray tracing, need I repeat myself 1080p? This is why Huang said 10 years, it took us 10 years of GPU horsepower increase to reach this level. Do you think 2060 can do ray tracing at 1080p 30fps? Did you not look at the demo? "Look at those reflections woooooo, as if someone is just gonna stand there and look at their opponents eyes and marvel at the reflection and not shoot the enemy. Must be nice paying for technology that you can't use or see limited benefits of for the next 3 years" I'll be happy if Nvidia instead dedicate the whole 700mm square at traditional compute and we see a massive 2x performance jump over pascal, 4k max at 144 FPS is way way way way better than 1080p barely 60FPS, low res ray tracing shit.
Posted on Reply
#83
Patriot
LFaWolfYou make me laugh.
1. Still no proof that defective die became 1080 ti. Just because they are of the same size does not mean they are defective. It is just your argument, perception. Binning chips != defective chips all the time. Show me the proof.
Binning=incapable to perform in that bin = defective.
Be it memory controller issues, clock issues with all cores enabled, leaky die... it was not capable performing without error with everything enabled at clocks required.

There are no fully enabled RTX dies... Quadro RTX8000 doesn't even use the full die, and the RTX2080ti is cut down from that.
Posted on Reply
#84
LFaWolf
ThisguyTM1)Holy facking shit man, the 1080ti cannot be a quadro p6000 nor a titan, it's defective in that sense, instead of throwing it away, Nvidia laser off the defective die(hence the lower cuda core) and sell it as a 1080ti. It's the lowest quality chip, gosh damn it. What the hell. Can a 1080ti be a quadro p6000? can it? Can those defective cuda cores found in the 1080ti be activated and be sold as a quadro p6000? Wondered why those cuda cores in the 1080ti are lasered off? Oh right it's defective. Those 1080 gpu dies are of higher quality than those 1080ti gpu dies.

2)Those are for developing next gen GPU you turd, 2014 R&D is for 2016 GPU, 2010 R&D is for 2012 GPU and so on. What the F? Oh right those billions of R&D over the past decades every year are used to fund the development of the 2000 series 10 years later. Wow, amazing man such dedication, you as a company spend 2 billion dollars every year researching tech that you will only benefit 10 years later. Mindblown man I didnt know that Wow super amazing. Nvidia's Fermi, Kepler, Maxwell, Pascal doesn't need R&D man, they are developed from magic. Amazing stuff.

3)This contradicts your own stupid statement, 10 years in the making means you got to have ROI on it or else you are a dumbass. Again, 10 years in the making does not equal 2.2 billion dollar yearly in R&D, the bulk of the R&D is to fund the development of next GPU architecture while probably researching ray tracing tech at the sidelines. Again this is Marketing, oh look 10 years in the making yet we can only do 1080p 60FPS, what the F? Yeah the fanboys voted with their money even when AMD offered superior tech, so now we get to enjoy the monopoly, look at how to CPU market turned out to be because of competition? Nvidia has been steadily increasing their margin simply because no competition. At what point then all of you guys wake up? 2080ti @ $1200 suggested retail price yet it sold out @ $1700? Nvidia has data now, they know that people are willing to pay more and so expect the 3000 series to be ever more expensive.

4)The ray tracing demo numbers is direct from Nvidia and select game studios you turd, what Nvidia lying now? The 2080ti can just about barely do 60FPS at 1080p with ray tracing, need I repeat myself 1080p? This is why Huang said 10 years, it took us 10 years of GPU horsepower increase to reach this level. Do you think 2060 can do ray tracing at 1080p 30fps? Did you not look at the demo? "Look at those reflections woooooo, as if someone is just gonna stand there and look at their opponents eyes and marvel at the reflection and not shoot the enemy. Must be nice paying for technology that you can't use or see limited benefits of for the next 3 years" I'll be happy if Nvidia instead dedicate the whole 700mm square at traditional compute and we see a massive 2x performance jump over pascal, 4k max at 144 FPS is way way way way better than 1080p barely 60FPS, low res ray tracing shit.
Ah, can’t argue properly with proof or facts so goes into attack with cussing and insults. How typical. Again, no proof just you saying it. Anyway I am out of this thread. No need to take in insults on a forum.
PatriotBinning=incapable to perform in that bin = defective.
Be it memory controller issues, clock issues with all cores enabled, leaky die... it was not capable performing without error with everything enabled at clocks required.

There are no fully enabled RTX dies... Quadro RTX8000 doesn't even use the full die, and the RTX2080ti is cut down from that.
Again, where is the proof that all 1080 ti are defective Quadro? Only some can be salvaged for 1080 ti as the die utilization is too big.
Posted on Reply
#85
ThisguyTM
LFaWolfAh, can’t argue properly with proof or facts so goes into attack with cussing and insults. How typical. Again, no proof just you saying it. Anyway I am out of this thread. No need to take in insults on a forum.
What the heck? Go and ask around, you will get the same answer. I see that you are strong in denial. Have a good day :)
LFaWolfAh, can’t argue properly with proof or facts so goes into attack with cussing and insults. How typical. Again, no proof just you saying it. Anyway I am out of this thread. No need to take in insults on a forum.


Again, where is the proof that all 1080 ti are defective Quadro? Only some can be salvaged for 1080 ti as the die utilization is too big.
LFaWolfAh, can’t argue properly with proof or facts so goes into attack with cussing and insults. How typical. Again, no proof just you saying it. Anyway I am out of this thread. No need to take in insults on a forum.


Again, where is the proof that all 1080 ti are defective Quadro? Only some can be salvaged for 1080 ti as the die utilization is too big.
In case you don't know, not all the defective die area are lasered off, some are turned off via microcodes/BIOS. This is why you can do something like this www.techpowerup.com/articles/overclocking/vidcard/159 not recommended though because those cores are disabled for a reason, it's a lower quality chip after all. In the case of Nvidia, they laser off defective areas of the die, usually the whole SM is lasered off.
Posted on Reply
#86
mugatopdub21
Well put together post, I daresay most if not all of it is accurate, it's business. We just need to say no. What they've done is effectively reset the bar on graphics in the industry. Yes, they could have dedicated the entire die to CUDA, but what next time? They need to sell us cards for the foreseeable future so, keep the same amount (well +15%) CUDA cores and double the size, use the rest of the real estate on AI RT. Next gen, make 75% of the core AI RT and leave 25% traditional methods. That will be the die shrink so they can get away with it, guaranteed the 3080 will be as fast at rasterization as a 1080 Ti but twice as fast as a 2080 in RT. Watch.
ThisguyTMddcretfg@gmail.com

Yes your 1080ti is a defective chip lowest quality out of the whole silicon intended to produce titans, it's not perfect, parts are lasered off to create your 1080ti. The perfect chips goes on to be the titan and quadro(perfect chips are what chips without any defects. Those that can't are called bad dies, as those have no hope in becoming a titan grade silicon :kookoo:). This is called binning. You mad that you paid money for bad silicon? Awww should have went and bought the titan, or the 1080 instead of the 1070 :(

50k for production of silicon alone per wafer(maybe around 30k). Bill of materials for a titan or quadro grade GPU could be $200(includes the whole card from the actual silicon die to the VRM to the PCB) or less and sold at $1500 to $10,000, so how much profit margin again? Oh not much only 1k to 9k per card. Let's say Nvidia bought 20k wafers from TSMC per month, 14 perfect dies per wafer(very unlikely, worse case here), that's around 280k perfect grade RTX 8000(10k each, 9600 after minus BOM) dies which is roughly 2 billion dollars wow. That's 20k wafer, one month. Nvidia's R&D were 2 Billion, so in fact Nvidia just needs to sell one month or two months worth of RTX Quadro 8000 to recoup back everything. Where do you think the funding for the 2000 series comes from? Oh right Pascal, and based on Nvidia's earnings on Pascal, they made shit ton amount of money.

www.marketwatch.com/investing/stock/nvda/financials - There Nvidia clean profit 1.6 Billion in 2017, 3 billion in 2018. Total revenue in 2018 is 9 Fricking Billion, 9 Billion actually close to 10 fricking billion. Their operating expenses is in the dozens of millions, their administrative, marketing etc are in the hundreds of millions(800 million in 2018), The R&D is 2 Billion so can you see how much Nvidia makes? Stop giving people excuses that "Oh it's expensive because yadayadayada"

10 years of R&D, dude, this guy, seems like your the type Huang likes, very easy to get money from. Apart from the new tensor cores and RTX cores and stuff, Nvidia, the traditional cores in the 2000 series are not that different compared to pascal(not much R&D there). Yes it's non-recurring cost, but since the 2000 series cores are similar to the pascal version they don't have to spend much R&D there, they spent most of the R&D on new stuff that few games will take advantage of in the next 3 years. So how does it feel to pay more every year? In short, it's expensive because Nvidia says so and no competition.
Posted on Reply
#87
Slizzo
medi01AMD "does this", my arse, but this kind of whitewashing is quite a bit better than "no big deal".


An outright lie.
It wasn't even Kyle who was first to refuse to sign bendoverbackward NDA, it was heise.de, a site that is, to put it softly, somewhat bigger than this one.

And, Heise was working off a translation, one that appears to be bad. And both AMD and NVIDIA choose who has access to prelaunch hardware. Like I said, nothing new there.
Posted on Reply
#88
londiste
mugatopdub21Well put together post, I daresay most if not all of it is accurate, it's business. We just need to say no. What they've done is effectively reset the bar on graphics in the industry. Yes, they could have dedicated the entire die to CUDA, but what next time? They need to sell us cards for the foreseeable future so, keep the same amount (well +15%) CUDA cores and double the size, use the rest of the real estate on AI RT. Next gen, make 75% of the core AI RT and leave 25% traditional methods. That will be the die shrink so they can get away with it, guaranteed the 3080 will be as fast at rasterization as a 1080 Ti but twice as fast as a 2080 in RT. Watch.
It is an exercise of optimization. The shader resources available are OK-ish for 4K gaming. Especially with monitors lagging behind, they need something to sell the GPUs. Raytracing in one form or another was the obvious choice and a good one at that.
Posted on Reply
#89
AltCapwn
Prima.VeraI would definitely NOT call a 500€ for a GTX 1080 vanilla card "a bargain" ;)
It's 600CAD over here which is 394 EUR as of now.
Posted on Reply
#90
Globespy
Darkoyanthen don't expect any kind of decent numbers being sold. RTX20 is little less than a scam
Sadly, I think you will be wrong. Pre-orders sold out within 24 hours worldwide.....so.......

Only exception is if the cards don't performe 35-50% better (2080ti/2080) than the 10 series cards in current titles.
Agreed that we are paying for RT which may not be ready at launch, but I think we will see this becoming better.

No-one likes the price but it's the new 'norm'.
Posted on Reply
#91
Fx
Me and friends are buckled in for the wait game. Simply not going to pay absurd prices.
Posted on Reply
#92
medi01
SlizzoAnd, Heise was working off a translation, one that appears to be bad
Citation needed (Jesus freaking Christ, they don't have lawyers speaking English in Germany eh?)

Oh, and what was the point to link the dude:
a) was caught with shit
b) signed NDA and has all reasons to pretend he didn't bent over backwards.
Posted on Reply
#93
Slizzo
medi01Citation needed (Jesus freaking Christ, they don't have lawyers speaking English in Germany eh?)

Oh, and what was the point to link the dude:
a) was caught with shit
b) signed NDA and has all reasons to pretend he didn't bent over backwards.
Point of the link was that Steve spoke with a lawyer versed in NDAs, and they found no issues with the NDA that NVIDIA had distributed to reviewers and others. And whoever that Heise had looking at the NDA either didn't speak english, or was working off of a bad translation; as was stated in the video. Did you not watch it?

EDIT: Also, watch from here:


Basically, gist of it? Linus sees no problem with NDA, nor NVIDIA controlling drivers pre-launch. Said NDA was boilerplate, bog standard NDA language. And said that NVIDIA has every right to protect their interests, which is true.
Posted on Reply
#94
medi01
SlizzoPoint of the link was that Steve spoke with a lawyer versed in NDAs
The disrespect of Germans (do you know how big heise.de is???) in this thread is astonishing.

No ,it's not "usual NDA".
No, it isn't "normal".
Yes, it is very ambiguous.
SlizzoLinus sees no problem with NDA
I didn't know, he was a lawyer or a source known for his principles.
Posted on Reply
#95
Slizzo
medi01The disrespect of Germans (do you know how big heise.de is???) in this thread is astonishing.

No ,it's not "usual NDA".
No, it isn't "normal".
Yes, it is very ambiguous.


I didn't know, he was a lawyer or a source known for his principles.
I do know how big Heise.de is, which was what was always confusing at how wrong they got this is.

Yes, it IS a usual NDA according to all credible sources. Yes, it is normal, and it's really not all that ambigous. But you keep being blindly anti-NVIDIA.

Linus isn't a lawyer, no. But he has had to sign many an NDA in his career, and if he sees no issue with it especially as a small business owner that relies on getting product in to review in order to make money....
Posted on Reply
Add your own comment
Dec 26th, 2024 23:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts