Thursday, February 15th 2018
NVIDIA Turing is a Crypto-mining Chip Jen-Hsun Huang Made to Save PC Gaming
When Reuters reported Turing as NVIDIA's next gaming graphics card, we knew something was off about it. Something like that would break many of NVIDIA's naming conventions. It now turns out that Turing, named after British scientist Alan Turing, who is credited with leading a team of mathematicians that broke the Nazi "Enigma" cryptography, is a crypto-mining and blockchain compute accelerator. It is being designed to be compact, efficient, and ready for large-scale deployment by amateur miners and crypto-mining firms alike, in a quasi-industrial scale.
NVIDIA Turing could be manufactured at a low-enough cost against GeForce-branded products, and in high-enough scales, to help bring down their prices, and save the PC gaming ecosystem. It could have an ASIC-like disruptive impact on the graphics card market, which could make mining with graphics cards less viable, in turn, lowering graphics card prices. With performance-segment and high-end graphics cards seeing 200-400% price inflation in the wake of crypto-currency mining wave, PC gaming is threatened as gamers are lured to the still-affordable new-generation console ecosystems, led by premium consoles such as the PlayStation 4 Pro and Xbox One X. There's no word on which GPU architecture Turing will be based on ("Pascal" or "Volta"). NVIDIA is expected to launch its entire family of next-generation GeForce GTX 2000-series "Volta" graphics cards in 2018.
Source:
DigitalTrends
NVIDIA Turing could be manufactured at a low-enough cost against GeForce-branded products, and in high-enough scales, to help bring down their prices, and save the PC gaming ecosystem. It could have an ASIC-like disruptive impact on the graphics card market, which could make mining with graphics cards less viable, in turn, lowering graphics card prices. With performance-segment and high-end graphics cards seeing 200-400% price inflation in the wake of crypto-currency mining wave, PC gaming is threatened as gamers are lured to the still-affordable new-generation console ecosystems, led by premium consoles such as the PlayStation 4 Pro and Xbox One X. There's no word on which GPU architecture Turing will be based on ("Pascal" or "Volta"). NVIDIA is expected to launch its entire family of next-generation GeForce GTX 2000-series "Volta" graphics cards in 2018.
124 Comments on NVIDIA Turing is a Crypto-mining Chip Jen-Hsun Huang Made to Save PC Gaming
AMD cards went out of stock, and then Nvidia cards went out of stock because people wanted anything they could get. The one thing I will say is that the GTX 1070 @$350 was a solid competitor if you were doing large-scale mining. They would do ~28 MH/s out of the box and consume slightly less energy than a stock RX 580. That's important if you are setting up 1000 cards and don't have time to tweak them. But the second the 1070 got close to 1.5-2x the price of an RX 580, it made absolutely no sense.
Heck even right now I just built yet another rig - it is 3 x RX 560's, and 3 x used R9 290's. After Bios mods the $150 RX 560's are all pulling 14.5 MH/s @ 60w, and the $275 28nm R9 290's are doing 29 MH/s @ 185w. That's not bad pal! People are crazy to pay $500 for a 1070 or 580 lol. 1) So for some reason Nanopool's Pascal servers are under maintenance or something, but I can show you what I do when not dual mining. 48 MH/s should shut up the nay-sayers lol, and I would be happy to show my dual mining whenever the Pascal servers are back up. So message me later for that if you want.
2) Brag all you want about your 1060's, but my point still stands. I paid less money than you a LONG time ago to get the same hashrate. Those R9 380's aren't as good as the 1060, but I wouldn't brag about a substantially newer 16nm card narrowly beating Tonga on 28nm lol.
Man it really never ceases to amuse me when people can't believe the performance of AMD cards. They are almost always the stupidly better choice for people who are good at tweaking hardware, and care about price/perf. Nvidia remains the brand both for people with money in need of burning, and for those who just don't have a lot of hardware knowledge. Not trying to pick a fight with that last statement either, facts are facts as I have shown with my $500 card putting out 48 MH/s...
Want I am trying to say is that, people buy cards that are in stock or pay 1k for a Vega card, and it also depends on the algorithm you are mining.
What I really resent though is when people start spreading FUD so they can convince themselves they were "smart" to buy one thing over the other, but in really they simply couldn't buy the best. If all cards were currently at MSRP, almost no informed person would be buying Nvidia for mining. For instance I am not going to tell people here that my new rig with R9 290's and RX 560's are better at mining than 1060's, but then again they cost less than 1/2 as much and they are honestly pretty close (Or they did, now RX 560's and 290's are sold out too lol).
Vega is the indisputable king at mining by ALL metrics, after that Polaris is the clear choice for price/perf, and even after that 28nm Fury cards are about as good as 16nm Nvidia. After all of those options are gone: then you buy 1070's, and after that you only should buy 1060's as a last resort. I cannot fathom why some people have actually convinced themselves 1080 Ti rigs make any sense at all...
Turing made it fast using methods unseen in that time.
I also wouldn't call half the power consumption "narrowly" beating Tonga. That's an improvement and considering the 1060 is a 192 bit card I really don't find it's performance bad at all. I'll pay $30 more a card for half the wattage any day, that makes a huge difference when scaling up.
All of my 1070's were purchased for under $350 so that might color my vision on them they all also do 32mh/s minus the one Samsung card I have that does 33.
I'm impressed with that vega. I have one it does 42mh/s and pisses me off to the point that it currently sits in my htpc. That's another reason I prefer nvidia. I don't like having to bios mod cards, run a reg edit for a driver to work, enable compute modes, pray the driver doesn't crash, find out the voltage didn't actually set in the bios or config file and have to manage it via software. For ease of use? Nvidia all day.
Not to mention let's go mine neoscrypt, Blake 2s, equihash, lyra2rev2, or any of about half a dozen algorithms other than cryptonight and ethash and watch what the mining looks like. That being said I ordered more 580's and 560's
Second, those 380's are not using double the energy of your 1060's. You can BIOS mod and undervolt them to the point that the fans barely even turn on. Again though, idk why you are bragging about a 2 year newer card kinda beating an old 28nm GCN card. Sure it's cool you supposedly got them for only 30% more money than I paid, but most people paid $250-$300 for a card that mines the same as something they could have got for less than half the price. Oh and my cards were mining before the 1060 launched lol, and they still are.
Finally, I am kinda sick of people acting like most of the coins outside of Monero and Ethereum are worth bragging about. You could argue that most of them are only being mined because they intentionally pander to Nvidia's architecture, and even then they are wholly less profitable than the real coins (especially if you HODL). Throw in how AMD cards vastly outperform Nvidia's in dual-mining, and it's laughable to suggest Nvidia's cards are anywhere near as profitable. Simply put: AMD mines the most profitable coins better, and then they also dual-mine more efficiently as well.
$150 vs $130 also isn't 30% :roll: and you can brag on 380's all you want they were never that great of a card. I have had a few myself. There is a reason I sold them.
There are quite a few that would argue this last point.
ZEC, BTG and we can continue from here are quite profitable coins and depending on market values have doubled ethereum on multiple occasions for profitability. Until the crash in December they all exceeded ethereum. "simply put" you are fake news. Nvidia makes a good product so does amd. Benefit for nvidia is it is more power efficient and yet again they are not let down by the driver.
I do like the mention of dual mining though. Which coin do you currently dual mine and how has it responded to the ASIC miners that dropped for everything outside of Blake 2S and keccak?
With their voltage locked. crippled GPU processors, overpriced...
He could be called with different names as we can call Devil with different names...
Lucifer, Diablo, Satan, Antichrist ... but Savior No.
Company without such influence is not possible to do that... Are you aware that now you could build Kawasaki Off Road KLX140 144c brand new 2017 model for one Graphic Cards with drivers in gaming. section.
Unfortunately for Nvidia even their buttloads of cash aren't enough to build/buy a foundry of their own.
In addition to that even if they would have one , there is no way they would be able to keep going. Companies like TSMC/Samsung pour billions into existing/new facilities each year just so they can remain competitive.
That being said not even Intel has that much cash , so you can see how the addition of a foundry would likely bury them. AMD didn't ditch their foundry for no reason.
However I am guessing Nvidia could pull it off if they wanted to, but it would take a lot of effort in addition to money.
AMD had to spin off their own foundries too, despite Jerry saying "real men have fabs".
Also, I like Vya's post when he clarified for me how much money Nvidia actually makes, and I did not post again until you called me out just now, but sure thing mate, you got all your facts straight about me.