Tuesday, March 13th 2018

NVIDIA P102-100 Cryptomining Graphic Card Surfaces

The truth of the matter is that the cryptocurrency boom isn't losing strength yet, and graphics card manufacturers are more than happy to profit off this phenomenon. However, one must have the right tools to mine cryptocurrency effectively. NVIDIA is keen to provide cryptominers with the necessary hardware as they ready their latest GP102-100 card for launch. The GP102-100 should sound familiar to most enthusiasts since it employs the same GP102 chip used in the mainstream GeForce GTX 1080 Ti and GTX Titan Xp graphics cards. Nevertheless, NVIDIA made a few modifications in order to maintain the hash rate while lowering the price at the same time. In other words, we're basically dealing with a gimped version of the GP102 chip with less memory channels, shader cores, and memory.

The Inno3D P102-100 is the first model to show up in the wild. The card packs 3,200 CUDA cores clocked at 1582 MHz and 5 GB of GDDR5X memory operating at 10 Gbps across a 320-bit wide memory interface. Inno3D is utilizing their Twin X2 cooling solution to cool the card. This solution features five heatpipes to draw the heat away from the GPU and two fans to provide active cooling. Being a mining-oriented model, the Inno3D P102-100 lacks a bracket and display outputs. It also only requires four PCIe lanes to function. The Inno3D P102-100 has a 250W TDP rating and draws power from a pair of 8-pin PCIe power connectors. According to Inno3D, their card is capable of mining Ethereum with a hash rate of 47 MH/s which easily beats a GeForce GTX 1080 Ti and GTX Titan Xp. Pricing is unknown at the writing of this article.
Inno3D P102-100 Specifications
  • GPU: P102-100
  • CUDA Cores: 3200
  • Base Clock: 1582 MHz
  • Memory Clock: 10 Gbps
  • Physical Memory Size: 5 GB
  • Memory Type: GDDR5X
  • Memory Interface Width: 320-bit
  • Memory Bandwidth: 400 GB/s
  • Bus Support: PCIe Gen3 x 4
  • Card Size: 21.5 cm length, 12.5 cm height, dual slot
  • Max TDP: 250 Watt
  • Power Connectors: 2 x 8-pin PCI-E
Inno3D P102-100 Hashrate
  • ETH: ~47 MHS
  • ZEC: ~660 Sol/s
  • XMR: ~879 H/s
Source: Hardware.Info
Add your own comment

33 Comments on NVIDIA P102-100 Cryptomining Graphic Card Surfaces

#1
Fleurious
I wonder how the heatsink would fare on a 1080ti if compatible. One would expect them to consider the card running full tilt 24/7 when speccing a thermal solution.

Edit: They say it uses the TwinX2 cooler. Maybe that is their standard cooler for the 1080ti <shrug>.
Posted on Reply
#2
the54thvoid
Super Intoxicated Moderator
Curious - how does it do better at mining than a 1080ti if it is a cut down version of one? Does it have any mining specific other hardware on the PCB?
Posted on Reply
#3
Valantar
That cooler looks like a carbon copy of the Zotac 1080Ti mini cooler, right down to the different sized fans. Weird.
Posted on Reply
#4
Tom.699
the54thvoidCurious - how does it do better at mining than a 1080ti if it is a cut down version of one? Does it have any mining specific other hardware on the PCB?
Maybe they somehow reduced latency of GDDR5X, that would improve ETH mining, maybe having less memory does that.
ZEC is less sensitive to memory latency and it slower than 1080Ti, 1080Ti can do around 730 , overclocked more
Posted on Reply
#5
jabbadap
5GB of gddr5x on 320bit, that would be 10*4Gb chips. I don't remember micron ever making 4Gb gddr5x chips(has always marketed them as 8Gb or 16Gb, which have never materialized). Well of course it should be relative easy to make those on demand, so it's possible. I very much doubt nvidia would use available 8Gb chips and disable half the ram by vbios.

Edit: and the memory clock or bandwidth is wrong:
  1. 11Gbps*320bit/(8bit/byte) = 440GB/sec
  2. 400GB/sec*(8bit/byte)/(320bit) = 10Gbps
Posted on Reply
#6
cdawall
where the hell are my stars
I guess the thread I started on these Feb 12th wasn't good enough. These cards have already started shipping.

www.techpowerup.com/forums/threads/p102-100-mining-cards.241473/
jabbadap5GB of gddr5x on 320bit, that would be 10*4Gb chips. I don't remember micron ever making 4Gb gddr5x chips(has always marketed them as 8Gb or 16Gb, which have never materialized). Well of course it should be relative easy to make those on demand, so it's possible. I very much doubt nvidia would use available 8Gb chips and disable half the ram by vbios.

Edit: and the memory clock or bandwidth is wrong:
  1. 11Gbps*320bit/(8bit/byte) = 440GB/sec
  2. 400GB/sec*(8bit/byte)/(320bit) = 10Gbps
It still has 11GB on the cards. Frame buffer is adjusted to use less of it.
Posted on Reply
#7
Valantar
cdawallIt still has 11GB on the cards. Frame buffer is adjusted to use less of it.
That cannot possibly make any kind of economical sense.
Posted on Reply
#8
cdawall
where the hell are my stars
ValantarThat cannot possibly make any kind of economical sense.
It doesn't have to make economical sense for your view. This is how they were able to get the gddr5x chips to do well in ethereum. That is what these are targeted at.
Posted on Reply
#9
Octavean
I thought miners didn't want cards like this because of the lower resale potential.

When miners buy gaming cards they know they can resell them if they want to hit the panic button and get out of the market. Selling their video card(s) investment to both gamers and miners maximizes their buying pool. If they can only sell to miners, such miners can smell their desperation when buying used cards like this.

As far as I can tell miners don't want cards like this because they don't dry up the gaming market supply and they can't sell them to the gamers that they are depriving of such cards.
Posted on Reply
#10
Valantar
cdawallIt doesn't have to make economical sense for your view. This is how they were able to get the gddr5x chips to do well in ethereum. That is what these are targeted at.
I mean for the manufacturer (what on earth else would I mean? "In my view"? I don't even understand what that means.): how could it be cheaper for them to buy (very very expensive!) high-end GDDR5X chips and then partially disabling them through software, rather than, say, not populating all the channels? Now, my understanding of Eth performance is virtually nonexistent, but I get the impression (feel free to correct me!) that memory latency and VRAM size is what matters, not memory bandwidth? If so, wouldn't they save quite a bit on simply not populating those pads, and using the chips for other cards?
Posted on Reply
#11
jabbadap
ValantarThat cannot possibly make any kind of economical sense.
Agreed, double the capacity ~ double the prize. Haven't seen prize of gddr5x, but I don't believe it differs much from gddr5. If my memory servers me right 8Gb 7Gbps gddr5 costs something about $15 per chip and 4Gb version $8 per chip(in minimum order of 1000 pcs.). That would make $150 vs $80, awful lot of money per card lost.
Posted on Reply
#12
Valantar
jabbadapAgreed, double the capacity ~ double the prize. Haven't seen prize of gddr5x, but I don't believe it differs much from gddr5. If my memory servers me right 8Gb 7Gbps gddr5 costs something about $15 per chip and 4Gb version $8 per chip(in minimum order of 1000 pcs.). That would make $150 vs $80, awful lot of money per card lost.
Especially if you flip it around: originally 11 8Gb chips firmware limited to 5GB usable total costs you ~$165 and gets you one card - or 2.2 cards with no firmware shenanigans for zero cost increase.

AFAIK there are no 4Gb GDDR5X chips on the market, so they'd have to use the same amount of chips, just less of them.

Again: unless Eth is far more dependent on bandwidth than what is my impression, this is throwing money out the window for the OEM.
Posted on Reply
#13
LightningJR
ValantarI mean for the manufacturer (what on earth else would I mean? "In my view"? I don't even understand what that means.): how could it be cheaper for them to buy (very very expensive!) high-end GDDR5X chips and then partially disabling them through software, rather than, say, not populating all the channels? Now, my understanding of Eth performance is virtually nonexistent, but I get the impression (feel free to correct me!) that memory latency and VRAM size is what matters, not memory bandwidth? If so, wouldn't they save quite a bit on simply not populating those pads, and using the chips for other cards?
You only need the memory amount to fit the dag size which increases in size but very slowly. It's almost exclusively bandwidth and latency of VRAM that increases the performance of ETH mining. It's why people love Samsung RAM on their 1070s, the overclockability is immense giving you more bandwidth and in turn more hashrate.
Posted on Reply
#14
R-T-B
OctaveanI thought miners didn't want cards like this because of the lower resale potential.
Now now, that's just silly. I've always told you: serious miners don't sell cards.
Posted on Reply
#15
bug
the54thvoidCurious - how does it do better at mining than a 1080ti if it is a cut down version of one? Does it have any mining specific other hardware on the PCB?
The base clock on this thing is the same as the boost clock on the 1080Ti.
Fewer memory channels, more TDP that can go the to GPU.
Posted on Reply
#16
jabbadap
ValantarEspecially if you flip it around: originally 11 8Gb chips firmware limited to 5GB usable total costs you ~$165 and gets you one card - or 2.2 cards with no firmware shenanigans for zero cost increase.

AFAIK there are no 4Gb GDDR5X chips on the market, so they'd have to use the same amount of chips, just less of them.

Again: unless Eth is far more dependent on bandwidth than what is my impression, this is throwing money out the window for the OEM.
Well yes not in any form or known stock. But Jedec standard does not rule that out:
GRAPHICS DOUBLE DATA RATE (GDDR5X) SGRAM STANDARD
JESD232AAug 2016
The purpose of this standard is to define the minimum set of requirements for JEDEC standard compatible 4 Gb through 16 Gb x32 GDDR5X SGRAM devices. System designs based on the required aspects of this standard will be supported by all GDDR5X SGRAM vendors providing JEDEC standard compatible devices. Some aspects of the GDDR5X standard such as AC timings were not standardized. Some features are optional and therefore may vary among vendors. In all cases, vendor data sheets should be consulted for specifics. Item 1827.99C
Graphics memory orders are in usually quite big in quantities, that if nvidia wants ~1M pcs. of 4Gb gddr5x chips micron might as well manufacture and sell them.
Posted on Reply
#17
Casecutter
I wouldn't see this as a "winning" deal for any "Jonny come lately" wishing to cash in. Sure 47 MH/s seems good from 250W; although the lack of any resale when it goes bust isn't for the faint of heart. I would want two 570 verse getting this. Two 570's cost perhaps $700, offer like 48MH/s together, for like a total 240W and have some resale down the road.

It would appear Nvidia is saying, the GP102 is dead to gamers anyway (no one touching them for at the inflated cost) and those already used in mining will flood the used market when running them stops delivering ROI... So let's finish off production with this in bulk sales to the big mining operators. Not a bad plan.
Posted on Reply
#18
Valantar
LightningJRYou only need the memory amount to fit the dag size which increases in size but very slowly. It's almost exclusively bandwidth and latency of VRAM that increases the performance of ETH mining. It's why people love Samsung RAM on their 1070s, the overclockability is immense giving you more bandwidth and in turn more hashrate.
Thanks for clearing that up :) So does that mean that the hashing algorithm is constantly checking the dag for something? That would explain the need for bandwidth, at least.

I wouldn't be surprised if we saw 4Gb GDDR5X chips showing up in the next year, then, if crypto demand keeps up. Has to be cheaper, after all, and it's not like they need the capacity. Yet.
Posted on Reply
#19
bug
CasecutterI wouldn't see this as a "winning" deal for any "Jonny come lately" wishing to cash in. Sure 47 MH/s seems good from 250W; although the lack of any resale when it goes bust isn't for the faint of heart. I would want two 570 verse getting this. Two 570's cost perhaps $700, offer like 48MH/s together, for like a total 240W and have some resale down the road.

It would appear Nvidia is saying, the GP102 is dead to gamers anyway (no one touching them for at the inflated cost) and those already used in mining will flood the used market when running them stops delivering ROI... So let's finish off production with this in bulk sales to the big mining operators. Not a bad plan.
Why would you care about resale value when the card is supposed to pay for itself in virtual cash anyway?
Also, between amazon and newegg, the cheapest RX 570 is $389.99
Posted on Reply
#20
the54thvoid
Super Intoxicated Moderator
bugThe base clock on this thing is the same as the boost clock on the 1080Ti.
Fewer memory channels, more TDP that can go the to GPU.
Ah, I forget, i overclock to 2Ghz. I lose track of boost....
Posted on Reply
#21
Casecutter
bug"supposed" to pay for itself
Until the power needing to mine is up-side-down and no use running it...
bugRX 570 is $389.99
Okay, last time checked a couple of days ago... but does that still say Nvidia will offer these at a similar $/MH/s or like $780 MSRP? Or the GTX1080Ti get back to it MSRP of $699, even better start having stock at Nvidia Web-Site? I doubt it.
Posted on Reply
#22
trog100
two of my 1070 cards push out 62 mhs... 8 of them push out 240 mhs and pull 1000 watts from the wall...

not much point in these things unless they are cheap... but two 1070 cards make more financial sense than one 1080tI if they didnt i would be mining 1080ti cards instead of 1070 cards..

but ether way at current payout levels mining is a waste of space.. i have 10 x 1070 cards running producing less than 15 dollars per day and its dropping.. roi at current hardware prices would be maybe two years but as i say the returns are dropping day by day..

trog
Posted on Reply
#23
GrandLine
I heard there's already available dedicated mining hardware (ASIC miner) and even some of them can also act as room heater...
so, why miners still using consumer graphic cards?

(my points is, I hope miners can just buy their own dedicated hardware for mining so I can buy graphic cards with normal prices)
Posted on Reply
#24
ZeDestructor
GrandLineI heard there's already available dedicated mining hardware (ASIC miner) and even some of them can also act as room heater...
so, why miners still using consumer graphic cards?

(my points is, I hope miners can just buy their own dedicated hardware for mining so I can buy graphic cards with normal prices)
Because after BTC went ASIC-only, every other coin that wasn't marginalized for being ASIC-viable was designed to specifically be ASIC-resistant. In fact, there are coins out there that are designed to be GPU-resistant too.
Posted on Reply
Add your own comment
Dec 22nd, 2024 05:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts