Raevenlord
News Editor
- Joined
- Aug 12, 2016
- Messages
- 3,755 (1.24/day)
- Location
- Portugal
System Name | The Ryzening |
---|---|
Processor | AMD Ryzen 9 5900X |
Motherboard | MSI X570 MAG TOMAHAWK |
Cooling | Lian Li Galahad 360mm AIO |
Memory | 32 GB G.Skill Trident Z F4-3733 (4x 8 GB) |
Video Card(s) | Gigabyte RTX 3070 Ti |
Storage | Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB |
Display(s) | Acer Nitro VG270UP (1440p 144 Hz IPS) |
Case | Lian Li O11DX Dynamic White |
Audio Device(s) | iFi Audio Zen DAC |
Power Supply | Seasonic Focus+ 750 W |
Mouse | Cooler Master Masterkeys Lite L |
Keyboard | Cooler Master Masterkeys Lite L |
Software | Windows 10 x64 |
Hold on to your ETH hats: you will still be able to cash in on the ETH mining craze for a while. However, you should look towards your 3 GB and 4 GB graphics cards with a slight distrust, for reasons that you should know, anyway, since you have surely studied your mining cryptocurrency of choice. Examples are the GTX 1060 3 GB, or one of those shiny new 4 GB RX 480 / RX 580 which are going at ridiculously premium prices right now. And as a side note, don't you love the mechanisms of pricing and demand?
The problem here stems from ETH's own design for its current PoW (Proof of Work) implementation (which is what allows you to mine the currency at all.) In a bid to make ETH mining unwieldy for the specialized silicon that brought Bitcoin difficulty through the roof, ETH implements a large size data set for your GPU to work with as you mine, which is stored in your GPU's memory (through the DAG, which stands for Directed Acyclic Graph). This is one of the essential differences between Bitcoin mining and Ethereum mining, in that Ethereum mining was designed to be memory-intensive, so as to prevent usage of ASICs and other specialized hardware. As a side-note, this also helps (at least theoretically) in ETH's decentralization, which Bitcoin sees more at risk because of the inherent centralization that results from the higher hardware costs associated with its mining.
In time, as the number of blocks in the blockchain increases (at a rate of roughly 14 seconds per block), so does Ethereum's epoch level (which relates to DAG size and the size of the memory footprint your GPU must occupy for its calculation) increase. Every 30,000 blocks, a new epoch emerges, more costly in memory footprint than the previous one. This means that memory requirements for ETH mining are actually increasing as time passes. As the workload's memory footprint increases, it can (and will) overflow from the GPU's memory, and be stored in main system memory. Which, as you know, is much slower to access than the GPU's VRAM (this just made me remember of AMD's SSG graphics solutions. Neat almost solution to this problem, no?) Slower random memory accesses (of which ETH mining is extremely dependent on) will result in performance penalties for your mining hash-rate. And you can see where this is going.
We are currently at epoch #129. Based on current estimates for DAG size and memory requirements, mining with an RX 470 4 GB should yield:
Currently, an epoch takes about 4 to 5 days to conclude until a new one is started. That means that in around five month's time, a user with an RX 470 4GB will see an almost 30% decrease in hash-rate, with the same power consumption of today. And as you know, in mining, power/performance ratio is all that counts for profitability. It is expected that NVIDIA cards being used currently, such as the GTX 1060 3 GB, will see decreases as well. And there are other performance-affecting details that originate from the increased worker size as well, such as TLB (Translation Lookaside Buffer) trashing, which could see performance degradation even before the memory pool of your graphics card of choice is fully loaded, provided the TLB is itself being overflowed... With Polaris, AMD implemented a TLB cache which was, before, absent from the GCN architecture. The small size of this TLB cache means that Polaris graphics cards will likely see performance penalties from TLB trashing before the DAG size increases to their memory limits. Just another point for you to consider.
Now granted, if you know anything about Ethereum, you probably won't even care about this: the passage from PoW to PoS (Proof of Stake) is expected to occur by November 1st of this year. This means that ETH mining will simply cease to be a thing (though this implementation could see some delays, unlikely as that is.) And it lines up nicely with the 5 month, 30% computing power decrease estimation above. So maybe you don't have to worry that much about ETH mining ceasing to be profitable in 5 month's time. But if you are looking to buy into the mining craze and invest in hardware, you should study this market, and this technology, first (and pay attention to this article as well.) Likewise, if you have just recently bought into the mining hardware market with those exorbitantly-priced RX 400 and RX 500 - do the math and be ready to look for alternatives, either in cryptocurrencies or mining solutions. Don't let yourself be burned just because you want to follow the train.
View at TechPowerUp Main Site
The problem here stems from ETH's own design for its current PoW (Proof of Work) implementation (which is what allows you to mine the currency at all.) In a bid to make ETH mining unwieldy for the specialized silicon that brought Bitcoin difficulty through the roof, ETH implements a large size data set for your GPU to work with as you mine, which is stored in your GPU's memory (through the DAG, which stands for Directed Acyclic Graph). This is one of the essential differences between Bitcoin mining and Ethereum mining, in that Ethereum mining was designed to be memory-intensive, so as to prevent usage of ASICs and other specialized hardware. As a side-note, this also helps (at least theoretically) in ETH's decentralization, which Bitcoin sees more at risk because of the inherent centralization that results from the higher hardware costs associated with its mining.
In time, as the number of blocks in the blockchain increases (at a rate of roughly 14 seconds per block), so does Ethereum's epoch level (which relates to DAG size and the size of the memory footprint your GPU must occupy for its calculation) increase. Every 30,000 blocks, a new epoch emerges, more costly in memory footprint than the previous one. This means that memory requirements for ETH mining are actually increasing as time passes. As the workload's memory footprint increases, it can (and will) overflow from the GPU's memory, and be stored in main system memory. Which, as you know, is much slower to access than the GPU's VRAM (this just made me remember of AMD's SSG graphics solutions. Neat almost solution to this problem, no?) Slower random memory accesses (of which ETH mining is extremely dependent on) will result in performance penalties for your mining hash-rate. And you can see where this is going.
We are currently at epoch #129. Based on current estimates for DAG size and memory requirements, mining with an RX 470 4 GB should yield:
- Dag 130 - 27.400 Mh/s
- Dag 140 - 25.100 Mh/s
- Dag 150 - 22.500 Mh/s
- Dag 160 - 20.100 Mh/s
- Dag 199 - 10.000 Mh/s
Currently, an epoch takes about 4 to 5 days to conclude until a new one is started. That means that in around five month's time, a user with an RX 470 4GB will see an almost 30% decrease in hash-rate, with the same power consumption of today. And as you know, in mining, power/performance ratio is all that counts for profitability. It is expected that NVIDIA cards being used currently, such as the GTX 1060 3 GB, will see decreases as well. And there are other performance-affecting details that originate from the increased worker size as well, such as TLB (Translation Lookaside Buffer) trashing, which could see performance degradation even before the memory pool of your graphics card of choice is fully loaded, provided the TLB is itself being overflowed... With Polaris, AMD implemented a TLB cache which was, before, absent from the GCN architecture. The small size of this TLB cache means that Polaris graphics cards will likely see performance penalties from TLB trashing before the DAG size increases to their memory limits. Just another point for you to consider.
Now granted, if you know anything about Ethereum, you probably won't even care about this: the passage from PoW to PoS (Proof of Stake) is expected to occur by November 1st of this year. This means that ETH mining will simply cease to be a thing (though this implementation could see some delays, unlikely as that is.) And it lines up nicely with the 5 month, 30% computing power decrease estimation above. So maybe you don't have to worry that much about ETH mining ceasing to be profitable in 5 month's time. But if you are looking to buy into the mining craze and invest in hardware, you should study this market, and this technology, first (and pay attention to this article as well.) Likewise, if you have just recently bought into the mining hardware market with those exorbitantly-priced RX 400 and RX 500 - do the math and be ready to look for alternatives, either in cryptocurrencies or mining solutions. Don't let yourself be burned just because you want to follow the train.
View at TechPowerUp Main Site
Last edited: