cdawall
where the hell are my stars
- Joined
- Jul 23, 2006
- Messages
- 27,680 (4.12/day)
- Location
- Houston
System Name | All the cores |
---|---|
Processor | 2990WX |
Motherboard | Asrock X399M |
Cooling | CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL |
Memory | 4x16GB G.Skill 3600 |
Video Card(s) | (2) EVGA SC BLACK 1080Ti's |
Storage | 2x Samsung SM951 512GB, Samsung PM961 512GB |
Display(s) | Dell UP2414Q 3840X2160@60hz |
Case | Caselabs Mercury S5+pedestal |
Audio Device(s) | Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood |
Power Supply | Seasonic Prime 1200w |
Mouse | Thermaltake Theron, Steam controller |
Keyboard | Keychron K8 |
Software | W10P |
I will say that is complete BS. The only advantage Nvidia has (besides Volta lol) is a 1070 at stock settings are more efficient than AMD. But if you tweak AMD cards heavily, they completely smash Nvidia.
-My $500 Vega cards do 47.2 MH/s Ethereum + 1100 Pascal dual mining. That is the equivalent of 2.5 1070's!
-My $120 R9 380's do 23 MH/s. That's as good as a 1060 that would cost at least 2x as much (And it's 28nm lol)
-And of course RX 580's mine as well as 1070's if properly tweaked.
There is a reason RX 580's eventually started selling for the same price as 1070's, and in general AMD cards sold out WAY before Nvida...
You mind showing your Vega card doing 47.2+1100 pascal sustained for any period of time?
and my 1060's ran $140-150 USD and use less than half the power those 380's do...