When I built my miner, I initially chose the mining software mostly
arbitrarily. Wanting to rectify that, I next performed comparative
benchmarking to help determine which mining software to use, and whether or not
Benchmarking yielded the following data:
Comma-separated values are ETC/DCR pairs.
Claymore charges a "DevFee" of 1% when mining ETC, and 2% when mining
"Adjusted" rates account for losses accrued by the DevFee.
By the numbers, yes - DCR was (barely) worth mining. However, beyond the
numbers, I had some concerns.
I had been dual-mining ETC+DCR for weeks before making the above benchmarks.
When I checked my mining pool, however, I was surprised to see how
little DCR I had earned. Investigating why, I discovered that the pool reported
my DCR hashrate to be less than half of what Claymore reported. (I still don’t
know which number was correct, or how the disparity came to be.)
I had additional concerns beyond the dubious hashrate numbers.
When measuring power consumption, I noticed that dual-mining was very “peaky” -
draw would swing +/-100 watts perhaps a dozen times per minute. I feared that
the temperature fluctuations that (likely) accompanied those swings would
eventually damage the GPUs via thermal expansion and contraction.
Lastly, I found Claymore somewhat unpleasant to use. Configuration seemed
awkward to me, in that it read configuration from text files (with a
proprietary syntax) from its application directory. Likewise, it continuously
downloaded .bin files (containing I-know-not-what) and logged output (again
into text files) into the same directory. This all felt terribly disorganized,
and inconsistent with sound software engineering practices.
In light of all of the above, I decided to forego dual-mining entirely, and to
simply “single-mine” ETC using Ethminer.
Thus, with a revised mining strategy, I next turned my attention to optimizing
my miner’s computational performance. In Part 4, I’ll discuss how I
was able to (dramatically) reduce power consumption without impacting hashrate.