1) Message boards : Number crunching : BitCoin Utopia went crazy credit-wise (Message 37262)
Posted 3547 days ago by Profile Bryan



While you may be able to donate the ASIC computational power for forever, the value of the ASIC computational power decreases over time as the computational power needed to unlock additional bit coins increases. At some point, you are better off just making a direct contribution to the charity rather than sending the money off to your local electric utility.


Following your line of reasoning ... since my current CPUs/GPUs will be producing far less work for the project than the CPUs/GPUs available 5 years from now then I should shut them down now and just send the project the money I spend monthly on electricity. Is that correct?

The arguments everyone is making are identical to when GPUs 1st came on the scene. It was the same trauma for those that didn't have a GPU. "It's going to ruin BOINC!". At least with ASIC mining anyone with a USB port can get into the game with a very small investment, $20 versus the hundreds of dollars that were required on the 1st (and current) GPUs.

The benefit to BitCoin Utopia versus Donate@home is that it is primarily using technology that doesn't take CPU/GPU resources away from other projects. Donate used the same resources and therefore it can be argued that it hurt the overall BOINC effort.

Typically projects award credits based on the amount of work being accomplished. A GPU will gain you more credits at "most" projects than running a CPU only. The GPU produces more work than a CPU and therefore should be rewarded higher. The same is true of BU. As someone mentioned, their 4.8Gh/s ASIC is producing more work than 6 HD 7970s. Shouldn't they be compensated for the work they are producing?

You guys are incorrect as to what an ASIC is. It stands for Application Specific Integrated Circuit. They ARE processors but they have been designed to perform a limited number of tasks ... very quickly.

I worked for HP for 32 years and we designed and used many ASICs. One ASIC I'm very familiar with did FFTs and inverse FFTS .... very very quickly. The heart of the Seti WU is preforming FFTs. So if someone were to design an ASIC that plugged into your USB port that could compute a Seti WU 1000s of times faster than a GPU would you be complaining because it isn't fair? Particularly since it would use a handful of watts versus hundreds of watts to do the same job. Of course the market is so small no one would every produce one for Seti ... you couldn't sell enough to pay for the development costs.

I agree with the person who mentioned that maybe projects should start their own ASIC mining sub-projects if they need funding. BU has approached MilkyWay@home to contribute to their program MW lost their grant and are in need of funds to continue.
2) Message boards : Graphics cards (GPUs) : Loading 2 WU (Message 10905)
Posted 5387 days ago by Profile Bryan
I'm running a GTS 250 for the GPU. My BOINC version is 6.6.31 and it has always performed as you stated. FIFO unless it panics and goes into "high priority" mode.

That is what surprised me, the GPU workunits all had the same deadline since they had just been downloaded.
3) Message boards : Graphics cards (GPUs) : Loading 2 WU (Message 10898)
Posted 5387 days ago by Profile Bryan
I just started running the project a couple of days ago and managed to process the original 4 units without problems Tonight I got new units downloaded and the GPU started a unit. About 2 minutes later it loads a 2nd wu and begins crunching on it. The 1st unit went to "Waiting to run".

I aborted the one that was waiting and then another unit was loaded into the GPU and began running forcing the one that had been running to go to "Waiting to run".

Any ideas?


//