Advanced search

Message boards : Graphics cards (GPUs) : Power Consumption?

Author Message
Coby Walker
Send message
Joined: 16 Jan 10
Posts: 11
Credit: 155,536
RAC: 0
Level

Scientific publications
wat
Message 16700 - Posted: 1 May 2010 | 1:15:16 UTC

I leave BOINC running for GPUGrid all the time when I am on, and uses 100% of my computer (GPU) for 5-10 hours a day.

I was wondering, does my computer use more power when BOINC is running? I am wondering because I always try to save power (cutting off screens, unplugging TV's, turning thermostat up) and I want to know if I am working against myself.

I am not asking if it costs to run the program, I am asking does the computer always use Full power or does it use less when in less use?

(Like I said my computer is at 100% use all the time for GPUGrid, even when I'm on)

<Also sorry if this is the wrong category, I was unsure where to post>

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16701 - Posted: 1 May 2010 | 1:49:36 UTC

OK, If you run BOINC, you essentially run the computer full 24/7 unless you make other setting adjustments ... if you run GPU projects, including GPU Grid, you will use even more power because the GPU speed comes from the juice ... more speed, more juice ...

So, if you are totally into power saving, then BOINC is not for you ...

If you are into saving where you can so you can run BOINC, then, do what you have been doing ...

Or be like me and just live your life ... :)

I don't heat in the winter though Sacramento does not get THAT cold, it does get to freezng once or twice a year...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16712 - Posted: 1 May 2010 | 9:04:48 UTC

There's no free lunch in computing: calculating does require more power than not calculating. Hence all the talk about energy efficient computing during the last few years.

GPU-Grid will not run your PC at 100% because it doesn't fully use the CPU - it uses your GPU to 100% but leaves the CPU practically idle. You've got a 9800GT which uses about 25W when idle (i.e. just showing your desktop). Running GPU-Grid should add about 60 - 70 W to that. If you load your CPU with some project it will add about another 80 W. And it also depends on your power supply: if a machine uses 200 W and the PSU runs at 75% efficiency (old and / or cheap model) you'd draw 250 W from the wall.

Running GPU-Grid is certainly more beneficial to mankind than having a TV sit at standby.. but the power savings pale in comparison to running a modern GPU under load (1-2 W for a device in standby).

Also note that one of your recent WUs was returned too late, so it was sent out to someone else and returned before your result was in. This doesn't give you credit and practically wastes the power you put into crunching this WU.

MrS
____________
Scanning for our furry friends since Jan 2002

Coby Walker
Send message
Joined: 16 Jan 10
Posts: 11
Credit: 155,536
RAC: 0
Level

Scientific publications
wat
Message 16736 - Posted: 2 May 2010 | 1:54:30 UTC - in response to Message 16712.

Thanks for the response. I wasn't sure if the power supply just did it's part and generated a steady stream of power that it was rated at.

So there is a direct correlation between computer usage and power usage thanks.

btw yea I had Docking@home for the CPU one.

"Also note that one of your recent WUs was returned too late, so it was sent out to someone else and returned before your result was in. This doesn't give you credit and practically wastes the power you put into crunching this WU."
well that sucks

I will try my best to get this one in on time and will continue telling others to do this, to help out.

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16738 - Posted: 2 May 2010 | 4:00:39 UTC

Direct but not necessarily linear ... :)

In some cases the speed goes up by two, power by four ...

It also depends on how you get there from here ...

Adding a GPU project will just about double your computer's power draw by a mid to high end card, say a GTX260 (or better, rated at about 170W pull, GTX295 at 312W, and so on), but the CS production will go up by a factor of 4-10 (or more) ... as the other side of the coin ...

For example I probably make from 2-40K a day from my CPUs running a lot of NCI projects for which the pay is great for the CPU load involved ... but I also run standard CPU projects at the same time earning about 1-5K on the CPU side per computer... but, I am making about 700K a day total because of the 7 GPUs I run ...

It has been so long since I have been running without a GPU in the mix I have no idea what I make on pure CPU though I could make an estimate ...


Lets see, for the numbers in BS as of right now,

NCI Projects: 13,630 (FreeHAL and WUProj)
GPU Projects: 668,081 Collatz, DNETC, MW
-----------------
Total: 681,711

Daily Total: 705,173
Delta = 23,462

So, I have 5 computers 3 i7s, one quad and a 8 core mac earning about 5K each ... all total, about 3.3% of my daily earnings comes from CPU work ... 96% comes from the 7 GPUs ...

Of course the flip side is not all project's programs are amenable to what we call vectorization which is what GPUs do best, process vectors of numbers ... if the programs are, then, well, Bobs your uncle and you can run the application 60-100 times faster (or more) on a GPU than on the CPU ... but, most problems are not vector problems and so, GPU processing is out ... this is one of the reasons the Cray computers, as wonderful as they were did not monopolize the super-computer marketplace in their day ... there are some problems that would not run efficiently on the architecture ... but for those that did ... they ran like a scalded cat ...

On the other hand, the more projects that we can get to use GPUs frees the same cores now doing the work on the CPU when it is shifted to the GPU ... personal example, I don't run MW or Collatz on the CPU at all ... why bother running a task for hours on the CPU when I have GPUs that do that same task in a minute and change (MW) or a few minutes (less than ten mostly, Collatz)... so, those CPU cores run other projects that don't have GPU applications yet, or never will ...

Oh, and I have GPUs that with about the same power draw do work at a rate about 4x apart, a GTX280 for example against a HD5870 are about the same power draw but the ATI card is about 4 times faster in doing the work ... of course the ATI card does not work at all here (yet) so, faster yes, always usable, no ... tradeoffs abound ...

Heck if it was simple or easy, anyone could play ... :)

Post to thread

Message boards : Graphics cards (GPUs) : Power Consumption?

//