Advanced search

Message boards : Graphics cards (GPUs) : Most cost effective setup?

Author Message
matlock
Send message
Joined: 12 Dec 11
Posts: 34
Credit: 86,423,547
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 33564 - Posted: 20 Oct 2013 | 19:59:45 UTC

I currently have a RAC of 292k+ with a single GTX 660, running on Linux. The GPU was under $200 CAD, and I am currently paying 6.9 cents (CAD) per kWh. I haven't measured the draw at the wall, but I feel like this is a pretty good setup. Just curious :)

kingcarcas
Avatar
Send message
Joined: 27 Oct 09
Posts: 18
Credit: 378,626,631
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33566 - Posted: 21 Oct 2013 | 3:51:23 UTC - in response to Message 33564.

Yes it is, i'm still trying to get things figured out in terms of performance per dollar/watt upgrading my old C2D/gt 240/430 rigs. My main rig had a GTX460 and i'm also looking at either a 660 or 660Ti since i heard they were getting even more price drops in November.

Microcenter used to have combo deals on CPU/Mobos for around $150, now it seems it's only AMD 4 and 6 cores, the i3s are more expensive now for some reason....
____________

Duane Bong
Send message
Joined: 21 Feb 10
Posts: 16
Credit: 736,625,284
RAC: 534,863
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33567 - Posted: 21 Oct 2013 | 4:25:37 UTC

I'm interested in this too. My current rig is a GTX460 1GB @ 800MHz. Thinking of upgrading to either 660 / 660Ti / 750Ti (assuming it is a 660 with 256bit memory and higher clocks).

In particularly, how does electricity cost affect the total cost over a 2-3Y lifespan? I've seen reviews that the 660Ti is very energy efficient (lower clocks + more shader units VS higher clock speed + less shader units). Also, going to 256bit memory bus and having more ROPs doesn't make a difference for GPUGrid.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33570 - Posted: 21 Oct 2013 | 15:09:02 UTC - in response to Message 33564.
Last modified: 21 Oct 2013 | 22:55:59 UTC

Due to the 'MSU overhead' (Motherboard, CPU, RAM, Drives, fans), in terms of credits/Watt the best systems have multiple GPU's.
However, the power consumption of the other MSU components and the efficiency of the PSU (80%, 85%, 89% or 93%) is also important.

For here, having a high end CPU (that crunches CPU projects) and one entry level GPU is the worst setup.

Then there is the Operating System to consider. Linux and XP are around 11% faster than Vista, Windows 7 and Windows 8. The 2008 and 2012'ish servers are between 3 and 8% slower, last time I checked. This basically means if you are not using Linux or XP you are loosing 11% performance.

I specifically built a GPU centric system to run GPUGrid work on the cheap. I just used a ~£50 Intel Pentium G2020 CPU (runs @ 2.90GHz), a ~£65 motherboard (3 PCIE slots), 8GB RAM for £40, a used HDD (Free) and a ~£75 PSU (~88% efficient). At present I have a GTX670 ~£180 and a GTX650TiBoost ~£150 in that system. It pulls 300W at the wall. I don't run any CPU intensive apps on it. When it's stable (which it has been for the last couple of days) it gets 660K/day.

Unfortunately for me I have to pay about 3times more than Matlock for electricity (~£0.15/KW), but for others its a lot more.

Anyway, to buy and run it over 2 years would cost ~£1350 and I would get ~481M credits. The £790 running costs would be around $300 to $350 for Matlock which demonstrates that in the UK running costs are more important than components. In some parts of the US and Canada it's a lot less but in Germany it's even more important to get an efficient system... Due to the running costs, cards such as the GTX580 would still be very good cards in parts of Canada, the US... but a poor choice in Denmark, Germany, Switzerland (on average)...
Electricity Pricing
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Hype
Send message
Joined: 21 Nov 11
Posts: 10
Credit: 8,509,903
RAC: 0
Level
Ser
Scientific publications
wat
Message 33575 - Posted: 21 Oct 2013 | 17:53:05 UTC

Hi,

I'm trying to build a very cost effective machine at the moment.
I'm from Germany and I can't believe that you're only paying 7 cents per kWh in Canada, holy ******* :)
I have to pay 25 cents (EUR) in Germany!
That's 35 cents (CAD) or 0.21 pounds.

Running a machine with 300 watts 24/7 is just too expensive.
Can you recommend a setup with around 100-150 watts?
Maybe some older stuff?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33579 - Posted: 21 Oct 2013 | 21:12:20 UTC - in response to Message 33575.
Last modified: 21 Oct 2013 | 21:12:57 UTC

If you really wanted to keep to 100W you would need to get a GTX650 (65W TDP), and run it in a mini-itx setup. Perhaps an Atom (though Bay Trail might not show to Feb/Apr '14), AMD E2-1800 (18W TDP), J1850 Celeron (10W), J2850 Pentium (10W) but possibly even up to an Intel i3-3220T 3rd Gen Core 2.6GHz LV Processor (35W TDP) so long as you didn't crunch on it for CPU work.

However, I wouldn't buy anything lower than a GTX650Ti (110W TDP, but probably going to use ~100W, or could be made to). That would leave you with ~50W to play with, to stay under 150W. So you would be stuck with a low end CPU (as above). With 1 module of RAM, an SSD and no CD/DVD/Blueray... disk drive you should draw under 150W.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33580 - Posted: 21 Oct 2013 | 21:17:14 UTC - in response to Message 33570.

Skgiven if you run it for 2 years you would get more credits I think...calculation error?

Nice overview of the electricity prices, but for me in the Netherlands its not right (anymore). I pay just under 17 euro cents per kilowatt. That is the all-in price like meter costs, taxes, network costs, invoice costs etc. During night time and weekends prices are lower but I have that taken into account in my calculations. And then I am not with the cheapest provider, but with good service and no power outages. And that is more important to me.

And for Hype, don't start with older stuff, that is more energy consuming and more error prone, more heat productive. It will give you a lot of troubles as I can tell you from own experiences.
If you want to crunch for the good cause, build one decent system, that you can use for other things as well. There are some good shops in Germany, where I have bought myself. Even Amazon.de (via the link of this project) has good prices.
You can also check EVGA.eu as they have a shop from Germany, with B-parts. Those are new but without any accessories.

____________
Greetings from TJ

Duane Bong
Send message
Joined: 21 Feb 10
Posts: 16
Credit: 736,625,284
RAC: 534,863
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33581 - Posted: 22 Oct 2013 | 2:28:40 UTC

Thanks for the responses. I'm using my CPU for WGC projects, while the (single) GPU runs GPUGrid. The setup is using hyperthreading, so it doesn't consume a whole CPU core. I suppose my view of efficiency is to find an efficient GPU and put it to work, rather than a dedicated multi-GPU setup.

In Singapore, for residential rates, we pay ~£0.13/kWh (prices valid until end of Q4 2013). Just a bit cheaper here than in Europe. Whereas Canada is really cheap compared to all of us..!

On the flip side, I remember using my PC for room heating when I was living in London. So it doesn't go to waste. However, in sunny Singapore, I need to turn on the airconditioning to remove the heat from the room.

matlock
Send message
Joined: 12 Dec 11
Posts: 34
Credit: 86,423,547
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 33585 - Posted: 23 Oct 2013 | 5:33:44 UTC

I should clarify that 7 cents is the base rate in British Columbia. With fees and taxes it comes to about 9 cents per kWh.

I agree with skgiven, that a single GPU doesn't take advantage of the system overhead enough. I plan to add a second GTX 660 at some point and hopefully will get close to a 600k RAC. This will draw maybe between 400-450W at load?

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33590 - Posted: 23 Oct 2013 | 13:05:44 UTC - in response to Message 33585.
Last modified: 23 Oct 2013 | 13:07:06 UTC

My G2020 supports a GTX670 and a GTX650TiBoost and is presently drawing 315W (240V), also Linux. Two GTX660's would have the same crunching performance, which roughly matches a Titan. The power usage of two GTX660's is going to be around about the same as a 670 and 650TiBoost, but your AMD FX(tm)-6300 has a TDP of 95W, whereas my G2020's TDP is only 55W, and is probably not drawing more than 30W. My guess is your system with an extra GTX660 would use around 350 to 380W, assuming you are not overclocking, not crunching CPU projects (which you dont appear to do) and you keep the power usage in check by having good cooling. Your theoretical RAC would also be around 660K/day, but you're probably going to get the odd failure that would bring it down to around 600K/day.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

GoodFodder
Send message
Joined: 4 Oct 12
Posts: 53
Credit: 333,467,496
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 33604 - Posted: 24 Oct 2013 | 15:22:28 UTC
Last modified: 24 Oct 2013 | 15:23:41 UTC

Throw in my tuppance as running a budget machine:

G2020 on a MSI Z77 MATX MB, an old slow 160GB 2.5" HDD, 1x 4GB DIMM
300W 80+ PSU, Win XP x86, 2x gtx650ti (1GB) Mild OC (1058,3000), 2x 120mm fans

Pulls at most 192W at the wall, around 150W used internally which is an optimal 50% of the PSU rating.

If I was to build a new machine now would be looking for two 600series (gtx 650ti boost (2gb) or above) in a bargain bucket.

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33605 - Posted: 24 Oct 2013 | 22:13:33 UTC - in response to Message 33585.

I agree with skgiven, that a single GPU doesn't take advantage of the system overhead enough. I plan to add a second GTX 660 at some point and hopefully will get close to a 600k RAC. This will draw maybe between 400-450W at load?

I run two GTX 660s on an Ivy Bridge i7-3770 board, with two virtual cores devoted to the GPUs and the other six running CEP2 on World Community Grid. It draws 345 watts from the plug, using a Seasonic bronze 85+ power supply. (These days, I buy the Rosewill Gold 90+ supplies from Newegg, if you can get them in Canada.) It is not overclocked, and I think I was getting around 550k RAC, maybe a little more, but just started up again and don't have the statistics yet.

matlock
Send message
Joined: 12 Dec 11
Posts: 34
Credit: 86,423,547
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 33607 - Posted: 25 Oct 2013 | 1:57:01 UTC - in response to Message 33590.

I'm curious about CPU power consumption when not using all of the cores. My AMD FX-6300 has 3 cores (6 virtual), and my Kepler GPU will keep one thread going. I assume the consumption is higher than simply dividing 95W by 3. As a guess, if it is using 40W with one GPU and maybe 70W with two GPUs, then the gap may not be as big versus a G2020 utilizing its 2 cores?

When looking at the TDP of Ivy Bridge i5's vs i7's, they are the same, so I assume that hyper-threading in the i7's would give very little power advantage vs the i5. I understand that the performance gain with virtual cores is really only seen when the physical core count is exceeded. If you run 8 threads on an i7, likely the performance of those threads will individually suffer compared to running only 4 threads on dedicated cores. This is a loose explanation, but hopefully highlights why I think CPU power consumption should only consider physical cores.

Duane Bong
Send message
Joined: 21 Feb 10
Posts: 16
Credit: 736,625,284
RAC: 534,863
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33608 - Posted: 25 Oct 2013 | 2:04:17 UTC - in response to Message 33605.

two virtual cores devoted to the GPUs and the other six running CEP2 on World Community Grid.


Interesting... was the optimum to use two HT cores for the GPUs? Would you get more total output if you dedicated just 1 HT core to the GPUs and 7 to CEP2 (WGC)?

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33610 - Posted: 25 Oct 2013 | 7:35:45 UTC - in response to Message 33608.

Interesting... was the optimum to use two HT cores for the GPUs? Would you get more total output if you dedicated just 1 HT core to the GPUs and 7 to CEP2 (WGC)?

I take a hit in output if I use only a single CPU core for both, though at the moment I don't remember how bad it is. I think it depends on the type of work unit; some can live with only half a virtual core, and those don't fare so badly, but even there you get a small loss. Since the GPU is the higher-output component, I like to keep it fed properly.

By the way, I think the Seasonic is actually called an 80+, though it runs at about 85 percent efficiency at my load (it is rated at 550 watts).

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33614 - Posted: 25 Oct 2013 | 12:10:06 UTC - in response to Message 33610.

There are 80+ bronze, gold and platinum ones. And reading through all the specs of several brands, it seems that the platinum is to most efficient and the bronze the less.
____________
Greetings from TJ

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33615 - Posted: 25 Oct 2013 | 13:34:11 UTC - in response to Message 33614.

There are 80+ bronze, gold and platinum ones. And reading through all the specs of several brands, it seems that the platinum is to most efficient and the bronze the less.

True. The Golds are now usually around 90% efficient over most of their range, and the platinums are slightly higher insofar as their rated minimum values are concerned. But in practice, the last time I was shopping, the Rosewill Golds were in reality as good as the Platinums over most of their range, and there was no point in paying the cost difference. It depends on the brand of course.

But remember that increasing the efficiency by only 1 percent (i.e., from 90% to 91%) decreases the heat that the power supply has to dissipate by about 10%, so its fan will run less and can be quieter. Also, the components will tend to last longer. And a good brand will actually deliver at it rated value, unlike lesser brands. I often see people buying ridiculously large power supplies to supply fairly ordinary upper mid-range cards, which should not be necessary at all.

Hype
Send message
Joined: 21 Nov 11
Posts: 10
Credit: 8,509,903
RAC: 0
Level
Ser
Scientific publications
wat
Message 33618 - Posted: 25 Oct 2013 | 18:50:20 UTC

I'm running 2 x GTX 570 and a OC'd i5 4670k with a 550W be quiet! psu at the moment.
No problems so far, the machine draws about 450-500W under full load.
I don't understand why some people on Google recommend a 850W psu for this setup.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33620 - Posted: 25 Oct 2013 | 20:34:11 UTC - in response to Message 33610.
Last modified: 26 Oct 2013 | 4:28:46 UTC

Interesting... was the optimum to use two HT cores for the GPUs? Would you get more total output if you dedicated just 1 HT core to the GPUs and 7 to CEP2 (WGC)?
I take a hit in output if I use only a single CPU core for both, though at the moment I don't remember how bad it is. I think it depends on the type of work unit; some can live with only half a virtual core, and those don't fare so badly, but even there you get a small loss. Since the GPU is the higher-output component, I like to keep it fed properly.

By the way, I think the Seasonic is actually called an 80+, though it runs at about 85 percent efficiency at my load (it is rated at 550 watts).


My i7-3770K (still @4.2GHz) supports 3 Kepler GPU's (for which the GPUGrid apps dedicate a full CPU/thread), a GTX 770, 660Ti and 660.

With a 70% CPU usage configuration, one of several reasonably happy mediums (typically 49% to 76%), Boinc runs 4 CPU tasks and 3 GPU tasks. The CPU usage is around 90% meaning that 7 of the 8 CPU threads are fully used and the 8th thread is largely available (but gets some use).
However, even with this setup my 770 struggles a bit; GPU usage is only ~74%. The 660 is 91% and the 660Ti is ~87%.
The 770 struggles the most as it's the best GPU of the three and the most reliant on the CPU, but this depends on the CPU projects and the GPU task types being run (utilization is not always the same).

When I set CPU usage in Boinc to 51% Boinc runs 3 CPU tasks and 3 GPU tasks, uses just over 75% of the CPU (6/8), GPU usages are:
GTX770 ~80%
GTX660Ti ~93%
GTX660 ~89%

On the face of it that's only a 6%, 2% and 2% difference. However, runtimes tend to show a greater improvement.

When I let Boinc run at 100% CPU usage (with default GPUGrid/Boinc setup) my WU's say they will use ~0.59 CPU cores per GPUGrid task (as read by scheduler). The scheduler thinks this is ~1.77 CPU's, and being less than two logical CPU cores (threads in my case) will allow 7 CPU tasks to run as well as the 3 GPU tasks. With such a BAD setup my GPU utilization fell to on average ~55% but also very spiky (dropping to 30% at times and occasionally peaking >80%). Also worth noting is that some CPU projects suffer extremely badly with this setup, it can impact badly on VM's and force Boinc into panic mode...

Clocks also drop - my GTX660Ti drops from 1189/1202 to 1058MHz and my 770 sometimes drops from 1202 to 1045. My 660 for some reason remained oblivious to just about everything and stayed exactly at 1071MHz.

When I only run GPUGrid work my GPU's usage rises the most. My GTX770's usage goes up to 92% and my GTX660 rises to 96%. For the GTX770 optimal is 92% faster than with the CPU usage set to 100%.

Unfortunately, we just can't have it all our own way - utilize all resources to 100%, there is always contention. If you chose to utilize towards your CPU that's up to you, but I tend to utilize towards my GPU's as they do more work and cost a lot to buy and run.

The best setup for GPUGrid is 4 Titans supported by an overclocked i7 with HT off on Linux and only running GPUGrid work. Anything close to that is a great setup.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

TJ
Send message
Joined: 26 Jun 09
Posts: 815
Credit: 1,470,385,294
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33626 - Posted: 26 Oct 2013 | 15:12:58 UTC - in response to Message 33618.

I'm running 2 x GTX 570 and a OC'd i5 4670k with a 550W be quiet! psu at the moment.
No problems so far, the machine draws about 450-500W under full load.
I don't understand why some people on Google recommend a 850W psu for this setup.

If your system is pulling 500W on a 550W PSU, it will run less efficient, it will run hotter, with a shorter lifespan and it may result in some errors of GPU WU's.
The advice is to use a PSU with plenty of head room. If it works at 50-70%, so a 1000-850W PSU in your case, is the best for performance, efficiency and heat production.
____________
Greetings from TJ

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33633 - Posted: 27 Oct 2013 | 10:58:19 UTC - in response to Message 33626.
Last modified: 27 Oct 2013 | 10:59:57 UTC

450W from a 550W PSU might be pushing it a bit, but it really depends on the PSU, some are perfectly capable of supplying >80% of their maximum load continuously and efficiently, others are not.

Take a TX650M for example,


From a 230V input it's maximum efficiency is ~88% (at 50% load it's at it's optimal), but at ~75% load it's efficiency has only fallen to 87% and even at 100% load (not recommended) the efficiency is still 85%. A better model would have 90% efficiency at 90% load,



and a top model could be 92% efficiency at full load,



If you are worried about the 550W PSU, look up it's efficiency. If you think you need to reduce the load stick with stock CPU clocks.

BTW. Haswell needs a PSU that can deliver 0.05W to support it's C6 and C7 power states (but you can configure a system to stay at a high clock).
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

matlock
Send message
Joined: 12 Dec 11
Posts: 34
Credit: 86,423,547
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 33643 - Posted: 27 Oct 2013 | 18:38:25 UTC - in response to Message 33633.

Nice PSU graphs. I have an HX850, so I feel it should be pretty efficient when I get two GTX 660s in there. It looks like a target of 50% is great for budget PSUs, but a high end PSU like the AX860i would more easily handle additional load.

Profile dskagcommunity
Avatar
Send message
Joined: 28 Apr 11
Posts: 456
Credit: 817,865,789
RAC: 0
Level
Glu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33644 - Posted: 27 Oct 2013 | 19:16:04 UTC

Hm ok does anything changed in technology? I heared years ago, the psus with 80.83 85 certificate are running only at this rate in the upper usage. So i have as example buyed an 530w psu for my dual 570gtx setup here. The other psus in other machines dont have much headroom either. But as i read here this is not the case anymore, good to know O.o
____________
DSKAG Austria Research Team: http://www.research.dskag.at



Hype
Send message
Joined: 21 Nov 11
Posts: 10
Credit: 8,509,903
RAC: 0
Level
Ser
Scientific publications
wat
Message 33645 - Posted: 27 Oct 2013 | 21:20:06 UTC - in response to Message 33626.

I'm running 2 x GTX 570 and a OC'd i5 4670k with a 550W be quiet! psu at the moment.
No problems so far, the machine draws about 450-500W under full load.
I don't understand why some people on Google recommend a 850W psu for this setup.

If your system is pulling 500W on a 550W PSU, it will run less efficient, it will run hotter, with a shorter lifespan and it may result in some errors of GPU WU's.
The advice is to use a PSU with plenty of head room. If it works at 50-70%, so a 1000-850W PSU in your case, is the best for performance, efficiency and heat production.


It's not a 24/7 crunching machine, so it doesn't matter.
500W is the peak with 100% and both GPU's at 99%.
I'm only doing GPU WU's every few days and while crunching with the CPU it draws ~ 150W.

Duane Bong
Send message
Joined: 21 Feb 10
Posts: 16
Credit: 736,625,284
RAC: 534,863
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33650 - Posted: 28 Oct 2013 | 4:16:49 UTC

Looking at those PSU efficiency charts, we're looking at around 3% difference in efficiency comparing between the peak ~45% load and at 90% load.

The bigger difference is probably to make sure you get a good power supply in the first place. There's a 5% difference between 80Plus Bronze and 80Plus Gold if you live in a country with 115V power. It widens to 7% difference if you're in a 230V country. There are also Platinum and Titanium grades of power supplies... though price at that level is another issue.

Of course once you get the quality of the PSU pinned down, second most important thing to try is to find a PSU where you're operating at 50% load...

More details here:
http://en.wikipedia.org/wiki/80_Plus

Profile X1900AIW
Send message
Joined: 12 Sep 08
Posts: 74
Credit: 23,566,124
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 33654 - Posted: 28 Oct 2013 | 13:09:22 UTC

Example of my four years old PSU, 20-50% mean the range between 130-310 watts. Buying a new one with 90% (4-6% better than now) starts at 75 Euro (PSU with same net power) or 300 kWh (0,25 Euro per kWh). With 5% increased efficiency that is according to my current wattage level (~170 watts) 170*5% = 161 or ~10 watts. Price of 300kWh to reduce 10 watts requires a minimum runtime of 30.000 hours or (current daily runtime in my case max. 14 hours) 2143 days or 5.8 years. It´s too costly than it´s worth. Maybe I should switch to a platinum PSU. Or simply reduce the DC runtime at all.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 33664 - Posted: 29 Oct 2013 | 13:48:04 UTC - in response to Message 33654.

For a good crunching system (say with 2 high end GPU's or 3 mid-high end GPU's) with a power draw of say 500W, you would notice the difference more. The more energy efficient PSU would result in a cooler system, and might save you money in the long run.
To work on my examples and compare a descent TX650M with the high end AX860i, you can see at 500W (230V) the efficiencies are 86% and 94% - a difference of 8%.
8% of 500W is 40W. Running 24/7 you would save 0.96KW.
At a cost of £0.16/KW that's about 15pence a day or £56/year.
The cost of these PSU's in the UK is about £90 for the TX650M and £144 for the AX860i - so it would roughly pay for itself in a year.
Obviously with less power draw and cheaper electric it would take longer, but in some countries the better PSU would pay for itself within 6months or less.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Post to thread

Message boards : Graphics cards (GPUs) : Most cost effective setup?

//