Advanced search

Message boards : Graphics cards (GPUs) : power consumption of contemporary graphics cards

Author Message
ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16245 - Posted: 11 Apr 2010 | 22:04:32 UTC
Last modified: 11 Apr 2010 | 22:05:11 UTC

Measured by XBit-Labs. What sets them apart from others is that they measure the consumption of the cards directly, not the entire system. The Crysis work load should be approximately comparable to GPU-Grid.. or maybe a bit higher.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16379 - Posted: 17 Apr 2010 | 21:21:33 UTC

For only a slight increase in power the HD5870 does far more than GTX260 ... though sadly still only 3 projects that have good support for ATI cards, though if rumor be true progress is being made on SaH Beta at long last for an ATI version ... something I have been meaning to look into ...

Though my next upgrade would be to get one of the new i7 with 12 CPUs and 4 slots for GPU cards ... then slowly replace the Nvidia GPUs with ATI versions ...

Maybe by then there will be an ATI version here also ...

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16393 - Posted: 18 Apr 2010 | 11:01:05 UTC - in response to Message 16379.

Just don't forget that "running on ATI" is different from "running efficiently on ATI". The HD5870 actually has 320 VLIW shaders, each consisting of 5 scalar but not independent units. So if your algorithm is nasty or you program in a bad way you may only extract 1/5th of the peak performance.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16406 - Posted: 18 Apr 2010 | 14:26:05 UTC - in response to Message 16393.
Last modified: 18 Apr 2010 | 14:27:14 UTC

Just don't forget that "running on ATI" is different from "running efficiently on ATI". The HD5870 actually has 320 VLIW shaders, each consisting of 5 scalar but not independent units. So if your algorithm is nasty or you program in a bad way you may only extract 1/5th of the peak performance.

MrS

But much of the time the system works well. On MilkyWay, Collatz and DNETC the ATI cards are several times faster than NVidia. To top it off the equivalent ATI card currently draws much lees power as it crunches 3x - 6x times the WUs on the above projects. I'm running 6 NVidia and 8 ATI now. Was originally all NVidia but the times they are a changing. Maybe it will swing back but I'd guess not in the near future.

Profile Zydor
Send message
Joined: 8 Feb 09
Posts: 252
Credit: 1,309,451
RAC: 0
Level
Ala
Scientific publications
watwatwatwat
Message 16408 - Posted: 18 Apr 2010 | 16:17:51 UTC
Last modified: 18 Apr 2010 | 16:19:53 UTC

ATI cards have it hands down in many ways, but on the power consumption they really stand out a mile now, there is just no contest. The work that my 5970 pumps out for the power consumption beggars belief. With the rest of the ATI range, for $100 less you get vastly more output and less consumption than comparible NVidia cards. There is little choice of course if individuals are wedded to the CUDA road, but otherwise ....

I recently bought my first ATI card since NVidia hit the comsumer market many many moons ago - 15-20yrs? - so I'm hardly an ATI fanboy, and did not make the move lightly. ATI have it in the bag for at least three years now could be more, maybe will swing back after that, who knows. For now tho, on this buying cycle (I usually upgrade graphics every 3-4 years) ATI have it by a wide wide margin on every set of criteria.

Regards
Zy

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16410 - Posted: 18 Apr 2010 | 16:53:20 UTC - in response to Message 16406.

On MilkyWay, Collatz and DNETC the ATI cards are several times faster than NVidia.


Which reinforces my point: it's only the threee of them ;)
And there are 2 reasons that other projects don't follow in large numbers.. and won't, for some time to come:

- programming in CAL instead of CUDA
- making the algorithm work efficiently with the ATI architecture

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16425 - Posted: 19 Apr 2010 | 12:03:43 UTC - in response to Message 16410.

On MilkyWay, Collatz and DNETC the ATI cards are several times faster than NVidia.


Which reinforces my point: it's only the threee of them ;)
And there are 2 reasons that other projects don't follow in large numbers.. and won't, for some time to come:

- programming in CAL instead of CUDA
- making the algorithm work efficiently with the ATI architecture

SaH is also working on an ATI application for AP at least ... I guess it works well enough that some are using it on the production side using anon platform.

PG may also be working on ATI apps as well though the focus there is presently on CUDA especially since the one CUDA application AP26 essentially allowed them to complete the sub-project in record time.

Post to thread

Message boards : Graphics cards (GPUs) : power consumption of contemporary graphics cards

//