Advanced search

Message boards : Graphics cards (GPUs) : GT240x2

Author Message
Profile liveonc
Avatar
Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 16686 - Posted: 30 Apr 2010 | 18:59:11 UTC
Last modified: 30 Apr 2010 | 19:02:23 UTC

Yesterday I was told by a hardware salesman that there was a GT240x2. He wasn't sure about the manufacturer & said that it "might" be POV. I checked out their site & googled, but couldn't find anything that wasn't Chinese. I don't speak Chinese.

Does anyone know who makes the GT240x2 & how much it costs? I'm curious to know if it's cheaper to get 2 of these instead of 1 GTX295, if it'll get as much done, & maybe use less electricity.
____________

Betting Slip
Send message
Joined: 5 Jan 09
Posts: 670
Credit: 2,498,095,550
RAC: 0
Level
Phe
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16688 - Posted: 30 Apr 2010 | 19:11:09 UTC - in response to Message 16686.

Well 2 GT240s will not do as much as 1 GT295

I've not heard anything about anyone putting 2 GT240 chips on one card or sandwiching 2 PCB's together.
____________
Radio Caroline, the world's most famous offshore pirate radio station.
Great music since April 1964. Support Radio Caroline Team -
Radio Caroline

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16689 - Posted: 30 Apr 2010 | 19:11:53 UTC - in response to Message 16686.
Last modified: 30 Apr 2010 | 19:14:01 UTC

Yesterday I was told by a hardware salesman that there was a GT240x2. He wasn't sure about the manufacturer & said that it "might" be POV. I checked out their site & googled, but couldn't find anything that wasn't Chinese. I don't speak Chinese.

Does anyone know who makes the GT240x2 & how much it costs? I'm curious to know if it's cheaper to get 2 of these instead of 1 GTX295, if it'll get as much done, & maybe use less electricity.

This page for the GTX295 shows the benchmark at about 17K, 18K and 16K for the benchmark scores...

This page shows the GT240 benchmark at about 6K or 1/3 or less of the GTX295 ...

Power consumption will be lower as well by a large margin... The 295 is hard to find as I understand it and will be in the $500-600 range, at least it was when I bought two ...

Here is one source Tiger Direct for the card ...

{edit}
The sources you were looking at [b]may[/] have been a dual card with two GT240s on one frame, in which case you would see the cost more than double and the performance double as well, or nearly so ...

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16691 - Posted: 30 Apr 2010 | 20:08:11 UTC - in response to Message 16689.

The GT240 is designed to fit into smaller cases, so that is its place.
Sticking two together would make it perform slightly better than a GTX260, but only just. There would be a place for such a card as there is a gap between the GT240 and the GTX260, in terms of cards using a G200 core, that work well here, and it would be more economic to run than a GTX260. That said it is only CC1.2 and there is likely to be lesser Fermi cards that would fill the role better (GTX440 or something), and Fermi is CC2.0.

Profile liveonc
Avatar
Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 16694 - Posted: 30 Apr 2010 | 20:20:11 UTC - in response to Message 16689.

It's not two GT240 that I'm looking for, it's two dual GT240's. If there is one out there, it would need a PCI-E 12V connector, which would mean that I could try to OC it. If so, then maybe a dual GT240 could perform as well as a GTX275, which means that 2 dual GT240's could get as much done as a GTX295.

That at least was what I had in mind, & if the price of a dual was 1/3 the price of a GTX295 & consumed less power, then it starts to make sense.
____________

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16695 - Posted: 30 Apr 2010 | 20:28:08 UTC - in response to Message 16694.

I thought that.
Actually my quad GT240 system almost does the same amount of work as two GTX275's, slightly more than a GTX295 :)
It does not need any additional power connectors, but a dual card would be better with one, as it would give you more room to clock.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16711 - Posted: 1 May 2010 | 8:51:26 UTC

Never heard of such a card. It probably wouldn't make much sense for the manufacturer, as these things are made for games. Here a dual GT240 would have to face the less-than-ideal scaling via SLI (contrary to GPU-Grid) and the memory overhead due to replicating data for both chips (one larger chip woudln't have to do this). So unless a GT200 chip is considerably more expensive than 2 GT215 chips (for GT240) I can't see manufacturers even considering such an option.

Performance-wise it should be quite good at GPU-Grid, though: it brings along 2 x 96 = 192 shaders at 1.35 GHz, whereas the old GTX260 ran at 1.25 GHz. To tie a GTX275 (240 shaders) it would have to run at ~1.75 GHz, though. 4 GT240 should need approximately 1.56 GHz to tie a stock GTX295 while consuming an estimated 4 x 80 W = 320 W versus 289 W for the GTX295; going by TDP numbers and increasing the GT240 base value of 69 W at 1.34 GHz linearly for an OC of 1.56 GHz. Actual GPU-Grid power consumption will be lower for both cards. And since the architectures are quite similar we can assume a similar reduction factor for both.

MrS
____________
Scanning for our furry friends since Jan 2002

Snow Crash
Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16721 - Posted: 1 May 2010 | 15:03:27 UTC - in response to Message 16695.
Last modified: 1 May 2010 | 15:05:07 UTC

I thought that.
Actually my quad GT240 system almost does the same amount of work as two GTX275's, slightly more than a GTX295 :)
It does not need any additional power connectors, but a dual card would be better with one, as it would give you more room to clock.


I took a look through the stats and they tell a different story, my properly running 295 at 1476 will outperform a quad of 240s by at least 2 and maybe even 3 WUs per day. If I crank it up the gap would be even larger (I can run it at 1656 stable). For the 275 and 295 to produce less they would have to be at stock (or lower) but you have your 240s OC'd to 1620 so I wonder about the comparison you are making. I'm not saying they are not good cards, they are, I'm just trying to make sure we present accurate stats to the best of our ability to help people make their own decisions.
____________
Thanks - Steve

MarkJ
Volunteer moderator
Volunteer tester
Send message
Joined: 24 Dec 08
Posts: 738
Credit: 200,909,904
RAC: 0
Level
Leu
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16722 - Posted: 1 May 2010 | 15:42:54 UTC - in response to Message 16686.

Yesterday I was told by a hardware salesman that there was a GT240x2. He wasn't sure about the manufacturer & said that it "might" be POV. I checked out their site & googled, but couldn't find anything that wasn't Chinese. I don't speak Chinese.

Does anyone know who makes the GT240x2 & how much it costs? I'm curious to know if it's cheaper to get 2 of these instead of 1 GTX295, if it'll get as much done, & maybe use less electricity.


I notice that Google mentioned a Galaxy card. Now they have a dual-core GTS250 which may be causing some confusion. Certainly no dual-core GT240 mentioned on the Galaxy web site.
____________
BOINC blog

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16723 - Posted: 1 May 2010 | 16:00:22 UTC - in response to Message 16721.
Last modified: 1 May 2010 | 16:27:07 UTC

As usual my opinions are somewhat tunnel versioned; A dual GT240 would not make for an especially good gaming card, but the G200 cores are on their way out. Last I heard manufacturing was all but over for most of the cards.
So that just leaves the gap between the GT240 and GTX260 to be filled with older 55nm or 65nm cards such as the GTS250.
I can’t see NVidia wanting to keep building with such old manufacturing techniques, when at the same time phasing out the G200 GPUs.

The TDP of a GT 240 is 69W, so that is 276W for 4 cards.
When natively clocked, I found that my Two GT 240s (DDR5) did slightly more work than my slightly overclocked GTX260sp216, which in turn, did slightly more than the reported work of a natively clocked GTX295 (on similar operating systems).
Running 4 GPU tasks, as opposed to zero GPU tasks and zero GPUs in the system, I measured a difference of about 243W. So as you said MrS, the actual running of the card “as a system” works out less, about 61W per card.

Steve, you are quite right to point out that I was talking about my overclocked GT240s and a natively clocked GTX295. However if you look at my GT240 system, and take the RAC, it is misleading because it has not yet levelled out since I moved the cards about (still rising). Another thing to take into consideration is the operating system (your streamlined XP vs my heavily used Vista system).
Although my GT240s are now overclocked, I am using stock voltages and the shaders are at 1.625GHz, so my OC is limited, and there is not an absolute need to rise to 80W per card. Of course a dual card using about 2 X 80W would be nice and would surely bring the reference clocks close to 1.7GHz as well as raising the GPU and RAM.

My 1.625GHz shaders brought the Boinc GFlops up from 281 to 312 peak (note that DDR3 cards are 257, 9% less than the native DDR5 version and 21% less than my OC’d GT240’s):
01/05/2010 14:35:14 NVIDIA GPU 0: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 312 GFLOPS peak)
01/05/2010 14:35:14 NVIDIA GPU 1: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 312 GFLOPS peak)
01/05/2010 14:35:14 NVIDIA GPU 2: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 312 GFLOPS peak)
01/05/2010 14:35:14 NVIDIA GPU 3: GeForce GT 240 (driver version 19621, CUDA version 3000, compute capability 1.2, 512MB, 312 GFLOPS peak)

I would guess 4 GT240s (GDDR5) could do about 5% more than a GTX295 (at reference clocks), or at least come very close to matching a GTX295, and if you can run the cards with 1.625MHz shaders then that could rise to anywhere between 10 and 25% (if compared on the same operating system and to a reference GTX295, which of course might very well clock better). Four GT240 DDR3 @ 257 Boinc GFlops peak would not fair so well; about 5% less than the native GTX295, and substantially less than a highly overclocked GTX295. So it is an apples and oranges comparison.

Anyway, GT240’s are easier to find and cheaper to buy than GTX295’s, or Fermi’s for that matter!
Also, if a GT240 breaks you still have 3 running and it would only be £50 to £70 to replace one, whereas a GTX295 is not so readily replaceable.
On the other hand you could have 4 highly overclocked GTX295’s in the one box, and blow everything else away, right up until there is a highly optimized Fermi app.

Snow Crash
Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16726 - Posted: 1 May 2010 | 18:34:15 UTC

I was not relying on RAC or the BOINC benchmarks, I looked at the time it took to process individual WUs. At the end of the day what it comes down to is simply how long does it take to process a WU. Clock for clock a 295 will process more than a quad of 240s.
You can play with the numbers all you want about who is OC'd, by how much, I have 480 shaders vs your 384, but your DDR5 is faster than my DDR3, OS, cpu, HT on/off ... it just does not matter. A 295, clock for clock will beat a quad of 240s for output. Finally, like I said before, I think the 240s are good cards, I just think you are needlessly trying to paint the output of 4 of them as better than a 295.
____________
Thanks - Steve

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16727 - Posted: 1 May 2010 | 19:04:23 UTC - in response to Message 16726.
Last modified: 1 May 2010 | 19:05:02 UTC

At the minute there are many different WUs. Some long, some short and some Betas.
Linux is clearly faster at the minute.

My four OC'd GT240s are already bringing back about 43K RAC per day, still rising and should even out at over 50K; One of my other GT240s brings back over 14K per day, so I could reach 56K.
From what I can see, your 1 OC'd GTX285 + OC'd GTX295 system brings back 75K daily, and a GTX285 does more work than 50% of a GTX295.

Profile liveonc
Avatar
Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 16728 - Posted: 1 May 2010 | 19:21:28 UTC - in response to Message 16711.
Last modified: 1 May 2010 | 20:04:52 UTC

Never heard of such a card. It probably wouldn't make much sense for the manufacturer, as these things are made for games. Here a dual GT240 would have to face the less-than-ideal scaling via SLI (contrary to GPU-Grid) and the memory overhead due to replicating data for both chips (one larger chip woudln't have to do this). So unless a GT200 chip is considerably more expensive than 2 GT215 chips (for GT240) I can't see manufacturers even considering such an option.

Performance-wise it should be quite good at GPU-Grid, though: it brings along 2 x 96 = 192 shaders at 1.35 GHz, whereas the old GTX260 ran at 1.25 GHz. To tie a GTX275 (240 shaders) it would have to run at ~1.75 GHz, though. 4 GT240 should need approximately 1.56 GHz to tie a stock GTX295 while consuming an estimated 4 x 80 W = 320 W versus 289 W for the GTX295; going by TDP numbers and increasing the GT240 base value of 69 W at 1.34 GHz linearly for an OC of 1.56 GHz. Actual GPU-Grid power consumption will be lower for both cards. And since the architectures are quite similar we can assume a similar reduction factor for both.

MrS


Looks like nobody knows of such a card, pity. I don't agree though that there wouldn't be a place for such a card. G92b was still sold even after the GT200 came out. That you compare the power consumption of two GT240 (69W) with the power consumption of a GTX295 (289W), instead of two GTX275 (219Wx2=438W), where there was (roughly) a 66% power reduction when you put two GTX275 to make a GTX295. So if the power consumption of a dual GT240 could also save 66%, that would be (roughly 91W).

Matrox can get away selling GPU's where you can attach many monitors to it. If a affordable dual GT240 could service 4 screens & you could put 4 of them in a modern PC. You've got the potential to attach 16 screens on one PC, save power, crunch, & save money.

If 4xDualGT240 could get as much done as 2xGTX295 & uses 91W per card. That's 364W compared to 578W which is roughly 59% or 63% power saving for the same amount of work. That's the opposite direction that Fermi is going & might be that "hole" which Nvidia can fill to argument the validity of such a card.
____________

Snow Crash
Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16731 - Posted: 1 May 2010 | 20:28:50 UTC - in response to Message 16727.
Last modified: 1 May 2010 | 20:31:04 UTC

At the minute there are many different WUs. Some long, some short and some Betas.
Linux is clearly faster at the minute.

My four OC'd GT240s are already bringing back about 43K RAC per day, still rising and should even out at over 50K; One of my other GT240s brings back over 14K per day, so I could reach 56K.
From what I can see, your 1 OC'd GTX285 + OC'd GTX295 system brings back 75K daily, and a GTX285 does more work than 50% of a GTX295.


I did not include the returns from my 285.
I compared muliple WU of the same types for both of us.
I threw out highs and lows for both of us.
I came up with an estimate of 2-3 WUs per day better for a 295.

But that does not support your initial thought so now, after saying RAC was not a good indicator you use *projections* of what your RAC *may*, *might*, *should*, and *could* be and compare those fuzzy numbers to my actual RAC? I was not running the project for a few days so your assumption that my RAc has maxed out is not correct.

If we go back in time to when my 295 was all by itself my RAC was up over 68k per day. Not a projection,. it actually was. So lets compare ... a fuzzy 50+k vs 68k real ... hmmm ... yup, that would be the 2-3 WU estimate I provided based on real, actual, valid, WUs we both returned.

The 240s are good cards but 4 of them can not match a 295.
Time for me to move on.
____________
Thanks - Steve

Profile liveonc
Avatar
Send message
Joined: 1 Jan 10
Posts: 292
Credit: 41,567,650
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwat
Message 16732 - Posted: 1 May 2010 | 21:25:27 UTC - in response to Message 16731.
Last modified: 1 May 2010 | 21:26:15 UTC

I did not include the returns from my 285.
I compared muliple WU of the same types for both of us.
I threw out highs and lows for both of us.
I came up with an estimate of 2-3 WUs per day better for a 295.

But that does not support your initial thought so now, after saying RAC was not a good indicator you use *projections* of what your RAC *may*, *might*, *should*, and *could* be and compare those fuzzy numbers to my actual RAC? I was not running the project for a few days so your assumption that my RAc has maxed out is not correct.

If we go back in time to when my 295 was all by itself my RAC was up over 68k per day. Not a projection,. it actually was. So lets compare ... a fuzzy 50+k vs 68k real ... hmmm ... yup, that would be the 2-3 WU estimate I provided based on real, actual, valid, WUs we both returned.

The 240s are good cards but 4 of them can not match a 295.
Time for me to move on.


When comparing GPU's, you have to include the same CPU, RAM, & OS. I've just recently moved two GTX260's from a Windows 7 PC to one using Linux, which is RAM intensive & used to put tasks from GPUGRID.net on hold several times a day while it was waiting for more RAM, so I switched the RAM as well from the Windows 7 over to the Linux PC. I had some heat issues & forgot to increase the voltage of the NB, but now it's crunching & soon I hope to get a RAC of 54-56K on it, while the RAC on Windows 7 with 4GB RAM & two GTX260 RAC maxed out at 48K. That's a difference of up to 8K, just because it's running on Linux. skgiven I saw used Vista for his Quad GT240, that's as I remembered worse than Windows 7.
____________

=[PULSAR]=
Send message
Joined: 22 Feb 10
Posts: 9
Credit: 16,172,951
RAC: 0
Level
Pro
Scientific publications
watwatwatwatwatwat
Message 16733 - Posted: 1 May 2010 | 21:54:03 UTC

Maybe he was mistaken there is a GTS250x2 coming out though. That was in the beginning of March no update on a release date though and it is suppose to be released by Zotac.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16734 - Posted: 2 May 2010 | 1:27:31 UTC - in response to Message 16733.

Snow Crash, all you had to do was actually say what your daily return was; but you chose not to - you kept your cards to your chest. Why?
If you have a point just spit it out and say it. You are not helping!

By the way it was you that argued that XP was much better than Vista and W7; selective memory perhaps?

Oh, and per chance, are you now comparing your over clocked card, and not a natively clocked card? You know, just to keep everyone informed!
Voltages?

As I clearly stated, 2 GT240's (GDDR5) do more work than a GTX260sp216 - measured on the Exact same system and setup!

Your argument that CPU, Operating System, and x86/x64 does not matter is just wrong - so who is misleading who now?


Profile Bikermatt
Send message
Joined: 8 Apr 10
Posts: 37
Credit: 3,839,902,185
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16737 - Posted: 2 May 2010 | 1:56:59 UTC

While we’re talking about GT 240s, have any of you compared these cards based on credit per watt? I am running two GT 240s right now and getting around 25K per day. I have my system on a kill o watt and each card only adds 50 watts to my system when crunching. So I guess you could say I’m getting 25K per 100 watts? Anyway, I am kind of an efficiency freak and I am running a MSI 790fx-gd70 so I can run 2 more cards if I want too. The last GT 240 I bought was only US $48 shipped so they seem like an incredible value for crunching right now if you don’t mind dealing with rebates. I am planning on picking up another 240 next time I see a great deal on one, but I was wondering if I should hold off on my fourth slot. Anyone have any idea how the Fermi is going to compare on a credit per watt basis?

Matt

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16742 - Posted: 2 May 2010 | 12:09:54 UTC - in response to Message 16737.

I have a GTX470. It's TDP is 215W and its idle power consumption is 33W.
The GTX 480 has a TDP of 250W, and an idle power consumption of 47W. also.
For comparison,
- GTX275 TDP 219W
- GTX285 TDP 236W
- GTX295 TDP 289W

As for real world usage (on GPUGrid):
When the Fermi in my system is crunching, the system uses about 375W.
When the Fermi is not crunching it uses about 275W.
So when crunching it uses an extra 100W (in its present form), compared to being idle, which suggests the card only uses 137W to crunch on GPUGrid (mind you I'm having issues with the performance of this card, so no guarantees until I test it in another system).

I could work out what the idle power usage is if I swap it with a GT240:
The GT240 card only uses about 3.5 or 6W in itself, when idle, but the system uses an additional 10W in total; what we really need to know.

Initial figures suggest that the GTX480 is about 35% faster than a GTX285.
The GTX470 is about 21% slower than a GTX480, which would make it about 14% faster than a GTX285, at the minute!

So, a GTX470 can already do more work than a GTX285 (should tasks turn up) and uses less power :)
However, the present application has not been refined and developed for Fermi yet.
I expect they will do some basic refinements and release a rudimentary Fermi app, and then work on Fermi refinements over several months.
It might take CUDA refresh releases before we see a very efficient Fermi app.


Going back to compairing a GTX295 with 4 GT240's (much less fun):
Paul’s GTX295 (at stock & reference) on a Q9300 and W7 64bit:
Run Time 42,886.53
Credit 11,931.63
http://www.gpugrid.net/workunit.php?wuid=1377976
http://www.gpugrid.net/show_host_detail.php?hostid=52239
24X60X60= 86400seconds in a day.
86400 / 11931.63 = 2.0146 tasks per day.
Since the GTX295 has 2 cores, that is 4.029 tasks per day.
Or, 11931.63 X 4.029 = 48K credits per day.

My OverClocked GT240 with GDDR3 (now one of the 4 cards in the same system) Phenom II 940, W7 64bit:
82,240.24 sec for 11,931.63 points on a GT240
1.0505 tasks per day
Or 12535 points per day for that one card. Or 50,140 points per day.

Another one of my OverClocked GT240s, this time with GDDR5 (also one of the 4 in the same system):
30,686.46 sec for 5,144 credits
...
14484 credits per day for that one card.
Or 57939 points per day for 4 of these GDDR5 cards.


Overclocked GTX295 running on a faster operating system (XP) and supported by a faster CPU (i7-920)
12,861.44 sec for 4,247.95 credits:
57073 points per day.
http://www.gpugrid.net/workunit.php?wuid=1422132
http://www.gpugrid.net/result.php?resultid=2254916

I think the RAC is more informative because some tasks perform better than others. There is likely to be variation between card types and relative task crunching performance; a GT240 might crunch one task more quickly than another, but get the same points, for example.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16745 - Posted: 2 May 2010 | 13:52:53 UTC
Last modified: 2 May 2010 | 13:57:57 UTC

Guys.. come on, don't let this discussion over hard facts turn into personal insults. And let's get a few things straight:

- OS, driver, CPU and whatever do influence the real world performance in some way. But they do so in a similar way for all cards within the same system, so as long as we're not discussing RAC values this doesn't matter (I don't intend to).

- The GT200 and GT215 chips have a very similar architecture (except no double precision for GT215, which doesn't matter here) and thus their shaders perform similarly, on a per clock basis.

- Neither of these cards is particularly memory bandwidth limited (not talking about DDR3 GT240). GT240 with GDDR5 has 386 GFlops and 54.4 GB/s bandwidth, i.e. 7.09 GFlop/GB. GTX295 has 2*894 GFlops and 2*119.9 GB/s, i.e. 7.46 GFlop/GB. That means the GTX295 has a little more memory bandwidth per raw compute capability, but both are clearly in the same league. To put things into perspective: a GT240 with DDR3 would be at 12.06 GFlop/GB, which I think we established costs about 15% GPU-Grid performance compared to the GDDR5 version. Based on linear interpolation one can drive a correction factor of 1.1% for the GTX295, but I'll ommit this.

Summing this up we can clearly say that in order to judge the cards performance potential it's sufficient to look at maximum theoretical GFlops. Real world performance will drop by a factor x with 0 < x < 1, but it will be the same x for both chips / cards (it's different for compute capability 1.1 cards, though).

Now where does that get us?

We've got GTX295 at stock speeds bringing in 2*894.2 = 1788.4 GFlops. Four GT240 GDDR5 land at 4*385.9 = 1543.6 GFlops, i.e. a factor of 1.16 lower. Conversely the GT240 has to be clocked 16% higher to tie a stock GTX295 and to make up for the lower amount of shaders. Make it 15% if you want to account for the memory bandwidth differences. 16% would be the 1.56 GHz shader clock for the GT240 I mentioned in my previous post. OC the GTX295 and you quickly reach the clock speed region GT240 can not reach. So I think we can all agree on:

I'm not saying they [GT240] are not good cards, they are, I'm just trying to make sure we present accurate stats to the best of our ability to help people make their own decisions.


Regarding power consumption: let's stick to nVidias numbers. They do not represent what you'll get under GPU-Grid, but at least they're somewhat consistent among the different cards. Power draw differences measured at the wall are nice for each specific system, but they include PSU efficiency and idle power draw and are thus not very comparable. Think of it this way: actual power draw under GPU-Grid will be some factor x lower than nVidias number and again this x will be similar for both cards due to their architectural similarities.

Now where does that get us?

GTX295 is rated at 289 W, whereas GT240 is rated at 69 W, i.e. 276 W for 4 of them. But then we're not comparing stock cards: remember that GT240 has to be clocked ~15% higher to reach similar performance. Power consumption increases linearly with frequency (I could elaborate on this, but I suppose it's not neccessary here), i.e. 69 W * 1.15 = 79 W (or 80 W if you use 16%). So there is a need to factor in the power consumption increase due to overclocking since now the 4 GT240 cards need 316 W (or 320 W).

There's one big uncertainty in this, though: nVidia tends to rate their dual GPU cards more conservatively than the single chip cards. Which makes sense for games, since under SLI both chips are always going to be used less than a single one. Taking a look at actual card power consumption here shows clearly what I mean:
Und Crysis Warhead a GTX275 consumes 220 W (rated at 219 W) and a GTX295 consumes 312 W (rated at 289 W), whereas under OCCT (which uses all chips to the same extent) the GTX275 draws 218 W and the GTX295 draws 401 W. This does make more sense, as the GTX295 is actually 2 slightly lower clocked GTX275 with less memory / channels. Notice that GT240 is only measured at 45 W here, though I'm not sure if driver throttling is kicking in newer measurements (=newer drivers).

Summarizing everything I'd say:
- quad GT240 performance is quite good, but can not match GTX295 if both are OC'ed
- GT240s should be more readily available and even 4 of them are probably cheaper
- power consumption is at worst approximately tied between both solutions, but I tend to think the 4 GT240 will have a significant advantage

MrS

EDIT: when ever I wrote 285 I probably meant 295 :p
And there's not much point dicussing power efficiency of Fermi yet. Performance will increase, we just don't know yet when and by how much.
____________
Scanning for our furry friends since Jan 2002

Profile Paul D. Buck
Send message
Joined: 9 Jun 08
Posts: 1050
Credit: 37,321,185
RAC: 0
Level
Val
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 16747 - Posted: 2 May 2010 | 15:46:41 UTC

Assuming I am the Paul referenced...:)

None of my GPUs are over-clocked beyond what they were set to by the MFGR. Part of that is philosophy (since I am interested in accurate science I have trouble with the idea of running my system's components in potentially unstable portions of the "flight-envelope" of that component), and the other is quite simply heat ... I already have a situation where the room gets too hot as it is ... OC would just make a bad situation worse ...

So, it is quite probable, as ETA notes/implies, that I could get higher performance as well had I chosen to do so ...

So, the comparison is of OC vs. stock ... let the argument continue ... :)

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16755 - Posted: 2 May 2010 | 21:13:00 UTC - in response to Message 16747.
Last modified: 2 May 2010 | 21:13:17 UTC

Measured the actual idle power consumption of my GTX470 to only be 28W (this includes the power used by the motherboard to support the card).

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16761 - Posted: 3 May 2010 | 1:06:32 UTC - in response to Message 16755.
Last modified: 3 May 2010 | 1:08:12 UTC

Measured actual power usage difference of 4 GT240s when natively clocked and OC'd to; GPU 610MHz, GDDR5 1800MHz, Shaders 1625MHz:
9W total (yes that is for All 4 cards combined)!

Measured actual power usage of the above 4 overclocked GT240s when running GPUGrid tasks:
System used 340W when crunching GPUGrid tasks
System used 190W when not crunching GPUGrid tasks

Did that while crunching 3 CPU tasks, and then without CPU tasks:
299W crunching only GPUGrid tasks
145W crunching no tasks of any kind

Each costs the system 10W when idle, so the total Watt usage of 4 OC'd GT240's crunching on GPUGrid is 192W.
Four natively clocked cards use about 183W.

These figures should transfer well into other systems. But people should be more concerned about overall system efficiency than the card. A GT240 in an i7-980X is a poorly balanced system.

I'm going to look at the actual performance difference from one of the cards when natively clocked to the other overclocked cards, to see what I get for that extra 9W.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16795 - Posted: 3 May 2010 | 22:47:20 UTC - in response to Message 16761.

Well,
It would appear that under Vista, with optimized conditions and using Four natively clocked GT240's (GDDR5), you could get 53K per day at GPUGrid, as is.

Four overclocked GT240's could bring 56K, as is (several restarts may have held these figures back),
and the Beta app (6.22) could see 66K with 4 overclocked GT240's (again on Vista).

If XP is 11% faster, as reported;
4 natively clocked GT240's could get 58K per day on XP
and 4 OC'd GT240's could get 62K per day (72K with the new app).

WhiteFireDragon
Avatar
Send message
Joined: 22 Jun 09
Posts: 5
Credit: 74,526,885
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 16796 - Posted: 4 May 2010 | 2:07:44 UTC

i'd like to say hi to everyone, it's my first post on this forum.

i'm going to chime in on my numbers, but i don't want to add any more fuel. i was looking for a crunching card only and didn't know what to pick. i should have seen these forums and this thread sooner, as it would have saved me a lot of time. i did my own research and based on the specs, power consumption, and price, i concluded myself that the gt240 is the best value.

when i plugged it in and after it returned a few WU, the results were better than i expected. on win7 64bit and while the CPU was crunching at 100% for WCG as well, this card finished a WU in exactly 8 hours for 5145 points. at this rate, it's at least 15k/day. after slightly OC'ing it and running beta units only, i finish a WU in 21.7 minutes for 281 points. this is projection of 18.7k points per day.

with 4 of these, that's 60k/day for regular WU, and 75k/day for beta WU's. if i OC higher, use linux/XP, or pull off WCG to dedicated a CPU thread per GPU, then that will be a huge performance boost also. i don't have a gtx295 to compare using the same exact parameters, but from the results posted so far, 4 of these budget cards beats the 295 in price (got mine for $50 AR), power consumption (mine loads at 45w) , and computational power (15k/day per card).

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16805 - Posted: 4 May 2010 | 11:50:40 UTC - in response to Message 16796.

Welcome to the forum,

Like your numbers :)

You are bringing some good systems with you.

Your GTX480 should do well in that 17-930, especially if you leave a thread free for it.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 16878 - Posted: 6 May 2010 | 20:15:13 UTC

SK, the additional ~45W under GPU-Grid load you measured for a GT240 match very well to the power consumption number posted in that article - very nice!

MrS
____________
Scanning for our furry friends since Jan 2002

bigtuna
Volunteer moderator
Send message
Joined: 6 May 10
Posts: 80
Credit: 98,784,188
RAC: 0
Level
Thr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 17080 - Posted: 15 May 2010 | 23:09:53 UTC

My position on 4x GT240 vs 1x GTX295:

Looks like roughly a tie except for one thing...

The GTX295 is about $500 while you can pick up a GDDR5 GT240 for $45-$60 each (after rebate).

Just ordered an XFX GT240 for $85 with 2 free softwares and a $30 rebate. Total cost after rebate $55 (free shipping, no tax).

Unfortunately rebates tend to be limited per household so I only got one.

My Gigabyte GDDR5 GT240 was only $45 after rebate.

At $55 each, 4 GT240 cards would cost $220.

Then again the GTX295 will fit in a single slot motherboard which are cheaper than the 4 slot main boards it would take to run 4x GT240s. Still 4x GT240s would end up being quite a bit cheaper...

If anyone is interested in the XFX GT240 @ $55 (after rebate):

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150452

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17082 - Posted: 16 May 2010 | 0:35:53 UTC - in response to Message 17080.

Several GT240 being so much cheaper is probably the point why the comparison has been brought up in the first place. And I think real GPU-Grid power consumption also favors the newer cards, by quite a margin.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 17084 - Posted: 16 May 2010 | 10:23:58 UTC - in response to Message 17082.

My four GT240s are now crunching 6.72 tasks (had to use an older driver on Vista 64bit). With only the shaders clocked high (1600MHz, the GPU core and RAM is native), these 4 cards could could get over 70K credits per day, on Vista! 86400/33000*6756=70.7K

- 2336422 1476441 15 May 2010 17:56:50 UTC 16 May 2010 3:12:30 UTC Completed and validated 32,999.91 3,178.07 4,503.74 6,755.61 Full-atom molecular dynamics v6.72 (cuda)


Post to thread

Message boards : Graphics cards (GPUs) : GT240x2

//