Advanced search

Message boards : Graphics cards (GPUs) : GTX 960

Author Message
Profile Francois Normandin
Send message
Joined: 8 Mar 11
Posts: 71
Credit: 654,432,613
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 39360 - Posted: 1 Jan 2015 | 23:54:07 UTC



http://www.digitaltrends.com/computing/nvidia-expected-to-unveil-geforce-gtx-960-on-january-22/

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39370 - Posted: 2 Jan 2015 | 18:28:49 UTC - in response to Message 39360.
Last modified: 2 Jan 2015 | 19:37:09 UTC

My guess is that a GTX 960 (GM206) will turn up towards the end of January.
It looks like it will be 4GB DDR5,256 BIT,993/6008, and I’m expecting 1280 CUDA Cores.
In some respects a 3GB version would make more sense but NVidia will likely fill the GF 900 range out a bit more around March and then again over the summer, so in due course there will be a greater variety.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39375 - Posted: 2 Jan 2015 | 22:53:24 UTC - in response to Message 39370.
Last modified: 2 Jan 2015 | 22:54:01 UTC

Count another guess for 192 bit memory bus with 3 GB GDDR5.

MrS
____________
Scanning for our furry friends since Jan 2002

eXaPower
Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39379 - Posted: 2 Jan 2015 | 23:40:05 UTC - in response to Message 39375.

A GM206 might be a 1024/8SMM or... GTX960 variant could possibly be a 10SMM GTX 970m (GM204/1280CUDA/80TMU/48ROP/192bit/3GB)or a GTX960ti variant- GTX980m at GM204/1536CUDA/96TMU/64ROP/256bit/4GB

Maxwell's SMM consists of 4ROP and 8TMU while GPC is 4SMM/32TMU/16ROP.
Every *6* NVidia released contains numerous specs. For example: Kelper's GTX 660 has three different (960/80TMU/24ROP/192bit and 1152/96TMU/32ROP/256bit and GTX660ti with 1344CUDA/112TMU/24ROP/192Bus) GTX760 also has three different dies - 2 with 256bit bus and another at 192bit. Fermi is the same as was GT200 series along with 860 Mobile- one is a Kelper 1152CUDA/96TMU/16ROP and other is a 640CUDA/40TMU/16ROP Maxwell.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39381 - Posted: 3 Jan 2015 | 14:13:55 UTC - in response to Message 39379.
Last modified: 3 Jan 2015 | 14:40:42 UTC

I was going by this Zauba shipment,

    GRAPHIC CARD GTX 960 4GB DDR5,256 BIT,993/6008 HDCP,DUAL DVI,HDMI,DP P PAKC (PCI,PCB POPULATED,VGA CARD,COMP. ACCESS)

https://www.zauba.com/import-GTX960-hs-code.html
Originally there was speculation of a Sept/Oct 960 release followed by an apparent U-turn due to good 980/970 sales. On reflection that '960' was probably an Engineering Sample, of a GTX970 with 384shaders disabled and is probably only what might have been. I'm not sure that NV were ever going to start shipments by Oct but if they did change their minds (after some sales modelling; if 960's would eat into 970 sales/profits) the specs that appeared once over 3months ago might be a long way from what might appear in a few weeks. Odd that they didn't call it a 965 (akin to the 465). While the above might still see the light of day in the form of a GTX965 or 960Ti, a 1280, 3GB, 192bus GM206 version would make more sense for a scaled down version of a GM204.

In terms of spec speculation just about every realistic combination of bus, memory, ROP, TMU and shader count has been predicted.

I've seen 960 speculations as low as a 128bit bus and 2GB GDDR5, but to me that sounds more like specs from a would be GTX950Ti.

Given the fact that 1280:80:48 mobile versions already exists (970m) I would fully expect something like that to turn up. The 970m comes with 3GB or 6GB GDDR5 and a 192bit bus. The big brother to this is 1536:96:64, with either 4GB GDDR5 or 8GB, but a 256bit bus. Both of these are GM204 but a discrete GM206 model without disabled units could well have 1280shaders. As for memory and bus, it and its lesser siblings (GM207, GM208) could have anywhere from 2GB to 8GB of GDDR5 and anything from a 64bit bus up to 256bit.

As soon as we know the actual 960 specs we should have a realistic idea of performance.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Profile Francois Normandin
Send message
Joined: 8 Mar 11
Posts: 71
Credit: 654,432,613
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 39382 - Posted: 3 Jan 2015 | 14:22:46 UTC
Last modified: 3 Jan 2015 | 14:23:05 UTC

How 2 GTX 960 will perform over a GTX980? if the price and performance are and energy effiency can match (using just one 6/8-pin?) with some overclock room. I'm in.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39383 - Posted: 3 Jan 2015 | 14:46:28 UTC - in response to Message 39382.
Last modified: 3 Jan 2015 | 14:55:39 UTC

Until we see the actual specifications we can only speculate. A 128bit bus or a boost cap could cripple it as an SP compute card, or it could come with a 256bit bus, 3 or 4GB RAM and boost really well. A dud or a lean mean crunching machine...
If it's got 1280shaders and a decent bus then two GTX960's are likely to outperform a GTX980.
As for pricing, I've seen a ~15% 980 price drop recently for some cards; from £449 to £389 (for new stock).
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39384 - Posted: 3 Jan 2015 | 18:19:25 UTC

I agree: that Zaube shipment from last fall was probably an egineering sample based on GM204, hence the 4 GB and 256 bit bus. It probably represents the performance nVidia was targeting for GTX960 back then. The unusually low memory clock could well simulate highly clocked GDDR5 on a 192 bit bus.

The performance can be expected approximately in the middle between GTX750 and GTX980, so 2 960 vs. 1 980 is still undecided.

MrS
____________
Scanning for our furry friends since Jan 2002

eXaPower
Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39393 - Posted: 5 Jan 2015 | 14:43:00 UTC

128bit bus confirmed for "GTX960" with decent clocks. Also- GTX960ti and GTX 965ti variants will be released soon after.

http://www.guru3d.com/news-story/nvidia-geforce-gtx-960-starts-listing.html

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39395 - Posted: 5 Jan 2015 | 15:40:00 UTC - in response to Message 39393.

128 bit is barely enough for GM107 - things are certainly getting interesting! It could also be misinformation of those shops, or speculation on their part. nVidia can implement 192 bit busses with 2 and 4 GB, as they did with GTX660Ti.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39403 - Posted: 6 Jan 2015 | 20:22:46 UTC - in response to Message 39395.
Last modified: 7 Jan 2015 | 2:58:12 UTC

Today at CES, Aorus announced a forthcoming (2nd Q) laptop with dual GTX965M GPUs. It comes with a 15.6" 3840*2160 (4k) screen. Basically it's a £2K games console!

http://www.kitguru.net/laptops/anton-shilov/nvidia-geforce-gtx-965m-spotted-in-drivers/

965M Specs:
GM204, 1024 cuda cores, 924 MHz GPU +Boost, 128 Bit, 5GHz, 2 to 4GB GDDR5.
http://www.notebookcheck.net/Nvidia-GeForce-GTX-965M-joins-the-product-portfolio.134107.0.html



It wouldn't surprise me if a GM206 GTX960 turned up with similar specs, followed by a Ti in a couple of months.
With only 1024 shaders it wouldn't suffer as much with a 128bit bus, but it would be well off the performance of a GTX970 - about 2/3rds.
This would be similar to the relative GTX670 and GTX660 performances.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39405 - Posted: 6 Jan 2015 | 21:59:57 UTC - in response to Message 39403.

Wow, they're cutting GM204 in half? That's a seriously expensive proposition (for nVidia). I suppose this 965M will transition to GM206 as quickly as possible.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39406 - Posted: 7 Jan 2015 | 3:34:31 UTC - in response to Message 39405.
Last modified: 8 Jan 2015 | 18:50:52 UTC

Yes, that would be half the cuda cores and half the bus of a 980, so I expect performance to be around half. The 760 had half the cores of a 780, but used the same bus width.

videocardz.com said of the 960, "The only certain thing is the memory configuration of 2GB GDDR5 and 128-bit interface".
http://videocardz.com/54214/exclusive-msi-geforce-gtx-960-gaming-2g-and-gtx-960-100-million-edition-pictured

They also say the 960 will be GM206-300 and going by pictures of the die size will likely have between 8 and 10 SMMs; 1024-1280 CUDA Cores. So that means 1024, 1152 or 1280.
http://videocardz.com/54201/exclusive-nvidia-maxwell-gm206-pictured
That in itself suggests a would-be 960Ti would not be GM206.

With 1024 cores its performance if it scales might be around that of a GTX 660Ti or a GTX 670 but for 1/3rd less power (or better).

Whatever turns up it will give people something to choose between a 750Ti and a 970, and that's a good thing.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39435 - Posted: 10 Jan 2015 | 5:19:49 UTC - in response to Message 39406.
Last modified: 10 Jan 2015 | 7:00:44 UTC

Well, the GTX960 WILL be 2GB, 128bit, 1177 to 1240MHz ref boost GPU, 3500MHz/7GHz GDDR5:

Date HS Code Description Origin Country Port of Discharge Unit Quantity Value (INR) Per Unit (INR) 8-Jan-2015 84733030 GTX 960 2GB DDR5 128BIT 1177-1240/7010 HDCP - ZT-90301-10M (PCI,PCB POPULATED,VGA CARD,COMPUTER ACCESS) China Delhi Air Cargo NOS 160 2,290,154 14,313
https://www.zauba.com/import-gtx960-hs-code.html

Looks like a ZOTAC Card so possibly not reference GPU clocks.
No messing this time - 160 units sent!
Cost; INI 14,313 = £150, $226, €191 excluding import duty & VAT.

Performance should be ~half that of a GTX980, possibly slightly more if it boosts higher (assuming no boost lock).
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

eXaPower
Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39477 - Posted: 14 Jan 2015 | 15:10:14 UTC - in response to Message 39406.

With 1024 cores its performance if it scales might be around that of a GTX 660Ti or a GTX 670 but for 1/3rd less power (or better).

GTX960 Benchmarks (3Dmarks firestrike) have appeared. If there not fake: The GTX960 [1024CUDA/64?TMU/32ROPS] Performance is near GTX770. Certain boards will feature a single 6 or 8pin for the possibly 90-125W GTX960.

http://wccftech.com/nvidia-geforce-gtx-960-reference-overclocked-performance-revealed-performs-slightly-faster-radeon-r9-280/

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39499 - Posted: 16 Jan 2015 | 16:37:07 UTC - in response to Message 39477.
Last modified: 16 Jan 2015 | 16:52:12 UTC

An ASUS OC version has since appeared on a shipping manifest:

STRIX-GTX960-DC2OC-2GD5//GTX960,DVI,HDMI,DP 3, 2G,D5 90YV07N0 - M0NA00 ( ASUS ) VGA

One 6-pin power supply (75W) + 75W from the slot provides for up to 150W. An 8-pin model makes no sense; it's only half the size of a 980 which has a 165W TDP. It doesn't need a 'physical' 225W TDP.

My guess is that a reference design version would be somewhere around 95W.
Fully expect these to OC-boost to 1350+ and then some. 1500MHz might well be true but for here we will have to wait and see.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

eXaPower
Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39501 - Posted: 16 Jan 2015 | 17:16:56 UTC - in response to Message 39499.

STRIX GTX970/980 sell really well. Fan control design is clever. EVGA non-blower models also stop spinning when below 60C.

My guess is that a reference design version would be somewhere around 95W. Fully expect these to OC-boost to 1350+ and then some. 1500MHz might well be true but for here we will have to wait and see.

Yes- with Ti variants filling in the ~100W-145 gap. Most GTX970/980 have 125% power limit. (Zotac is limited to 111% on there overclocked GPU's) If GTX960 (fullGM206?) is rated at 75-100W with a 125% power limit: 1500MHz is very possible. Full dies always overclock better than a Cut down. Example: GK110 GTX780 can't reach GTX780ti speeds or GTX760 vs. GTX770. Maxwell GTX980's have higher reference clocks than the GTX970.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39517 - Posted: 18 Jan 2015 | 8:59:42 UTC - in response to Message 39501.
Last modified: 18 Jan 2015 | 9:06:19 UTC

GTX 960 AMPI EDITION 2GB DDR5 128BIT 1266-1329/7010 HDCP - ZT-90303-10M (PCI,PCB POPULATED,VGA CARD,COMPUTER ACCESS)

Probably an FOC, but if its a base clock, 1266 is 12.4% higher than that of a Ref. GTX980 (1126MHz), and I'm running a 970 at 1350, so that scales (1350*1.124) to >1500MHz.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39545 - Posted: 20 Jan 2015 | 22:08:14 UTC - in response to Message 39501.

Full dies always overclock better than a Cut down.

Empirically this is correct, but with everything else being equal a full die would clock worse simply because it produces more heat. And the cut-down version can have the slowest part disabled, which can yield more frequency headroom.

The reason you're seeing the premium cards clock higher is that chips which clock better are more likely to be promoted to premium cards (if there are no defects).

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Francois Normandin
Send message
Joined: 8 Mar 11
Posts: 71
Credit: 654,432,613
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwat
Message 39566 - Posted: 21 Jan 2015 | 22:50:45 UTC - in response to Message 39545.
Last modified: 21 Jan 2015 | 22:55:27 UTC

http://www.ginjfo.com/actualites/composants/cartes-graphiques/gigabyte-gtx-960-g1-gaming-se-fait-photographier-20150119

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39567 - Posted: 21 Jan 2015 | 23:27:16 UTC - in response to Message 39545.

Full dies always overclock better than a Cut down.

Empirically this is correct, but with everything else being equal a full die would clock worse simply because it produces more heat. And the cut-down version can have the slowest part disabled, which can yield more frequency headroom.

MrS

It is a bit of a puzzle why the cut-down chips don't do better. I expect it is because they disable portions of the active circuitry that are not functioning correctly, but leave behind the clock lines. That just maintains the capacitive load without the means to drive it at full speed.

eXaPower
Send message
Joined: 25 Sep 13
Posts: 293
Credit: 1,897,601,978
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwat
Message 39577 - Posted: 22 Jan 2015 | 15:49:46 UTC

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-960,4038-8.html

Includes PCIe/PEG measurements and power targets (120-160W) for most GTX960. Evga provides a 8 pin while Gigabyte G1 is [2] 6pin and remaining boards are [1] 6pin. Looking at guru3d thermal shots: Galax has the coolest VRM and core temps.
Newegg lists GTX960 at 199-209usd.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39702 - Posted: 25 Jan 2015 | 19:12:56 UTC - in response to Message 39567.

It is a bit of a puzzle why the cut-down chips don't do better. I expect it is because they disable portions of the active circuitry that are not functioning correctly, but leave behind the clock lines. That just maintains the capacitive load without the means to drive it at full speed.

Well, I'm fine with my explanation given in my last post.

The point you're making here has one weak spot: we're talking about lot's of millions transistors here. If the capacitance of fused-off transistors would hamper the performance of transistors in a neighbouring SM, we'd have a huge capacitive problem. At usual transistor densities (as close as possible) the capacitive load would be prohibitively high [if this was true] and our entire chip design, scaling and technology development would have to be changed. Luckily it's not that bad :)

On topic: it should be interesting how GTX960 actually performs here. 8/5 or 60% better than a GTX750Ti is expected, but the memory speed doesn't scale as well as the crunching power (30% higher bandwidth due to clock speed). This difference doesn't sound dramatic, so we may well see performance in the range of 50 - 60% higher.

MrS
____________
Scanning for our furry friends since Jan 2002

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39705 - Posted: 25 Jan 2015 | 20:40:41 UTC - in response to Message 39702.
Last modified: 25 Jan 2015 | 20:41:03 UTC

The point you're making here has one weak spot: we're talking about lot's of millions transistors here. If the capacitance of fused-off transistors would hamper the performance of transistors in a neighbouring SM, we'd have a huge capacitive problem. At usual transistor densities (as close as possible) the capacitive load would be prohibitively high [if this was true] and our entire chip design, scaling and technology development would have to be changed. Luckily it's not that bad :)

No, I am referring to the "clock lines", which are the metallic conductors that carry the clock signals over the chip. They would not be so easy to disconnect when you do a chip repair; it is easier just to turn off transistors, and so the clock lines might be left in place. They would then be a large load on the clock-drivers that are left operational. That is a bit of speculation of course; it could have to do with various other portions of the circuitry, but it appears to be something basic. Nvidia would not want to lose performance on the cut-down chips if they didn't have to.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39709 - Posted: 25 Jan 2015 | 22:18:31 UTC - in response to Message 39705.

Ah, now I got your point. Power consumption of the clock signal is indeed a serious issue in modern chips. But during recent years there has often been talk about better "clock gating" when new chips were presented. I always understood this as not delivering the clock signal to regions of the chips which are currently not in use, i.e. power gated. If the ycan do this, they can also clock gate deactivated SMMs.

MrS
____________
Scanning for our furry friends since Jan 2002

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39710 - Posted: 25 Jan 2015 | 22:33:37 UTC - in response to Message 39709.

I always understood this as not delivering the clock signal to regions of the chips which are currently not in use, i.e. power gated. If the ycan do this, they can also clock gate deactivated SMMs.

Possibly so. But there are clock lines and then there are clock lines. Some are "local", which would be easier to turn off, and some are "global", which might not be. And some would be intermediate between the two. What you do for repair is probably different than what you do in normal operation, but beyond that is beyond the scope of this discussion I am sure.

Dave
Send message
Joined: 12 Jun 14
Posts: 12
Credit: 166,790,475
RAC: 0
Level
Ile
Scientific publications
watwatwatwat
Message 39827 - Posted: 29 Jan 2015 | 10:25:18 UTC

My GTX 980 was getting to noisy for me, so I swapped it out for a EVGA GTX 960 SSC (mainly to test the ACX2.0+ cooler). The money I've saved will probably be used for a Titan 2 or I might as well just stick with the 960 until Pascal comes out.

I still have to monitor GPUGrid performance. But I don't expect much. Power draw has decreased by 45W (-15%). Ok, I guess. But only if it doesn't fail crunching :/ The card is dead silent. Fans are spinning with 670 rpm under full load (Noelia), temperature never goes beyond 67°C. CPU temp has gone up by 1-3°C due to to cooler not exhausting heat out of my µATX case. But it's barely noticeable.

They only thing I can hear know while crunching are my 3 Noctua 120 mm fans spinning at max. 850 rpm (CPU fan). It's really quient now. No comparison to the blower with it 2000+ rpm and 80°C. Fun fact is, that the reference 980 got quite toasty on its surface and heated up other components around the GPU as well.

I did this mainly to test if my case could handle a non reference card.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 39943 - Posted: 31 Jan 2015 | 21:38:43 UTC - in response to Message 39827.

Quickly looking at your results shows the GTX960 card being a bit faster than half the performance of your GTX980 - which is nice, since it's only got half the raw power. But saving only 45 W? Given the TDPs this number is plausible, but is that worth it? You could have simply lowered the power target on the GTX980 or lowered the fan speed, which would have made it boost less while staying at 80°C.

Anyway, I'm sure you got some good money for your GTX980 and if you're happy with the GTX960, so be it :)

MrS
____________
Scanning for our furry friends since Jan 2002

Dave
Send message
Joined: 12 Jun 14
Posts: 12
Credit: 166,790,475
RAC: 0
Level
Ile
Scientific publications
watwatwatwat
Message 39992 - Posted: 1 Feb 2015 | 23:07:42 UTC

Hi Extra,

my GTX 980 power target was already lowered quite significantly (around 74%) so that it stayed at around 1240MHz (dropping down to 1216MHz occasionally). I probably could have taken it furhter than this, but I didn't want to get too far away from stock clocks.

My GTX 960 is now tweaked the same way (even further, I've lowerered it to 55% power target (!), voltage is at 1,1V-1,13V, clocks stay at around 1340Mhz.

After taking four Noelia WUs into account, I can now say:
GTX 980, AVG: 6,4 hrs
GTX 960, AVG: 10,7 hrs

Hm, dunno if my math is right. But this seems pretty disappointing. It takes 70% more time now to complete a WU, yet I only save around 15% power. So actual efficieny is quite low, isn't it?

What does matter however is that the card is now very silent even at full load and less heat is exhausted into my case (yep, you heard me right, less heat, even though this is not a DHE design).

Saving money for big Maxwell now :) This card will eventuelly go into my dad's PC.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40006 - Posted: 2 Feb 2015 | 20:46:03 UTC - in response to Message 39992.
Last modified: 2 Feb 2015 | 20:47:01 UTC

Thanks for the more precise numbers, Dave!

They still look good: your GTX960 has somewhat higher clock speeds and average GPU utilization is generally higher for smaller cards (which explains the remaining performance advantage of the GTX960).

Be careful when judging power draw, though: many cards have power targets set in their BIOs which differ from nVidias recommended values. You can not easily see this, as all you're given are percentages. "MaxwellBiosTweaker" can read it out (and modify it, if one wants to). Your GTX960 should currently draw about 100 W, whereas a typical power target for GTX980 cards is 180 W. A card with blower-style cooler has probably no higher setting than this, but it may still be more than nVidias stock value of 165 W. So I estimate you're saving 60 - 80 W or % now, which is fine considering the performance. Hard measurements would be better, for sure, but you'd need a power meter for this.

Edit: or look into your GPU BIOS, so you know exactly which power draw xx% translates into.

MrS
____________
Scanning for our furry friends since Jan 2002

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40039 - Posted: 4 Feb 2015 | 23:33:06 UTC

The only valid way to measure power draw is to measure it. Use a power meter such as the very affordable Kill-a-watt or equivalent. The theories are fine but until you measure it you just don't know.

Dave
Send message
Joined: 12 Jun 14
Posts: 12
Credit: 166,790,475
RAC: 0
Level
Ile
Scientific publications
watwatwatwat
Message 40047 - Posted: 5 Feb 2015 | 11:06:12 UTC

Huh? I did measure power draw.

It went from 320W down to 275W. Approx. -15% compared to my GTX 980. Both with lower power limit.

Profile Beyond
Avatar
Send message
Joined: 23 Nov 08
Posts: 1112
Credit: 6,162,416,256
RAC: 0
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40051 - Posted: 5 Feb 2015 | 20:13:09 UTC - in response to Message 40047.

I know, was referring to all the theory going on elsewhere. Sorry for the misunderstanding, I should have been more clear. Theorizing is fine to a point but actual measurement such as you did is the only real way to know. I've been fooled more than once by assuming things about power draw and then finding out I was all wet after measuring.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40056 - Posted: 5 Feb 2015 | 22:56:59 UTC - in response to Message 40047.

You appear to be crunching Einstein on your CPU.
Ask yourself a question - Why?

Huh? I did measure power draw.

It went from 320W down to 275W. Approx. -15% compared to my GTX 980. Both with lower power limit.


____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

Dave
Send message
Joined: 12 Jun 14
Posts: 12
Credit: 166,790,475
RAC: 0
Level
Ile
Scientific publications
watwatwatwat
Message 40126 - Posted: 9 Feb 2015 | 11:40:26 UTC - in response to Message 40056.
Last modified: 9 Feb 2015 | 11:43:43 UTC

You appear to be crunching Einstein on your CPU.
Ask yourself a question - Why?

Huh? I did measure power draw.

It went from 320W down to 275W. Approx. -15% compared to my GTX 980. Both with lower power limit.



Well, universe, space. Look at the stars at a clear night ;)

Funny you asked though. Because in fact I stopped crunching E@H and switched to Rosetta@Home for my CPU. R@H seems to need all the CPU power it can get.

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,201,255,749
RAC: 6,169
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 40127 - Posted: 9 Feb 2015 | 13:36:56 UTC - in response to Message 40126.
Last modified: 9 Feb 2015 | 13:38:00 UTC

You appear to be crunching Einstein on your CPU.
Ask yourself a question - Why?
Huh? I did measure power draw.
It went from 320W down to 275W. Approx. -15% compared to my GTX 980. Both with lower power limit.


Well, universe, space. Look at the stars at a clear night ;)

Funny you asked though. Because in fact I stopped crunching E@H and switched to Rosetta@Home for my CPU. R@H seems to need all the CPU power it can get.

I think the subject of skgiven's question is not your motivation for crunching Einstein@home, but the reason for the power drop you've measured.
That reason is that the other project's applications could not utilize the latest GPUs (regardless of any GPU utilization readings by different tools) as much as GPUGrid does, partly because the other projects are using older CUDA versions.

Dave
Send message
Joined: 12 Jun 14
Posts: 12
Credit: 166,790,475
RAC: 0
Level
Ile
Scientific publications
watwatwatwat
Message 40133 - Posted: 9 Feb 2015 | 17:54:15 UTC
Last modified: 9 Feb 2015 | 17:54:45 UTC

Oh, no, the power drop is real. Everything else stayed the same. All I did was swapping out the GTX 980 for a 960. The CPU utilization remained unchanged.
Some minor misunderstanding going on here I guess haha but thanks anyway for the input.

Profile skgiven
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 23 Apr 09
Posts: 3968
Credit: 1,995,359,260
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 41428 - Posted: 27 Jun 2015 | 22:43:01 UTC - in response to Message 40133.
Last modified: 27 Jun 2015 | 22:44:22 UTC

The power drop from 320W to 275W was simply explained by the GPU TDP/usage drop; 165 to 120 is 45W.
With respect to the GPU's this is a 37.5% drop but to the systems power usage it's a ~15% drop (275/320=0.859).
A 15% reduction in system power usage vs a >40% performance loss isn't good.
____________
FAQ's

HOW TO:
- Opt out of Beta Tests
- Ask for Help

kingcarcas
Avatar
Send message
Joined: 27 Oct 09
Posts: 18
Credit: 378,626,631
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 41648 - Posted: 11 Aug 2015 | 8:26:50 UTC

Every time I peak at my 960 it's doing a short run, is it just me?
____________

Profile Retvari Zoltan
Avatar
Send message
Joined: 20 Jan 09
Posts: 2343
Credit: 16,201,255,749
RAC: 6,169
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 41650 - Posted: 11 Aug 2015 | 18:31:18 UTC - in response to Message 41648.

Every time I peak at my 960 it's doing a short run, is it just me?

It's probably set in your preferences that you accept work only from the short queue (perhaps it's the default setting and you haven't changed it).

kingcarcas
Avatar
Send message
Joined: 27 Oct 09
Posts: 18
Credit: 378,626,631
RAC: 0
Level
Asp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 41898 - Posted: 25 Sep 2015 | 5:39:35 UTC

Ok seems to have been fixed, now oddly my 750Ti is not getting anything lol
____________

Snow Crash
Send message
Joined: 4 Apr 09
Posts: 450
Credit: 539,316,349
RAC: 0
Level
Lys
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 41900 - Posted: 25 Sep 2015 | 7:43:42 UTC - in response to Message 41898.

It looks like you have it set up to only get short and that queue has gone dry, have you tried it on longs?
____________
Thanks - Steve

Post to thread

Message boards : Graphics cards (GPUs) : GTX 960

//