Home / PC H/W Reviews / Core PC Components / Desktop GPU Reviews / Gigabyte GTX 950 OC Graphic Card Review
Gigabyte-GTX-950-Windforce-OC-Edition-32

Gigabyte GTX 950 OC Graphic Card Review

  1. Introduction
  2. Packaging and Contents
  3. Closer Look
  4. Overclocking Impressions
  5. Test Bench and Testing Methodology
  6. Futuremark Benchmark- 3DMark (2013)
  7. Futuremark Benchmark- 3DMark 11
  8. Futuremark Benchmark- 3DMark Vantage
  9. OpenGL Benchmark: Cinebench 11.5 and R15
  10. Unigine Benchmark Heaven 3.0 and 4.0
  11. Batman Arkham City
  12. Hitman Absolution
  13. Metro 2033
  14. Shadow of Mordor
  15. Sleeping Dogs: Definitive Edition
  16. Sniper Elite V2
  17. GPU Computation Benchmark
  18. Folding at Home and LuxMark OpenCL Benchmark
  19. Overclocking Performance
  20. Conclusion
  21. Online Purchase Links
  22. View All


As expected, Nvidia launched its GeForce GTX 950. The difference between the GeForce GTX 950 and the GTX 960 is that it uses the same core but with two cores disabled with lesser CUDA core counts- and the base/core clock speed. So we should be expecting ‘close enough’ performance when compared to GTX 960, in this case, the Zotac GTX 960 AMP! Edition.

Nvidia does have a reference specification mentioned in its specs page, but all of its AiB partners will be having its own variant. I will be reviewing Gigabyte GTX 950 WindForce OC Edition graphic card for this purpose.

Judging by the specification (duh!), it should be sitting somewhere between the GTX 750/750Ti and the GTX 960, though it seems awfully close to the GTX 960. But it should be noted that GTX 750/750Ti first generation Maxwell cards support DirectX 11.2, while the 900 series support DirectX 12. It will be interesting to see if disabled cores on the GPU makes any significant changes. Just like GTX 960, GTX 950 seems to be more concentrated towards MOBA players. Specs wise, it uses the 128-bit interface that supports 2GB. It’s unclear if manufacturers will have a 4GB variant, but only time will tell.

Nvidia is stressing MOBA titles like DOTA 2, League of Legends and others just like how Intel is doing with its onboard graphic component for its desktop processor. MOBA titles have a large userbase, and if Intel at some point in the future is able to cater to gamers just as how discrete cards do today, Nvidia (and AMD) will loose a large chunk of userbase all around the one when they make a jump towards that platform. But Nvidia also concentrated on some of the newer titles as well, saying that the experience will be better than the most powerful console. Fair enough…

What’s the point of GTX 960? I don’t know! But it should be cheaper as the GTX 950’s price range starts from US$140, UK £129 AND IN Rs 14,000/-.

ModelGTX 950GTX 960GTX 750GTX 650
GPU Engine Specs
GPU CoresGM206GM107GK107
CUDA Cores7681024512384
Base Clock1024 MHz1127 MHz10201058
Boost Clock1188 MHz1178 MHz1085 MHzNA
Texture Fill Rate49.2 GigaTexels/s72 GigaTexels/sNA33.9 GigaTexels/s
Memory Specs
Memory Speed6600na5.0
Memory Amount2GB1GB
Memory Interface128-bit GDDR5
Max Bandwidth105.5GB/s112 GB/s80 GB/s
Card Features
SLI ReadyYes
DirectX12 API12 API11.211
OpenGL4.54.44.44.3
Bus TypePCIe 3.0
Specifications
Maximum Digital Resolution5120 x 32004096 x 2160
Max VGA Resolution2048 x 1536
Media Connection
  • Dual Link
  • DVI-I
  • DisplayPort
  • HDMI
  • Dual Link DVI-I
  • Dual Link DVI-D
  • Mini HDMI
Multi-DisplayYes
HDCPYes
HDMIYes
HDMI AudioInternal
Dimensions
Height11.16 cm/ 4.376 Inches11/13 cm/ 4.38 Inches
Length25.24 cm/ 9.938 Inches24.13 cm/ 9.5 Inches14.48 cm/ 5.7 Inches14.48 cm/ 5.7 Inches
WidthDual-Slot
Power and Temperature
Maximum Temperature90 C98 C95 C98 C
Power Consumption90 w120 w55 w64 w
Minimum Power Requirement350 W400 W300 W400 W
Power Connections6-pin6-pinNA6 pin

The GTX 950 has 768 CUDA cores, 265 lesser than the GTX 960. It still requires extra power from a 6-pin PCIe since the card requires 90 w and PCIe is limited to 75 W. With a combination of PCIe interface and a 6-pin PCIe power cable, it can provide a total of 150 w.

That’s not the only changes Nvidia is bringing up along with GTX 950. The GeForce Update is getting an update.

The demo below will give a better idea…

By September, Nvidia will be launching a new suite of features in BETA within its GeForce Experience. The new access allows users to start and stop recording via its in-game overlay, similar to what many DVR suits does. In a way, that’s pretty cool because it’s removing the capture card out of the picture. Nvidia is providing access to Broadcast the game via Twitch, and one shouldn’t be surprised if Nvidia would be planning to do the same with YouTube at some point.

Apart from making a concise and easy to access in-game options for Nvidia graphic cards, it will introduce GameStream. According to few news sites, Nvidia GameStream works similar to Sony Share Play on its Playstation 4. There are three modes which you can use with a friend. The first lets your friend to view your gameplay in real-time, the second option lets your friend to play the game on your behalf while mirroring same controls set in the host PC. The third lets your friend play co-op though the co-op mode largely depends on the game.

But Nvidia already had all the tools it needs for GameStream- GRID. For a long time, Nvidia has a feature where the user can stream its gameplay from his/her computer to its Shield device. GameStream is just a few steps ahead. In its BETA mode, it will provide 720p in 60FPS via its h.264 encoding. It also requires 7Mbps upload and download.

Back to the card…

Gigabyte GTX 950 WindForce OC Edition

Gigabyte GTX 950 WindForce OC Edition

With an exception the GPU’s dimensions, cooling setup and the GPU base/boost clock speed, it’s the same as stock. The Gigabyte GTX 950 WindForce OC Edition card’s base clock speed is 1102 MHz, which is a tiny 7.61% bump over its reference clock. Just like the Nvidia GTX 950, all AiB manufacturers will have their variants of the card, with a different cooling setup, dimension and clock speed.

Oh, but Gigabyte GTX950 comes with dual-link DVI-I and DVI-D, and Nvidia mentions that its reference specs do not have DVI-D.


hmm…

Leave a Reply

Your email address will not be published. Required fields are marked *

*