Home / PC H/W Reviews / Core PC Components / Desktop GPU Reviews / Gigabyte GTX 950 OC Graphic Card Review

Gigabyte GTX 950 OC Graphic Card Review

  1. Introduction
  2. Packaging and Contents
  3. Closer Look
  4. Overclocking Impressions
  5. Test Bench and Testing Methodology
  6. Futuremark Benchmark- 3DMark (2013)
  7. Futuremark Benchmark- 3DMark 11
  8. Futuremark Benchmark- 3DMark Vantage
  9. OpenGL Benchmark: Cinebench 11.5 and R15
  10. Unigine Benchmark Heaven 3.0 and 4.0
  11. Batman Arkham City
  12. Hitman Absolution
  13. Metro 2033
  14. Shadow of Mordor
  15. Sleeping Dogs: Definitive Edition
  16. Sniper Elite V2
  17. GPU Computation Benchmark
  18. Folding at Home and LuxMark OpenCL Benchmark
  19. Overclocking Performance
  20. Conclusion
  21. Online Purchase Links
  22. View All

As expected, Nvidia launched its GeForce GTX 950. The difference between the GeForce GTX 950 and the GTX 960 is that it uses the same core but with two cores disabled with lesser CUDA core counts- and the base/core clock speed. So we should be expecting ‘close enough’ performance when compared to GTX 960, in this case, the Zotac GTX 960 AMP! Edition.

Nvidia does have a reference specification mentioned in its specs page, but all of its AiB partners will be having its own variant. I will be reviewing Gigabyte GTX 950 WindForce OC Edition graphic card for this purpose.

Judging by the specification (duh!), it should be sitting somewhere between the GTX 750/750Ti and the GTX 960, though it seems awfully close to the GTX 960. But it should be noted that GTX 750/750Ti first generation Maxwell cards support DirectX 11.2, while the 900 series support DirectX 12. It will be interesting to see if disabled cores on the GPU makes any significant changes. Just like GTX 960, GTX 950 seems to be more concentrated towards MOBA players. Specs wise, it uses the 128-bit interface that supports 2GB. It’s unclear if manufacturers will have a 4GB variant, but only time will tell.

Nvidia is stressing MOBA titles like DOTA 2, League of Legends and others just like how Intel is doing with its onboard graphic component for its desktop processor. MOBA titles have a large userbase, and if Intel at some point in the future is able to cater to gamers just as how discrete cards do today, Nvidia (and AMD) will loose a large chunk of userbase all around the one when they make a jump towards that platform. But Nvidia also concentrated on some of the newer titles as well, saying that the experience will be better than the most powerful console. Fair enough…

What’s the point of GTX 960? I don’t know! But it should be cheaper as the GTX 950’s price range starts from US$140, UK £129 AND IN Rs 14,000/-.

Model GTX 950 GTX 960 GTX 750 GTX 650
GPU Engine Specs
GPU Cores GM206 GM107 GK107
CUDA Cores 768 1024 512 384
Base Clock 1024 MHz 1127 MHz 1020 1058
Boost Clock 1188 MHz 1178 MHz 1085 MHz NA
Texture Fill Rate 49.2 GigaTexels/s 72 GigaTexels/s NA 33.9 GigaTexels/s
Memory Specs
Memory Speed 6600 na 5.0
Memory Amount 2GB 1GB
Memory Interface 128-bit GDDR5
Max Bandwidth 105.5GB/s 112 GB/s 80 GB/s
Card Features
SLI Ready Yes
DirectX 12 API 12 API 11.2 11
OpenGL 4.5 4.4 4.4 4.3
Bus Type PCIe 3.0
Maximum Digital Resolution 5120 x 3200 4096 x 2160
Max VGA Resolution 2048 x 1536
Media Connection
  • Dual Link
  • DVI-I
  • DisplayPort
  • HDMI
  • Dual Link DVI-I
  • Dual Link DVI-D
  • Mini HDMI
Multi-Display Yes
HDMI Audio Internal
Height 11.16 cm/ 4.376 Inches 11/13 cm/ 4.38 Inches
Length 25.24 cm/ 9.938 Inches 24.13 cm/ 9.5 Inches 14.48 cm/ 5.7 Inches 14.48 cm/ 5.7 Inches
Width Dual-Slot
Power and Temperature
Maximum Temperature 90 C 98 C 95 C 98 C
Power Consumption 90 w 120 w 55 w 64 w
Minimum Power Requirement 350 W 400 W 300 W 400 W
Power Connections 6-pin 6-pin NA 6 pin

The GTX 950 has 768 CUDA cores, 265 lesser than the GTX 960. It still requires extra power from a 6-pin PCIe since the card requires 90 w and PCIe is limited to 75 W. With a combination of PCIe interface and a 6-pin PCIe power cable, it can provide a total of 150 w.

That’s not the only changes Nvidia is bringing up along with GTX 950. The GeForce Update is getting an update.

The demo below will give a better idea…

By September, Nvidia will be launching a new suite of features in BETA within its GeForce Experience. The new access allows users to start and stop recording via its in-game overlay, similar to what many DVR suits does. In a way, that’s pretty cool because it’s removing the capture card out of the picture. Nvidia is providing access to Broadcast the game via Twitch, and one shouldn’t be surprised if Nvidia would be planning to do the same with YouTube at some point.

Apart from making a concise and easy to access in-game options for Nvidia graphic cards, it will introduce GameStream. According to few news sites, Nvidia GameStream works similar to Sony Share Play on its Playstation 4. There are three modes which you can use with a friend. The first lets your friend to view your gameplay in real-time, the second option lets your friend to play the game on your behalf while mirroring same controls set in the host PC. The third lets your friend play co-op though the co-op mode largely depends on the game.

But Nvidia already had all the tools it needs for GameStream- GRID. For a long time, Nvidia has a feature where the user can stream its gameplay from his/her computer to its Shield device. GameStream is just a few steps ahead. In its BETA mode, it will provide 720p in 60FPS via its h.264 encoding. It also requires 7Mbps upload and download.

Back to the card…

Gigabyte GTX 950 WindForce OC Edition

Gigabyte GTX 950 WindForce OC Edition

With an exception the GPU’s dimensions, cooling setup and the GPU base/boost clock speed, it’s the same as stock. The Gigabyte GTX 950 WindForce OC Edition card’s base clock speed is 1102 MHz, which is a tiny 7.61% bump over its reference clock. Just like the Nvidia GTX 950, all AiB manufacturers will have their variants of the card, with a different cooling setup, dimension and clock speed.

Oh, but Gigabyte GTX950 comes with dual-link DVI-I and DVI-D, and Nvidia mentions that its reference specs do not have DVI-D.


Leave a Reply

Your email address will not be published. Required fields are marked *