• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 960 Specs Confirmed

Joined
Sep 6, 2013
Messages
3,306 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Color compression is only used for transferring from and to memory, it doesn't have an effect in memory's capacity, or am I missing something?
 
Joined
May 4, 2012
Messages
420 (0.09/day)
Processor Intel i7 10700K
Motherboard Gigabyte Z490 Aorus Pro AX
Cooling Be Quiet Dark Rock Pro 4
Memory DDR4 Corsair Vengeance LPX 3200MHZ (2X16GB)
Video Card(s) Palit GTX1080ti Super JetStream 11GB
Storage Trandscend 370s 256GB / WD Caviar Black 2+1TB
Display(s) Acer XB270HU 144hz Gsync
Case Phanteks Eclipse P600S White
Audio Device(s) Sound Blaster ZXR
Power Supply Corsair RM1000X 1000W 80 Plus Gold
Yeah, I think it works both with writing/retrieving from system RAM, and also from client vRAM to the texture address units of the GPU.

Ouch! Sounds like some serious pre-release price gouging (unless all other cards are carrying the same kind of mark-up).
If they're on the shelves, how about some quick phone pictures for us?



I didn't take it myself. Grabbed it from the store's FB page. No unboxed pictures though.

PS. The same store sells Gigabyte GTX970 G1 at around 410$.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Color compression is only used for transferring from and to memory, it doesn't have an effect in memory's capacity, or am I missing something?
Not sure who you're addressing, but myself and xorbe have both asserted that delta colour compression is used in the former ( transferring to/from memory). Not sure how colour compression would be confused with capacity since the measure of it's effectiveness is a percentage of GB/sec or Gbps (bandwidth), not GB (capacity). This was explained at GM 204's launch.

The same store sells Gigabyte GTX970 G1 at around 410$.
Well, the 970 G1 is a $360 part at Newegg, so assuming no price gouging on pre-launch sales (might be unlikely) that would make the 960 G1 a $260 card by comparison. Not overly scientific I'll grant.
 
Last edited:
Joined
Feb 14, 2012
Messages
2,352 (0.51/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Sure it has an effect on effective vram size. The textures are 75% of original size, so 1.33x transfer rate, 1.33x texture storage (max -- some vram is used for frame buffers of course).
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Sure it has an effect on effective vram size. The textures are 75% of original size, so 1.33x transfer rate, 1.33x texture storage (max -- some vram is used for frame buffers of course).
That 1.33 is a very variable number as I'm sure you're aware that not all textures can be compressed - that aside from vRAM capacity set aside for storage buffers, post process effects etc. Compressing the data surely adds to the capacity of the framebuffer, but the fluidity of the workload would surely make the actual gains variable. At the opposite end of the scale, a highly compressed workload might meet a bottleneck with the texture address units. Fillrate seems to fall dramatically compared to Kepler due to the reduced TMU's available (compared with an otherwise ~equally performing GK 110).
 
Joined
Sep 6, 2013
Messages
3,306 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Not sure who you're addressing

I wasn't addressing to someone specifically. It was a general question because of what GhostRyder wrote that puzzled me

I agree, mostly it's the color compression that makes the 2gb enough for a card like this which is going to be a sweet 1080p card.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I wasn't addressing to someone specifically. It was a general question because of what GhostRyder wrote that puzzled me
I see why you made the query. The compressibility would allow for more to be stored in the existing vRAM. Assuming the gains are what Nvidia say they are (!), the 700MB might make a difference, but I'm figuring the low TMU count (64) might end up wiping out some of that theoretical gain. Adding vRAM doesn't always translate into tangible benefit (see 4GB vs 8GB R9 290X, 1.5GB vs 3GB GTX 580, 3GB vs 6GB GTX 780 for example*), the graphics pipeline just moves to the next choke point.

Having said that, a 2GB card should still suffice at 19x10 resolution for the image quality setting likely being used. Unless you're a masochist, I doubt many people would deliberately amp up the game I.Q. and play at sub-optimal framerates just to make a point.

* The larger framebuffers only create separation when the smaller vRAM capacity cards are deliberately overwhelmed (e.g. use of high res texture packs) - so less a gain by the large framebuffer card than a deliberate hobbling of the standard offering.
 
Joined
Apr 30, 2008
Messages
4,894 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor Intel Core i5 12600H
Motherboard MinisForum NAB6 Lite Board
Cooling Mini PC Cooling
Memory Apacer 16GB 3200Mhz
Video Card(s) Intel Iris Xe Graphics
Storage Kingston 512GB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum NAB6 Lite Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Home 64bit
Benchmark Scores Don't do them anymore.
Joined
Apr 29, 2014
Messages
4,275 (1.11/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
I wasn't addressing to someone specifically. It was a general question because of what GhostRyder wrote that puzzled me
Color compression is only used for transferring from and to memory, it doesn't have an effect in memory's capacity, or am I missing something?
Because its exactly what I meant by it, the color compression technique employed now by both AMD and Nvidia in their desktop GPU's (New gens) helps alleviate a little bit of the vram bottleneck that can surface on top of that. Its not the most significant amount but it helps and at 1080p games do not really blow away the vram so 2gb is normally enough for a card aimed around mid range area and with that help it will make 2gb feel like a little more or at least enough to alleviate some potential bottlenecks (Not all mind you just some).
 
Joined
Dec 29, 2014
Messages
861 (0.24/day)
Having said that, a 2GB card should still suffice at 19x10 resolution for the image quality setting likely being used. Unless you're a masochist, I doubt many people would deliberately amp up the game I.Q. and play at sub-optimal framerates just to make a point.

I don't understand how video cards work very well. Is the complaint about low vram quantity coming from new games storing higher res textures in vram? Is this somehow separable from the card's processing speed? And if so, is it *necessary* to store so much in vram, or is it just the way the game coders did it... ie assuming that the card would have xGB of vram available?

I have a GTX 750 1GB. So many were saying the low vram would hobble it, but I researched it before buying and decided this wouldn't be the case... and I've not experienced a problem yet. Possibly I will. As you said, if I turned up the settings, it might be an issue, but no one would want to play at 10-15 fps anyway.

The GTX 960 looks to be coming in ~2x the speed of a GTX 750, so I don't expect the 2GB to necessarily be a problem, but the bandwidth is only 40% greater. And overclocking my vram helped quite a lot, so I think that will be a restriction on the 960... ie it could run considerably faster if it had more bandwidth. Well... on the other hand maybe not, since it is already twice as fast using 2x the shaders, TMUs, and ROPs.
 
Joined
May 19, 2007
Messages
4,520 (0.71/day)
Location
Perth AU
Processor Intel Core i9 10980XE @ 4.7Ghz 1.2v
Motherboard ASUS Rampage VI Extreme Omega
Cooling EK-Velocity D-RGB, EK-CoolStream PE 360, XSPC TX240 Ultrathin, EK X-RES 140 Revo D5 RGB PWM
Memory G.Skill Trident Z RGB F4-3000C14D 64GB
Video Card(s) Asus ROG Strix GeForce RTX 4090 OC WC
Storage M.2 990 Pro 1TB / 10TB WD RED Helium / 3x 860 2TB Evos
Display(s) Samsung Odyssey G7 28"
Case Corsair Obsidian 500D SE Modded
Power Supply Cooler Master V Series 1300W
Software Windows 11
Uuh what? Many ppl like that game including myself so ppl will care. o_O

PE____S this card will be less than 10% better than the 760 which is why they use a 660.
 
Joined
Feb 18, 2013
Messages
2,181 (0.51/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
Like I said, the GTX960 WILL supersede both the GTX750 & GTX750Ti as these 2 cards somewhat did not deliver it's expectations when it came out. In my opinion, I can safely say that the GTX960 & it's $200 or below price point is going to make some heads turn, on both PC builders & budget-conscious PC gamers around the world. Over here in Malaysia, the GTX960 card will become a sensational product of the year as there are many PC users sees the GTX970 as "high end VGA card" thanks to it's $350 price tag, with vendor based kits like ASUS Strix, Gigabyte G1 Gaming, MSI Gaming & others hitting nearly MYR1800 a piece.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.51/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Sure it has an effect on effective vram size. The textures are 75% of original size, so 1.33x transfer rate, 1.33x texture storage (max -- some vram is used for frame buffers of course).

If it drops the quality of the textures, how are they able to advertise it as lossless?
 
Joined
Feb 18, 2013
Messages
2,181 (0.51/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
Software Windows 11 Professional x64 (Update 23H2)
Remember; the GTX960 IS A BUDGET 1080p VGA CARD catered specifically for those who can't afford the GTX970 or even the top-of-the-line GTX980 card. Besides, not everyone has the money to buy & build a rig that pushes games on max settings @ 1080p resolution, yet alone 1440p or 4K. 2GB on 128-bit bus is no issue as we're talking about a Maxwell chip running under the hood, not a full-blown GK110 used by the GTX780Ti. Uses a single 6-pin connector is also no problem as this card is not like those power-hungry cards who's requirements are a little too high for budget gamers. Couple the 960 with a cool running Core i3 & cheap 8GB DDR3 kit, you'll get a very efficient rig that uses very little energy while able to play games on 1080p comfortably at High without running into problems like low fps.
 
Joined
Dec 29, 2014
Messages
861 (0.24/day)
Like I said, the GTX960 WILL supersede both the GTX750 & GTX750Ti as these 2 cards somewhat did not deliver it's expectations when it came out.

Actually it's my impression that these cards sold like crazy. And check the reviews from people who bought them... overwhelmingly positive. Probably more so than any other model. Which likely is due to good QC and drivers, but still... many are very satisfied with the performance.

The 960 is in different league altogether. I'm sure that a GTX 960 looks pathetic to someone with SLI 980s, but it is reported to be literally 2x the speed of a 750, and ~1.8x the speed of a 750 Ti. At a retail price of $199, the 960 will sell like crazy.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I don't understand how video cards work very well. Is the complaint about low vram quantity coming from new games storing higher res textures in vram? Is this somehow separable from the card's processing speed? And if so, is it *necessary* to store so much in vram, or is it just the way the game coders did it... ie assuming that the card would have xGB of vram available?
As an analogy, the more complicated and larger the picture that the GPU has to paint, the more paint and a wider range of colours are required ( textures, geometry, tessellation etc). The picture has to be painted in one sitting, and you can only use the paint you can fit on your palette. You can increase your painting speed (framerate), but the amount* and colour range of paint you can put on your palette depends upon the palettes size (vRAM framebuffer).
* Using a higher quality (denser) paint would allow for more coverage (delta compression).
I have a GTX 750 1GB. So many were saying the low vram would hobble it, but I researched it before buying and decided this wouldn't be the case... and I've not experienced a problem yet. Possibly I will. As you said, if I turned up the settings, it might be an issue, but no one would want to play at 10-15 fps anyway.
Yup. Just a simple case of adjusting the workload ( screen resolution, gaming image quality settings) to fit the available hardware.
 
Last edited:
Joined
Apr 19, 2012
Messages
12,062 (2.63/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
Far Cry 4, Very High preset @1440p


The GTX 960 has 112GB/s, so it's not quite enough to run it at 1440p during peak gameplay. Once I've done my 1080p benchmarks, we'll see what the figures are.

These are rough and approximate figures based on some educated extrapolation with a test version of GPU-Z W1zzard sorted for me. The values could be entirely wrong. I'll explain in detail once the full article is up.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
Isn't tricky since one of the benefits is delta based leading to variable outcomes depending on the source compressed and the output.

Running benchmark X Game X Scene X as appose to running Bx,Cx,Sy. Game A-Z will never share similar while Game X Scene A-Z will always vary outcome.

I don't know how your running it but wouldn't it be
Kepler & Maxwell similar performance with same frame buffer comparison.

Am I totally miss interpreting it?
 
Last edited:
Joined
Apr 19, 2012
Messages
12,062 (2.63/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
Isn't tricky since one of the benefits is delta based leading to variable outcomes depending on the source compressed and the output.

Running benchmark X Game X Scene X as appose to running Bx,Cx,Sy. Game A-Z will never share similar while Game X Scene A-Z will always vary outcome.

I don't know how your running it but wouldn't it be
Kepler & Maxwell similar performance with same frame buffer comparison.

Am I totally miss interpreting it?

You're correctly interpreting the fact that it's altogether a nightmare to accurately measure. But, somebody needs to do it to prove or disprove whether this 128bit 112GB/s memory bandwidth is actually an issue on 1080p once and for all.
I've got a separate benchmark for non-compressed memory bandwidth usage, as well as some graphs to show how bandwidth usage correlates (or doesn't) with other usage on GPU hardware (VRAM, PCIe Bus, GPU Load). The best possible thing to do is to take the highest bandwidth usage figure and go by that figure to be utterly and completely sure it won't be a bottleneck. I'm also attempting to cover 4 games that represent a couple of different types, including VRAM hogs, CPU limited, GPU limited, and general well rounded title. Once I've finished the full write up, people are welcome to request benchmarks on games.

I'd ideally like a 770 as it shares identical bandwidth to the 970, the difference being the Maxwell compression technique. As it stands I'm having to assume it's 30%. In reality it varies a lot.

I must stress at this point though, even NVidia has mentioned that the available tools for measuring such a thing are not particularly accurate. (They actually said the software application available is not 100% representative, just that the values are similar by proxy)
 
Last edited:
Joined
Dec 29, 2014
Messages
861 (0.24/day)
The picture has to be painted in one sitting, and you can only use the paint you can fit on your palette.

Thanks, that makes sense to me. For instance, say I was able to get 30fps max, and I happen to be at 99.9% of my 1GB of vram capacity. If I doubled the speed (say with 100% efficient SLI), I could double my fps, but I wouldn't be able to increase the quality settings at all without causing problems, like swapping to system memory and causing stutters. True? Is the issue as simple as having more digital information in a single frame than the vram capacity, or is it more complicated? Because 1GB seems like quite a lot for one frame.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Thanks, that makes sense to me. For instance, say I was able to get 30fps max, and I happen to be at 99.9% of my 1GB of vram capacity. If I doubled the speed (say with 100% efficient SLI), I could double my fps, but I wouldn't be able to increase the quality settings at all without causing problems, like swapping to system memory and causing stutters. True?
Correct insofar as vRAM is concerned. You can still increase image quality settings that aren't directly texture based. In SLI (and CrossfireX) the cards work in parallel with the same resources mapped into their individual on-board graphics memory - that is to say the vRAM is mirrored across each card. Quick illustration: These benchmarks run by the system builder Digital Storm show that the usage for two GTX 780 Ti's in SLI is the same as that for a single card:

Is the issue as simple as having more digital information in a single frame than the vram capacity, or is it more complicated? Because 1GB seems like quite a lot for one frame.
Not all the vRAM is allocated for a single frame. The vRAM has portioned buffers (size is dependant upon the application) - holding multiple frames at varying stages of completion, one frame being sent to the monitor (or the primary graphics card then to the monitor if the card is the 2nd, 3rd, or 4th in an SLI setup) from the front buffer, the next held in the back buffer- which then becomes the new front buffer as the newly vacated former front buffer assumes back buffer duty (there is also triple buffering options). There are also many other buffers to take into consideration, such as the depth buffer. Basically, the 1GB of vRAM you have isn't dedicated to a drawing a single frame at a time.
 
Joined
Dec 29, 2014
Messages
861 (0.24/day)
You can still increase image quality settings that aren't directly texture based.

What settings would those be?

In SLI (and CrossfireX) the cards work in parallel with the same resources mapped into their individual on-board graphics memory - that is to say the vRAM is mirrored across each card.

My understanding of SLI was that the cards sort of took turns rendering frames. Yes, no?

And the main thing I've been wondering about is all the buzz about how new games need 4GB+ vram. If not this year then the next. I can fully believe that a top card needs that kind of vram because it is capable of processing information fast enough to be limited with less. I've not seen anything yet that convinces me that things have changed. Vram requirements seem to scale with processing speed about the same as several years ago. Are newer higher res textures something that will impact vram quantity more than processing speed? Even if so, would turning down the texture detail enough to solve the problem, make the game look ugly?
 

mxp02

New Member
Joined
Jan 17, 2015
Messages
8 (0.00/day)
That should be plenty in the 1080p segment this card is meant for.
New games in 2015 will cost 2GB or more vram @720p.There were several games cost more than 2GB vram @1080p already last year.
 
Joined
Sep 7, 2011
Messages
2,785 (0.58/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
What settings would those be?
Well, for starters, any compute shader post process - effects applied after the scene has been rendered. Common types of this would be Depth of Field and motion blur
My understanding of SLI was that the cards sort of took turns rendering frames. Yes, no?
Yes. Each card holds it's frame in its vRAM's front buffer and flips the contents to the primary card - the one that is connected to the video display.
And the main thing I've been wondering about is all the buzz about how new games need 4GB+ vram. If not this year then the next. I can fully believe that a top card needs that kind of vram because it is capable of processing information fast enough to be limited with less. I've not seen anything yet that convinces me that things have changed. Vram requirements seem to scale with processing speed about the same as several years ago. Are newer higher res textures something that will impact vram quantity more than processing speed? Even if so, would turning down the texture detail enough to solve the problem, make the game look ugly?
Voxel based Global Illumination
Path tracing
Larger texture packs as screen resolution increases
Improved physics particle models (fog, water, smoke, interactive/destructible environments) and a host of other graphical refinements. For further reading I'd suggest Googling upcoming/future rendering techniques. The yearly SIGGRAPH is a good place to start being as it is independent.

You'll always have the opportunity to lower game image quality. How good/bad it looks will depend on the game engine, and how far the options are dialled down (few PC gamers would willingly choose static lighting for instance).

At this point we're straying pretty far from the actual topic at hand, the GTX 960.
 
Joined
Dec 29, 2014
Messages
861 (0.24/day)
But do those things increase vram requirements *more* than the cards computing requirements? Will we need larger amounts of vram even on slow cards? That's what many people seem to believe, but I don't know if there is any truth to it.

At this point we're straying pretty far from the actual topic at hand, the GTX 960.

Not necessarily... because the 960 is the fastest new card to come out that still uses 2GB of vram. I've been looking at some the reports where the specs are listed, and the comments are overwhelmingly of this variety "This card is an immediate fail with 2GB, games need 4GB, Nvidia are idiots, shouldn't cost more than $100, it's already obsolete, I feel sorry for anyone dumb enough to buy it" etc. I tend to think that Nvidia knows what they are doing and the card will be balanced and perform well, but I seem to be in the minority.
 
Top