• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Newer GTX driver series gripe

Joined
Jan 24, 2019
Messages
11 (0.01/day)
Hey, hopefully someone can address Nvidia's move here. They've had a history of preventing users, gamers, etc from running lower tier video cards ie: GTX950, GTX 750Ti's and probably others, in an SLI configuration by removing the bridge connectors. Understandably, who would buy a faster videocard if you could get away with a multi-gpu setup of older cards?..because driver issues are not the problem here. I discovered that they are still providing very new driver support (NVIDIA GeForce 55x.xx WHQL) for the lower tier GT745, and GTX750 and GTX 750Ti videocards but that these same drivers will not work with the upper tier (760, 770, 780, 780Ti). When I tried to install the NVIDIA GeForce 556.12 WHQL driver for my GTX780Ti, it fails and is not compatible. One has to settle with the less versatile NVIDIA GeForce 46x.xx WHQL series drivers. Just curious, but does it sound like Nvidia did this deliberately (again) to prevent users from benefiting AI rendering with their older but capable hardware? (Yes, I can AI render with a GT 745 OEM card, but it's slooow) Thx for any feedback.
 
Joined
Feb 22, 2010
Messages
227 (0.04/day)
Location
Kent, UK
System Name Cannon
Processor Intel Core i7 13700K
Motherboard Asus Z690-P D4
Cooling Corsair H100i PRO XT w/ Corsair ML120 White LED x2
Memory 32GB (4x8GB) Corsair Vengeance Pro RGB 3600MHz
Video Card(s) Palit RTX 3060 Ti Dual 8GB
Storage 1TB 980 Pro w/Heatsink + 500GB 970 Evo Plus NVMe + 2TB MX500 + 500GB 850 Evo
Display(s) 34” Dell S3422DWG 3440x1440 144hz Curved + 27" Dell U2713H IPS 1440p + Dell P2219H IPS 1080p
Case Corsair 4000X RGB - 3xSP120 ELITE RGB (Front Intake) + 1xML120 White LED (Rear Exhaust)
Audio Device(s) Creative Sound Blaster Z SE + Logitech z623 2.1
Power Supply Corsair RM850x SHIFT
Mouse Logitech G602
Keyboard Corsair K70 RGB
VR HMD Oculus Quest 2
Software Windows 11 Pro x64 || Linux Mint
Benchmark Scores None. Primary uses are browsing, music, gaming (mostly Battlefield & Doom) and virtualization!
Driver support for the 750Ti etc is because they are Maxwell v1 cards (Maxwell also underpins the 9xx series, also still supported). Whereas higher end 7xx cards are Kepler-based, such as your 780 Ti.

(I own a 780 Ti too, and wish it had better support…)
 
Joined
Jan 24, 2019
Messages
11 (0.01/day)
Driver support for the 750Ti etc is because they are Maxwell v1 cards (Maxwell also underpins the 9xx series, also still supported). Whereas higher end 7xx cards are Kepler-based, such as your 780 Ti.

(I own a 780 Ti too, and wish it had better support…)
thank you, it was destroying me until now :kookoo::)
 
Joined
Nov 27, 2023
Messages
2,111 (6.17/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
Just curious, but does it sound like Nvidia did this deliberately (again) to prevent users from benefiting AI rendering with their older but capable hardware?
Even putting aside the Maxwell thing, calling Kepler “capable” for AI or, frankly, anything else in 2024 is ridiculous. Not only is the architecture outdated for modern games because it basically doesn’t work for newer rendering methods efficiently, the AI workloads work well on Ampere and up (and to a lesser extent Turing) primarily due to Tensor cores and undoing the half-rate FP32 in Ampere and Ada. None of those things are present in Kepler.
 
Joined
Jan 24, 2019
Messages
11 (0.01/day)
Even putting aside the Maxwell thing, calling Kepler “capable” for AI or, frankly, anything else in 2024 is ridiculous. Not only is the architecture outdated for modern games because it basically doesn’t work for newer rendering methods efficiently, the AI workloads work well on Ampere and up (and to a lesser extent Turing) primarily due to Tensor cores and undoing the half-rate FP32 in Ampere and Ada. None of those things are present in Kepler.
Thanks for that. Seemingly makes sense since the upper end cards are the first that is marketed (in this case Keplar) and therefore the oldest. As dhdude mentioned, 745, 750 etc are Maxwells so they must have came along later. Upper end doesnt always equate to the newest. I'm learning.:D
 
Last edited:
Joined
Dec 25, 2020
Messages
6,526 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Thanks for that. Seemingly makes sense since the upper end cards are the first that is marketed (in this case Keplar) and therefore the oldest. As dhdude mentioned, 745, 750 etc are Maxwells so they must have came along later. Upper end doesnt always equate to the newest.

There's no grand conspiracy as your OP implies... SLI support was removed because the industry pivoted from multi-GPU, AMD GPUs no longer support it either, and it was never explicitly supported in low end GPU configurations, for both brands. As explained previously, the GTX 750 Ti (as well as 750 and 745) use a GM107 processor, which is a Maxwell 1 design as opposed to the Kepler GK series from the other 600 and 700 cards. Those are functionally obsolete today (Maxwell 1 does not support the DirectX 12 base spec), but the driver code path is similar enough to the GTX 900 series that NVIDIA decided to keep support in the driver.

However, I must stress that these, alongside the 10 series (Pascal) are very much on the chopping block and you should expect support to be cut any day now. Maxwell is now over a decade old and Pascal is now 8 years old, and only the strongest performing cards from these architectures are still viable, and that's if you are happy to compromise heavily in image quality - it doesn't hold up even if you use low resolutions with FSR, targeting low frame rates. NV is being very generous with supporting this hardware for as long as they have, since AMD discontinued driver support for their equivalent GPUs for a long time now, their competitor products to the GTX 900 and 10 series have either been completely abandoned (R9 300, R9 Fury) or put on the backburner, with periodic public releases based on an old driver branch (RX 400, 500, Vega series).

Needless to say; trying to run any sort of machine learning on Maxwell and Pascal amounts to utter futility. Don't even bother.
 
Joined
Jan 24, 2019
Messages
11 (0.01/day)
Needless to say; trying to run any sort of machine learning on Maxwell and Pascal amounts to utter futility. Don't even bother.
Im not sure what you mean here, as I've had good success with interpolating and enhancing videos using consumer AI software (that I didnt write) with Maxwell and Pascal based cards. So you must be referring to writing AI software with those architectures?
 
Joined
Dec 25, 2020
Messages
6,526 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Im not sure what you mean here, as I've had good success with interpolating and enhancing videos using consumer AI software (that I didnt write) with Maxwell and Pascal based cards. So you must be referring to writing AI software with those architectures?

What software exactly? I have an exceedingly hard time picturing any AI software that works at an even remotely acceptable speed on a GeForce GT-class GPU, this includes the 1030. Even going into the GTX 10 series (1070 Ti+) they should be excruciatingly slow, to the point that it's probably not practical to perform any kind of matrix operations on them.
 
Joined
May 22, 2024
Messages
408 (2.47/day)
System Name Kuro
Processor AMD Ryzen 7 7800X3D@65W
Motherboard MSI MAG B650 Tomahawk WiFi
Cooling Thermalright Phantom Spirit 120 EVO
Memory Corsair DDR5 6000C30 2x48GB (Hynix M)@6000 30-36-36-76 1.36V
Video Card(s) PNY XLR8 RTX 4070 Ti SUPER 16G@200W
Storage Crucial T500 2TB + WD Blue 8TB
Case Lian Li LANCOOL 216
Power Supply MSI MPG A850G
Software Ubuntu 24.04 LTS + Windows 10 Home Build 19045
Benchmark Scores 17761 C23 Multi@65W
What software exactly? I have an exceedingly hard time picturing any AI software that works at an even remotely acceptable speed on a GeForce GT-class GPU, this includes the 1030. Even going into the GTX 10 series (1070 Ti+) they should be excruciatingly slow, to the point that it's probably not practical to perform any kind of matrix operations on them.
I would assume it's not AI as conventionally understood and hyped today, but older things like ML-powered deblocking and super-resolution. I suppose those might still be faster than any same-generation CPU on those cards.

Im not sure what you mean here, as I've had good success with interpolating and enhancing videos using consumer AI software (that I didnt write) with Maxwell and Pascal based cards. So you must be referring to writing AI software with those architectures?
I'd advise getting a new card if possible. Even a 3060 would be several times faster than any of those mentioned above. Generational turnover for graphic cards has slowed, but several-year-old cards still struggle. It's unlikely that NVIDIA or anyone else would improve driver support for those, as unfortunate as that is.
 
Joined
Jan 24, 2019
Messages
11 (0.01/day)
What software exactly? I have an exceedingly hard time picturing any AI software that works at an even remotely acceptable speed on a GeForce GT-class GPU, this includes the 1030. Even going into the GTX 10 series (1070 Ti+) they should be excruciatingly slow, to the point that it's probably not practical to perform any kind of matrix operations on them.
Im not breaking any records with speed either as I thought that hours of processing was just a given with AI rendering. I thought you were referring to quality of video. Ive been using Topaz AI for enhancing and interpolating) which is painfully slow for both using a 960, 970, 980TI, 1050, 1660Ti.) I have many computers-bad habit :ohwell:) I also use Flowframes for just interpolating as that is all it does, and its much faster and it denoises beautifully. The only gripe with it is when I video auto racing I get flickering in many areas of the interpolated video so I end up having to leave the original footage in at that point. The author has not updated his software for some time now.
 
Joined
Nov 27, 2023
Messages
2,111 (6.17/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
@CloggedVenuole
For Topaz, even the “lowly” 4060 would be significantly (we’re talking hundreds of percent) faster than any of the cards you’ve listed. And it only further scales up from there. These are workloads in which new architectures are just disproportionately superior and it is those same workloads, along with something like Blender, that truly shows fairly massive leaps in smart optimization and arch tweaks that are actually there gen on gen. Games are kiddy stuff and the fact that a lot of them are dogshit in terms of how they are made doesn’t help issues.
 
Joined
Jan 24, 2019
Messages
11 (0.01/day)
I'd advise getting a new card if possible. Even a 3060 would be several times faster than any of those mentioned above. Generational turnover for graphic cards has slowed, but several-year-old cards still struggle. It's unlikely that NVIDIA or anyone else would improve driver support for those, as unfortunate as that is.
I would love to get a 30xx series card, even a 2080 would do good in my books, still waiting for the "used" prices to come down a little more :rolleyes:
 
Top