• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 50 Series Faces Compute Performance Issues Due to Dropped 32-bit Support

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,834 (1.03/day)
PassMark Software has identified the root cause behind unexpectedly low compute performance in NVIDIA's new GeForce RTX 5090, RTX 5080, and RTX 5070 Ti GPUs. The culprit: NVIDIA has silently discontinued support for 32-bit OpenCL and CUDA in its "Blackwell" architecture, causing compatibility issues with existing benchmarking tools and applications. The issue manifested when PassMark's DirectCompute benchmark returned the error code "CL_OUT_OF_RESOURCES (-5)" on RTX 5000 series cards. After investigation, developers confirmed that while the benchmark's primary application has been 64-bit for years, several compute sub-benchmarks still utilize 32-bit code that previously functioned correctly on RTX 4000 and earlier GPUs. This architectural change wasn't clearly documented by NVIDIA, whose developer website continues to display 32-bit code samples and documentation despite the removal of actual support.

The impact extends beyond benchmarking software. Applications built on legacy CUDA infrastructure, including technologies like PhysX, will experience significant performance degradation as computational tasks fall back to CPU processing rather than utilizing the GPU's parallel architecture. While this fallback mechanism allows older applications to run on the RTX 40 series and prior hardware, the RTX 5000 series handles these tasks exclusively through the CPU, resulting in substantially lower performance. PassMark is currently working to port the affected OpenCL code to 64-bit, allowing proper testing of the new GPUs' compute capabilities. However, they warn that many existing applications containing 32-bit OpenCL components may never function properly on RTX 5000 series cards without source code modifications. The benchmark developer also notes this change doesn't fully explain poor DirectX9 performance, suggesting additional architectural changes may affect legacy rendering pathways. PassMark updated its software today, but legacy benchmarks could still suffer. Below is an older benchmark run without the latest PassMark V11.1 build 1004 patches, showing just how much the newest generations suffers without a proper software support.



View at TechPowerUp Main Site | Source
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,109 (2.47/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Nvidia really didn't think this generation through. If you are going to drop support for this stuff at least make it known several months prior to launch so software devs can adapt or just don't do it at all. Ffs
 
Joined
Nov 14, 2021
Messages
149 (0.12/day)
Why did Nvidia skip deprecation and a timeline for removing support? Seems to not have been thought out well. If 5000 series was going to kill 32-bit, they should have done a notification of deprecation back on 3000 or 4000 series to give developers time to migrate. And to also allow people to not purchase a 5000 series if they still need 32-bit support for a bit.

They need to back pedal on this. Offer 2 drivers at least, with one containing 32-bit support. Even if they don't update it as often as the 64-bit. But support this driver for at least 2 years or something.

Hell, they could likely even make some money like MS does with ESU on Windows. Have an ESU driver companies can pay for if they need to maintain 32-bit support.
 
Joined
Feb 18, 2005
Messages
6,179 (0.84/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) Dell S3221QS(A) (32" 38x21 60Hz) + 2x AOC Q32E2N (32" 25x14 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G604
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Nvidia really didn't think this generation through. If you are going to drop support for this stuff at least make it known several months prior to launch so software devs can adapt or just don't do it at all. Ffs
Or write a compatibility/translation layer to transparently handle 32-bit as 64-bit.

I agree with you that the 5000-series has been a disaster. I really get the impression NVIDIA's A-team is all working on "AI" chips and the B-team was left to do consumer graphics, with the inevitable result that we got a B-grade product and a lot of unnecessary foot-shooting. Huang is not doing his job very well.
 
Low quality post by dtoxic
Joined
Nov 30, 2019
Messages
31 (0.02/day)
Location
LV-426 Approx. 39ly from Earth
System Name X64
Processor i9-10920X
Motherboard ASUS Pro WS X299 SAGE II
Cooling Noctua NH-D15S
Memory Corsair Dominator Platinum RGB 64GB (4x16GB) DDR4 3600 CL16
Video Card(s) MSI GTX 1070 Gaming
Storage Samsung 850 Pro SSD
Display(s) Dell U2417H
Case CM C700M
Power Supply Seasonic Prime TX 1300W ATX 3.0
A person has to have IQ below 50 to buy this marvelous product for 2000K+ and be happy. People just skip this series.
 
Joined
Jul 31, 2024
Messages
974 (4.51/day)
The customer buys graphic card hardware and graphic card software. Which usually works in Windows. 32 bit support for Windows is obsolete since when? Gnu linux world 32 bit was obsolete years ago.

At a point in time you have a transition from the technology. With MSDOS we had the luck with dosbox.

I wonder how long nvidia will support those obsolete features in their driver packages. Especially when the driver package only is valid for certain graphic card generations. Obsolete feature opencl32? phsyx? cuda32?

--

The hole benchmark is faulty - P*ssmark


 
Joined
Jan 17, 2022
Messages
89 (0.08/day)
There was no real reason to remove this support in this gen other than penny pinching cards which are already much more expensive than their predecessors, truly earning their nickname of ngreedia this time

The architecture is nearly identical to the previous gen judging by the performance "improvements" that scale pretty much linearly with transistors and power consumption so there was no real reason to have done this
 
Joined
Jul 31, 2024
Messages
974 (4.51/day)
If 5000 series was going to kill 32-bit, they should have done a notification of deprecation back on 3000 or 4000 series to give developers time to migrate.

quote tech article: This architectural change wasn't clearly documented by NVIDIA, whose developer website continues to display 32-bit code samples and documentation despite the removal of actual support.

I think I read recently that cuda32 is for 7 years already obsolete.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,109 (2.47/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
Or write a compatibility/translation layer to transparently handle 32-bit as 64-bit.

I agree with you that the 5000-series has been a disaster. I really get the impression NVIDIA's A-team is all working on "AI" chips and the B-team was left to do consumer graphics, with the inevitable result that we got a B-grade product and a lot of unnecessary foot-shooting. Huang is not doing his job very well.
Come on AMD. The door is opened wide for you this time
 
Joined
Jun 19, 2024
Messages
528 (2.05/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
This says more about Passmark than Nvidia. 32 bit code was deprecated seven years ago by Nvidia, and apparently Passmark didn’t know their own code base well enough to realize they were still running 32 bit code.

Did they say when they are going to fix their software ?

Edit: OpenCL? Even more irrelevant. Is anyone updating their OpenCL drivers? OpenCL was dead a decade ago.
 
Joined
Jan 8, 2017
Messages
9,727 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
There was no real reason to remove this support in this gen
Or write a compatibility/translation layer to transparently handle 32-bit as 64-bit.
I suspect there may be something about Blackwell's ISA that hinders 32bit software from running natively. GPUs don't work like CPUs, you can't really write native software for them because GPU makers change the ISA often, a lot of the time from architecture to architecture, everything that runs on the GPU has to be compiled first, that's why I suspect there might be a hardware limitation somewhere so perhaps some 32bit instructions can't be issued or something like that.

On top of this Nvidia has the shitty habit of disclosing absolutely nothing about their ISA, like it's some sort of national secret.
 
Joined
Dec 21, 2023
Messages
16 (0.04/day)
Processor Ryzen 5950X
Motherboard MSI MPG X570 GAMING PRO CARBON WIFI
Memory 2 x 16GB DDR4 3600MHz
Video Card(s) ASUS PRIME GeForce RTX 5080
Storage 2 x 4TB WD SN850X
Case Fractal Design Define C
I needed a GPU and was lucky enough to pick up a 5080 RTX for RRP on launch day, as there were no 4080 Super cards to buy for weeks before and nothing from team red, so I was lucky to get even that and wouldn't have paid more than FE RRP, but all the same this must be the most disappointing generation of cards I can remember, and I can remember back to EGA launching in '84...
 

valicu2000

New Member
Joined
Nov 19, 2024
Messages
1 (0.01/day)
The gaming segment means less and less for nGreedya, so why bother? Less chips for consumers, more chips for data centers and more profit ...



1741022705993.png
 
Joined
Sep 6, 2013
Messages
3,603 (0.86/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I can understand Nvidia dropping 32bit support. With chips designed for servers and AI, Nvidia doesn't really care that much for compatibility with old software. In any case the solution for anyone wanting to run DirectX 9 games at 4K with 500 fps or games supporting hardware PhysX, is getting a 3000/4000 series Nvidia GPU and adding it in a second PCIe x16 slot. Someone paying $200-$500 over MSRP, probably can pay a little more for 32bit support.





PS
....and in a year from now people will keep saying how badly AMD hardware, drivers, features, support, everything/you name it, are.......

And to also allow people to not purchase a 5000 series if they still need 32-bit support for a bit.
Aaaaaaaa...........that's why......

Huang is not doing his job very well.
Huang is having a huge problem to fill all those AI orders and he is sending all the chips that somewhat fail quality check, but still can be called functional, to gamers.
 
Joined
Jun 19, 2024
Messages
528 (2.05/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Come on AMD. The door is opened wide for you this time

The last time AMD passed OpenCL compliance was in 2015.


1741023835725.png
 
Last edited:
Joined
May 10, 2023
Messages
660 (0.99/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
I suspect there may be something about Blackwell's ISA that hinders 32bit software from running natively. GPUs don't work like CPUs, you can't really write native software for them because GPU makers change the ISA often, a lot of the time from architecture to architecture, everything that runs on the GPU has to be compiled first, that's why I suspect there might be a hardware limitation somewhere so perhaps some 32bit instructions can't be issued or something like that.

On top of this Nvidia has the shitty habit of disclosing absolutely nothing about their ISA, like it's some sort of national secret.
That's not about the GPU not supporting "32-bit", it's about nvidia's compiler dropping 32-bit support to build new stuff (so targeting x86 32-bit), and their newest runtime (which runs on the CPU to dispatch stuff to the GPU) to run such 32-bit software on their newest µarch.

Anyhow, as others have already said, nvidia has deprecated 32-bit support since CUDA 9 (almost 8 years since then):
CUDA Tools
  • 32-bit Linux CUDA Applications. CUDA Toolkit support for 32-bit Linux CUDA applications has been dropped. Existing 32-bit applications will continue to work with the 64-bit driver, but support is deprecated.

Nvidia also never had proper backwards compatibility. Newer products always required a update to the CUDA version, so you'd have to rebuild your application to support the newest applications nonetheless.
 
Joined
Jan 2, 2019
Messages
207 (0.09/day)
>>...Returned the error code "CL_OUT_OF_RESOURCES (-5)"...

PassMark developers should Not accuse NVIDIA in any wrong doing and they simply should Not allocate any blocks of memory greater than 2 GB, or significantly exceeding 2 GB, for 32-bit applications.

Many companies already dropped full support for any 32-bit applications.

PassMark developers stated that some tests are 32-bit and it is Not clear why they did Not port these tests to 64-bit.

PS: All my internal HPC related OpenCL 32-bit tests do Not allocate greater than 2 GB of memory and, actually I do Not pay attention for these 32-bit verifications for a long time.

Do Not blame somebody else and look at what is wrong on your side first.

Also, in a 64-bit world it is Not possible to mix 64-bit and 32-bit codes. Microsoft had a similar problem many years ago when trying to mix 32-bit and 16-bit codes in Windows 3.1 with Win32s extension, Windows 95, Windows for Workgroups, etc 32-bit operating systems. A solution from Microsoft was very complex, unreliable, and based on thunk-based technique.

In the middle of 90th we had a problem on a financial project with mixing 32-bit and 16-bit codes and a Microsoft DDE ( Dynamic Data Exchange Win32 API ) client-server solution was used to "execute" 16-bit codes from a client DLL in a 32-bit server application. That was very complex, I personally worked on it, and the solution was abandoned after a provider of the 16-bit cryptography DLL finally released the 32-bit version.

I would Not worry about problems of PassMark and I would Not blame NVIDIA for Not clearly informing PassMark developers about dropped support.

NVIDIA always makes critical notes in Release Notes and companies involved in Software Development should read them on NVIDIA website, for example for a driver ABC for a GPU card XYZ.

PassMark developers, just port all these 32-bit tests to 64-bit world and forget about it.
 
Joined
Jun 19, 2024
Messages
528 (2.05/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
I suspect there may be something about Blackwell's ISA that hinders 32bit software from running natively. GPUs don't work like CPUs, you can't really write native software for them because GPU makers change the ISA often, a lot of the time from architecture to architecture, everything that runs on the GPU has to be compiled first, that's why I suspect there might be a hardware limitation somewhere so perhaps some 32bit instructions can't be issued or something like that.

On top of this Nvidia has the shitty habit of disclosing absolutely nothing about their ISA, like it's some sort of national secret.

I suspect you haven’t ever coded in your life. Do you even know what OpenCL is?

Hell, it was created by Apple and even they don’t support it anymore.
 
Joined
Jan 8, 2017
Messages
9,727 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
That's not about the GPU not supporting "32-bit", it's about nvidia's compiler dropping 32-bit support to build new stuff
One of the reasons for dropping support for something in a compiler is that the hardware itself no longer supports it, without knowing what's actually happening under the hood it's hard to tell why they did this.

GPUs work in a regime where almost everything is compiled first before running anyway, it's very bizarre to drop support for something unless there was a technical reason to do so. It's one thing to drop support for development and a different thing to remove the ability to run certain software entirely.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,109 (2.47/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | LG 24" IPS 1440p
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
The last time AMD passed OpenCL compliance was in 2015.


View attachment 387659
That's not what I'm referring too. With all the shit popping up with this gen for Nvidia, the bar is extremely low for AMD to capitalize on.
 
Joined
Jun 24, 2017
Messages
199 (0.07/day)
Probably the current implementation of passmark depicts a more real situation of the performance the user will obtain from the product.

Things must move on? Sure. This way? At all.

Imagine intel or AMD dropping suport of some 32bit instructions or sets without prior warning... any warning.
 
Joined
Jun 19, 2024
Messages
528 (2.05/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
That's not what I'm referring too. With all the shit popping up with this gen for Nvidia, the bar is extremely low for AMD to capitalize on.
Capitalize on what? Software they haven’t supported in a decade?
 
Joined
Jan 8, 2017
Messages
9,727 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
>>...Returned the error code "CL_OUT_OF_RESOURCES (-5)"...

PassMark developers should Not accuse NVIDIA in any wrong doing and they simply should Not allocate any blocks of memory greater than 2 GB, or significantly exceeding 2 GB, for 32-bit applications.
That error can come up for many different reasons, not to mention that if this was the case it would also crash on other cards which do have support for 32bit.
 
Top