• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Radeon RX 9070 XT Shatters Sales Records, Outperforming Previous Generations by 10X

Joined
Jun 19, 2024
Messages
680 (2.29/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
CUDA is great and all, but virtually all video game code is DirectX, DirectX Raytracing or DirectCompute.

The exceptions are Vulkan, Vulkan Raytracing and Vulkan compute.

CUDA (and HIP) is for AI and Prosumer (ex: DaVinci Resolve, Blender Cycles) applications. Not video games.

Games have a ton of NV_API calls in them, which is essentially CUDA.

 
Last edited:
Joined
Nov 4, 2005
Messages
12,161 (1.71/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
If they had released a GPU with 50% more performance for the same price ratio it wouldn’t have sold as well, proving their point and hitting their stated goal.
> "And frankly, 10x the first week sales of the 7900 XT and XTX is like taking candy from a baby, nobody wanted those things compared to the 4080 and 4090.

What’s the 4080 better at than the 7900XTX for 90% of gamers out there? I play the games I want at 4K 60+ FPS with all high settings. I have played with RT/PT but it doesn’t move the scale for me any, it’s still a mediocre implementation of a new technology. Higher resolution textures in games make a far larger difference in visual fidelity.
The 7900XTX vs 4080 when I bought mine was closer to 30% more performance per dollar.

1742942359738.png
 
Joined
Jun 19, 2024
Messages
680 (2.29/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Joined
Jul 23, 2024
Messages
33 (0.13/day)
When we look at Directx 12 all directX12 technologies are automatically compatible with directX12 gpu's, the latest ones come with it as standard, no matter if they are AMD/INTEL/Nvidia.
If I own an AMD 7000 or 6000 series right now I would use Optiscaler,there are numerous tutorials on youtube,it allows you to enable dssl/xess/fsr4 on older and more modern games,even though it is beta software.

While a 7900XTX is a very powerful gpu with much more vram its AI capabilities are up to FP 16 and yes it is more powerful than the 9070 XT up to fp 16 but the 9000 series does fp 8 and there it is more powerful than any previous gpu because the previous 7000/6000 series cannot.

In the RX 9000 series no chiplets are used, they are heterogeneous gpu like nvidia.
The 9070XT that I own is well defended in 4k depending on what game you want to play, there are games that in 4k nor a 5080 can get more than 50fps.in those cases is the gpu that is used desescale the screen to 1440p or 1080p, no matter if it is a 5080 as a 5070 Ti as a 9070XT.the performance will be higher.

The use of frame multipliers in both AMD / Nvidia, really only work well if you have a fps rate of minimum average of 30 fps and even then I would not use it, the reason one is because it will overload the gpu with higher load, the second reason is that it will increase the latency, the third point is that it drastically increases the energy consumption of the gpu..... If you already have in native 30 fps or 60 no longer need more, if you can get 80 fps and would be brutal if you have a gpu with a higher load, the second reason is that it will increase the latency, the third point is that it drastically increases the energy consumption of the gpu.

Translated with DeepL.com (free version)

If you already have 30 fps or 60 fps natively, you don't need more; if you can get 80 fps, that would be awesome. If you have an 80 fps or higher display, 60 fps is actually excellent; any additional fps increases latency.

What you need to worry about is whether your GPU can run textures not only in ultra mode, but also in cinematic mode, which is a higher standard than ultra at the resolution you're playing at or if your GPU is designed for it.

People think ray tracing or path tracing makes their game look better.
Explanations: Cinematic-quality textures have nothing to do with ray tracing or path tracing, but let's clarify something here: there are games created and optimized for NVIDIA where cinematic-quality textures were removed as an option in the cinematic textures section and only included the ray tracing option to justify the technology, implying that ray tracing improves the game's textures, but that's not the case. In games not paid for by NVIDIA, cinematic-quality textures are an additional option, and ray tracing does more than just illuminate with a specific light beam and create shadow paths.

Ray tracing and path tracing don't improve textures; they simply add extra lighting to the game. But that won't make a square Minecraft texture into a work of art like the Mona Lisa.

Games paid for by Nvidia will always be more optimized for higher performance with Nvidia, games paid for by AMD will be optimized for AMD, for example in Nvidia cyberpunk 2077, for example in AMD Black ops 6 where the 9070 Xt takes +30 fps from the 5070 TI and is equal to a 5080.
 
Joined
Nov 15, 2020
Messages
997 (0.62/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
Poor old MSI. I bet they regret not releasing an AMD GPU, particularly in this generation for Nvidia and AMD.
 
Joined
Dec 16, 2024
Messages
33 (0.28/day)
The 9070 XT has been a fantastic success—it's the No. 1 selling AMD Radeon GPU in its first week, with sales 10 times higher than past generations."
Maybe AMD cards previously sold poorly at launch. Maybe this time, after the premiere was postponed, many waited... and this was also due to the fact that this was the moment when RTX 50XX disappeared from stores, and there were available ones for several dozen or several hundred percent higher.
 
Joined
Aug 12, 2022
Messages
292 (0.30/day)
There are no GPUs in the 9070 performance class in consistent stock in stores so all 9070s will sell, and it was delayed about 10 weeks, so 10x sales in the first week isn't news it's just what we all assumed.
 
Joined
Nov 14, 2021
Messages
155 (0.12/day)
> "It's the No. 1 selling AMD Radeon GPU in its first week, with sales 10 times higher than past generations."

This is just the exact same thing as Nvidia's "double the amount" claim, comparing your mid-tier launches to your previous flagship launches. And frankly, 10x the first week sales of the 7900 XT and XTX is like taking candy from a baby, nobody wanted those things compared to the 4080 and 4090.
It does say "past generations". Not previous generation. So, if this statement is accurate, we would want to at least include RDNA series. RDNA2 launched the 6800/6800XT first. RDNA the 5700/5700XT launched first.

So there could be some truth. If it did 10x better than the 6800/6800XT, that would be pretty nice.
 
Joined
Dec 9, 2024
Messages
355 (2.86/day)
Location
Namek (actually, missouri)
System Name The
Processor Ryzen 7 5800X
Motherboard ASUS PRIME B550-PLUS AC-HES
Cooling Thermalright Peerless Assassin 120 SE
Memory Silicon Power 32GB (2 x 16GB) DDR4 3200
Video Card(s) RTX 2080S FE | 1060 3GB & 1050Ti 4GB In Storage
Display(s) Gigabyte G27Q (1440p / 170hz DP)
Case SAMA SV01
Power Supply Firehazard in the making
Mouse Corsair Nightsword
Keyboard Steelseries Apex Pro
It does say "past generations". Not previous generation. So, if this statement is accurate, we would want to at least include RDNA series. RDNA2 launched the 6800/6800XT first. RDNA the 5700/5700XT launched first.

So there could be some truth. If it did 10x better than the 6800/6800XT, that would be pretty nice.
I assume they're probably hrowing the sales of RDNA3 in there to make it seem better than it realistically is, if we were to compare to RDNA2, Polaris, etc. But probably not as bad as NVIDIA's claims (nor as egregious id assume) since AMD seems to like to intentionally be more transparent / more honest than NVIDIA to seemingly one up them.

I still think any improvement in sales (like x2 or x4 over RDNA3) is a huge success though, considering RDNA3 didn't sell super well compared to previous AMD generations iirc. Atleast not at launch.
 
Joined
May 13, 2024
Messages
69 (0.21/day)
Processor Ryzen 7 5800X3D
Motherboard MSI Pro B550M-VC Wifi
Cooling Thermalright Peerless Assassin 120 SE
Memory 2x16GB G.Skill RipJaws DDR4-3600 CL16
Video Card(s) Asus DUAL OC RTX 4070 Super
Storage 4TB NVME, 2TB SATA SSD, 4TB SATA HDD
Display(s) Dell S2722DGM 27" Curved VA 1440p 165hz
Case Fractal Design Pop Air MIni
Power Supply Corsair RMe 750W 80+ Gold
Mouse Logitech G502 Hero
Keyboard GMMK TKL RGB Black
VR HMD Oculus Quest 2
What’s the 4080 better at than the 7900XTX for 90% of gamers out there? I play the games I want at 4K 60+ FPS with all high settings. I have played with RT/PT but it doesn’t move the scale for me any, it’s still a mediocre implementation of a new technology. Higher resolution textures in games make a far larger difference in visual fidelity.
The 7900XTX vs 4080 when I bought mine was closer to 30% more performance per dollar.
I didn't say anything about performance. I'm talking about sales. And the 7900 XTX and 7900 XT sold HORRIBLY against the 4090 and the 4080 and even the underwhelming 4070-Ti. Even 10x the sales wouldn't have come close to matching the 40 series first week. Same thing with the RX 6000 series. Nobody wanted the RX 6800 and 6800 XT compared to the 3080.

This launch *SHOULD* be different, since they're supposed to be competing in a different price bracket (cheaper cards should mean more sales), AND they actually have a competitive product this time that can match Nvidia on most features.

So when AMD says "10x the first week sales," that's a very, VERY low bar to clear given their past launches. Frankly, only 10x the sales of the 7900 XTX / XT or the 6800 / 6800 XT is worryingly low.

It does say "past generations". Not previous generation. So, if this statement is accurate, we would want to at least include RDNA series. RDNA2 launched the 6800/6800XT first. RDNA the 5700/5700XT launched first.

So there could be some truth. If it did 10x better than the 6800/6800XT, that would be pretty nice.
A comparison to the RDNA 2 launch is even less encouraging! The 6800 / 6800 XT sold worse! The RX 6800 / 6800 XT didn't even show up on the Steam Hardware Survey until 2022, 2 years after they were launched! And that was only because they were literally dirt-cheap and stock was clearing out for the next-gen launch. The RX 6000 flagship launch card literally didn't sell until it was put on fire sale.
 
Joined
Dec 25, 2020
Messages
8,172 (5.21/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000 (5090 shipping to me soon™)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
It does say "past generations". Not previous generation. So, if this statement is accurate, we would want to at least include RDNA series. RDNA2 launched the 6800/6800XT first. RDNA the 5700/5700XT launched first.

So there could be some truth. If it did 10x better than the 6800/6800XT, that would be pretty nice.

It's likely accurate, outside of the Linux community and perhaps tech forums such as TPU or Guru3D, you would find little love for Radeon - and the only geographical region where AMD held a consistent level of performance is Europe. Although definitely not an universal truth, European customers tend to have lighter pockets and are willing to concede on having the latest and greatest if it means they get to save a few euros, it will not have escaped those with an observant eye that the most vocal people complaining tend to give themselves away by using the € and £ signs.

The timing is right, the product is decent, the pricing is (at least on paper) fair and perhaps even more importantly, gamers are fed up with Nvidia pushing the boundary on giving customers the least they can for the most money possible. Exceptionally few people fulfill the conditions of being passionate enough about games to drop 2k+ (realistically 3k+) on an RTX 5090 and having the funds to do so, especially given the AAA market no longer takes any risks. Production costs are sky high, publishers are risk-averse, and as such pretty much all new releases became "safe" and in trying to offend no one, manage to please nobody either.

I wish AMD a well-earned and well-deserved success. We need their horse on this race.

I still think any improvement in sales (like x2 or x4 over RDNA3) is a huge success though, considering RDNA3 didn't sell super well compared to previous AMD generations iirc. Atleast not at launch.

RDNA 3 sales picked up after DeepSeek AI launched, it can utilize the Navi 31/32 hardware very well and you get the kind of performance that AMD was hoping to get with games from the get go. Makes the 7900 XTX more than adequate competitor for the 4090... on this workload.
 
Joined
Jun 19, 2024
Messages
680 (2.29/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Hint: Do you know what D3D12 in the API calls mean?

View attachment 391701

Do you know what HLSL is? You might want to Google that. You literally linked to a DirectX API.
Facepalm.

I linked you to the NVAPI documentation, not the DirectX documentation. You didn’t know DirectX is extensible?
 
Joined
Apr 24, 2020
Messages
2,857 (1.57/day)
Facepalm.

I linked you to the NVAPI documentation, not the DirectX documentation. You didn’t know DirectX is extensible?

CUDA is great and all, but virtually all video game code is DirectX, DirectX Raytracing or DirectCompute.

The exceptions are Vulkan, Vulkan Raytracing and Vulkan compute.

CUDA (and HIP) is for AI and Prosumer (ex: DaVinci Resolve, Blender Cycles) applications. Not video games.

What the hell are you arguing with me for? Did you even read my post?
 
Joined
Jun 19, 2024
Messages
680 (2.29/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
What the hell are you arguing with me for? Did you even read my post?

You‘re clueless dude. CUDA is used all over the place in games. Just look at the Nvidia branch of UE to start with. It’s open, so assuming you know how to read code you’ll be able to see all the NVAPI calls.

The above assumes you know what an API does of course.
 
Joined
Apr 24, 2020
Messages
2,857 (1.57/day)
There is not a single lick of CUDA in the entire page you linked to. Not one lick.

But don't mind me. I'm just someone who can read CUDA and DirectX.

Do YOU know what cuda<<<>>> calls look like? Because they don't look like the page you selected randomly.
 
Joined
Dec 25, 2020
Messages
8,172 (5.21/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000 (5090 shipping to me soon™)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Since video games are generally a commercial endeavor and must support the existing install base, developers choose to support a baseline and that baseline has almost always been what older AMD hardware with much older drivers can do. Just throwing it out there that video games not using these compute runtimes isn't necessarily a positive: it's a compatibility constraint.
 
Joined
Apr 24, 2020
Messages
2,857 (1.57/day)
Since video games are generally a commercial endeavor and must support the existing install base, developers choose to support a baseline and that baseline has almost always been what older AMD hardware with much older drivers can do. Just throwing it out there that video games not using these compute runtimes isn't necessarily a positive: it's a compatibility constraint.

DirectCompute and Vulkan Compute offer plenty of cross-GPU compute APIs.

If your programmers are already masters of HLSL (aka: DirectX Shader language), it's just easier to write HLSL than to switch everyone to CUDA.

DirectX is a beast. And for all other video game purposes, Vulkan is acceptable (mostly Linux / SteamOS).

If your video game is already cross-GPU compatible because it's HLSL DirectX and you want a big compute shader, the answer is simply DirectCompute (a subcomponent of DirectX). Besides, all your data is already in DirectX Buffer Objects so it's much more natural.
 
Joined
Dec 25, 2020
Messages
8,172 (5.21/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000 (5090 shipping to me soon™)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
DirectCompute and Vulkan Compute offer plenty of cross-GPU compute APIs.

If your programmers are already masters of HLSL (aka: DirectX Shader language), it's just easier to write HLSL than to switch everyone to CUDA.

DirectX is a beast. And for all other video game purposes, Vulkan is acceptable (mostly Linux / SteamOS).

If your video game is already cross-GPU compatible because it's HLSL DirectX and you want a big compute shader, the answer is simply DirectCompute (a subcomponent of DirectX). Besides, all your data is already in DirectX Buffer Objects so it's much more natural.

See where my bone to pick with this lies: AMD almost never implements the optional extensions to DirectX in their driver, either. So that point gets kind of moot fast, the game industry today is almost universally bound to AMD's limitations.
 
Joined
Jun 19, 2024
Messages
680 (2.29/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
There is not a single lick of CUDA in the entire page you linked to. Not one lick.

But don't mind me. I'm just someone who can read CUDA and DirectX.

Do YOU know what cuda<<<>>> calls look like? Because they don't look like the page you selected randomly.
Dude. NVAPI is implemented in CUDA. How many times do you have to be told?

DirectCompute and Vulkan Compute offer plenty of cross-GPU compute APIs.
Sure, that’s why everyone uses them, right?

Everyone as in nobody of significance.
 
Joined
Apr 24, 2020
Messages
2,857 (1.57/day)
Dude. NVAPI is implemented in CUDA. How many times do you have to be told?

Oh no. You've told me plenty about your ignorance on this subject. That's why this is so amusing to me.

CUDA doesn't need ID3D12Device btw. I'm just waiting to see if you ever notice. At this point, it seems like you're beyond my help though.

FYI: CUDA compiles to PTX. DirectX/HLSL compiles down to DXIL, a completely different technology. They are different. If your GPU code is in HLSL, it CANNOT be in CUDA. There's like a few extensions that allow data and stuff to be passed between the two in case you need to, but it's not common at all to mix DirectX and CUDA code.

Its one or the other. But please, tell me how little GPU programming I know. Its amusing to me.
See where my bone to pick with this lies: AMD almost never implements the optional extensions to DirectX in their driver, either. So that point gets kind of moot fast, the game industry today is almost universally bound to AMD's limitations.

Hmmm. My experience is that AMD does implement them but they tend to be slower (ex: DirectX Raytracing).
 
Joined
Dec 25, 2020
Messages
8,172 (5.21/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000 (5090 shipping to me soon™)
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Hmmm. My experience is that AMD does implement them but they tend to be slower (ex: DirectX Raytracing).

RT isn't optional, without it, hardware won't qualify for DirectX 12 Ultimate. Granted, the situation has greatly improved as of late, but some of their legacy API stuff still suffers quite a bit
 
Joined
Jun 19, 2024
Messages
680 (2.29/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
FYI: CUDA compiles to PTX. DirectX/HLSL compiles down to DXIL, a completely different technology.
Those are different intermediate languages - both DXIL and PTX are virtual machines and do not contain native code. Nvidia assembly is SASS, which is compiled down to native microcode via a JIT compiler.

Its one or the other. But please, tell me how little GPU programming I know. Its amusing to me.
I don’t have to, you are doing perfectly fine on your own. For example anyone working in PTX would know about SASS because PTX breakpoints are SASS addresses.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,563 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Good stuff, now please backport FSR 4 to more existing titles.



RT isn't optional, without it, hardware won't qualify for DirectX 12 Ultimate.
And who will give a flying F about it?
 
Top