- Joined
- Jun 12, 2022
- Messages
- 64 (0.07/day)
It would be great if you could run Cyberpunk with raytracing High on 35FPS.What is wrong with the 6500XT for Gaming?
It would be great if you could run Cyberpunk with raytracing High on 35FPS.What is wrong with the 6500XT for Gaming?
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
That's not gonna happen with a budget graphics card. Not in 2022, anyway.It would be great if you could run Cyberpunk with raytracing High on 35FPS.
Why would anyone expect something to be honest. When talking about games performance you need to count in the drivers, which I doubt that even Intel has the resources to pull off for a brand new architecture.3DMark performance shows possible potential. Games show current reality.
In any case Intel will be selling millions of those to OEMs, to be used in their prebuild systems, meaning that cards like Nvidia's MX line and AMD's RX 6400/6500XT are out of Intel based systems. And that's what Intel cares about. Those 3DMark scores are enough to convince consumers that they are getting a fast card.
Now if only someone could clear up things about ARC's hardware compatibility, that would be nice. Let's hope that Intel doesn't starts a new trend with cards being incompatible with some systems. If they start that kind of trend, then I wish they NEVER had reentered the market and competely fail.
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
It's not brand new. It's based on current gen Xe, which you can find in Rocket Lake / Alder Lake CPUs.Why would anyone expect something to be honest. When talking about games performance you need to count in the drivers, which I doubt that even Intel has the resources to pull off for a brand new architecture.
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
Have I said something different?Why would anyone expect something to be honest. When talking about games performance you need to count in the drivers, which I doubt that even Intel has the resources to pull off for a brand new architecture.
It's not the same. Let's for example consider a case where Intel iGPU is having huge bugs when enabling feature A in a game. If that feature also reduces framerates from 20fps to 10 fps, gamers will just avoid enabling it because of the performance hit, not because of the bugs. If a gamer wants to enable it anyway, a tech support person could still insist in their reply that the solution is to just "disable that A feature for the game to run at reasonable framerates". Also a game running at low fps because of the lack of optimizations will probably pass unnoticed, with the majority thinking that it's normal for a slow iGPU to perform like that.It's not brand new. It's based on current gen Xe, which you can find in Rocket Lake / Alder Lake CPUs.
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
I'll just say what I have said in many other related threads: There's no reason to be overly negative or positive - we'll see when it comes out.It's not the same. Let's for example consider a case where Intel iGPU is having huge bugs when enabling feature A in a game. If that feature also reduces framerates from 20fps to 10 fps, gamers will just avoid enabling it because of the performance hit, not because of the bugs. If a gamer wants to enable it anyway, a tech support person could still insist in their reply that the solution is to just "disable that A feature for the game to run at reasonable framerates". Also a game running at low fps because of the lack of optimizations will probably pass unnoticed, with the majority thinking that it's normal for a slow iGPU to perform like that.
But when someone is trying to be competitive in the discrete GPU market, they can't avoid situations like this. They will have to fix the bugs, they will have to optimize performance. While Intel is building GPUs for decades and drivers for GPUs for decades, I doubt they had thrown the necessary resources on optimizations and bug fixing. That "heavy optimization and fixing ALL bugs" situation is probably "brand new" for Intel's graphics department.
Processor | AMD Ryzen 9 5900X ||| Intel Core i7-3930K |
---|---|
Motherboard | ASUS ProArt B550-CREATOR ||| Asus P9X79 WS |
Cooling | Noctua NH-U14S ||| Be Quiet Pure Rock |
Memory | Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz |
Video Card(s) | MSI GTX 1060 3GB ||| MSI GTX 680 4GB |
Storage | Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB |
Display(s) | Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24" |
Case | Fractal Design Define 7 XL x 2 |
Audio Device(s) | Cambridge Audio DacMagic Plus |
Power Supply | Seasonic Focus PX-850 x 2 |
Mouse | Razer Abyssus |
Keyboard | CM Storm QuickFire XT |
Software | Ubuntu |
If you take a GPU architecture that works reasonably well and scale it let's say 10x, but the performance don't scale accordingly, then you're having a hardware problem, not a driver problem. The driver actually does far less than you think, and has fairly little to do with the scale of the GPU. You know Nvidia and AMD scales fairly consistently from low-end GPUs with just a few "cores" up to massive GPUs on the very same driver, like Pascal from GT 1010 at 256 cores up to Titan Xp at 3840. The reason why this works is the management of the hardware resources is done by the GPU scheduler, like allocating (GPU) threads, queuing memory operations etc. If these things were done by the driver, then the CPU overhead would grow with GPU size and large GPUs would just not perform at all.But when someone is trying to be competitive in the discrete GPU market, they can't avoid situations like this. They will have to fix the bugs, they will have to optimize performance. While Intel is building GPUs for decades and drivers for GPUs for decades, I doubt they had thrown the necessary resources on optimizations and bug fixing. That "heavy optimization and fixing ALL bugs" situation is probably "brand new" for Intel's graphics department.
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
I wasn't describing what you understood. You didn't understood my point and probably my English is the problem here.If you take a GPU architecture that works reasonably well and scale it let's say 10x, but the performance don't scale accordingly, then you're having a hardware problem, not a driver problem. The driver actually does far less than you think, and has fairly little to do with the scale of the GPU. You know Nvidia and AMD scales fairly consistently from low-end GPUs with just a few "cores" up to massive GPUs on the very same driver, like Pascal from GT 1010 at 256 cores up to Titan Xp at 3840. The reason why this works is the management of the hardware resources is done by the GPU scheduler, like allocating (GPU) threads, queuing memory operations etc. If these things were done by the driver, then the CPU overhead would grow with GPU size and large GPUs would just not perform at all.
My point is, Intel's architecture is not fundamentally new and they have a working driver from their integrated graphics, so if they have problems with scalability then it's a hardware issue.
I'm not saying there can't be minor bugs and tweaks to the driver, but the bigger problem lies in hardware, and will probably take them a couple more iterations to sort out.
Don't buy a product expecting the drivers to suddenly add performance later, that has not panned out well in the past.
Processor | AMD Ryzen 9 5900X ||| Intel Core i7-3930K |
---|---|
Motherboard | ASUS ProArt B550-CREATOR ||| Asus P9X79 WS |
Cooling | Noctua NH-U14S ||| Be Quiet Pure Rock |
Memory | Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz |
Video Card(s) | MSI GTX 1060 3GB ||| MSI GTX 680 4GB |
Storage | Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB |
Display(s) | Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24" |
Case | Fractal Design Define 7 XL x 2 |
Audio Device(s) | Cambridge Audio DacMagic Plus |
Power Supply | Seasonic Focus PX-850 x 2 |
Mouse | Razer Abyssus |
Keyboard | CM Storm QuickFire XT |
Software | Ubuntu |
I know , I was trying to make you (and others in this thread) who assume that a driver for an integrated GPU and a dedicated GPU will be fundamentally different, in reality they would be mostly the same. The main difference will be in the hardware and the firmware which controls it. That's why I mentioned that Nvidia have low-end GPUs which are performing roughly comparable to integrated GPUs and high-end GPUs running the very same driver, and the same goes for AMD, which also runs the same driver for their integrated GPUs. So it's important to understand that this scaling has little to nothing to do with the driver.I wasn't describing a scaling problem. I was saying that building graphics drivers for low performing iGPUs is probably very different than building drivers for discrete GPUs. You can bypass/ignore some driver issues when you support "free" and slow iGPUs, you can't when you support expensive discrete GPUs.
glBindTexture(GL_TEXTURE_2D, ...);
glBindBuffer(GL_ARRAY_BUFFER, ...);
glVertexAttribPointer(...);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ...);
glDrawElements(GL_TRIANGLES, ...);
Drivers aren't really optimized for specific games, at least not the way you think. When you see driver updates offer up to X % more performance in <selected title>, it's usually tweaking the game profiles or sometimes overriding shader programs. These aren't really so much optimizations as them "cheating" to try to reduce image quality very slightly to get a few percent more performance in benchmarks.Let's say that Intel is producing only iGPUs and iGPUs are performing poorly in game title A and also have a bug(image corruption) with graphics setting X in that game.
Do you throw resources to optimize the driver in that game title A, to move fps from 20 to 22 and also fix that graphics setting X, especially when enabling that setting means dropping framerate from 20fps to 12fps? Probably not. If that game is a triple A title you might spent resources to optimize it, but at the same time the solution for graphics setting X will be simply to ask gamers to keep it disabled(if it is difficult to fix the bug). If that game is a not so much advertised game, you probably wouldn't even spend resources to move that fps counter from 20 to 22fps.
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
Why? Probably because it was not their priority? Just asking. How many games out there need OpenGL? Probably very few? On the other hand I guess there are pro apps using OpenGL. Intel was targeting office PCs, so OpenGL could be more important for them.AMD have offered horrible OpenGL support for ages
If you don't try to optimize for every game or app or probable scenario out there and not implement a gazillion of features in your drivers, I guess you have better chances to offer something more stable. Much simpler, but more stable. Also people who have integrated Intel GPU, but do all their jobs on discrete Nvidia or AMD GPUs, I bet they will have no problems with their Intel iGPUs. Because, well, they are disabled.The overall quality and stability of Intel's drivers have been better than AMD's for years.
Well in Intel's driver faq you will read about games crushing and image quality problems. So, Intel might had thrown resources on their media engine, OpenGL performance and driver stability in office applications, but doesn't look like they where caring about games. They have to now that they are trying to become a discrete Graphics card maker. That's what I am saying all the time and while you started your post saying you understand my point, I am not sure about that.Drivers aren't really optimized for specific games,
System Name | RyzenGtEvo/ Asus strix scar II |
---|---|
Processor | Amd R5 5900X/ Intel 8750H |
Motherboard | Crosshair hero8 impact/Asus |
Cooling | 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK |
Memory | Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB |
Video Card(s) | Powercolour RX7900XT Reference/Rtx 2060 |
Storage | Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme |
Display(s) | Samsung UAE28"850R 4k freesync.dell shiter |
Case | Lianli 011 dynamic/strix scar2 |
Audio Device(s) | Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset |
Power Supply | corsair 1200Hxi/Asus stock |
Mouse | Roccat Kova/ Logitech G wireless |
Keyboard | Roccat Aimo 120 |
VR HMD | Oculus rift |
Software | Win 10 Pro |
Benchmark Scores | 8726 vega 3dmark timespy/ laptop Timespy 6506 |
Processor | AMD Ryzen 9 5900X ||| Intel Core i7-3930K |
---|---|
Motherboard | ASUS ProArt B550-CREATOR ||| Asus P9X79 WS |
Cooling | Noctua NH-U14S ||| Be Quiet Pure Rock |
Memory | Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz |
Video Card(s) | MSI GTX 1060 3GB ||| MSI GTX 680 4GB |
Storage | Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB |
Display(s) | Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24" |
Case | Fractal Design Define 7 XL x 2 |
Audio Device(s) | Cambridge Audio DacMagic Plus |
Power Supply | Seasonic Focus PX-850 x 2 |
Mouse | Razer Abyssus |
Keyboard | CM Storm QuickFire XT |
Software | Ubuntu |
While DirectX is certainly more widespread, there are "minor" successes such as Minecraft(original version), most indie games and most emulators, and considering that AMD has really struggled to maintain market shares for the past decade, and have had decent value options, this should have been pretty low-hanging fruit to gain some extra percentage points of market share. And as for the stability issues of AMD drivers, those are not limited to OpenGL, and have been a persistent problem for over a decade. (we keep hearing about it every time there is new hardware)Why? Probably because it was not their priority? Just asking. How many games out there need OpenGL?
Well the answer is they don't, that's the point you still can't grasp.If you don't try to optimize for every game or app or probable scenario out there and not implement a gazillion of features in your drivers
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
Well, don't worry I can see where you going, or maybe to be more accurate, where you are standing.Well the answer is they don't, that's the point you still can't grasp.
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
I'm not a programmer by far, but from an average user's point of view, I'd say 3DMark stresses a very specific part of your of hardware. I don't know what it is, but I see all of my graphics cards behaving very differently under 3DMark compared to games in terms of clock speed, power consumption, etc. The part of Arc GPU's that 3DMark stresses the most must be strong, while other parts of it fall behind the competition. Games, on the other hand, use a much broader range of your hardware's capabilities. To put it simply: 3DMark is designed to stress a specific part of your hardware, games are designed to use whatever you have.Why ARC performs on par with the competition in 3DMark and lose badly in games?
Does it really do that? Do you have sources? If so, I believe it must be some bug in the driver that can be ironed out - and not an issue of optimisation. But I'm curious about a proper answer, as I don't know much about driver code myself.Why most bugs in ARC are bugs that lead to crush of the application or texture corruption?
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
Go at AMD's, Intel's and Nvidia's page, go to download the latest version of the driver, don't download the driver, just read the release notes.Do you have sources?
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
A fair point. Personally, I think that's down to how the driver communicates with the API, and specific portions of the API the game uses. Like I said, bugs that can be ironed out. It's not an "optimisation" thing.Go at AMD's, Intel's and Nvidia's page, go to download the latest version of the driver, don't download the driver, just read the release notes.
Processor | AMD Ryzen 9 5900X ||| Intel Core i7-3930K |
---|---|
Motherboard | ASUS ProArt B550-CREATOR ||| Asus P9X79 WS |
Cooling | Noctua NH-U14S ||| Be Quiet Pure Rock |
Memory | Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz |
Video Card(s) | MSI GTX 1060 3GB ||| MSI GTX 680 4GB |
Storage | Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB |
Display(s) | Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24" |
Case | Fractal Design Define 7 XL x 2 |
Audio Device(s) | Cambridge Audio DacMagic Plus |
Power Supply | Seasonic Focus PX-850 x 2 |
Mouse | Razer Abyssus |
Keyboard | CM Storm QuickFire XT |
Software | Ubuntu |
AusWolf's reply is pretty good in layman's terms.Why ARC performs on par with the competition in 3DMark and lose badly in games?
If there are texture corruption across multiple games, and the same games don't have the same problem on other hardware, then it means the driver doesn't behave according to spec. Finding the underlying reason would require more details though, it could be either the driver or the hardware. This might surprise you, but when it comes to software bugs it's actually better if the bug occurs across many use cases. That usually means the bug is easier to reproduce and precisely locate. Such bugs are usually caught and fixed once there are enough testers. A rare and obscure bug is in many ways worse, as it will lead to very poor bug reports, which in turn leads to large efforts to find those bugs.Why most bugs in ARC are bugs that lead to crush of the application or texture corruption? In AMD's and Nvidia's driver FAQ you will read about strange behaviors when doing very specific stuff. In ARC FAQ half bugs are about application crush, or textures after just running the game.
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
Those are the numbers that will be printed on advertisement material. That's why Intel is concentrating on those apps. While you say optimization is a myth, it seems Intel is focusing on that myth.I don't think the performance scores here have any use to consumers.
I guess I have to provide a link after allIf there are texture corruption across multiple games, and the same games don't have the same problem on other hardware, then it means the driver doesn't behave according to spec. Finding the underlying reason would require more details though, it could be either the driver or the hardware. This might surprise you, but when it comes to software bugs it's actually better if the bug occurs across many use cases. That usually means the bug is easier to reproduce and precisely locate. Such bugs are usually caught and fixed once there are enough testers. A rare and obscure bug is in many ways worse, as it will lead to very poor bug reports, which in turn leads to large efforts to find those bugs.
DRIVER VERSION: 30.0.101.1736
DATE: June 14, 2022
GAMING HIGHLIGHTS:
• Launch driver for Intel® Arc™ A380 Graphics (Codename Alchemist).
• Intel® Game On Driver support for Redout 2*, Resident Evil 2*, Resident Evil 3*, and Resident Evil 7:
Biohazard* on Intel® Arc™ A-Series Graphics.
Get a front row pass to gaming deals, contests, betas, and more with Intel Software Gaming Access.
FIXED ISSUES:
• Far Cry 6* (DX12) may experience texture corruption in water surfaces during gameplay.
• Destiny 2* (DX11) may experience texture corruption on some rock surfaces during gameplay.
• Naraka: Bladepoint* (DX11) may experience an application crash or become unresponsive during training
mode.
KNOWN ISSUES:
• Metro Exodus: Enhanced Edition* (DX12), Horizon Zero Dawn* (DX12), Call of Duty: Vanguard* (DX12), Tom
Clancy’s Ghost Recon Breakpoint (DX11), Strange Brigade* (DX12) and Forza Horizon 5* (DX12) may
experience texture corruption during gameplay.
• Tom Clancy’s Rainbow Six Siege* (DX11) may experience texture corruption in the Emerald Plains map when
ultra settings are enabled in game. A workaround is to select the Vulkan API in game settings.
• Gears 5* (DX12) may experience an application crash, system hang or TDR during gameplay.
• Sniper Elite 5* may experience an application crash on some Hybrid Graphics system configurations when
Windows® “Graphics Performance Preference” option for the application is not set to “High Performance”.
• Call of Duty: Black Ops Cold War* (DX12) may experience an application crash during gameplay.
• Map textures may fail to load or may load as blank surfaces when playing CrossFire*.
• Some objects and textures in Halo Infinite* (DX12) may render black and fail to load. Lighting may also appear
blurry or over exposed in the multiplayer game menus
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
Even if a certain architecture performs better in one app than another, there's nothing to suggest that it's due to a magical driver rather than the hardware itself.Those are the numbers that will be printed on advertisement material. That's why Intel is concentrating on those apps. While you say optimization is a myth, it seems Intel is focusing on that myth.
No one said that there can't be bugs in the driver-API communication. AMD is notorious for leaving bugs in for a long time. The argument was that these bugs in no way mean that games are "optimised" for a certain architecture or god forbid, manufacturer.I guess I have to provide a link after all
What doesn't surprice me is how the glass is half empty or half full, depending on the situation.
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
A driver does play a role. It's not a myth. When a new driver fixes performance in a game or multiple games, then something was changed in that driver. What was that? I am NOT a driver developer. Are you? Having luck of knowledge doesn't means that the phrase "nothing to suggest" has any real value here. A man from 100 BC will insist that there is "nothing to suggest" that a 10 tone helicopter is staying on the air by pushing that air down with it's rotor blade, lucking all the necessary knowledge about physics.*Even if a certain architecture performs better in one app than another, there's nothing to suggest that it's due to a magical driver rather than the hardware itself.
AMD CPUs have been famous for being better at productivity apps because they where having more cores until Alder Lake. On the other hand Intel almost always had the advantage in IPC and also many apps where optimized for Intel CPUs, not AMD CPUs.AMD CPUs have been famous for being better at productivity apps, while Intel is (or used to be) better at games. Is this due to some driver magic as well?
I am not going to comment about the notorious AMD. It's boring, after so many years reading the same stuff. People having the need to bush AMD, even when using it's products, it is not my area of expertise. I am not going to play with words also with someone who will never ever accept something different. I am reading for decades, even from Intel/AMD/Nvidia representatives about apps/games optimizations, apps/games been developed on specific platforms, have seen how Nvidia's perfect image was ruined for a year or two somewhere in 2014 I think, when games where optimized for the consoles, meaning GCN and PC versions where having a gazillion of problems on PCs, especially those games payed from Nvidia to implement GameWorks in their PC versions.No one said that there can't be bugs in the driver-API communication. AMD is notorious for leaving bugs in for a long time. The argument was that these bugs in no way mean that games are "optimised" for a certain architecture or god forbid, manufacturer.
System Name | Nebulon B |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | MSi PRO B650M-A WiFi |
Cooling | be quiet! Dark Rock 4 |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | AMD Radeon RX 6750 XT 12 GB |
Storage | 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2 |
Display(s) | Dell S3422DWG, 7" Waveshare touchscreen |
Case | Kolink Citadel Mesh black |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | Seasonic Prime GX-750 |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE |
I'm not a driver developer either, but I'm willing to learn from someone who knows a lot more about the topic than I do, for example:A driver does play a role. It's not a myth. When a new driver fixes performance in a game or multiple games, then something was changed in that driver. What was that? I am NOT a driver developer. Are you? Having luck of knowledge doesn't means that the phrase "nothing to suggest" has any real value here. A man from 100 BC will insist that there is "nothing to suggest" that a 10 tone helicopter is staying on the air by pushing that air down with it's rotor blade, lucking all the necessary knowledge about physics.*
This. @efikkan presented a clear explanation with technical details as to why his claim is right. You didn't.Drivers aren't really optimized for specific games, at least not the way you think. When you see driver updates offer up to X % more performance in <selected title>, it's usually tweaking the game profiles or sometimes overriding shader programs. These aren't really so much optimizations as them "cheating" to try to reduce image quality very slightly to get a few percent more performance in benchmarks.
When they do real performance optimizations, it's usually one of these;
a) General API overhead (tied to the internal state machine of an API) - Will affect anything that uses this API.
b) Overhead of a specific API call or parameter - Will affect anything that uses this API call
So therefore, I reject your premise of optimizing performance for a specific title.
There you go. That's down to differences in the hardware, isn't it?AMD CPUs have been famous for being better at productivity apps because they where having more cores until Alder Lake. On the other hand Intel almost always had the advantage in IPC and also many apps where optimized for Intel CPUs, not AMD CPUs.
1. You clearly misread my point. I never intended to criticise AMD. I merely stated the fact that bugs CAN be found in a driver, like in any software. It's not proof that drivers are specifically optimised for certain games.I am not going to comment about the notorious AMD. It's boring, after so many years reading the same stuff. People having the need to bush AMD, even when using it's products, it is not my area of expertise. I am not going to play with words also with someone who will never ever accept something different. I am reading for decades, even from Intel/AMD/Nvidia representatives about apps/games optimizations, apps/games been developed on specific platforms, have seen how Nvidia's perfect image was ruined for a year or two somewhere in 2014 I think, when games where optimized for the consoles, meaning GCN and PC versions where having a gazillion of problems on PCs, especially those games payed from Nvidia to implement GameWorks in their PC versions.
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
No he didn't. He just wrote too much stuff that not necessarily are on topic or correct. If you know NOTHING about driver development how can you assume that what he wrote is in fact correct? You can't. And he was trying to support a specific argument where he was constantly changing the point of view which in my book doesn't make him objective or his arguments correct. You can give him all the credit you want seeing that he is supporting your idea of a notorious AMD, but I am someone who needs more specific and more concrete arguments than 5 lines of code.This. @efikkan presented a clear explanation with technical details as to why his claim is right. You didn't.
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
Well, I suppose it would be more accurate to say that it cripples performance on non-Nvidia platforms, but the end result is the same.
The fact we don't have ready made answers but only guesses for these questions is quite simply because we don't know for sure. Perhaps there's monkeys disguised as humans building their code. Perhaps they have hardware issues they work around as we speak. Workarounds are going to be inefficient.Well, don't worry I can see where you going, or maybe to be more accurate, where you are standing.
Anyway let's keep questions simple here.
Why ARC performs on par with the competition in 3DMark and lose badly in games?
Why most bugs in ARC are bugs that lead to crush of the application or texture corruption? In AMD's and Nvidia's driver FAQ you will read about strange behaviors when doing very specific stuff. In ARC FAQ half bugs are about application crush, or textures after just running the game.
It does. Here's a car analogy. The driver is the DRIVER. But the car is the car. It has limits, it can accelerate to 100 in a defined number of seconds. But if the driver of the car is crap at shifting gears, it certainly won't meet that spec. A better driver, or hey, let's use the at-one-point implemented Shader cache as an example: a more experienced driver, having driven the car a few times in that situation; will know exactly when to shift gears and therefore meets the spec.A driver does play a role. It's not a myth.
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500 |
Motherboard | X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) |
Cooling | Aigo ICE 400SE / Segotep T4 / Νoctua U12S |
Memory | Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200 |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech |
Software | Windows 10 / Windows 10&Windows 11 / Windows 10 |
Considering we are not programmers, we might use words not really accurate. But in most times we will be describing the same thing, considering most of us having the same teachers and the same books (youtube, forums, tech sites).Calling either of it optimization is not really accurate, is it?