Thursday, September 1st 2022
Arc A770 Ray Tracing Competitive to or Better Than the RTX 3060: Intel at IFA Berlin
Intel Graphics in an interview with PC Gamer on the sidelines of the 2022 IFA Berlin, claimed that the real-time ray tracing architecture of the Xe-HPG graphics architecture in the Arc A770 "Alchemist" graphics card is "competitive or better than" the NVIDIA GeForce RTX 3060, and that the company plans to launch the card at an attractive price-point, to grab a slice of the very top of the gaming graphics market bell-curve. The RTX 3060 is a very successful GPU, especially with graphics card prices on the chill, and AMD is already competing with its Radeon RX 6650 XT, which can be spotted at lower prices. The RTX 3060 has been in the crosshairs of Intel Graphics marketing, in recent performance reveals for the A770.
"When you have a title that is optimized for Intel, in the sense that it runs well on DX12, you're gonna get performance that's significantly above an [RTX] 3060," said Tom Petersen with Intel Graphics. "When you have a title that is optimized for Intel, in the sense that it runs well on DX12, you're gonna get performance that's significantly above an [RTX] 3060. And this is A750 compared to a 3060, so 17%, 14%, 10%. It's going to vary of course based on the title," he said. "We're going to be a little bit faster but depending on your game and depending on your settings, it's trading blows, and that's the A750. Obviously, A770 is going to be a little bit faster. So when you add in DX11, you're gonna see our performance is a little less trading blows, and we're kind of behind in some cases, ahead in some cases, but more losses than wins at DX11," he added. While Intel is still non-committal about a launch date, although it stated that the Arc A770 and A750 will launch with an attractive "introductory pricing."
Sources:
PC Gamer, Wccftech
"When you have a title that is optimized for Intel, in the sense that it runs well on DX12, you're gonna get performance that's significantly above an [RTX] 3060," said Tom Petersen with Intel Graphics. "When you have a title that is optimized for Intel, in the sense that it runs well on DX12, you're gonna get performance that's significantly above an [RTX] 3060. And this is A750 compared to a 3060, so 17%, 14%, 10%. It's going to vary of course based on the title," he said. "We're going to be a little bit faster but depending on your game and depending on your settings, it's trading blows, and that's the A750. Obviously, A770 is going to be a little bit faster. So when you add in DX11, you're gonna see our performance is a little less trading blows, and we're kind of behind in some cases, ahead in some cases, but more losses than wins at DX11," he added. While Intel is still non-committal about a launch date, although it stated that the Arc A770 and A750 will launch with an attractive "introductory pricing."
43 Comments on Arc A770 Ray Tracing Competitive to or Better Than the RTX 3060: Intel at IFA Berlin
Hardware Unboxed never found these problems on a clean system.
I was a little bit nervous to leave my Nvidia bubble but I've been very impressed so far with the stability of the drivers. Granted I only use a single 1440p monitor over DP, but still. Even after my extensive meddling with driver/FreeSync/resolution settings I have yet to experience a crash or black screen (except when my performance-greedy ass tries pushing the VRAM timings way too low :p) AMD has definitely cracked down on their driver stability.
Also, a lot of the stability of recent drivers is due to WDDM. Starting with Vista (yeah, I know), Microsoft was able to get enough useful telemetry to address driver crashes at the source. Everybody's reaping the results of that now.
In a similar way, the bugs on Intel arent worrying because of frequency but rather nature of them. Its about elemental things; scheduling (remember frame pacing on AMD?), boost/clock behaviour and cooling (again...), general consistency and overall support. If small things go wrong, okay. If elemental things are missing... goodbye market share. This must also be underlined: praise companies for doing a sound job. RDNA2 is in a great place. If I was buying, i would go there.
again, outputting to a monitor is a fundamental core basic thing. I’m not talking driver bugs with inconsistent performance or bsod. I’m talking it straight up doesn’t work. And I’m not the only one with that problem. At least one other Newegg reviewer has my same experience.
although it’s possible I just got bad luck and a defective one. When a different model becomes available I will try again, I am hoping for an a770. For now I bought an rx6600 and it’s perfectly stable and hey, actually works!
- New world bricking cards
- Space invader 2000 series
They had VR stuttering issues for 6 months and just recently patched flickering issues in the latest hotfix driver.
In terms of screwups I'd say they are about even. Yes, I always found it interesting that reviewers did not observe many of the reported issues with RDNA 1. Even Steve from HWUB who used an RDNA 1 card as a daily driver. I really wish someone had done a video on the topic investigating the situation because as it stood you had a bevy of unverified claims from reddit and reviewers saying they've had no issue. If I remember correctly it wasn't until 6 months after launch around january that people started reporting issues at a higher frequency, after AMD released it's yearly driver update.
A750 has 7 slices active vs 2 of A380 so 3.5X (+clock diff) the theoretical raytracing output and what's the difference between raytracing throughput between GA106 based RTX 3060 and RTX 3050?
Regarding performance loss when you enable raytracing, logically A750 will have less delta vs RTX 3060 from the delta that A380 had vs RTX 3050.
Even A770 is midrange at best. Alchemist if far from enthusiast class GPU.
Also it's not the first attempt. DG1 was the first and was meant for developers and for Intel itself to perfect drivers before Alchemist.
Its just like VR: there is no killer app, or there aren't enough to justify the expense or investment. Even Alyx wasn't enough - its the exception to the rule. And the momentum for a steady stream of killer apps isn't there either. Its the same chicken/egg situation, ain't nobody got time for that except rich nerds.
Other than that, technically, you are obviously correct. Absolutely, New World happened on EVGA top end cards only; and Space Invaders were in the end a little storm in a cup that apparently touched a limited batch. One caused by VRM/board and the other memory. My popcorn is always ready, love the struggles.
The subject here is drivers, not hardware fault/DOA. All companies have issues, its all about frequency/severity/impact. The basics of incident priority :D But we all have our own perspectives on this, YMMV as always. The numbers and market shares however don't lie and do reflect these notions. And once more... here comes Intel, where you often don't even know whether the cause is a DOA or just shitty drivers or some horrible combo.