• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ARC "driver issues" turning out to be actually hardware deficiencies - Battlemage reveal

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,252 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Really keen to see what Battlemage can BRR up to, it's shaping up to be a more exciting gen than RDNA4 at this point, in some ways Intel are showing more interest and promise than AMD are.
 
Joined
May 25, 2022
Messages
130 (0.14/day)
SSDs -> Soldigm
NUC -> Asus
Optane -> Binned
1st party Intel Boards - Binned
Gelsinger is known for cutting cruft out of companies to make them leaner and meaner. He has done it with VMWare and was very successful. The categories above aren't known for high margins and he is trained under Andy Grove, one of the co-founders of Intel and also famous for transitioning Intel from a memory company to a CPU company. He has said in the past "I never want to be in memory, which automatically cuts out SSDs and Optane".

Now under heavy financial pressure the GPU division could be cut, but they won't be able to cut the iGPU division anyway.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Gelsinger is known for cutting cruft out of companies to make them leaner and meaner. He has done it with VMWare and was very successful. The categories above aren't known for high margins and he is trained under Andy Grove, one of the co-founders of Intel and also famous for transitioning Intel from a memory company to a CPU company. He has said in the past "I never want to be in memory, which automatically cuts out SSDs and Optane".

Now under heavy financial pressure the GPU division could be cut, but they won't be able to cut the iGPU division anyway.
dGPU was always an extremely strange segment for Intel to try to play in, given the incumbents and the abject failures that are the company's previous attempts in this space. I also feel like they've intentionally setup their dGPUs to fail, given that they always seem to launch them at the same time as their competitors launch theirs. Unless BM is something really special I foresee its launch playing out exactly as Alchemist's: strong initial market penetration that drops off to negligible within a year. If that happens Gelsinger will without a doubt kill the dGPU business, regardless of how much capital has already been plowed into it.

This is especially true because when Arc was conceptualised the expectation was that dGPU and iGPU would be able to share hardware and drivers and thus save money. That quickly proved impossible and I doubt it ever will be, because the iGPU team consists of people who have a completely different focus to dGPU. The former team exists to provide the most minimal hardware that can do basic desktop tasks in the smallest physical, power and thermal envelope; the latter to do... basically the opposite. And of course Intel is unlikely to be willing to rock the iGPU boat when they already have issues with heat and power on their CPUs; the last thign they need is to make that problem worse by trying to shove more dGPU-focused tech in there.
 
Joined
Oct 6, 2021
Messages
1,605 (1.36/day)
You're all very unrealistically positive. Someone has to say something negative to balance it out, I accept the burden:
Intel is just losing money on the GPU business, billions on top of billions, if this generation doesn't work out it will be the last to come in the form of dGPUs for desktops. bye bye.
 
Joined
Aug 12, 2020
Messages
1,207 (0.75/day)
Hardware has been bottlenecking ARC
NOT
Driver has been bottlenecking ARC
How about both? Clearly, ARC numbers got a whole lot better with successive driver updates, as they fixed many broken games. This does NOT mean there is no hardware bottleneck, just that it's not an either/or.
 
Joined
Apr 2, 2011
Messages
2,851 (0.57/day)
dGPU was always an extremely strange segment for Intel to try to play in, given the incumbents and the abject failures that are the company's previous attempts in this space. I also feel like they've intentionally setup their dGPUs to fail, given that they always seem to launch them at the same time as their competitors launch theirs. Unless BM is something really special I foresee its launch playing out exactly as Alchemist's: strong initial market penetration that drops off to negligible within a year. If that happens Gelsinger will without a doubt kill the dGPU business, regardless of how much capital has already been plowed into it.

This is especially true because when Arc was conceptualised the expectation was that dGPU and iGPU would be able to share hardware and drivers and thus save money. That quickly proved impossible and I doubt it ever will be, because the iGPU team consists of people who have a completely different focus to dGPU. The former team exists to provide the most minimal hardware that can do basic desktop tasks in the smallest physical, power and thermal envelope; the latter to do... basically the opposite. And of course Intel is unlikely to be willing to rock the iGPU boat when they already have issues with heat and power on their CPUs; the last thign they need is to make that problem worse by trying to shove more dGPU-focused tech in there.

So I read this...and I look at this differently.

Yes, a lot of the "low impact high return" moonshot items failed to materialize. Likewise, the dGPU division started out crippled in performance by some design choices. Despite this, Intel plowed the resources into the dGPU market to be a lower middle tier option...for many games. None of that inherently guarantees success, but it showed Intel was willing to keep this on life support despite knowing that it'd be a loss leader regarding costs and returns with a primarily middle market target.

Where we strongly diverge is that I think Intel is invested in the dGPU market not as an end, but a means to an end. Think like a corporate schmuck, and you'll see that the opportunity is to copy Nvidia. Release a dGPU as an experimental platform, get it working right, then plow your resources into novel accelerators. Think RT cores, AI cores, etc... By focusing on a market able to take some substantive over cost products, you get guinea pigs who'll spend $269 for a 3060 in 2024 (as of 6/12/24 on Newegg) or $289 for a 4060 with 2/3rd the VRAM. It's relatively easy to experiment with that kind of fat...plow it into research losses, then turn around and release an AI accelerator card with all of the lessons learned. As such, I view this as a "win" on paper whether Intel becomes the next big dGPU force....or whether they package this as a grand experiment and cash-in on the AI craze.


Side note...300+ watts for a dGPU...whereas current processors that high require liquid cooling. I think they've side-stepped the need for power management with this too....so win-win-win in the old Intel playbook.
 
Joined
Mar 7, 2023
Messages
923 (1.39/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
I swear I've known about this since the beggining, but I can't quite seem to remember where I learned it. Perhaps it was just speculation that turned out to be right. idk.

So I read this...and I look at this differently.

Yes, a lot of the "low impact high return" moonshot items failed to materialize. Likewise, the dGPU division started out crippled in performance by some design choices. Despite this, Intel plowed the resources into the dGPU market to be a lower middle tier option...for many games. None of that inherently guarantees success, but it showed Intel was willing to keep this on life support despite knowing that it'd be a loss leader regarding costs and returns with a primarily middle market target.

Where we strongly diverge is that I think Intel is invested in the dGPU market not as an end, but a means to an end. Think like a corporate schmuck, and you'll see that the opportunity is to copy Nvidia. Release a dGPU as an experimental platform, get it working right, then plow your resources into novel accelerators. Think RT cores, AI cores, etc... By focusing on a market able to take some substantive over cost products, you get guinea pigs who'll spend $269 for a 3060 in 2024 (as of 6/12/24 on Newegg) or $289 for a 4060 with 2/3rd the VRAM. It's relatively easy to experiment with that kind of fat...plow it into research losses, then turn around and release an AI accelerator card with all of the lessons learned. As such, I view this as a "win" on paper whether Intel becomes the next big dGPU force....or whether they package this as a grand experiment and cash-in on the AI craze.


Side note...300+ watts for a dGPU...whereas current processors that high require liquid cooling. I think they've side-stepped the need for power management with this too....so win-win-win in the old Intel playbook.
Yeah I agree. Intel has realized the advancements in silicon technology have moved away from cpus and into gpus, they really can't afford to miss another boat like this ( as they did with smartphones). But they nearly almost missed this boat too, but they may just have caught it in time if they work their butts off. Sure, they are spending $500 to make a gpu that sells for $300 now, but in a few years, perhaps they'll be selling that GPU to a tech firm for LLM willing to pay thousands of dollars. I don't think intel is ducking out of this of this one. That would essentially be suicide.

And of course Intel is unlikely to be willing to rock the iGPU boat when they already have issues with heat and power on their CPUs; the last thign they need is to make that problem worse by trying to shove more dGPU-focused tech in there.
Well isn't that exactly what they are doing with lunar lake? Having those Xe cores in there? Or perhaps thats just marketting speak for their existing igpu technology. Or perhaps you mean only on desktop chips? I'm not really sure but I thought arrow lake had arc technology in there too.

Besides, we don't yet know if the next gen of intel chips will have the same heat/power issues the last 5? or so gens have had LOL.

I just hope arrow lake wont take up too much of the die with ai cores that are useless to me. Could have used that space for more cache or cores, or anything other than ai would have been fine. Maybe they could put a little cigarette lighter in there. Now that would be way more interesting than ai cores. A little dangerous perhaps, but definitely eye catching.
 
Joined
Sep 26, 2022
Messages
2,150 (2.61/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
Well isn't that exactly what they are doing with lunar lake? Having those Xe cores in there? Or perhaps thats just marketting speak for their existing igpu technology. Or perhaps you mean only on desktop chips? I'm not really sure but I thought arrow lake had arc technology in there too.
Not only LNL and ARL, it is already inside MTL. Without it there would be no MSI Claw.
A great upgrade from their earlier iGPU's, but has the same weaknesses as their desktop ARC, exacerbated by the fewer cores and lesser power budget.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Where we strongly diverge is that I think Intel is invested in the dGPU market not as an end, but a means to an end. Think like a corporate schmuck, and you'll see that the opportunity is to copy Nvidia. Release a dGPU as an experimental platform, get it working right, then plow your resources into novel accelerators. Think RT cores, AI cores, etc... By focusing on a market able to take some substantive over cost products, you get guinea pigs who'll spend $269 for a 3060 in 2024 (as of 6/12/24 on Newegg) or $289 for a 4060 with 2/3rd the VRAM. It's relatively easy to experiment with that kind of fat...plow it into research losses, then turn around and release an AI accelerator card with all of the lessons learned. As such, I view this as a "win" on paper whether Intel becomes the next big dGPU force....or whether they package this as a grand experiment and cash-in on the AI craze.
Wrong, the only reason Intel wanted to get into dGPUs is because their MBAs wanted to tap the lucrative margins enabled by the crypto bubble. Except that bubble burst before Intel's dGPUs launched, so they were left with a product lacking a market, so they quickly pivoted it to the consumer space instead of throwing it away. The AI hype bubble is irrelevant to Intel's dGPUs because (a) they didn't predict it and thus didn't design their GPUs for it (b) like AMD, they lack the software ecosystem that would allow them to compete with NVIDIA.

Except there is no opportunity to compete with NVIDIA, because no other company has invested as much in building a rich and compelling ecosystem around GPU compute, and no other company will be able to build a competing ecosystem before the AI bubble bursts. So once again Intel will find itself with a product that really doesn't justify the amount of resources they've ploughed into it, a product that's almost certainly never going to provide a positive return on investment (never mind the kind of ROI that their MBAs originally hoped for) - and Pat Gelsinger is very much a positive ROI kinda guy.

He also understands Intel: it's a CPU company, not a little-bit-of-everything company. It's good at CPUs when it's allowed to be, and for the past decade it hasn't been allowed to. That's why he's cut so many projects already, to bring the focus back to the thing the company does best. Whether he can accomplish that given how much ground Intel has already lost to its competitors is still up for debate, but if he thinks the dGPU division has to go because those resources will be better spent on CPU - he'll swing the axe without a second thought, sunk costs be damned. dGPUs can always be a fourth-time's-the-charm project for someone else, but right now he has a company to save.
 
Joined
Apr 2, 2011
Messages
2,851 (0.57/day)
Wrong, the only reason Intel wanted to get into dGPUs is because their MBAs wanted to tap the lucrative margins enabled by the crypto bubble. Except that bubble burst before Intel's dGPUs launched, so they were left with a product lacking a market, so they quickly pivoted it to the consumer space instead of throwing it away. The AI hype bubble is irrelevant to Intel's dGPUs because (a) they didn't predict it and thus didn't design their GPUs for it (b) like AMD, they lack the software ecosystem that would allow them to compete with NVIDIA.

Except there is no opportunity to compete with NVIDIA, because no other company has invested as much in building a rich and compelling ecosystem around GPU compute, and no other company will be able to build a competing ecosystem before the AI bubble bursts. So once again Intel will find itself with a product that really doesn't justify the amount of resources they've ploughed into it, a product that's almost certainly never going to provide a positive return on investment (never mind the kind of ROI that their MBAs originally hoped for) - and Pat Gelsinger is very much a positive ROI kinda guy.

He also understands Intel: it's a CPU company, not a little-bit-of-everything company. It's good at CPUs when it's allowed to be, and for the past decade it hasn't been allowed to. That's why he's cut so many projects already, to bring the focus back to the thing the company does best. Whether he can accomplish that given how much ground Intel has already lost to its competitors is still up for debate, but if he thinks the dGPU division has to go because those resources will be better spent on CPU - he'll swing the axe without a second thought, sunk costs be damned. dGPUs can always be a fourth-time's-the-charm project for someone else, but right now he has a company to save.

Except in one sentence you decide that history doesn't matter, and that it isn't a predictor of the future. I do wish I could live in your interpretation of reality where both the past and the future don't matter.

Let me bottom line this, as you seem to want to take the narrowest of views that excludes both past and future. I'm an Intel investor. Theoretically I came late to the crypto bubble...but if you look at my scheduler and performance in crypto mining I was also released as a pretty trash option for crypto even if it wasn't a bubble. I maybe take this to heart as an experiment...but if that's the case I don't plow huge amounts of resources into developing a card that started from the word go with heavy disadvantages. I also fry the CEO and board for plowing money and resources into Battlemage...because it's throwing more money at a problem. I'm now in Intel...I see my investors are angry...I've cut the smaller investments which were easy to plow into experimental losses....but I need a future.

What do I see?
dGPUs can suck down enormous amounts of power
PCI-e is plenty of bandwidth to saturate secondary processors
AI is basically LLMs...so the more accelerators you pack into an area the better it performs its dog-and-pony show
Nvidia is making bank...and they are officially no longer a silicon company....they call themselves software...and demonstrate it by releasing worse specified product more expensive and still make bank
Nvidia can also be trusted to show the way forward to investors...because blue monkey copies green monkey....separate cards with novel accelerators baked in make bank


You want to pretend that they were too late to the dGPU gouge fest...and I agree. You want to pretend that AI will collapse as a bubble before Intel gets their foot in the door...because. Not because of reasons, you expect that not knowing how long it'll take Intel it is still guaranteed to take too long. Funny...I believe that's a pretty stupid crystal ball when Intel has plenty of stuff out there already. Gaudi 3 being "50% faster than H100" is a questionable claim...but if true slapping that sucker on an add-in card is license to print money just like Nvidia.



So...I have to ignore the past, pretend the future is bleak, ignore the present where Intel is lining up AI successes that look like a win if they can slap them on any board within a few months (Gaudi 3, April 2024), and all so that I can say that Intel is going to axe its GPU division. Kinda seems like you have a hate boner for their dGPUs that will only be sated by their destruction. For my part...I just think it's Intel failing upward somehow....and they seem to have done so because of a conflagration of failures instead of despite it.
 
Joined
May 25, 2022
Messages
130 (0.14/day)
He also understands Intel: it's a CPU company, not a little-bit-of-everything company. It's good at CPUs when it's allowed to be, and for the past decade it hasn't been allowed to. That's why he's cut so many projects already, to bring the focus back to the thing the company does best. Whether he can accomplish that given how much ground Intel has already lost to its competitors is still up for debate, but if he thinks the dGPU division has to go because those resources will be better spent on CPU - he'll swing the axe without a second thought, sunk costs be damned. dGPUs can always be a fourth-time's-the-charm project for someone else, but right now he has a company to save.
That's not what he said. He said dGPUs are what allows making money on what significantly development costs and time is being spent on anyway - iGPUs. Right now it may not seem like it, but if they are able to get harmony between the two and have good products it'll essentially be that way.

Also he said he regrets Intel not being able to continue with Larrabbee and GPU development as a major goal. He's not just some accountant shmuck.
 
Joined
Oct 15, 2011
Messages
2,491 (0.52/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sparkle Titan Arc A770 16 GB
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
If Arc gets axed, then we're in trouble!
 
Joined
Dec 25, 2020
Messages
7,079 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
If Arc gets axed, then we're in trouble!

What gave you the idea that it would be? It's probably one of the most promising divisions within Intel right now. They have momentum and a golden chance - AMD's graphics division has been blundering non-stop for the entire generation and Nvidia's all but given up on the entry and midrange markets. You should expect Battlemage to be a very competitive product indeed.
 
Joined
May 25, 2022
Messages
130 (0.14/day)
You should expect Battlemage to be a very competitive product indeed.
It's going to be better than Alchemist's situation for sure.

But A770-level of performance die is going to be the first to be released in a few months. The one that'll perform at RTX 4070 Ti level is 1 year away - Summer of 2025.
 
Joined
Jan 29, 2023
Messages
1,523 (2.18/day)
Location
France
System Name KLM
Processor 7800X3D
Motherboard B-650E-E Strix
Cooling Arctic Cooling III 280
Memory 16x2 Fury Renegade 6000-32
Video Card(s) 4070-ti PNY
Storage 500+512+8+8+2+1+1+2+256+8+512+2
Display(s) VA 32" 4K@60 - OLED 27" 2K@240
Case 4000D Airflow
Audio Device(s) Edifier 1280Ts
Power Supply Shift 1000
Mouse 502 Hero
Keyboard K68
Software EMDB
Benchmark Scores 0>1000
But Intel did made bests motherboards long ago, right ?!
 
Joined
Dec 25, 2020
Messages
7,079 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
But Intel did made bests motherboards long ago, right ?!

Wouldn't call them the best - but they were reliable and generally, of high quality. None of the Intel boards I've owned sucked.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,746 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
D

Deleted member 229121

Guest
I'm Intel's biggest cheerleader in this space, we need as much competition as we can get...

But Intel did made bests motherboards long ago, right ?!
Wouldn't call them the best - but they were reliable and generally, of high quality. None of the Intel boards I've owned sucked.
System integrator boards, they were ok...

Yea nothing special apart from Skulltrails maybe, but those were not for most.

They also had some weird policies that affected CPU warranties while they were still a thing.
In that, any warranty claims submitted for CPUs needed to be validated on Intel branded boards.

Only reason I even found that out was due to a friend's bad luck.
He had a Core 2 Duo with some borked temp sensors, tested it in both his and my rig (different boards, same issue).
 
Joined
Oct 15, 2011
Messages
2,491 (0.52/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sparkle Titan Arc A770 16 GB
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
He had a Core 2 Duo with some borked temp sensors, tested it in both his and my rig (different boards, same issue).
Yeah, some 45 nm Core 2s, especially early 45 nm ones, are likely to have the minimum reported temp stuck at or around 45C. (especially C0!)

The E0s are more likely to actually be able to report lower core temps.
 
Joined
May 25, 2022
Messages
130 (0.14/day)
We can see from Lunarlake reviews that it's already doing a lot better relative to Time Spy results. This is due to the result of micro-architectural changes in Xe2 aka Battlemage.

Apparently Celestial is the much bigger change though, the one that'll dwarf the improvements in Battlemage. It is said this is what can get them competitive. Interestingly same sources are saying that they'll skip Xe3/Celestial, which is a puzzling decision. If anything Battlemage should be skipped in favor of Celestial if big change in Celestial is indeed true. Ideally they shouldn't skip anything, if for software stability.

2 year GPU development means 2024-2025 for BMG and 2026-2027 for Celestial. If there's no dGPU for Celestial, the successor to BMG is 2028-2029. I would classify that as a mini failure of strategy. Intel often makes these almost brain-dead decisions and wonder why they can't get into a new market.
 
Joined
Mar 13, 2021
Messages
480 (0.35/day)
Processor AMD 7600x
Motherboard Asrock x670e Steel Legend
Cooling Silver Arrow Extreme IBe Rev B with 2x 120 Gentle Typhoons
Memory 4x16Gb Patriot Viper Non RGB @ 6000 30-36-36-36-40
Video Card(s) XFX 6950XT MERC 319
Storage 2x Crucial P5 Plus 1Tb NVME
Display(s) 3x Dell Ultrasharp U2414h
Case Coolermaster Stacker 832
Power Supply Thermaltake Toughpower PF3 850 watt
Mouse Logitech G502 (OG)
Keyboard Logitech G512
I suspect there is issues with Celestial and Druid is progressing along nicely
Similar to how Intel abandoned 20a and focused on 18a when the timelines started bluring together
 
Joined
May 25, 2022
Messages
130 (0.14/day)
Similar to how Intel abandoned 20a and focused on 18a when the timelines started bluring together
And losing 10% performance.

I think they just rebranded 20A to 18A. 20A was supposed to be 15% over intel 3, and 18A, another 10% over 20A. Now they are saying 15% for 18A over Intel 3, which is just 20A claims. And just 30% density.

The real 18A seems to be 18A-P, a year away.
 
Joined
May 25, 2022
Messages
130 (0.14/day)
It's interesting how move to SIMD16 improving compatibility was the most noticeable part of Battlemage. It still underperforms in UE5.

Also, it still needs quite a bit of driver work:
-Driver overhead on 1080p resolutions and slower CPUs, arguably worse on B580 than on predecessor. They need to "fix" it before they get B770, or by B770, because it'll get worse. When Celestial comes out, it'll become even worse, because it'll be faster.
-Promised DX11 driver with performance optimizations not requiring whitelisting has not launched yet.
-When VR support? Probably de-prioritized. Promised feature too.

Still has ReBar performance/compatibility issues and still high idle power issues.

Changing from ARC Control to new software shows a bit of dysfunctionality within the group. Hopefully this is their low.

Battlemage overall is a good improvement. There seems to be overall less glitches and graphics errors than on Alchemist. They need to keep at it.
 
Top