Tuesday, August 29th 2023

Intel Arc Alive & Well for Next-gen - Battlemage GPU Spotted During Malaysia Lab Tour

HardwareLuxx's editor, Andreas Schilling, was invited by Intel Tech Tour to attend a recent press event at the company's manufacturing facility and test labs in Malaysia. Invited media representatives were allowed to observe ongoing work on next generation client and data center-oriented products. He posted a short summary of these observations via social media: "I've seen wafers with Emerald Rapids XCC on them, that were being cut. Not a surprise at all, but still...also MTL682_C0, so Meteor Lake with 6 P-Cores, 8 E-Cores and GT2 Graphic Tile tested in a C0 stepping and finally the Failure Lab already saw BMG G10 - Battlemage is real." We have been hearing mixed mutterings about the status of Team Blue's next-gen Arc GPU technology, with more concrete evidence of its existence popping up around mid-August—namely in the shape of two Battlemage interposers, BGA2362-BMG-X2 and BGA2727-BMG-X3, uploaded to Intel's DESIGN-iN Tools website.

Schilling elaborated further in his full report: "In the Failure Analysis Lab, we came across a tray that evidently contained chips from the next Arc generation - at least, there were already corresponding chips in the analysis, which were clearly labeled as BMG G10." This chip looks to be lined up to succeed the current Alchemist ACM-G10 GPU, as seen on Intel Arc A750 and A770 discrete graphics cards. A leaked product roadmap shows Intel targeting a Battlemage launch around Q2 - Q3 2024, with the aforementioned G10 having a TDP rating of <225 W, as well as another variant—G21—rated for a maximum power consumption of 150 W.
Sources: Hardwareluxx, VideoCardz, Wccftech, Andreas Schilling Tweet
Add your own comment

29 Comments on Intel Arc Alive & Well for Next-gen - Battlemage GPU Spotted During Malaysia Lab Tour

#1
Onyx Turbine
i sincerely hopes that intel succeeds to create a unique competitive advantage synergy effect next year with their battlemage / cpu combo, if sold as an attractive priced bundle
this could be a tempting offering instead of tedious process handpicking a cpu and combining it with a different gpu.
My suggestion is give the battlemage generous VRAM at least 16GB and make it exceed the 4060ti, as such people will not wait in this price class for 5000 series.
So lowest spec bm gpu + lowest spec 6 performance core cpu and a steam voucher instead of a forced game, what a dream!
Posted on Reply
#2
phanbuey
Yeah another competitor would really be amazing. Especially as there's likely to be another GPU shortage with the boom in AI / potential supply chain changes.
Posted on Reply
#3
ARF
ChippendaleMy suggestion is give the battlemage generous VRAM at least 16GB and make it exceed the 4060ti, as such people will not wait in this price class for 5000 series.
My suggestion for these people is:
Radeon RX 6700 XT 12GB
Radeon RX 6750 XT 12GB
Radeon RX 7700 XT 12GB
Radeon RX 6800 16GB
Radeon RX 7800 XT 16GB
Radeon RX 6800 XT 16GB
Posted on Reply
#4
Onyx Turbine
ARFMy suggestion for these people is:
Radeon RX 6700 XT 12GB
Radeon RX 6750 XT 12GB
Radeon RX 7700 XT 12GB
Radeon RX 6800 16GB
Radeon RX 7800 XT 16GB
Radeon RX 6800 XT 16GB
Yes and than later cry about power bill
Posted on Reply
#5
ARF
ChippendaleYes and than later cry about power bill
You will undervolt.

Amd/comments/vh4xjw
I was playing around in Afterburner today and decided to see how far I could push the undervolt on my card, for science and a quieter/cooler office.

Card model is an XFX 6700XT SWFT309.

I used the Borderlands 3 built-in benchmark as my testing game. All settings Ultra, with the following exceptions: Texture Streaming (High), Shadows (High), Material Quality (High), Foliage (Medium), Volumetric Fog (Medium), and AO (Medium.)

Base at 1200mV: 117.06

-25mV 118.62

-50mV 117.74

-75mV 117.75

-100mV 116.29, crashed once but not repeatable

-125mV DNF, repeatable hard crash to desktop

-120mV 119.23, 118.79, 118.97

+10mV for stability

=ending voltage -110mV

Base results


Peak temperatures: 61C edge, 105C Junction

Power usage: ~190W on GPU

Fan speed: 69% (nice)

-110mV results

Peak temperatures: 61C edge, 90C junction

Power usage: ~145W on GPU

Fan speed: 45%
Posted on Reply
#6
Onyx Turbine
ARFYou will undervolt.

Amd/comments/vh4xjw
Ok that is something interesting as i accept a hovering around 150watt which is approx 4060ti..
Could you also list a cpu both intel and amd for around 150 eur which is power efficient in idle and can do some gaming with medium aspirations? Possibly combined with your 6700xt
Posted on Reply
#7
Mr. Perfect
I hope Intel can slowly get closer to the Nvidia/AMD launch cycle. If they're constantly launching mid generation, then Arc will always be spending half it's life competing against the wrong generation. Alchemist is selling next to Ada and RDNA3 instead of it's peers in Ampere and RDNA2, while later Battlemage will have to fight it out with RDNA4 and Blackwell.
Posted on Reply
#8
TheinsanegamerN
ChippendaleOk that is something interesting as i accept a hovering around 150watt which is approx 4060ti..
Could you also list a cpu both intel and amd for around 150 eur which is power efficient in idle and can do some gaming with medium aspirations? Possibly combined with your 6700xt
ChippendaleYes and than later cry about power bill
If you are crying about power use, you cannot afford any GPU. Period.

We've gone over this in many threads before. The difference between a 3090ti and the 6900xt amounted to $50 a year, assuming 8 hours of full throttle use 365 days a year.

You CANNOT afford $300+ GPUs in $50 in electricity a year is a major issue for you.

Buy the performance you want. If power use is a concern, stop playing videogames and get a job that pays $50 more a year so you can afford to game.
Posted on Reply
#9
Jism
ChippendaleYes and than later cry about power bill
Yes, because all AMD hardware consumes 6 gigawatts.

Undervolting, Vsync, there's another feature in the drivers that actually helps tone the power down, i mean, who really need 400FPS ?
Posted on Reply
#10
Tropick
JismYes, because all AMD hardware consumes 6 gigawatts.

Undervolting, Vsync, there's another feature in the drivers that actually helps tone the power down, i mean, who really need 400FPS ?
Undervolt, drop power target, and set FRTC cap @ monitor max refresh - 3Hz and then watch your GPU wattage melt away. You can run AMD cards really lean if you spend some time poking around Adrenalin.
Posted on Reply
#11
Onyx Turbine
TheinsanegamerNIf you are crying about power use, you cannot afford any GPU. Period.

We've gone over this in many threads before. The difference between a 3090ti and the 6900xt amounted to $50 a year, assuming 8 hours of full throttle use 365 days a year.

You CANNOT afford $300+ GPUs in $50 in electricity a year is a major issue for you.

Buy the performance you want. If power use is a concern, stop playing videogames and get a job that pays $50 more a year so you can afford to game.
in europe power prices are different also my personal electricity bill is very high that's why every 10's of watt count.
The 4060 is simply way more power efficient as the 6700 XT.. When a pc for mixed tasks office, video streaming and gaming is on
say minimum 12 hours a day if not sometimes 14, 365 days a year why you stop me being power efficient in next built?

Not to mention the electricity prices in future could very well double again.
If you do calculate the difference for just a medium spec pc with your jet turbine setting (staying in full throttle jargon) high end
the difference is not small, unless you are so rich that say in 10 years time you don't care about saving between 1000 and 2000 dollars.
Not to mention saving on not needing say your 'premium' set up another easily 1000 dollars.
Posted on Reply
#13
Onyx Turbine
AnotherReaderAll of the GPUs mentioned by @ARF are more efficient than the A770 despite being made on a slightly inferior process. The image below is edited from this one in TPU's review of the 4070 TUF.

This are nice graphs but enough people have a 60 hz monitor and won't be utilising the full power needed to render and they are not willing to invest in a 4080 or 90..
Posted on Reply
#14
TheoneandonlyMrK
ChippendaleThis are nice graphs but enough people have a 60 hz monitor and won't be utilising the full power needed to render and they are not willing to invest in a 4080 or 90..
So Battle mage are you buying or not?

All cards have v-sync , no card has to waste power , that's optional.

Personally I don't care what you buy, but I do hope battlemage is good.

It's definitely nice to hear it's not canned but I do hope it won't take a further three years of press releases before it releases.
Posted on Reply
#15
SSGBryan
I'll be buying one.

RTX 4080 performance in blender & stable diffusion and RTX 4070 performance in games?

Sign me up.
Posted on Reply
#16
ZoneDymo
I like how you are all still so positive about Intel joining the gpu market, so full of hope...yet does the pricing of the current line up, the very first untested gpu's that needed so much work....not tell you anything about how they want to approuch the market?

I have yet to see a pricing that says "this is the only gpu to consider for this budget, nothing comes close for this price" and if its not happening now, it most def wont happen with a better product like battlemage, they will jsut get more cocky.
Posted on Reply
#17
RedBear
I'm going to repost this just for the lulz, maybe next time people will take the whole salt shaker when reading the nonsense of a certain leaker.
Posted on Reply
#18
phanbuey
RedBearI'm going to repost this just for the lulz, maybe next time people will take the whole salt shaker when reading the nonsense of a certain leaker.
Yeah that guy is quickly becoming the Jim Kramer of leaks.
Posted on Reply
#19
crysty86
Intel must improve drivers first!!!
Posted on Reply
#20
boomstik360
TheinsanegamerNIf you are crying about power use, you cannot afford any GPU. Period.

We've gone over this in many threads before. The difference between a 3090ti and the 6900xt amounted to $50 a year, assuming 8 hours of full throttle use 365 days a year.

You CANNOT afford $300+ GPUs in $50 in electricity a year is a major issue for you.

Buy the performance you want. If power use is a concern, stop playing videogames and get a job that pays $50 more a year so you can afford to game.
Best. :roll:
Posted on Reply
#21
Easo
All I hope is that Intel does not give up, we really do need the third party for the good of all.
P.S.
Yeah, I am NEVER going to understand people with high/top end GPU's complaining about electricity costs. Sorry, just no.
Posted on Reply
#22
kondamin
That’s a relief, nvidia and amd noping out of the market and having us use what ever waste trickles down from infernce cards doesn’t really sit well‘
Posted on Reply
#23
Scrizz
crysty86Intel must improve drivers first!!!
They have been.
Posted on Reply
#24
Onyx Turbine
TheoneandonlyMrKSo Battle mage are you buying or not?

All cards have v-sync , no card has to waste power , that's optional.

Personally I don't care what you buy, but I do hope battlemage is good.

It's definitely nice to hear it's not canned but I do hope it won't take a further three years of press releases before it releases.
vsync is normally also not an issue as you can limit it yourself
Posted on Reply
#25
gurusmi
EasoYeah, I am NEVER going to understand people with high/top end GPU's complaining about electricity costs. Sorry, just no.
If one builds a pool he has to afford the water also. ;)
Posted on Reply
Add your own comment
Dec 17th, 2024 22:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts