• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc Alive & Well for Next-gen - Battlemage GPU Spotted During Malaysia Lab Tour

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,055 (3.88/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
HardwareLuxx's editor, Andreas Schilling, was invited by Intel Tech Tour to attend a recent press event at the company's manufacturing facility and test labs in Malaysia. Invited media representatives were allowed to observe ongoing work on next generation client and data center-oriented products. He posted a short summary of these observations via social media: "I've seen wafers with Emerald Rapids XCC on them, that were being cut. Not a surprise at all, but still...also MTL682_C0, so Meteor Lake with 6 P-Cores, 8 E-Cores and GT2 Graphic Tile tested in a C0 stepping and finally the Failure Lab already saw BMG G10 - Battlemage is real." We have been hearing mixed mutterings about the status of Team Blue's next-gen Arc GPU technology, with more concrete evidence of its existence popping up around mid-August—namely in the shape of two Battlemage interposers, BGA2362-BMG-X2 and BGA2727-BMG-X3, uploaded to Intel's DESIGN-iN Tools website.

Schilling elaborated further in his full report: "In the Failure Analysis Lab, we came across a tray that evidently contained chips from the next Arc generation - at least, there were already corresponding chips in the analysis, which were clearly labeled as BMG G10." This chip looks to be lined up to succeed the current Alchemist ACM-G10 GPU, as seen on Intel Arc A750 and A770 discrete graphics cards. A leaked product roadmap shows Intel targeting a Battlemage launch around Q2 - Q3 2024, with the aforementioned G10 having a TDP rating of <225 W, as well as another variant—G21—rated for a maximum power consumption of 150 W.



View at TechPowerUp Main Site | Source
 
i sincerely hopes that intel succeeds to create a unique competitive advantage synergy effect next year with their battlemage / cpu combo, if sold as an attractive priced bundle
this could be a tempting offering instead of tedious process handpicking a cpu and combining it with a different gpu.
My suggestion is give the battlemage generous VRAM at least 16GB and make it exceed the 4060ti, as such people will not wait in this price class for 5000 series.
So lowest spec bm gpu + lowest spec 6 performance core cpu and a steam voucher instead of a forced game, what a dream!
 
Yeah another competitor would really be amazing. Especially as there's likely to be another GPU shortage with the boom in AI / potential supply chain changes.
 
My suggestion is give the battlemage generous VRAM at least 16GB and make it exceed the 4060ti, as such people will not wait in this price class for 5000 series.

My suggestion for these people is:
Radeon RX 6700 XT 12GB
Radeon RX 6750 XT 12GB
Radeon RX 7700 XT 12GB
Radeon RX 6800 16GB
Radeon RX 7800 XT 16GB
Radeon RX 6800 XT 16GB
 
My suggestion for these people is:
Radeon RX 6700 XT 12GB
Radeon RX 6750 XT 12GB
Radeon RX 7700 XT 12GB
Radeon RX 6800 16GB
Radeon RX 7800 XT 16GB
Radeon RX 6800 XT 16GB
Yes and than later cry about power bill
 
Yes and than later cry about power bill

You will undervolt.


I was playing around in Afterburner today and decided to see how far I could push the undervolt on my card, for science and a quieter/cooler office.

Card model is an XFX 6700XT SWFT309.

I used the Borderlands 3 built-in benchmark as my testing game. All settings Ultra, with the following exceptions: Texture Streaming (High), Shadows (High), Material Quality (High), Foliage (Medium), Volumetric Fog (Medium), and AO (Medium.)

Base at 1200mV: 117.06

-25mV 118.62

-50mV 117.74

-75mV 117.75

-100mV 116.29, crashed once but not repeatable

-125mV DNF, repeatable hard crash to desktop

-120mV 119.23, 118.79, 118.97

+10mV for stability

=ending voltage -110mV

Base results


Peak temperatures: 61C edge, 105C Junction

Power usage: ~190W on GPU

Fan speed: 69% (nice)

-110mV results

Peak temperatures: 61C edge, 90C junction

Power usage: ~145W on GPU

Fan speed: 45%
 
I hope Intel can slowly get closer to the Nvidia/AMD launch cycle. If they're constantly launching mid generation, then Arc will always be spending half it's life competing against the wrong generation. Alchemist is selling next to Ada and RDNA3 instead of it's peers in Ampere and RDNA2, while later Battlemage will have to fight it out with RDNA4 and Blackwell.
 
Ok that is something interesting as i accept a hovering around 150watt which is approx 4060ti..
Could you also list a cpu both intel and amd for around 150 eur which is power efficient in idle and can do some gaming with medium aspirations? Possibly combined with your 6700xt
Yes and than later cry about power bill
If you are crying about power use, you cannot afford any GPU. Period.

We've gone over this in many threads before. The difference between a 3090ti and the 6900xt amounted to $50 a year, assuming 8 hours of full throttle use 365 days a year.

You CANNOT afford $300+ GPUs in $50 in electricity a year is a major issue for you.

Buy the performance you want. If power use is a concern, stop playing videogames and get a job that pays $50 more a year so you can afford to game.
 
Yes, because all AMD hardware consumes 6 gigawatts.

Undervolting, Vsync, there's another feature in the drivers that actually helps tone the power down, i mean, who really need 400FPS ?

Undervolt, drop power target, and set FRTC cap @ monitor max refresh - 3Hz and then watch your GPU wattage melt away. You can run AMD cards really lean if you spend some time poking around Adrenalin.
 
If you are crying about power use, you cannot afford any GPU. Period.

We've gone over this in many threads before. The difference between a 3090ti and the 6900xt amounted to $50 a year, assuming 8 hours of full throttle use 365 days a year.

You CANNOT afford $300+ GPUs in $50 in electricity a year is a major issue for you.

Buy the performance you want. If power use is a concern, stop playing videogames and get a job that pays $50 more a year so you can afford to game.

in europe power prices are different also my personal electricity bill is very high that's why every 10's of watt count.
The 4060 is simply way more power efficient as the 6700 XT.. When a pc for mixed tasks office, video streaming and gaming is on
say minimum 12 hours a day if not sometimes 14, 365 days a year why you stop me being power efficient in next built?

Not to mention the electricity prices in future could very well double again.
If you do calculate the difference for just a medium spec pc with your jet turbine setting (staying in full throttle jargon) high end
the difference is not small, unless you are so rich that say in 10 years time you don't care about saving between 1000 and 2000 dollars.
Not to mention saving on not needing say your 'premium' set up another easily 1000 dollars.
 
Yes and than later cry about power bill
All of the GPUs mentioned by @ARF are more efficient than the A770 despite being made on a slightly inferior process. The image below is edited from this one in TPU's review of the 4070 TUF.

1693328121969.png
 
This are nice graphs but enough people have a 60 hz monitor and won't be utilising the full power needed to render and they are not willing to invest in a 4080 or 90..
So Battle mage are you buying or not?

All cards have v-sync , no card has to waste power , that's optional.

Personally I don't care what you buy, but I do hope battlemage is good.

It's definitely nice to hear it's not canned but I do hope it won't take a further three years of press releases before it releases.
 
I'll be buying one.

RTX 4080 performance in blender & stable diffusion and RTX 4070 performance in games?

Sign me up.
 
I like how you are all still so positive about Intel joining the gpu market, so full of hope...yet does the pricing of the current line up, the very first untested gpu's that needed so much work....not tell you anything about how they want to approuch the market?

I have yet to see a pricing that says "this is the only gpu to consider for this budget, nothing comes close for this price" and if its not happening now, it most def wont happen with a better product like battlemage, they will jsut get more cocky.
 
I'm going to repost this just for the lulz, maybe next time people will take the whole salt shaker when reading the nonsense of a certain leaker.
 
I'm going to repost this just for the lulz, maybe next time people will take the whole salt shaker when reading the nonsense of a certain leaker.

Yeah that guy is quickly becoming the Jim Kramer of leaks.
 
If you are crying about power use, you cannot afford any GPU. Period.

We've gone over this in many threads before. The difference between a 3090ti and the 6900xt amounted to $50 a year, assuming 8 hours of full throttle use 365 days a year.

You CANNOT afford $300+ GPUs in $50 in electricity a year is a major issue for you.

Buy the performance you want. If power use is a concern, stop playing videogames and get a job that pays $50 more a year so you can afford to game.
Best. :roll:
 
All I hope is that Intel does not give up, we really do need the third party for the good of all.
P.S.
Yeah, I am NEVER going to understand people with high/top end GPU's complaining about electricity costs. Sorry, just no.
 
That’s a relief, nvidia and amd noping out of the market and having us use what ever waste trickles down from infernce cards doesn’t really sit well‘
 
So Battle mage are you buying or not?

All cards have v-sync , no card has to waste power , that's optional.

Personally I don't care what you buy, but I do hope battlemage is good.

It's definitely nice to hear it's not canned but I do hope it won't take a further three years of press releases before it releases.
vsync is normally also not an issue as you can limit it yourself
 
Back
Top