Friday, February 4th 2022

Intel Arc "Alchemist" PCB Closeup Shows Up on Intel Graphics Discord

Intel put out a clear, close-up picture of its Arc "Alchemist" gaming graphics card engineering sample. This matches with a picture of the PCB rear shot that surfaced in a report by "Moore's Law is Dead." The picture reveals a PCB that's about 3/4th the length of the cooling solution, with the remainder of the cooler's length being used to directly vent airflow from the back.

The PCB reveals a rectangular pad for the GPU, which corresponds with that of the larger "Alchemist" GPU. This is surrounded by what look like eight GDDR6 memory pads for a 256-bit wide memory interface; at least 10 VRM phases of an unknown configuration; and a power input configuration that's made up of one each of 8-pin and 6-pin PCIe power connectors (capable of delivering 300 W including slot power). The PCB shows traces that connect the GPU to all 16 PCI-Express lanes of the PCIe finger. Display outputs include three full-size DisplayPorts and an HDMI. This particular variant of "Alchemist" is rumored to feature 512 execution units (4,096 unified shaders), and at least in SiSoft SANDRA, it allegedly outperforms the GeForce RTX 3070 Ti "Ampere."
Source: SquashBionic (Twitter)
Add your own comment

54 Comments on Intel Arc "Alchemist" PCB Closeup Shows Up on Intel Graphics Discord

#26
arni-gx
so, what is their gpu series name when its finally released on gpu market ??
Posted on Reply
#27
lexluthermiester
JismWe want "benchmarks" and visual quality; not a sisoft sandra test which does'nt say anything at all.
Not true, SiSoft Sandra says a lot. The key is understanding the results.
Posted on Reply
#28
Kanan
Tech Enthusiast & Gamer
Drivers will be key if this will be good or not, will make or break Intel Arc. Nobody will care if it has theoretical great perf if the drivers are shoddy. Then it will most likely end up as a miner card instead, the funny thing is, at the current market, this GPU can not fail.
Posted on Reply
#29
Bones
KananDrivers will be key if this will be good or not, will make or break Intel Arc. Nobody will care if it has theoretical great perf if the drivers are shoddy. Then it will most likely end up as a miner card instead, the funny thing is, at the current market, this GPU can not fail.
I'd put money of it - When these are released for sale they will be available, in stock for all of about 2 seconds - Eh, I'll be generous and make it 2.5 seconds.
Posted on Reply
#30
looniam
BonesI'd put money of it - When these are released for sale they will be available, in stock for all of about 2 seconds - Eh, I'll be generous and make it 2.5 seconds.
and drivers will be 2.5 years later . . . (hi raja!)
Posted on Reply
#31
boomheadshot8
x8pin + x6 + 16 VRM and i'm seeing 16 modules of memory ; so 16go of memery ?
Im a little exited about this card, but the price...
Posted on Reply
#32
usiname
lexluthermiesterNot true, SiSoft Sandra says a lot. The key is understanding the results.
Your Sandra says that radeon VII is faster than rtx 3090
Posted on Reply
#33
mechtech
lexluthermiesterAnd it looks surprisingly high end. I really hope Intel delivers on their claims.
I think everyone does. However if they are using TSMC, I don't know how many will make it to shelves or how much they will cost. I am hoping there will be a good supply, at MSRP (if there is one) but I am expecting same pricing and availability as AMD and Nvidia.
Posted on Reply
#34
bonehead123
ALL HAIL DA MIGHTY ARC :)

But seriously, until I have one installed in my rig, it's just friggin V.A.P.O.R.W.A.R.E.Z..........

And the fact that engineering samples apparently exist means absolutely NUTHIN until it lands on store shelves/webstorez for moar than the above mentioned 2.5 seconds !
Posted on Reply
#35
aQi
Its a pcb that can go with any company. What surprises is it has a connect close to the main core. This is a clue for others that the game is on..
Posted on Reply
#36
80-watt Hamster
KananDrivers will be key if this will be good or not, will make or break Intel Arc. Nobody will care if it has theoretical great perf if the drivers are shoddy. Then it will most likely end up as a miner card instead, the funny thing is, at the current market, this GPU can not fail.
The silver lining there is that even if the launch drivers are bad, Intel will have a period of nigh-guaranteed revenue to sort them out.
Posted on Reply
#37
TheoneandonlyMrK
80-watt HamsterThe silver lining there is that even if the launch drivers are bad, Intel will have a period of nigh-guaranteed revenue to sort them out.
Oh absolutely, they just got a free pass to go old school on OEM builds via Deals to remove competition from OEM builds , it'll be hard to buy a laptop that isn't all Intel in two years IMHO.
Posted on Reply
#38
Gundem
KananDrivers will be key if this will be good or not, will make or break Intel Arc. Nobody will care if it has theoretical great perf if the drivers are shoddy. Then it will most likely end up as a miner card instead, the funny thing is, at the current market, this GPU can not fail.
Agreed, performance aside, the internet will destroy this launch if the drivers give issues in the days/weeks after launch. Please Intel, get the drivers right so that you can successfully launch a third player in the GPU game.
Posted on Reply
#39
seth1911
A 192Bit 6GB Card for 150 - 200$ would be the sweetspot in these days not like the garbage 64bit 4GB XT my Ass
KananDrivers will be key if this will be good or not, will make or break Intel Arc. Nobody will care if it has theoretical great perf if the drivers are shoddy. Then it will most likely end up as a miner card instead, the funny thing is, at the current market, this GPU can not fail.
Intel have AMD as counterpart, Intel IGP Drivers are still better than AMD ones.
Posted on Reply
#40
Kanan
Tech Enthusiast & Gamer
seth1911Intel have AMD as counterpart, Intel IGP Drivers are still better than AMD ones.
No they aren't, Intel IGP drivers are complete garbage and AMD work pretty much perfectly. You couldn't have said anything more wrong.
Posted on Reply
#41
seth1911
No AMD Drivers are garbage. :laugh:


Best one is Nvidia, second Intel.
Posted on Reply
#42
Kanan
Tech Enthusiast & Gamer
seth1911No AMD Drivers are garbage. :laugh:


Best one is Nvidia, second Intel.
You're only in these forums to troll, right? Saw a lot of your comments. Go play a game instead.
Posted on Reply
#43
seth1911
Na the ignore is for people like you or? :laugh:

Ah 2 guys they like ure comment still on that list :toast:
Posted on Reply
#44
Watermelon5
Ferrum MasterActually it really lacks some screws.
Intel took out the woodscrews for the purpose of the photo. Then they go right back in
Posted on Reply
#45
AlwaysHope
Looking forward to retail availability + price first & foremost. Everything before then is just FUD, some things in this industry never change.
Posted on Reply
#46
Vayra86
TheoneandonlyMrKWow unicorns exist.
Its about time they start shitting rainbows by now. Otherwise its just a white horse
chstamosIt's been hyped to kingdom come
Time for Deliverance ;)
Posted on Reply
#47
thelawnet
seth1911A 192Bit 6GB Card for 150 - 200$ would be the sweetspot in these days not like the garbage 64bit 4GB XT my Ass
not possible.

DAG size of Ethereum is 4.7GB. It won't reach 6GB till 2024.

The 6500XT is already a faster card (for gaming) than the 1060 6GB. And you think it's terrible.

A hypothetical 6GB 6500 XT card would earn around $550, whereas the 4GB version is useless, because, 4GB.

There is absolutely zero chance of a 6GB card for $150-$200.

You can have a hopefully slightly better 4GB card, or you can have something costing $400++. Those are your choices.
Posted on Reply
#48
phanbuey
thelawnetnot possible.

DAG size of Ethereum is 4.7GB. It won't reach 6GB till 2024.

The 6500XT is already a faster card (for gaming) than the 1060 6GB. And you think it's terrible.

A hypothetical 6GB 6500 XT card would earn around $550, whereas the 4GB version is useless, because, 4GB.

There is absolutely zero chance of a 6GB card for $150-$200.

You can have a hopefully slightly better 4GB card, or you can have something costing $400++. Those are your choices.
They could have made parts of it just a bit better even for a 4gb card. It does seem like a laptop part that was glued onto a desktop pcb and clocked to the moon at the last minute. It’s fine for what it is but I feel very badly for anyone who has to buy one.
Posted on Reply
#49
lexluthermiester
usinameYour Sandra says that radeon VII is faster than rtx 3090
And if you really believe that, I have a bridge in Brooklyn New York I want to sell you...
Posted on Reply
#50
watzupken
thelawnetnot possible.

DAG size of Ethereum is 4.7GB. It won't reach 6GB till 2024.

The 6500XT is already a faster card (for gaming) than the 1060 6GB. And you think it's terrible.

A hypothetical 6GB 6500 XT card would earn around $550, whereas the 4GB version is useless, because, 4GB.

There is absolutely zero chance of a 6GB card for $150-$200.

You can have a hopefully slightly better 4GB card, or you can have something costing $400++. Those are your choices.
To me, it is a bad card as well. Performance wise, if the system that you are using it has a PCI-E 4.0 slot, then you will get the full performance of the card. And even if you get 100% of what you paid for, you are comparing a GPU that is 2 generations back, launched 6+ years ago with likely no driver optimisation by now. The improvement in performance is "meh" to say the least. And the difference in 2GB VRAM will also hurt performance at higher graphical settings. The story is worst when you compare both cards on the same system that only has PCI-E 3.0.
Objectively, the performance is decent if certain condition is met. But this is a card that I will only recommend if someone must buy a GPU because their existing GPU is dead or dying. For users for GTX 1060 6GB and RX 570/ 580, they should just continue to use their card until the opportunity presents itself to buy a better GPU at their target price, or until their GPU bites the dust. From an attractiveness standpoint, the RX 6500 series is a fail.
Posted on Reply
Add your own comment
Dec 20th, 2024 17:27 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts