Wednesday, December 21st 2022

Intel Reorganises its Graphics Chip Division, Raja Koduri Seemingly Demoted

Big things are afoot at Intel's graphics chip division once again, as the company has just broken up its Accelerated Computing Systems and Graphics (AXG) business unit which will result in some big changes. For starters, Raja Koduri has been—what we can only refer to as—demoted, given he's back to being chief architect rather than being in charge of the AXG business unit. Some of his staff will be moved to other business units inside Intel as the AXG business unit will cease to exist. This doesn't mean Intel will stop making discrete consumer GPUs, with at least the Battlemage/Arc B-series launch still being planned to take place sometime in 2023.

At the same time, it looks like Raja Koduri will be out of action for what is likely to be at least a month since he posted on Twitter that he's had emergency back surgery while on a business trip. How this will affect his transition back to his role as chief architect is anyone's guess at this point in time. However, he will not be focusing solely on GPUs in the future, but the broader range of products that Intel offers—particularly the integration of GPU, CPU and AI architectures at Intel. We've posted an official statement from Intel after the break, which Intel provided to Tom's Hardware. We also wish Raja a speedy recovery!
Discrete graphics and accelerated computing are critical growth engines for Intel. With our flagship products now in production, we are evolving our structure to accelerate and scale their impact and drive go-to-market strategies with a unified voice to customers. This includes our consumer graphics teams joining our client computing group, and our accelerated computing teams joining our datacenter and AI group.

In addition, Raja Koduri will return to the Intel Chief Architect role to focus on our growing efforts across CPU, GPU and AI, and accelerating high priority technical programs.
Sources: Bloomberg (paywall), Tom's Hardware, Raja Koduri (on Twitter)
Add your own comment

131 Comments on Intel Reorganises its Graphics Chip Division, Raja Koduri Seemingly Demoted

#51
nguyen
TotallyAnd your point? Whether he was in charge of a project with 2 person team or 5000 people doesn't change the fact that he over promised and under delivered, nor was this a one off thing.
Nah the over-promising and under-deliver is just AMD thing, like what they are doing with 7900XTX without Raja

Raja has been quietly working at Intel, hopefully they can make some breakthrough soon, the GPU market need some competent competitor to Nvidia
Posted on Reply
#52
SSGBryan
Xex360They are losing lots of money on this, plus in no way they can come close to nVidia or AMD, better kill it and move on to something else.
The A770 is around a 3060Ti in performance, cheaper, more Vram for texture packs (and 3d rendering) and in early days on driver development.

I am getting myself one in January.
Posted on Reply
#53
Jism
ThrashZoneHi,
Intel and software/ firmware has always been pretty bad
But Raja well when with asus fried lots of haswell-e and broardwell-e chips and boards and just said it was because of weak chips on ROG forum :laugh:

Can his ego handle it ?
I had a Intel NIC sporadically dying on a good working Asus AM4 board. Once it dissapeared it took the whole board with it 2 weeks later. Incredible how such hardware is'nt tested properly before being put out.

As for Raja; he's still a key in designing compute based hardware. Thats what that whole line of Intel Graphics card is really. It's just derivative from compute hardware that did'nt meet quality guidelines as a compute or professional card. Just like Vega and Instinct or Geforce and Quadro.
Posted on Reply
#54
Bwaze
SSGBryanThe A770 is around a 3060Ti in performance, cheaper, more Vram for texture packs (and 3d rendering) and in early days on driver development.

I am getting myself one in January.
It also has in some ways higher hardware capabilities which may be potentially unlocked in future driver and software updates.

On the other hand, Intel might announce they are exiting discrete gaming cards, which would mean you bought an abandonware, not functioning with new games etc...

It's a bit of a blow A770 isn't even on TechPowerUP review charts of RTX 4080, RX 7900 XT - because they only extend down to RTX 3070, RX 6800. I hope it will be included in incoming midrange card reviews.
Posted on Reply
#55
Patriot
mclaren85Did you really trigger from that opinion? It's called freedom of speech.


Could you provide us scientific proofs?
Don't really think Nvidia is that stupid.
They have done it before, it's just been a bit, people are quick to forget.
GTX275 era. It is also not about stupidity but pride.

Huh, it appears they have done it atleast 3 times over the decades, I was only aware of the one I experienced directly on the GTX275 which made me sell it and go back to my 4850.
I use a 3080ti currently btw.

forums.anandtech.com/threads/nvidia-caught-cheating-again-new-aquamark.1154090/
www.se7ensins.com/forums/threads/nvidia-once-again-cheating-benchmarks.1353515/

That said, this is about Intel, and currently ARC doesn't perform well enough to merit it's use in modern games, and doesn't properly support old games... and requires rebar sooo.
Posted on Reply
#56
MrMeth
ARFStarts his own firm, or joins as an executive in a smaller, less known company?!



They are indirectly on topic.
Given the current state of affairs at AMD, maybe Lisa Su should follow because of the terrible graphics decisions... :rolleyes: Lower market share, dark forecast, bad product lineup, etc...
Dude, Lisa Su has brought stability to the cpu division !! The reason you can buy $150 R5 5600 6 core CPU is partially because of her and what the rest of the team at AMD has done since 2017. Her track record in terms of execution , release schedule, and even performance has been spot. Yes they gambled on Vega & HBM and lost but the 7900XTX isn't a failure. I think / hope prey they are on the cusp of coming up to par with Nvidia next gen. Put some respect on Lisa name !! Lol
Posted on Reply
#57
ARF
elon_trumpThat's not how it works, the drivers needs to be WHQL certified
But most of the drivers that AMD releases are NOT WHQL certified. Which opens another question - for exactly is this certification for and does AMD want to comply with it? :D
Posted on Reply
#58
qubit
Overclocked quantum bit
v12dockIve been using Arc for 1 month now and the drivers are abysmally bad and updates are not nearly coming at a quick enough pace. Software is the only thing that will save the GPU division and it looks grim.
eidairaman1Now you understand why Dr. Lisa Su fired him, he is pathetic.
Now one can see why AMD's graphics cards have always been playng second fiddle to NVIDIA. Why Intel thought it was a good idea to bring him on board, I don't know. Perhaps they should poach engineers off NVIDIA if they can and then make something decent? Intel graphics card releases have been a disaster and not worth the materials they're made with.

I'd be embarrassed to release something as dysfunctional as that and would make an excuse, any excuse, not to release it.

Hope he gets better soon, though. One should still wish people better health regardless, especially at Christmas.
Posted on Reply
#59
Unregistered
chrcolukWere they expecting to strike gold just after one product release? if yes that was a very unrealistic expectation.

Optane was quit too early what a wasted opportunity that was and if they do the same here, people wont invest in first gen Intel products as they might become abandonware quickly.

All they needed to do was loss lead with a very attractive price point as well as wait until drivers were ready so it had a better first impression, but the entire ARC thing smacks of impatience from the top.
They were making GPUs forever now, drivers shouldn't have been an issue, and native support for DX11/10/9. They didn't fail where we expected them performance, but on basic stuff. Killing it makes total sense, as they'll never compete with nVidia or AMD on performance or price, as they just lower prices for previous generation of GPUs and crush anything Intel comes up with.
SSGBryanThe A770 is around a 3060Ti in performance, cheaper, more Vram for texture packs (and 3d rendering) and in early days on driver development.

I am getting myself one in January.
Price wise it is the same as a 3060ti/6700XT, while destroyed by both.
#60
SSGBryan
Xex360They were making GPUs forever now, drivers shouldn't have been an issue, and native support for DX11/10/9. They didn't fail where we expected them performance, but on basic stuff. Killing it makes total sense, as they'll never compete with nVidia or AMD on performance or price, as they just lower prices for previous generation of GPUs and crush anything Intel comes up with.

Price wise it is the same as a 3060ti/6700XT, while destroyed by both.
3d rendering is an important part of my workflow - and with that - the 16gb matters.
Posted on Reply
#61
MrMeth
SSGBryan3d rendering is an important part of my workflow - and with that - the 16gb matters.
Intel really cant do anything with there 3D rendering cause cuda is the industry standard open cl isnt really supported by anyone and the people who do support it the performance isnt as good as on cuda. AMD , Intel and apple need to push cuda support by paying software devs to support it.
Posted on Reply
#62
ReallyBigMistake
Intel i740 all over again?
I guess we will find out pretty soon
Posted on Reply
#63
lexluthermiester
Wow! We can just feel the love in this thread..

Personally, I think Raja has done good. He took Intel's GPU efforts from passable IGPs, to full on and competitive world class discrete GPUs. They managed that in the middle of a pandemic and economic recession. And all many of you can do is whine, complain and throw insults?

Sorry folks, but that's a fail for YOU, not Raja and his teams. Take a step back, use your brains from something more than a seat cushion and see reality for what it is! :shadedshu:
Posted on Reply
#64
Bomby569
lexluthermiesterWow! We can just feel the love in this thread..

Personally, I think Raja has done good. He took Intel's GPU efforts from passable IGPs, to full on and competitive world class discrete GPUs. They managed that in the middle of a pandemic and economic recession. And all many of you can do is whine, complain and throw insults?

Sorry folks, but that's a fail for YOU, not Raja and his teams. Take a step back, use your brains from something more than a seat cushion and see reality for what it is! :shadedshu:
I also think his actual job (as an engineer) was a good one, but he was also the head of the project and we can't forget the rest, the drivers and the communication was a disaster. Intel (that is in a bad spot) burned a lot of money on the project, you'd expect a bit more from it.
Posted on Reply
#65
Laykun
Lol, what is wrong with some people in this thread? I'd hate to be in a room with half of you trying to have a civil conversation. A lot of sad boys at arm chairs throwing rocks at a giant. Raja could out engineer and out manage any one of you, the fact that he has the chops to be a lead architect, and releases a product AT ALL is astounding. I love that people are just making stuff up like "oh he got fired", just cause you don't like him and that's what you wished happened.

Then we have the tin foil hat image quality argument, that apparently started as a "joke", but when people say that later it's usually just to save face. "nvidia did it 20 yers ago! I dun trust em!". There's no image quality difference between the two manufacturers, and anything that would produce image quality savings these days wouldn't give an appreciable performance boost (unless we're talking DLSS vs FSR, which no doubt the goal posts will now be moved to). Which goes to show how little most people understand about how far GPUs have moved in the last 10-20 years. Anisotropic filter cheating isn't what it used to be, and you can't cheat floating point precision in your shaders like you used to, there are hard standards you have to conform to to be DX12 certified. Even then, GPU testing is even MORE rigorous now then it used to be, much more empirical, any image quality difference would be reported far and wide through scientific testing.

Go back under your brigdes.
Posted on Reply
#66
Dirt Chip
Salty people will salt. That's the way it is.
Forums will see occasional rant`s.
Flamers will flame. They don`t know otherwise.

I need to Haiku this.
Posted on Reply
#67
sLowEnd
His batting average hasn't been very good, but hopefully he can hit a homerun this time.
Posted on Reply
#68
TumbleGeorge
MrMethIntel really cant do anything with there 3D rendering cause cuda is the industry standard open cl isnt really supported by anyone and the people who do support it the performance isnt as good as on cuda. AMD , Intel and apple need to push cuda support by paying software devs to support it.
You seem to know better than the owner of a product who has chosen it for certain reasons, what is better for it? As far as I know, Intel discrete graphics driver issues are more related to their performance when playing PC games. I have some ideas that in a workflow with content creation and editing programs, the performance is better.
Posted on Reply
#69
lexluthermiester
Bomby569the drivers...was a disaster
I disagree. No they weren't perfect, but no one expected them to be. For an introductory product, they not only got alot right, but they have greatly improved and optimized what they didn't.
Posted on Reply
#70
Bomby569
lexluthermiesterI disagree. No they weren't perfect, but no one expected them to be. For an introductory product, they not only got alot right, but they have greatly improved and optimized what they didn't.
Software was a disaster. They clearly over promised on the gpu side of things, performance, but especially the drivers that were not ready for what we were told the product would be, and the old DX9 and 10 compatibility should never have been a surprise for us to uncover like a easter egg. It was all very disastrous and a brand PR nightmare.
Posted on Reply
#71
AnarchoPrimitiv
ARFI suspect nvidia is cheating on the drivers level, providing lower image quality for higher framerates.
While on this topic I feel like this I'd an important distinction....as consumers, our entire capability to compare videocards is based upon the assumption that all frames are created equally. It's the common unit of measurement that allows us to compare across brands, BUT with the increasing encroachment of software manipulation with DLSS, FSR, and with Nvidia's frame generation as the most egregious form, can we truly rest assured that a frame generated by an Nvidia card and an AMD card are exactly the same?

If they are not, and this may sound dramatic, then basically every review based upon the FPS unit of measurement (which is literally every review) is worthless. Obviously the average consumer doesn't have the capability to ensure that the frames are equivalent, so if some professional reviewers would tackle this issue, I feel like it would of paramount importance to the entire community going forward.
Posted on Reply
#72
pavle
lexluthermiester...an introductory product,...
Intel is the most ubiquitous graphics maker among all, look at how many of their GPUs are integrated all over the world, the only thing different here is there are additional and more units inside and it's a stand-alone card, which isn't the first for them either. No excuses with the drivers (they've been bad for ages).

Reorganisation might bring something good or it is just a must because I don't buy the "he had emergency back surgery on a business trip" story for 1 minute.
Posted on Reply
#73
Upgrayedd
Sounds like he just wants back in the shop?

But I hope the back problem goes away for him, that's no fun.
Posted on Reply
#74
mukumi
MrMethDude, Lisa Su has brought stability to the cpu division !! The reason you can buy $150 R5 5600 6 core CPU is partially because of her and what the rest of the team at AMD has done since 2017. Her track record in terms of execution , release schedule, and even performance has been spot. Yes they gambled on Vega & HBM and lost but the 7900XTX isn't a failure. I think / hope prey they are on the cusp of coming up to par with Nvidia next gen. Put some respect on Lisa name !! Lol
Jim Keller is the one who saved AMD. Lisa afterwards made a good job keeping it going.
Posted on Reply
#75
L'Eliminateur
Gungar"Raja Koduri demoted" what a surprise...
I've ALWAYS said that Raja is a smoke and snake oil salesman, he ALWAYS overhypes, underdelivers HARD and always late. He did that in AMD and effed up hard, now he's doing the same in Intel, overpromising and underdelivering.

How he landed in intel is a mystery to me, he reminds me of those "liquidator CEOs" that corporations on the brink of bankruptcy bring that end up firing everyone, closing everything ti make the company sweeter for a sell/takeover.
Posted on Reply
Add your own comment
Oct 5th, 2024 21:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts