• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Reorganises its Graphics Chip Division, Raja Koduri Seemingly Demoted

...an introductory product,...
Intel is the most ubiquitous graphics maker among all, look at how many of their GPUs are integrated all over the world, the only thing different here is there are additional and more units inside and it's a stand-alone card, which isn't the first for them either. No excuses with the drivers (they've been bad for ages).

Reorganisation might bring something good or it is just a must because I don't buy the "he had emergency back surgery on a business trip" story for 1 minute.
 
Sounds like he just wants back in the shop?

But I hope the back problem goes away for him, that's no fun.
 
Dude, Lisa Su has brought stability to the cpu division !! The reason you can buy $150 R5 5600 6 core CPU is partially because of her and what the rest of the team at AMD has done since 2017. Her track record in terms of execution , release schedule, and even performance has been spot. Yes they gambled on Vega & HBM and lost but the 7900XTX isn't a failure. I think / hope prey they are on the cusp of coming up to par with Nvidia next gen. Put some respect on Lisa name !! Lol
Jim Keller is the one who saved AMD. Lisa afterwards made a good job keeping it going.
 
"Raja Koduri demoted" what a surprise...
I've ALWAYS said that Raja is a smoke and snake oil salesman, he ALWAYS overhypes, underdelivers HARD and always late. He did that in AMD and effed up hard, now he's doing the same in Intel, overpromising and underdelivering.

How he landed in intel is a mystery to me, he reminds me of those "liquidator CEOs" that corporations on the brink of bankruptcy bring that end up firing everyone, closing everything ti make the company sweeter for a sell/takeover.
 
I disagree. No they weren't perfect, but no one expected them to be. For an introductory product, they not only got alot right, but they have greatly improved and optimized what they didn't.
Absolute rubbish. Intel has been making iGPUs longer then AMD has. Not including their decade+ of chipset GPUs.

There is 0 excuse for a corporation that spends over $10 billion a year on R+D, with decades of experience making display adapters, to not get basic things right, like the ability to plug into monitors without freaking out, or a control panel that functions. These are basic driver 101 things that intel screwed up. What they are "greatly improving" are basic features that should have worked at launch, not 4 months later.

This is the same line of copium that No Mans Sky fans have been huffing for half a decade to justify a product being release din alpha state or full price.
 
Absolute rubbish. Intel has been making iGPUs longer then AMD has. Not including their decade+ of chipset GPUs.

There is 0 excuse for a corporation that spends over $10 billion a year on R+D, with decades of experience making display adapters, to not get basic things right, like the ability to plug into monitors without freaking out, or a control panel that functions. These are basic driver 101 things that intel screwed up. What they are "greatly improving" are basic features that should have worked at launch, not 4 months later.

This is the same line of copium that No Mans Sky fans have been huffing for half a decade to justify a product being release din alpha state or full price.
Exactly,
it's not like intel NEVER made anything with graphics, they're new to the DISCRETE GPU market, that's it.
But it does look like the teams that make the igpus/software for them might as well be in a different company as they tried to do everything from scratch and reinvent the wheel.

What i don't understand is how they could fuckup basic stuff like a driver installer, or the driver UI
 
intel... new to the DISCRETE GPU market

If they want to succeed, they have to offer something that AMD or nvidia don't offer. But looking at their offers, instead of offering something good, they are offering something much worse.
 
They have done it before, it's just been a bit, people are quick to forget.
GTX275 era. It is also not about stupidity but pride.

Huh, it appears they have done it atleast 3 times over the decades, I was only aware of the one I experienced directly on the GTX275 which made me sell it and go back to my 4850.
I use a 3080ti currently btw.


That said, this is about Intel, and currently ARC doesn't perform well enough to merit it's use in modern games, and doesn't properly support old games... and requires rebar sooo.
That must be the biggest scandal in tech history. Shame on Nvidia!
 
Intel really cant do anything with there 3D rendering cause cuda is the industry standard open cl isnt really supported by anyone and the people who do support it the performance isnt as good as on cuda. AMD , Intel and apple need to push cuda support by paying software devs to support it.
You might want to share that with the Blender Foundation. The Arc series can do GPU rendering in Blender.
 
If Intel can't compete with NVIDIA/AMD in discrete GPUs then nobody can. The entire project does seem mismanaged but we should all be hoping for it to be a success and that they continue the development and release of updated drivers and B- and C-Series Arc cards.
 
If Intel can't compete with NVIDIA/AMD in discrete GPUs then nobody can. The entire project does seem mismanaged but we should all be hoping for it to be a success and that they continue the development and release of updated drivers and B- and C-Series Arc cards.

^^ This 100% ^^

In case anybody's not paying attention, the current generation of cards released so far by AMVidia are priced at $900, $1000, $1200, and $1600.

Two of them are currently sold out at $1000 and $1600. Prices will only continue to go up if this duopoly doesn't get a decent challenger.
 
Why did Intel hire Raja?

How many people in the world can do what he does?

The facts indicate that he is very good at designing compute GPUs. AMD's compute chips are still derivative of the GPU architecture developed when Raja was there. Arc does seem to be a solid compute design.

One of Raja's shortcomings is making that compute performance show up in games. Guess which market, professional compute or consumer gaming, is more important to AMD, Intel, and Nvidia. Hint: it's not most of us on this forum.

Another of Raja's problems is consistently overhyping whatever he is working on, apparently not just to press and consumers, but also within the company.

Intel NEEDS GPU compute, and Raja delivers there. Gaming consumers can only hope that Intel sticks it out in our market and brings in better competition. Raja seems to be one of those people who are best utilized behind closed doors and not allowed to talk to the press.
 
Then they'll be none of us left.
Hey, stop putting pressure on my blue gray yellow stars!

Exactly,
it's not like intel NEVER made anything with graphics, they're new to the DISCRETE GPU market, that's it.
But it does look like the teams that make the igpus/software for them might as well be in a different company as they tried to do everything from scratch and reinvent the wheel.

What i don't understand is how they could fuckup basic stuff like a driver installer, or the driver UI
But that's just the thing. Intel's GPU project started off on wrong notions;
Early Xe for consumer was just more execution units of what they already had. They were literally making their igp's wider and bigger.
The Xe that followed was a massive tile for enterprise
The real Xe is a cut down gaming oriented result of the above two elements, and in the meantime they never went deep into the gaming driver regime. Instead, they just went for the quick win of emulating anything pre-DX12 and then going by singular DX12 titles for optimization. You can't even make it up. They've made a GPU 'work for some games', and then started adding a massive featureset of which the better half isn't even functional proper, up to and including simple things like actually showing something on display. The baseline DX12 performance per square mm of die space AND the perf/watt is below par which translates to not competitive in the market, and only their best optimized titles are somewhat competitive in perf.

In the meantime the GPU design contains lots of expanded pieces of 'stuff' that is hardly if ever utilized in games. There are interviews of the man Raja boasting all about how future proof all of that really is. Too bad you do need a future for that. In the tiny handful of titles where they can utilize those resources, Arc is still barely competitive.

Its Raja all the way:
Horrifying project management skills,
terrible PR to oversell it,
no grasp on time to market,
ending up with an inefficient, unmarketable product at the very last stage, post planned release date, at which point there is no turning back.

Vega=Arc. Its a perfect match, sans HBM, or put differently, this time Raja didn't even NEED HBM to fuck it up.

If Intel can't compete with NVIDIA/AMD in discrete GPUs then nobody can. The entire project does seem mismanaged but we should all be hoping for it to be a success and that they continue the development and release of updated drivers and B- and C-Series Arc cards.

Well honestly, let's just pass this project over to Samsung, maybe they can do better. I remember the rumors about them wanting to buy RTG :)

Intel & gaming just aren't a match. Even their 'gamur' CPUs now aren't really gamer oriented even if they 'work fine' for the purpose. But the real gaming CPUs are X3D's now; a simple step away from a proven design but specifically oriented at gaming tasks, and damn good at exactly the most problematic CPU loads in gaming.
 
Last edited:
I remember reading that Intel had issues scaling Xe (can't remember if all the architecture or just the driver side) from the IGPU business to the dedicated cards. With Raja (not anymore) at the helm it says wonders.
 
Only warning.
Stay on topic.
Stop the insults.
Stop the trolling.
Follow the guidelines!

If I missed any post that is off topic, trolling, insulting, etc. put in a report and they will be dealt with.
DO NOT post and become part of the problem, or, you will be dealt with as a problem.
It is like this... for all of you who think we are watching every post. We do not and are not here 24/7.
And, mods are assigned forums and have limited ability access in other sections.

Thank You and have a good day/night.
 
They were making GPUs forever now, drivers shouldn't have been an issue, and native support for DX11/10/9. They didn't fail where we expected them performance, but on basic stuff. Killing it makes total sense, as they'll never compete with nVidia or AMD on performance or price, as they just lower prices for previous generation of GPUs and crush anything Intel comes up with.

Price wise it is the same as a 3060ti/6700XT, while destroyed by both.
Deciding after one year it must be killed is the impatience I spoken off.

What GPUs have they been making for years? iGPU's dont count.

The prices are way too high and the drivers are/were not ready, its as simple as that really. It smacks of engineers saying not ready and executives saying deadline is up.

If Nvidia and AMD gave up after one bad generation we would have no discrete GPU manufacturers today.
 
And your point? Whether he was in charge of a project with 2 person team or 5000 people doesn't change the fact that he over promised and under delivered, nor was this a one off thing.
And your point? Whether his exceptions and results were closely aligned or not people like you ignore all the hard work and only offer one dimensional criticisms.
 
And your point? Whether his exceptions and results were closely aligned or not people like you ignore all the hard work and only offer one dimensional criticisms.

Was I trying to make a point there? Your "none of you have walked in his shoes, therefore you no should say anything" rant is nonsense. No one here has to do what he has to see an individual who isn't learning from his past mistakes and is repeating them again. Just because he's doing that on a larger scale, doesn't mean he's above reproach.
 
Last edited by a moderator:
^^ This 100% ^^

In case anybody's not paying attention, the current generation of cards released so far by AMVidia are priced at $900, $1000, $1200, and $1600.

Two of them are currently sold out at $1000 and $1600. Prices will only continue to go up if this duopoly doesn't get a decent challenger.

Yeah, I mean I bought the "ripoff" 7900XT and honestly it's a great card in every way, I love it and have no complaints, except it should have been $799, not $899. If Intel can not compete in the next couple years the prices will continue to go up and the performance per dollar will continue to go down. If Intel can compete, things will get better or at least stay the same as they were in the past, where each gen delivered significantly better performance and efficiency for the same price or cheaper.
 
Why did Intel hire Raja?

How many people in the world can do what he does?

The facts indicate that he is very good at designing compute GPUs. AMD's compute chips are still derivative of the GPU architecture developed when Raja was there. Arc does seem to be a solid compute design.

One of Raja's shortcomings is making that compute performance show up in games. Guess which market, professional compute or consumer gaming, is more important to AMD, Intel, and Nvidia. Hint: it's not most of us on this forum.

Another of Raja's problems is consistently overhyping whatever he is working on, apparently not just to press and consumers, but also within the company.

Intel NEEDS GPU compute, and Raja delivers there. Gaming consumers can only hope that Intel sticks it out in our market and brings in better competition. Raja seems to be one of those people who are best utilized behind closed doors and not allowed to talk to the press.


Its like you hire a diesel engine specialist with the task to design a racing engine with high RPM based on gasoline.

Raja is talented; lets not forget about that but they should just focus on putting chips specially for gaming and specially for compute out there. Not hacked up compute chips rebranded as graphics chips (Vega?)
 
Software was a disaster. They clearly over promised on the gpu side of things, performance
No it wasn't, and no they didn't. The drivers had problems, sure, but a disaster they were not. They didn't over promise. They stated clearly what people should expect and that level of performance was delivered. EVERY review clearly showed that fact.

Intel is the most ubiquitous IGP maker among all
Fixed that for you to be in proper context.
Reorganisation might bring something good
Perhaps. The economic slump likely is the reason.

Yup, there it is. An IGP is a completely different beast when compare to discrete GPU's..
 
Deciding after one year it must be killed is the impatience I spoken off.

What GPUs have they been making for years? iGPU's dont count.

The prices are way too high and the drivers are/were not ready, its as simple as that really. It smacks of engineers saying not ready and executives saying deadline is up.

If Nvidia and AMD gave up after one bad generation we would have no discrete GPU manufacturers today.
You have to look at reality, Intel couldn't compete with 2 years old GPUs (the 770 is similar to the 3070 in size), Intel came late to the party and knows that they won't be able to compete for several generations, sales will be abysmal unless sold at loss (as it's probably the case with the 770).

iGPUs do count, they out to HDMI DP DVI...etc, decode video and compatible with all DX versions hence play games. You could argue it's difficult to scale them but it seems Intel's new GPU division didn't use their expertise, performance could be excused but not the lack of compatibility with earlier DX versions and all the issues they have now are not.
 
This is very interesting, but maybe they should launch A-series Arc first? I mean, properly.
 
Nah, in beginning of 2023 we'll get Nvidia and AMD midrange and low end cards. And as bad a value they will present (no price / performance increase), they will highlight even more that Intel was designed to compete with 2018 - 2020 GPU generation.
 
Ah yes, Raja Koduri! He continues his tradition of overblown claims followed by abysmal failures.

His work on Vega got him kicked from ATi and now his work on ARC has him demoted by Intel. The fact that Intel is keeping him shows how desperate they are.
 
Back
Top