Thursday, September 10th 2015
AMD Clumps Various Graphics Divisions as Radeon Technologies Group Under Koduri
AMD this Wednesday, announced a major internal re-organization, with the merger of its various visual computing divisions as a monolithic entity, called Radeon Technologies Group. It will be led by Raja Koduri as Senior Vice President. Koduri served as VP of the company's visual computing division. He will now report directly to CEO Lisa Su, and the various other graphics-related divisions (eg: professional graphics business, led by Sean Burke), will report to him. AMD's lucrative semi-custom business - the one responsible for SoCs that drive popular game consoles such as the Xbox One and PlayStation 4 - will also come under the unified Radeon Technologies Group.
"We are entering an age of immersive computing where we will be surrounded by billions of brilliant pixels that enhance our daily lives in ways we have yet to fully comprehend," said Lisa Su. "AMD is one of the few companies with the engineering talent and IP to make emerging immersive computing opportunities a reality," said Koduri, adding "Now, with the Radeon Technologies Group, we have a dedicated team focused on growing our business as we create a unique environment for the best and brightest minds in graphics to be a part of the team re-defining the industry."Many Thanks to Dorsetknob for the tip.
"We are entering an age of immersive computing where we will be surrounded by billions of brilliant pixels that enhance our daily lives in ways we have yet to fully comprehend," said Lisa Su. "AMD is one of the few companies with the engineering talent and IP to make emerging immersive computing opportunities a reality," said Koduri, adding "Now, with the Radeon Technologies Group, we have a dedicated team focused on growing our business as we create a unique environment for the best and brightest minds in graphics to be a part of the team re-defining the industry."Many Thanks to Dorsetknob for the tip.
22 Comments on AMD Clumps Various Graphics Divisions as Radeon Technologies Group Under Koduri
EDIT** Plus the bad name the ATI brand got with their AMD repeated fall on their face bullshit, Phnomonally shit, Bulldozer of trash, Excavating your septic system at low low prices.
Also may be the beginning of a bankrupcy for AMD as the parent company to spin all the bad debts off to, and run into the ground before they all jump ship.
Finally realizing it, eh?
It might be a good thing that someone who is experienced in GPUs (especially mobile ones, where the rapid progress really happens these days) is at the head of Radeon, but it just seems like AMD is throwing their CPUs away. This can't be good for AMD and it can't be good for Intel either.
Who knows? AMD/ATI has a reputation for throwing away things that suddenly become valuable. Look at Adreno. Qualcomm's having a great time with Adreno (although they're having a shitty time with ARMv8).
They can't get shit together as it is. As a separate company the collaboration timing will be an even larger failure, no?
Attention anyone who is still foolish enough to own shares in AMD: sell them. Now. Before they're worth nothing.
it wasnt too early. it was right on time. it is logical next step after stacking cores consept.
it is software industry that failed to catch on and utilize all that bulldozer can offer. 4 years have past and still most of the software struggles to use 2 cores properly not to mention 4 or more.
This ATG must place the predominate focus on Professional, that’s where the R&D could give the most returns, although that entails big expenditures in drivers and compliance. But that R&D to a most part does have relevance to consumer/gaming parts. While SOC and Consoles part can in many ways set the pace of direction PC gaming goes, more as evolving, while not lucrative short term can offer dividends long term.
Professional: the new group would be smart to focus on a given task, where a small on interposer CPU is working in coordination with the GPU, and HBM to make that particular task expediently with power needs reduced (overall system) they might find a good niche in HPC. Any good win in the fairly good profit area like HPC would be a boon to floating cash in.
I don't know how soon HBM2 can be in consoles, but if you can supply a APU all with compact shared HBM, ultimately providing improved efficiencies in power, performance, construction, and package size... There's a value in that to all parties that have merit and with that improved return. Also, the group should continue a footing in graphic advertising signage systems. I think as cursory area that they can cabbage-on with existing rudimentary technology, there’s an appreciable but constant cash flow.
As to gaming… stay true to time-lines, stop sugar-coating or pretending you best the competition across all segments. When you have a upstanding offering in a marketing position just provide review samples to keep it forefront. I don’t see social media as the predominate path to promote; and never should an executive/engineer "out" any snippets or barbs. Got something that needs to be said, vet it by PR and release it into various media’s at the same time. Want to drive buzz... reviews, and build on them posting on S-M saying "so-n-so" put up a review. Possibly connect winning free game(s) to folks that "like" the post. Then finally (for like perhaps later when there's cash flow) set-up local gaming contest/conventions that "break-out" though the use social media to land an invites.
They’ll need to control their message, especially moving forward to Arctic Islands, either it’s a released statement (as stated above), or you should come out and say we don’t “confirm nor deny” anything, it will not gain traction if you stick to that and do it early. Don’t say anything as to being XX% to the competition, heck don’t even say it’s better than our XX product by some amount. Just say we’re confident it will find plenty consumers for the intended price point.
there are tones of such examples and quite few apps that can put all those cores in use.
what i said in my previous post is valid for intel cpus too. @Frick's comment i've quoted was for bulldozer so i gave example with it.
I built a few machines with them, they were good middle of the road chips, but considering that 90% of the workload today will run faster on older C2D with insane over clocks they are still irrelevant.
AMD/ATI has had a long standing love of making/using a few new instruction sets and using the benchmarks they provide as "new numbers" when its a standard that will not see the light of day before the hardware is too old, or before someone comes along and does it better. Their occasional home runs are from others failures and not their own successes in the last 6 or so years.
Is it inappropriate to develop way of implementing ideas (in this case architecture), because you can't because of patents' replicate what the competition does? Benchmarks are perhaps the only way stimulate such new technologies, how else do you intend to promote a new solution to a competitors lock on anything? Even when the benchmarking does show it might present real merit software developers who even want to invest the time, could in various fashions feel pressure from the three hundred pound gorilla who can probably intimidate in various ways.
It's in a way like saying.. stay in the back, don't rock the boat, all while everyone continues pass forward the tribute to the almighty. That's not how you progress new idea's in any market, especially in technology. There can be better ways to do something, but the status quo/dominance can put a powerful straggle-hold on emerging ideas when those rulers feel vulnerable.
Why did AMD bring out Mantle to showcase low level API and true asynchronous shaders? I almost want to think MS reneged on AMD, MS coveted it hoping to use such gains for the Xbox only, it was something they didn’t want to completely offer to PC gaming. AMD press them to see if they don't, others might pick it up. If others did (Ubuntu, SteamBox, Apple?), MS might watch their dominance with DirectX become undermined, and with it their last real vestige that somewhat props up their OS, as competition presenting an alternate (perhaps better) path for PC gaming see growth.
If you would like to blindly ignore their track record that is your prerogative, but save a few very good cards they have had little success in their main business ventures of CPU and GPU performance winners.
With that said, it appears you couldn't comprehend my post at all, ignoring all of what was said (and that wasn't meant to be specific). Now with this retort you appear to have offered your tribute forward with faith, hopeful those on high will continue to sustain your wants.
They have continued to showcase mediocre performing products, failed to materialize the improvements alluded to, and I am certain that they were aware of the lack of process shrinks at the same time their competitor was.
At the end of the day owning a fab or not doesn't mean jack to the end product when you have a competitor who is able to produce and perform with the same constraints.
All I want is competition so prices stay at the sane level, and performance continues to improve, and companies remain profitable so they can innovate and move forward.
Simply put, AMD sacrificed too much by going to BD IMHO when they could have invested that R&D into making the Phenom II more efficient.
I'll give AMD props for looking forward but, we all found out quickly that a CPU is only as good as its weakest link.
AMD had not been at all that "out of the ballpark" in performance/Watt with GCN (they lead in the 6XXX series/Fermi era), until Maxwell (GM204) Sept 2014; its been just a year... although a bad year. AMD's response while late, and seemingly lackluster, falls to the plain fact they have little money to invest in R&D and production. Nvidia had R&D that found they could discard much of the non-gaming functions to provide such efficiency (though little performance increase). As Nvidia being so big now is probably looking to separate chips for gaming and professional, more than they had ever been able to in the past. I really don't keep up in the professional market not sure how that's effecting them, but I believe they still rely on mainly Kepler in many professional Sku's.
Beating down AMD graphic response in 2015 for the woe's of not having the capital brought on by decision from past BoD caused... seems to be just "ganging-on". They're in a tough situation to be sure. Would everyone have liked AMD to have had the cash flow to re-spin say Pitcairn and Hawaii to use GCN1.2... perhaps; although, consider for such expenditure what could they have gained? We saw how Tahiti-to-Tonga had given a mediocre bump, and honestly the percent of gain over Pitcairn couldn't make sense against the competitive/price landscape of "entry" cards. While since Hawaii GCN1.1 had much of the GCN1.2 that would've been less a return on investment. Smarter for AMD that they to conserved their resources and put the bulk of effort toward 14/16FinFet. Seeing at least in the graphic side the stagnation of 28nm, and honestly I still believe it wasn't a great process node for the industry. TSMC up'd prices in the beginning, they had production issues at the start, and while it had efficacy gains they seem to be forfeited to lower clocks. Nvidia finally by stripping out the non-essential bits that don't improve gaming found huge untapped potential for efficiency, and smaller die's (~30%) that helped their costs. We don't know if AMD could've actually done that same culling of stuff, as it might have effected their architecture their building on true asynchronous shader pipelines.
All we can do is watch and see if the paths of either plays out better. Will AMD find new traction with emerging Dx12 titles? Can AMD make it to 14nm FinFET and HBM2 across more SKU's before Nvidia? Will Nvidia and their coming driver that's intend to emulate graphics and compute workloads concurrently, be as good as AMD because they’ve kept to their ACE units that have been central to the GCN?
And, this has the conversation going full-circle “AMD/ATI has had a long standing love of making/using a few new instruction sets”. AMD has been waiting for their investment in realizing graphics using concurrent compute workloads, since probably before 2010. Basically when they started looking at next-gen console parts, and now with Dx12 it’s seeing fruitrition. Though did it work for them we have to wait and see? As you said, "when it’s a standard that will not see the light of day before the hardware is too old, or before someone comes along and does it better." In this case by Nvidia calculating they can hold out with Maxwell in regards to async compute, it wouldn’t really start to hurt them, and once it does it won't matter they'll have Pascal. A marketing can show it's a reason to move from Maxwell parts. The part I wonder is how many Dx12 games will be released sponsored by Nvidia in the next year where asynchronous workloads are fudged-with, so those games appear descent on Maxwell cards?