Monday, December 10th 2018

Ex-Hardware.fr GPU Editor Damien Triolet Jumps Ship from AMD RTG to Intel

Oh hey remember this news post from July last year? Damien Triolet's work history off-late has been one of many such recent stories. These tend to begin with AMD, and RTG in particular, getting a cash infusion and growing in 2016 and 2017 to where they hired some of the best engineers and marketing personnel from the industry- media or otherwise. This follows a more stagnant GPU division in 2017-2018, Intel deciding to dip their toes back into the discrete GPU market, and in turn.. persuading many to cross over to the blue side.

According to Damien's LinkedIn and FaceBook profiles, he has started working for Intel from November 26, 2018 in a technical marketing position in their Gaming and Graphics division, a role analogous to his from his days at AMD. Presumably, he joins Raja Koduri and the many others who have followed this exact path of late, and everyone remains curious as to what the finished retail product will be. In the meantime, we here at TechPowerUp wish him the best again for his new venture. We had the pleasure of interacting with Damien on multiple occasions in the past, some as colleagues in the media giving hardware manufacturers a hard time, and others when he was hosting us as an AMD employee. His tenure at Hardware.fr has been inspiring to us, with excellent reviews that no doubt were what caught the eyes of AMD in the first place, and Intel will definitely gain from his presence.
Source: LinkedIn
Add your own comment

41 Comments on Ex-Hardware.fr GPU Editor Damien Triolet Jumps Ship from AMD RTG to Intel

#26
lexluthermiester
xkm1948AMD should have never purchased ATi in the first place as it was doom for the creation spirit from both company.
While I personally would have preferred ATI stay separate from AMD, financially it was a great move for both companies and there have been a number of great innovations and advancements as a result. And before anyone calls me a fanboy, I'm an RTX2080 owner and love the card.
Posted on Reply
#27
Xaled
HTCSmall point you seem to be missing: from creating an architecture to have it on sale for consumers takes roughly 4 to 5 years. This means Intel's Core 2 arch was created in 2001 / 2002, way before AMD bought ATI in 2006, so it wasn't buying ATI that precipitated AMD's problems: it was the other way around.
Yes sure. And thats the reason that let me say AMD wouldve made Zen 8 years ago (2010) not 12 year ago (2006). AMD couldve had an answer to Intels Core cpus in the period IF it hadnt acquried ATI. Instead they made a refresh or a a quick answer. then after came the disasterical Bulldozer. Just imagine the waste of money and time AMD spend on trying to embed gpus with cpus and it never worked!
HTCMuch like Zen arch caught Intel off guard, Intel's Core 2 arch caught AMD off guard. Much like Intel is having their competition issues magnified by their 10nm problems, AMD had to contend with the sudden unexpected performance boost from Intel's Core 2 arch precisely after it had endangered itself financially by buying buying ATI: both cases with "rotten" timing.
Thats the way it always used to be AMD and ATI always had answers to Intel and Nvidia. ATI has answer almost every year! Having an answer for an almost 12 year old (in cpsu) and 3-4 years (in gpus) happened after AMD acquired ATI
Posted on Reply
#28
davideneco
So ... close hardware.fr website for join intel ??
Posted on Reply
#30
AsRock
TPU addict
XaledAMD desgined Zen before the crypto bandwagon. actualy AMD would'd made Zen 8 years ago if it didnt buy ATI. That money that went to ATIs ex owners shouldve been spent on CPU development not on buying ATI.
But could AMD afford to let some one else like Intel or nVidia to have it ?, their was very good reason they needed ATI. Like @eidairaman1 getting there hands on ATI allowed them to get in to the consoles.
R-T-BThey immediately spun off GloFo after aquiring ATI IIRC just to help afford it, meaning that would've kept them afloat for some time if cash became an issue.

Best case, they may have done better AND even had more fab options.
Thing is none of us know if that would of been enough, and why would anyone pay top $ for a company loosing value.

What i do know is AMD is still around even though it has been a hell of a bumpy ride they are around, no one knows if they would of been around now if they hadn't but one thing for sure they would have today less options.
Posted on Reply
#31
Vya Domus
R-T-BAND even had more fab options.
Eventually everyone will have to go fabless, you reach a point when you don't a choice because your competition is using better nodes that come out of the gate quicker since companies like TSMC can pour all their money into R&D. Even Intel have come to feel that pressure.

For AMD turning their fabs into GloFo was another lifeboat that they managed to get on just in time. Matter of the fact is AMD is the only company that has both high performance CPUs and GPUs in their portfolio and yes getting there nearly killed them but they somehow managed. Nothing comes without a cost.
Posted on Reply
#32
renz496
HTCBut without the chipsets / GPUs gained by buying ATI, AMD could possibly have gone under fighting Intel.
there is no way AMD will going under even without ATI. in fact if AMD utilize the "other option" they probably able to keep up with intel just fine while having develop their own GPU solution.
lexluthermiesterWhile I personally would have preferred ATI stay separate from AMD, financially it was a great move for both companies and there have been a number of great innovations and advancements as a result. And before anyone calls me a fanboy, I'm an RTX2080 owner and love the card.
the thing is both company probably able to get so much more than what they got so far if the merger never happen. yeah that was some innovation behind the actual idea of APU. but in the end APU never becoming what AMD had imagine with it. it was just a CPU with integrated GPU in it. put GPU and CPU in a single package? nvidia has done it well before AMD with their tegras.
Posted on Reply
#33
I No
Vya DomusEventually everyone will have to go fabless, you reach a point when you don't a choice because your competition is using better nodes that come out of the gate quicker since companies like TSMC can pour all their money into R&D. Even Intel have come to feel that pressure.

For AMD turning their fabs into GloFo was another lifeboat that they managed to get on just in time. Matter of the fact is AMD is the only company that has both high performance CPUs and GPUs in their portfolio and yes getting there nearly killed them but they somehow managed. Nothing comes without a cost.
The deal with GloFlo they made almost crippled them... what are you talking about? If that was a lifeboat it leaked... badly ... the revised deal they made that didn't tie them down to GloFlo is what actually saved their skins and that was only done a while back. Spinning off GloFlo and keeping that "exclusiveness" with them is what kept them under Intel's boot all these years. The only thing GloFlo did for AMD is manage to make them pay for their incompetence .... twice... They managed to get AMD to pay for every wafer that they buy from other foundries. If this was a lifeboat they were better off taking their chances swimming with the sharks....
Posted on Reply
#34
medi01
VSGtechnical marketing position
#couldntcareless
XaledConsoles HURT AMD it didnt save it at all.
Consoles were nearly a half of AMD's business, when GPU business was shrinking and CPU products laughable it was the only thing that kept them afloat, before Ryzen business came.
Consoles are also the only reason its GPUs are relevant to game developers.
Consoles are also the main reason multicore support went mainstream in games.

CPU + GPU combo is so strong, Intel figured it have to jump the wagon.

And Fabs... With TSMC and Samsung beating Intel to 7nm, what on Earth are you about? The underdog with mediocre market share would be able to afford R&D on own fabs?!?!?!

Oh, boy, how delusional some people are.
renz496but in the end APU never becoming
Merely becoming what's inside majority of notebooks and desktops.
XXL_AIAll will be done with pretrained datasets, like Nvidia showcased with RTX platform.
Dear god, let this be sarcasm.
Posted on Reply
#35
renz496
Merely becoming what's inside majority of notebooks and desktops.
AMD did not create APU to simply make that CPU + GPU combo. but that's what APU end up being. just about anyone can do that. intel, nvidia, samsung, qualcomm, apple, huawei, mediatek able to make their version of it. APU was supposed to be a specialized processor that no CPU or GPU can do what it can. so at one perspective AMD try to do something different but never get it to where it should be. not even a demo ever exist to showcase this specialized APU capabilities.
Posted on Reply
#36
Xaled
renz496AMD did not create APU to simply make that CPU + GPU combo. but that's what APU end up being. just about anyone can do that. intel, nvidia, samsung, qualcomm, apple, huawei, mediatek able to make their version of it. APU was supposed to be a specialized processor that no CPU or GPU can do what it can. so at one perspective AMD try to do something different but never get it to where it should be. not even a demo ever exist to showcase this specialized APU capabilities.
That was like Apple buying Samsung to make new mobile phones that are better than iPhone! (or the opposite)
Posted on Reply
#38
comtek
TheGuruStudYes, yes, purge the losers. Intel can have em all.
So, are you saying he is a loser?
Posted on Reply
#39
sergionography
AsRockAnd who''s to say it would of turned out ok ?, your just presuming that 7b would of been enough, maybe it would of been but who really knows ?. Maybe they knew they were going have issue's in the near future.

Like hell Intel had already turned the market against them for the most part, they knew they could not do that with the graphics side of stuff so took a risk.

Sure it''s been ruff on them but they are getting back finally.
Read what i said carefully. I think AMD merging with a graphics company was a good choice long term, as that is an apparent need these days(look how serious intel is about it too). However the amd ati deal was just simply terrible and was a huge risk that cost them alot. Also the second part was that AMD management at the time was simply terrible, and I pointed out jen hsung huang because from his track record im absolutely certain that he would've handled the company much better.

Also from a business standpoint, initial goals of the buyout mostly failed because they did it on the basis of "the future is fusion" lol. They then went on about the whole heterogeneous system architecture (HSA) which hardly picked up and simply died as a term since ryzen came out. The whole thing about "GPU being better at floating point operations while CPU is better at integer" argument; well that's the mindset that resulted in bulldozer architecture.
renz496AMD did not create APU to simply make that CPU + GPU combo. but that's what APU end up being. just about anyone can do that. intel, nvidia, samsung, qualcomm, apple, huawei, mediatek able to make their version of it. APU was supposed to be a specialized processor that no CPU or GPU can do what it can. so at one perspective AMD try to do something different but never get it to where it should be. not even a demo ever exist to showcase this specialized APU capabilities.
HSA failed because it was a high level idea on paper that was impossible in practice especially back then. Just a typical bad business decision done by business people with little or no consultation with engineers. Lisa su being an engineer herself just shows how big of a difference it makes to know your stuff when heading a computing company.


And as for those who keep saying ATI saved AMD all these years because it's revenues kept AMD alive, well that in itself is a bad thing, and the situation with RTG today is proof of that. Basically we don't have competitive Radeon graphics today because any additional revenues in the past went to the failing CPU side. And especially at first; it was like 2 companies operating separately with one covering for the other failing company instead of investing it is own R&D.
Posted on Reply
#40
Fluffmeister
comtekSo, are you saying he is a loser?
The cult of AMD is strong, when you leave, you lose, and become a loser apparently.
Posted on Reply
#41
AsRock
TPU addict
FluffmeisterThe cult of AMD is strong, when you leave, you lose, and become a loser apparently.
Believe they all have some kinda cult going on.
sergionographyRead what i said carefully. I think AMD merging with a graphics company was a good choice long term, as that is an apparent need these days(look how serious intel is about it too). However the amd ati deal was just simply terrible and was a huge risk that cost them alot. Also the second part was that AMD management at the time was simply terrible, and I pointed out jen hsung huang because from his track record im absolutely certain that he would've handled the company much better.

Also from a business standpoint, initial goals of the buyout mostly failed because they did it on the basis of "the future is fusion" lol. They then went on about the whole heterogeneous system architecture (HSA) which hardly picked up and simply died as a term since ryzen came out. The whole thing about "GPU being better at floating point operations while CPU is better at integer" argument; well that's the mindset that resulted in bulldozer architecture.


HSA failed because it was a high level idea on paper that was impossible in practice especially back then. Just a typical bad business decision done by business people with little or no consultation with engineers. Lisa su being an engineer herself just shows how big of a difference it makes to know your stuff when heading a computing company.


And as for those who keep saying ATI saved AMD all these years because it's revenues kept AMD alive, well that in itself is a bad thing, and the situation with RTG today is proof of that. Basically we don't have competitive Radeon graphics today because any additional revenues in the past went to the failing CPU side. And especially at first; it was like 2 companies operating separately with one covering for the other failing company instead of investing it is own R&D.
We will just have to keep hoping they pull it all off. Don't believe MAD will have any thing until 2020 or even a bit later for the GPU side, i cannot see what they are releasing next year to be any thing really special although might not have the nVidia price tags.
Posted on Reply
Add your own comment
Nov 21st, 2024 22:58 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts