# "Vishera" End Of The Line for AMD FX CPUs: Roadmap



## btarunr (Dec 3, 2013)

We'd feared something like this would happen for some time now, but leaked AMD product roadmaps confirmed it that AMD FX "Vishera" is the last line of CPUs from AMD. The company will only focus on APUs from here onward, and at the very most, one could expect CPU core counts to go up from their current quad-core stale-meat since A-series "Llano," which will continue into the 2014 A-Series "Kaveri," too.

The alleged AMD roadmap slide leaked to the web by ProHardver.hu points out that socket AM3+ "Vishera" will exist on AMD's product stack for as far as AMD's eye can see - looking deep into even 2015. Unless AMD is planning on hanging its towel with AM3+, it wouldn't mark its roadmap slide out in this way. 2015 will see the introduction of "Carrizo," an APU that succeeds "Kaveri," which will be based on future-generation "Excavator" CPU micro-architecture, and a future-generation GPU architecture, along with full HSA programming model implementation. "Kabini" will have its spell running into mid-2014, at which point "Beema" will succeed it. 






Unless AMD is planning on 6-core, and 8-core APUs with "Carrizo," (we know that "Kaveri" is neither,) the roadmap reveals that AMD has given up on making processors that are pricier than $150. The company could focus its client products division onto APUs and GPUs, while multi-core processors could be kept alive by the enterprise products division under the Opteron banner, although we've not seen roadmaps to back that theory.

*View at TechPowerUp Main Site*


----------



## Batou1986 (Dec 3, 2013)

Thanks for the dead socket AMD, never again.


----------



## buildzoid (Dec 3, 2013)

Please AMD I would buy a 10 core Kaveri FX the day it came out even if it cost 350$.


----------



## Frick (Dec 3, 2013)

Batou1986 said:


> Thanks for the dead socket AMD, never again.



Meh, it lasted three generations. Lets see Intel do that. 

But yeah sort of sad though. I can totally understand them though, but still.


----------



## Batou1986 (Dec 3, 2013)

Frick said:


> Meh, it lasted three generations. Lets see Intel do that.
> 
> But yeah sort of sad though. I can totally understand them though, but still.



3 generations with little to no improvement from the first generation + higher power consumption

I understand why, because they cant even remotely compete with Intel in desktop cpu's  but it still makes me mad.


----------



## Exceededgoku (Dec 3, 2013)

buildzoid said:


> Please AMD I would buy a 10 core Kaveri FX the day it came out even if it cost 350$.



Agreed, but think enthusiasts may have to stick to Intel CPU, AMD GPU's.


----------



## EpicShweetness (Dec 3, 2013)

Exceededgoku said:


> Agreed, but think enthusiasts may have to stick to Intel CPU, AMD GPU's.



I came to that reality when something like the Q9550 came out. At that point 45nm was Intel only, and the frequency boost overcame architecture inferiority. By Nehalem well you ever play Monopoly, and your losing. It's a painful spiral my friends.


----------



## Debat0r (Dec 3, 2013)

Business-wise it's a great move for AMD, as their CPU division wasn't that profitable except for the APUs. Too bad they're losing a great part of the growing pc gaming market, especially now that they're more shifting to that (Radeon is gaming etc). Too bad, intel's prices are gonna go up in a few years, but I guess with a little price cut the FX series can still compete performance-wise with broadwell...


----------



## xtremesv (Dec 3, 2013)

The reality is sad, practically AMD is saying goodbye to performance and enthusiast PC builders. We won't see anytime soon an APU with 8 physical cores, Haswell-level IPC and R9 290X-like performance. They just threw in the towel. The immediate result: no competition, high prices, slow innovation.


----------



## Debat0r (Dec 3, 2013)

xtremesv said:


> The reality is sad, practically AMD is saying goodbye to performance and enthusiast PC builders. We won't see anytime soon an APU with 6 physical cores, Haswell-level IPC and RX 290X-like performance. They just threw in the towel. The immediate result: no competition, high prices, slow innovation.


<sarcasm>And Intel is already innovating at a staggering rate</sarcasm>


----------



## Frick (Dec 3, 2013)

Batou1986 said:


> 3 generations with little to no improvement from the first generation + higher power consumption
> 
> I understand why, because they cant even remotely compete with Intel in desktop cpu's  but it still makes me mad.



Oh the improvements were great, if you used it "correctly". FX83xx is still a great chip competing with i5's, in *some *cases. Which is the problem.

This all ties in with Mantle as well. And APU's is the future, most Intel CPU's have IGP's too.


----------



## xtremesv (Dec 3, 2013)

Frick said:


> Oh the improvements were great, if you used it "correctly". FX83xx is still a great chip competing with i5's, in *some *cases. Which is the problem.
> 
> This all ties in with Mantle as well. And APU's is the future, most Intel CPU's have IGP's too.



Yes, probably APUs are the future but until they are more powerful and fully flexible they are not viable for enthusiasts.

And speaking of Mantle I don't really think it will become relevant, the only way it could succeed is if Nvidia ceases to exist.


----------



## Solidstate89 (Dec 3, 2013)

Honestly, it makes sense that they're abandoning pure CPUs for APUs - what with their investment into HSA capabilities. What annoys me is that they're just ceding the high-end to Intel. I needed to decide with my latest build whether to go with Haswell or wait to see if AMD would introduce a Steamroller FX CPU. I'm glad I didn't wait, because clearly, I'd be waiting forever.


----------



## swaaye (Dec 3, 2013)

xtremesv said:


> Yes, probably APUs are the future but until they are more powerful and fully flexible they are not viable for enthusiasts.
> 
> And speaking of Mantle I don't really think it will become relevant, the only way it could succeed is if Nvidia ceases to exist.



Considering how CPUs aren't scaling up considerably in size anymore, since Intel and AMD see the "quad core" as a good spot to stay at for consumers, I can perhaps see APUs reaching parity with the new consoles at some point in the future as manufacturing and memory tech progresses.  If that happens it's possible discrete cards will be less interesting in general to most people since APUs will offer adequate capability to play the latest gaming trends.

Yeah I don't know about Mantle either.  If it were to be supported by everyone it would succeed but that doesn't make business sense for NVIDIA and Intel. You don't go out of your way to help a competitor gain mind-share and power. Mantle is certainly an interesting move for AMD though.


----------



## cadaveca (Dec 3, 2013)

I guess AMD's "The Future is Fusion" campaign that ran for at least a year failed to make this obvious.

That was back in 2010-2011, if I recall correctly, which makes this really old news...other than setting a date for the real transition.

Given that info, this is nothing but good news out of AMD.


----------



## ensabrenoir (Dec 3, 2013)

Nobody should be surprised by this.... intel will be sliower but for them to survive they too must walk this path


----------



## ironwolf (Dec 3, 2013)

bulldozer, piledriver, excavator...  When are we going to get devastator?


----------



## Frick (Dec 3, 2013)

ironwolf said:


> bulldozer, piledriver, excavator...  When are we going to get devastator?



ATOM!


----------



## eidairaman1 (Dec 3, 2013)

AM4, Skt G34?

I think AMD would be smart to integrate all class CPUs into a single Socket, Namely G34 would be ideal, no FM2, no AM4 etc, like it was with SKT A (462)


----------



## Fiery (Dec 3, 2013)

ironwolf said:


> bulldozer, piledriver, excavator...  When are we going to get devastator?



Devastator and Scrapper were the codenames of the iGPU of Trinity APUs


----------



## Assimilator (Dec 3, 2013)

"quad-core stale-meat"? Y'all mofos need a proofreader.

If this leak is genuine, it means AMD is not betting on new tech, but betting against Intel. They are betting that Intel takes its time releasing Haswell-E and hence support for DDR4. They are betting that DDR3 will still be the mainstream memory technology in 2015. They are betting that Intel's integrated graphics won't reach parity with their own APUs. Most of all, they are betting that Intel will sit back, become complacent, and milk the consumer; after all, without competition, why innovate?

Except what if Intel doesn't sit back, but instead decides to deliver the coup de grace to AMD? Perhaps by delivering Skylake - a mainstream desktop part with DDR4 support - in early 2015.

Let's also not forget that AMD is shooting itself in the foot by creating such long-lived chipsets and sockets. Motherboard manufacturers don't like long-lived chipsets and sockets because it means they sell fewer products, whereas Intel's tick-tock approach means that every 12 to 18 months, the manufacturers get a new chipset (and hence boards) to sell.

tl;dr AMD doesn't seem to have a roadmap or strategy, so much as a hope and a prayer. I fear that by 2015 their processor business will be in an even sorrier state than today.


----------



## Dent1 (Dec 3, 2013)

Batou1986 said:


> Thanks for the dead socket AMD, never again.


 
Intel has changed sockets like 2,3,4 times in the last few years I lose count.



-----------------

One thing I don't understand about this roadmap is it appears to be all APU based. Will AMD have any non-APUs?


----------



## Vario (Dec 3, 2013)

Hinted at in earlier article a year ago: http://www.techpowerup.com/174962/no-new-fx-processor-from-amd-in-2013.html


----------



## FreedomEclipse (Dec 3, 2013)

It truly sad to see AMD getting 'chased out' of the CPU market like this - Its almost somewhat depressing as back in '95 to '05 AMD seemed to flourish and it was like everything they touched turned to gold.  For a company that was so successful they ended up leaving on a very very low note.

With that said, APUs are the future with portable and low to mid range setups. AMD will benefit greatly from the console market and even more when tablets for gaming become more and more popular with casual gamers., so who knows where AMD will go. they could become a much stronger now that they have directed focus away from the failing side of the business, got rid of the cancer that was eating them from the inside and move forward with one of their most successful products that will continue to grow and grow


----------



## v12dock (Dec 3, 2013)

Get AMD a few years to restructure their CPU departments. I can see top end processor returning in the future. The FX lines were kinda of a embarrassment for AMD anyways give them a few years to fix themselves.


----------



## NC37 (Dec 3, 2013)

If AMD can get serious with APU designs and bring in L3, up the core count, and resolve the performance hit under DDR3 RAM...then this isn't a bad thing. But that is going to require a lot of tricky wrangling. Good thing AMD brought back that CPU designer from Apple. The next few years are going to be his time to shine.


----------



## RCoon (Dec 3, 2013)

Dent1 said:


> One thing I don't understand about this roadmap is it appears to be all APU based. Will AMD have any non-APUs?



Say hello to Athlon 750K and 760K, the APU's with the GPU torn out. Less TDP to deal with, more overclocking potential. That isn't going to stop me from switching to intel now though, but I'm going to sit back and wait for their next iteration of CPU's before I switch MoBo socket.


----------



## Dent1 (Dec 3, 2013)

FreedomEclipse said:


> It truly sad to see AMD getting 'chased out' of the CPU market like this - Its almost somewhat depressing as back in '95 to '05 AMD seemed to flourish and it was like everything they touched turned to gold.  For a company that was so successful they ended up leaving on a very very low note.


 
I don't think AMD are being chased out. They made an educated decision to follow the money. That is voluntarily leaving.


We all need to keep in mind that APUs are the way forward. Even Intel's high end CPUs are still APUs really. So AMD and Intel are mutually heading the same direction.


----------



## newtekie1 (Dec 3, 2013)

I'm glad AMD is focussing on APU's I just wish they would put out some higher end ones.  I can live with AM3+ dying, I think they should combine everything into one socket.  Go with socket C33, and have that one socket and motherboard support anything from dual-core APUs with no L3 to 16-Core APUs with huge amounts of L3.



RCoon said:


> Say hello to Athlon 750K and 760K, the APU's with the GPU torn out. Less TDP to deal with, more overclocking potential. That isn't going to stop me from switching to intel now though, but I'm going to sit back and wait for their next iteration of CPU's before I switch MoBo socket.



The GPU is still there, it is just deactivated.  And since it doesn't lower the rated TDP any, I'd guess there is actually still some power flowing through it.


----------



## TheHunter (Dec 3, 2013)

Its basically what intel has been doing since SB in mainstream market.. Onboard gpu + cpu..


Good to see there will be a SteamRoller and Excavator, but yeah only in this cpu+gpu hybrid mode. I hope its gonna be 8+ cores though.

But if we look at huma then it can change everything. Imo it would have been useless to use old FX design now if there is such a good potential in huma.. Wish intel would start doing this also, use igpu for general cpu processing more and mix both


----------



## FreedomEclipse (Dec 3, 2013)

Dent1 said:


> I don't think AMD are being chased out. They made an educated decision to follow the money. That is voluntarily leaving.
> 
> 
> We all need to keep in mind that APUs are the way forward. Even Intel's high end CPUs are still APUs really. So AMD and Intel are mutually heading the same direction.




If they werent chased then they were hounded out. Hounded out by anti-competitive shenanigans. Intel are all over that shit. boutique system builders dropping AMD CPUs from their custom builds & inventories and review sites that no longer bench Intel vs AMD cpus because they think its a waste of time. Less review samples being sent out - AMD preferring to have their own PR team over-hype processors then having reviewers telling the honest truth about the chips when they are released.  Everyone knows the score. Its an educated decision but how far did they have to flog a dead horse before coming up with the decision to depart from the mainstream desktop CPU market? Hats off to them for trying though, and to be honest, a lot of this probably could have been avoided if they pulled back the release date of the first phenoms and tweaked it a little more so they wouldnt look so bad when they were thrown against Intel chips. They knew it was coming alright, but I dont actually think they knew they were gonna be hit so hard. Maybe things might have been different if they held back and revised their designs. 

They tried to roll with the punches but they never recovered from it. Theres really nothing more to say on this matter.  At least now they will have some level of success even though they're no longer competing in the enthusiast market.


----------



## TheoneandonlyMrK (Dec 3, 2013)

v12dock said:


> Get AMD a few years to restructure their CPU departments. I can see top end processor returning in the future. The FX lines were kinda of a embarrassment for AMD anyways give them a few years to fix themselves.


and a bit more node parity.

I may have to except the demise of Am3+ but Amd will be back when they have the right things in place and fully tested, the roadmap has slipped as has most bar intel's AND ARM'S, but most of these things are node based issues and process obstacles im expecting the next Fx Arch to rival a medium Gameing pc 8x8 core gpu+cpu Hsa combined with ddr4 support ,a true step up from am3+ and all intels mobos.

Many decry the lack of new features on AM3+ but ive 4 useable pciex since day 1 and nothing gimped,, all features present, all intel did was unravel their chipset piecemeal over years and many sockets,, and at great cost to the yearly updaters ,,so what Am I missing out on ,,,PCIEX3 and thunderbolt oh my, how are us Amd AM3 users getting by.

Also excavator the core made for Hsa isnt out yet and hence fairly untested (and the core worth most in investment(Ip and future) terms),whilst they maybe could have done a steamy FX, it would not have made good business sense since the channels loaded with FX still and intel have a lead in nodes and power efficiency,ie they wouldnt have sold, Amd have been very busy on  Hsa/ software and Soc designs and the effort required for a steamy FX was not worth it in reality, however once TSV(3d/2.5d)production really kicks in on a decent node size below 28 and ddr4 finally hits the consumer market,,, well ,then well see.


----------



## dwade (Dec 3, 2013)

AMD was Intel's old rival in Pre-PC era. Intel's main rival is ARM, who won't last long as well.


----------



## Eukashi (Dec 4, 2013)

we need more cores, please 4M8C 4CU/256SP Kaveri PLEASE!


----------



## badtaylorx (Dec 4, 2013)

how quickly we all forget that SB, IB, and Haswell are all technically APU,s


----------



## GLD (Dec 4, 2013)

Well my 125w PII 940 that isn't being used may just have increased in resale value?   Good solid CPU as we all know.

Maybe now Biostar will make another batch of AM3+ motherboards. I want a new Biostar 990FX board, to make me want to upgrade from my Biostar TA790XE that I have been using since Feb. of 2010.


----------



## xenocide (Dec 4, 2013)

This sure puts a kink in all those people saying AM3+/AMD offer better upgradability xD


----------



## Sempron Guy (Dec 4, 2013)

Surely though we are not moving backwards to single threaded performance so with better multi core support for both apps and games, the current FX line can still pack a punch.


----------



## TRWOV (Dec 4, 2013)

I don't really mind that no new architecture will be developed for the AM3+ platform but I hope that we at least get to see a 28/22 nm Vishera. A 22nm Vishera at >5Ghz and 125w TDP would be a nice swang song for the AM3+ platform.

I suppose that AMD is going to take a page from Intel's book and direct us power users to their server platforms in the future.


----------



## ompak5 (Dec 4, 2013)

when ATI was handed down to AMD they said they will not compete on Nvidia to higher END of video card coz theres no money on the enthusiast level, but now they have R290 for enthusiast. Today AMD throw the towel and they will not compete on the higher END of CPU war with INTEL. i think there is something cooking on the design room of AMD? GO ON AMD!!!!


----------



## beautyless (Dec 4, 2013)

Why FX CPU not profitable? It's more expensive than their APU.

I dreaming for 22nm 4.5/5GHz x 12 excavator cores CPU 12MB L3 Cache without Graphic part, support PCI-E 3.0, support dual-channel DDR4 3.2 GHz up to 64 Gigabyte capacity, and latest storage interface for faster SSD.  Also, used on the same socket with cheap APU. But after read this news, I better hope with 2015 Intel.


----------



## ensabrenoir (Dec 4, 2013)

Can't say im mad at them.... they've been doing some smart survival moves. redefining themselves instead of the old "we're the cheap intel / nvidia" strategy .  Us desktop enthusiast are probably looking like ham radio operators.... in a cell phone age....  Its a great hobby ...but more money in cell phones so.....


----------



## xenocide (Dec 4, 2013)

ompak5 said:


> when ATI was handed down to AMD they said they will not compete on Nvidia to higher END of video card coz theres no money on the enthusiast level, but now they have R290 for enthusiast. Today AMD throw the towel and they will not compete on the higher END of CPU war with INTEL. i think there is something cooking on the design room of AMD? GO ON AMD!!!!


 
I'm almost positive AMD never said they wouldn't compete with Nvidia--their GPU's have always been competitive with Nvidia.  I know for a fact they did say they wouldn't compete Intel at the highest end about a year and a half ago when the FX-x3xx lines came out (Vishera).  Apparently this is what they meant.


----------



## Fourstaff (Dec 4, 2013)

AMD is not going to lose a lot of business because of this: the only group alienated through this decision is the budget HPC crowd.


----------



## medo (Dec 4, 2013)

I dont mind killing AM3+ Socket, my issue with the slowness of AMD in progressing with there Plans

AM3+ was truly bulky with north bridge and south bridge but ofcourse having hexa or octa cores is great benefit, espeically for cheap, and the fx 6300 was truly amazing.

Here is my problem :

1) AMD will ditch big cores, and favor GPU computing, and push towards that note that there slides  shows the future CPU`s will all have an 65W TDP, which is hard to create in hexa or octa form, but who knows maybe manufacturing will advance by then.

2) AMD will create a APU version with hexa or octa cores but it will take very long time, and AMD plans will keep getting delayed.

I truly hope AMD works ASAP, and Unify there socket as many have stated, into a single one, and push forward with there plans, and make the 8 core athlon, Amd states now that upto 47 Percent of the die space is used for GPU and thats with quad core, so with another shrink or optomization i reckon AMD can pull an 8 core Athlon, but the question is how much longer will it take.


----------



## NC37 (Dec 4, 2013)

medo said:


> 1) AMD will ditch big cores, and favor GPU computing, and push towards that note that there slides  shows the future CPU`s will all have an 65W TDP, which is hard to create in hexa or octa form, but who knows maybe manufacturing will advance by then.



ATI did the same thing on the GPU end. nVidia for the longest time ran with powerful monolithic GPUs while ATI switched over to a smattering of cores to handle the same load. Trouble was, they were behind for about 2-3 gens but they did finally catch up. 

Its a novel idea. But execution is always an issue. However this isn't GPU company vs GPU company. This is GPU company vs CPU company. GPU computing is an area AMD could beat Intel in. There is no question there. But AMD really, really, has to nail it. I suspect it would come down to drivers and Windows implementation. AMD could do it but if nothing can utilize it then Intel just sits there while it gets it's own R&D caught up.


----------



## HammerON (Dec 4, 2013)

Fourstaff said:


> AMD is not going to lose a lot of business because of this: the only group alienated through this decision is the budget HPC crowd.


Yep - ^^^AMD is going to be providing the hardware for many, many PS4 and Xbox One users for the next x years. They haven't been able to compete in the high-end CPU market for many years now. Their APU's are selling really well right now and so this is a good business decision in my book.


----------



## kn00tcn (Dec 4, 2013)

i remember reading something about how dirk meyer resigned due to 'not liking the direction amd wanted to go' or something along those lines...

given that processor designs take years, could that have been an early hint that high end high power desktop cpus didnt have a future (in the classic sense at least, nothing wrong with a good mainstream apu in the veign of q6600 / q9550 / one of those $300 i7s / 2600k / 3770k / 4770k, i can totally see an apu conflict with him since he led the team that created the athlon)

EDIT: the ps4/xb1/wiiu stuff is hardly a big deal, it only gets the brand into consumers minds, why would amd get lots of profits from it? it's nothing like their own retail product like a cpu or gfx card


----------



## joyman (Dec 4, 2013)

Most of the people forget the purpose of the APU. And it is merely just a stage in the way of hybrid processors. By 2015 on the old AMD roadmap there will be HSA enabled chips. Looking on their updated roadmap it appears that they are a little ahead of this. So in a few years GPU cores in the CPU will be used also as a FPU, because they are just designed for this task since of the first scalar gpu architectures. This is why AMD Bulldozer design is oriented on more integer modules and less FPU, because GPU will help there. And Intel are far behind this kind of tech IMHO. They have more resources, yes, but history tells that most of great design features comes from AMD. Also in good threaded apps(games) 2 module/4 thread AMD CPU fares well against other 4 cores processors. The limiting factor for smooth gameplay is almost always the GPU(people game at big resolutions and there cpus are not so important).


----------



## rvalencia (Dec 4, 2013)

Batou1986 said:


> Thanks for the dead socket AMD, never again.


Intel Haswell is not compatible with Intel Sandybridge/Ivybridge sockets. 

PS; My PC has Intel Core i5-2500K.


----------



## jigar2speed (Dec 4, 2013)

xtremesv said:


> The reality is sad, practically AMD is saying goodbye to performance and enthusiast PC builders. We won't see anytime soon an APU with 8 physical cores, Haswell-level IPC and R9 290X-like performance. They just threw in the towel. The immediate result: no competition, high prices, slow innovation.



Can you please let me know a CPU from Intel which has no GPU on chip ?


----------



## xenocide (Dec 4, 2013)

rvalencia said:


> Intel Haswell is not compatible with Intel Sandybridge/Ivybridge sockets.
> 
> PS; My PC has Intel Core i5-2500K.



The difference is Intel never claimed socket retention, they also kept LGA775 around for what, like 5-6 years?  That's not bad.  AMD were the ones that insisted on keeping Socket AM2/AM3 around for nearly a decade (AM2 was originally released in 2006) while reassuring people they had no interest in changing sockets.



jigar2speed said:


> Can you please let me know a CPU from Intel which has no GPU on chip ?



Anything with a P at the end of the model number has the GPU de-activated.  It's unreasonable to expect them to make 100,000 of a certain SKU and design a completely different manufacturing process for 5-10,000 CPU's because some people are afraid of an iGPU.


----------



## jihadjoe (Dec 4, 2013)

EpicShweetness said:


> I came to that reality when something like the Q9550 came out. At that point 45nm was Intel only, and the frequency boost overcame architecture inferiority. By Nehalem well you ever play Monopoly, and your losing. It's a painful spiral my friends.



What architecture inferiority? The only time Intel had to resort to frequency was during Netburst.

Clock-for-clock Core2 was much faster than Athlon64 and Phenom.
Just go back to the old reviews. 2.4GHz E6600 completely dominated the 3GHz Athlon 64 X2 6000+


----------



## Dent1 (Dec 4, 2013)

jihadjoe said:


> What architecture inferiority? The only time Intel had to resort to frequency was during Netburst.
> 
> Clock-for-clock Core2 was much faster than Athlon64 and Phenom.
> Just go back to the old reviews. 2.4GHz E6600 completely dominated the 3GHz Athlon 64 X2 6000+


 

Bit of a blanket statement. There was too many revisions of the Athlon 64 series and Core 2 Series to make that generalisation. Yes Core 2 was faster "much faster" depends on which iteration you're talking about.  Yes the E6600 was faster than the Athlon II X2 6000, but I wouldn't say dominate.

You are cherry picking too because when AMD moved to the "Kuma" Athlon 64 revision i.e. Athlon X2 7750 BE the performance was very competitive with the Core 2 Duo Conroe.

  When AMD moved to _Callisto_ i.e. Phenom II X2 560, it was neck and neck competitive with the E8400. The Athlon II X2 could also compete and sometimes outcompete the E8400. (also  the Athlon II X3 was much cheaper than a E8400 and handily dominated the entire Core 2 Duo series).

But we have selective memories of those events


----------



## Slomo4shO (Dec 4, 2013)

I am surprised that there will be no DDR4 support on their APU lineup in 2015...


----------



## NeoXF (Dec 4, 2013)

Dent1 said:


> Intel has changed sockets like 2,3,4 times in the last few years I lose count.
> 
> 
> 
> ...


Not to mention, next year they are planning a new chipset... for the same socket... that is the only way you'll be able to run desktop Haswell refresh and/or Broadwell-K...

So please, people, a little less with the uneducated biased and/or hateful comments that have nothing to do with reality, and think.

AMD would show a lack of strength and confidence in their own products if they would release FX processors on the side, again people, think god damn it.

Too bad for the time being, APUs still sound good only on paper... but we shall see.


----------



## HumanSmoke (Dec 4, 2013)

Dent1 said:


> You are cherry picking too because when AMD moved to the "Kuma" Athlon 64 revision i.e. Athlon X2 7750 BE the performance was very competitive with the Core 2 Duo Conroe.


Accuse someone of cherry picking then compare a 7750BE with a Conroe? Even though Wolfdale and Yorkfield 45nm Penryn-based CPUs had already been in the marketplace for nearly a year?


Dent1 said:


> When AMD moved to _Callisto_ i.e. Phenom II X2 560, it was neck and neck competitive with the E8400.


If all the  Phenom II actually had to compete with was an ageing Core 2 Duo when AMD would have been peachy. Pity Intel already had Lynnfield based i5 and i7 in the consumer channel by the time the X2 560 arrived then isn't it?


Dent1 said:


> But we have selective memories of those events


Indeed.


----------



## Casecutter (Dec 4, 2013)

xenocide said:


> It's unreasonable to expect *them* to make _X-amount_ of a certain SKU and design a completely different manufacturing process for _X-amount_ CPU's because some people are afraid of an iGPU/APU.


Fixed this... as now it's more poignant, Intel and AMD now are just more aligned.

Rats, was hoping AMD might spin one more interation for AM3 socket.  Nothing all that earth-shaddering, just wanted like a 6-core 95W part the had had a little more oomph or a "Black".   My boy's are on 870 based machines  and hoping to stretch one more upgrade, looks like FX-6300 are all I'll have to look at.

AMD... start offering tray parts for $20 less!  please...


----------



## Dent1 (Dec 4, 2013)

HumanSmoke said:


> Accuse someone of cherry picking then compare a 7750BE with a Conroe?


 
Well yes because jihadjoe was referring to the E6600 which is Conroe.


----------



## HumanSmoke (Dec 5, 2013)

Dent1 said:


> Well yes because jihadjoe was referring to the E6600 which is Conroe.


The E6600 came out in July 2006, the 7750BE didn't launch until December 2008. Why bother comparing an AMD processor to an Intel CPU that *1*. Was two and a half years old, and *2.* *Had been EOL'ed nearly a full year before the 7750BE even arrived *???


----------



## Dent1 (Dec 5, 2013)

HumanSmoke said:


> The E6600 came out in July 2006, the 7750BE didn't launch until December 2008. Why bother comparing an AMD processor to an Intel CPU that *1*. Was two and a half years old, and *2.* *Had been EOL'ed nearly a full year before the 7750BE even arrived *???


 
OK, so now there is a EOL clause. Conroe become unchallenged because it went EOL, didn't realise that stipulation.

The Wolfdale Core 2 Duos were around when the Phenom II  X2 560 and Athlon II X2/X3 was around competing with it - either way you look at it  jihadjoe's statement wasn't 100% accurate.


----------



## itsakjt (Dec 5, 2013)

If AMD tries hard, I am pretty sure they can make better CPUs in the AM3+ socket. The socket is absolutely fine. But the architecture needs refinement. Better quality L1, L2, L3 caches, better IMC and better IPC is all they need. Instead of core count and cache capacity, they need to focus on performance per core and cache speed. 
For that, they will need to keep the socket constant and start or refine the architecture from scratch. As for Vishera, the architecture is good enough especially the 63xx and the 83xx.


----------



## NeoXF (Dec 5, 2013)

itsakjt said:


> Better quality L1, L2, L3 caches, better IMC and better IPC is all they need.



*facepalm*


----------



## TheRagnarok (Dec 5, 2013)

Intel's infamous "netburst" architecture comes to mind, and their anticompetive nature sealed AMDs fate.


----------



## TRWOV (Dec 5, 2013)

AMD doesn't really need a new AM3+ architecture at this point. Refine Vishera, get it to 28nm (22nm is a pipe dream ATM) and release it at >4.5Ghz. The strategy has worked well with their GPUs so I think they should try that.


----------



## alucasa (Dec 5, 2013)

Too bad to see AMD losing out in the CPU race but I am sure majority of us have seen this coming.

I've used APU for HTPCs but to be honest even Intel HD is good enough for HTPC unless you are going for 4K or something in which case you would need a dedicated gpu anyway.


----------



## itsakjt (Dec 5, 2013)

For cost effective solutions, AMD is way ahead and always will be. An AMD APU A10 combined with a entry level GPU like the 6670 delivers excellent performance in 720p. Come on, not all of us play at 1080p 8x MSAA and so on. Most people from poor countries like the one I live in cannot afford high end systems for gaming. For them, APU is the perfect choice. And computer hardware is damn costly here. So for them, atleast people can play and get 40-50 FPS on 720p screens on majority of the games. Had it been an Intel system, the HD graphics could not be used with a discrete GPU and the cost would have been more. An A10 and Core i3 are priced almost same where I live, the i3 being a bit more expensive. Not just for HTPC, APUs are meant for entry level gaming too. 
Come back again to FX, I would say they are good for the price apart from the 4xxx series. I don't find any reason to get a Core i3 instead of a FX 6xxx. 
I am not a fanboy. I am just at the reality. Here where I live, at this moment, a FX 8350 and Core i5 3570 costs the same. Both are good at their fields and performs more or less the same overall. 
And the reason I said that FX needs better caches is because if you bench using AIDA cache and memory benchmark, FX L2 and L3 caches perform really bad. You can check out yourself.


----------



## fullinfusion (Dec 5, 2013)

Thats a sad thing, it makes me want to go grab a high end AMD system again just for the memorabilia..


----------



## TRWOV (Dec 5, 2013)

I was thinking that myself


----------



## techtard (Dec 8, 2013)

Apparently this was a rumour that turned out to be false.


----------



## de.das.dude (Dec 8, 2013)

NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO


----------



## TRWOV (Dec 8, 2013)

techtard said:


> Apparently this was a rumour that turned out to be false.



The new slide doesn't debunk the report; if anything, it confims it:











Both roadmaps are the same, the only difference is that the top one includes 2012 and 2015. Both say that Piledriver (Vishera) will be the only architecture available for the AM3+ platform for the comming years.


----------



## TheoneandonlyMrK (Dec 9, 2013)

de.das.dude said:


> NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO


why no??

Wierd D D D I dont think a future without Amd high end cpu's is a cheap one or one i would like

This end of the cpu twaddle has been spouted add nauseum for the last few years and all regardless of the FACT that binning will retain pure cpu types even when the gpu is integrated because its better then bining them(actual dustbin) (the ones the gpu is broke on) and were not even in that yard yet since Amd and intel still need server parts and the resultant refuse becomes consumer parts.

Fx's will continue to evolve in a positive way as mine has with better software ,Api's ,and Os's


----------



## Gezzer (Dec 11, 2013)

I really don't understand all the doom and gloom this announcement is producing.  If you really think about it, it was going to have to happen eventually. Let's take a long look at the events that have brought AMD to this point.

Amd has historically always suffered under Intel's shadow. The only times they've been able to out maneuver Intel is when they've gone for the "hail mary pass" so to speak. Think about it, what was the last truly innovative CPU technology that Intel produced on it's own? If you don't count instruction set extensions which can be of dubious use I'd say, hyper threading. The one before that, on die L2. That's it. The L2 would of been pretty obvious to any one. There could be some debate about Hyper threading, but everyone knew eventually multi-core CPUs were coming. Hyper Threading was just a step in between single core and multi-core. On the other hand AMD has produced a lot of innovations that are in common use today and a lot of people (even me) saw dubious worth in them.

When Intel's  Netburst was first on the drawing table it was all about clock speed. Amd and Intel had been trading blows and neither really had the upper hand, even with AMD being the first to reach 1GHz. So while Intel pursued their failed Netburst architect (it never scaled in the manner Intel thought it would) and ever climbing FSB speeds, AMD introduced a much more efficient core architect making over all clock speed less important. They then added the following, first an on die memory controller making FSB less of an issue. Secondly a 64bit extension to the X86 instruction set. I'll admit I was even one to say "so what, other then memory limits, why do we need 64bit" Okay and I was wrong, how many people still run a 32bit OS? Fourth they introduced the first true dual core. And lastly they introduced on die GPUs. All innovations that AMD pioneered and Intel later adopted.

AMD became king of the mountain around the time of the 64bit extensions. In fact that's why they first introduced the FX line of processors. Insanely priced, but the best you could get. As it later came out, and was well documented, the only way Intel could even compete was by forcing OEMs to only carry Intel products. Intel all most lost the race except for a strange turn of events. Enthusiasts had started to use the Pentium M processor (part of the Centrino brand) for desktop use. While Intel may not be as innovative as AMD, that doesn't mean their not smart, and they saw the potential the Pentium M had. I'd say that Intel's biggest strengths is it's all most inexhaustible resources and it's ability to turn on a dime and refocus on more promising avenues. Which is exactly what they did. Intel has taken the Pentium M and over time added all of AMD's innovations to the point where they totally dominate the CPU market. In truth AMD's only recourse is to offer deep discounts on their flagship products. It's sad, but it's also true.

So what can AMD do to once more get out of Intel's shadow? It's pretty obvious that no matter what AMD does with their current processors they will always be playing second fiddle to Intel's. They'll never be a threat. In fact as been shown with Intel's last generation Intel doesn't even consider AMD a threat anymore. In stead of increasing CPU performance they seem to be concentrating on reducing power consumption and heat.  And that makes sense because of the rise of mobile computing and how ARM has risen to become a major player. Intel has it's sights set on taking on ARM, not AMD. But this gives AMD a advantage they haven't had since the Athalon 64 days, room to breath and maneuver.

So what can they do with this breathing room? Well they could keep trying to improve traditional processor performance, but we all know where that will lead. No AMD needs to do what AMD does best. Come out of left field with a new technology that no one thinks is viable, just like they did in the past. So what does AMD have that Intel just can't touch? GPU performance on their APU dies. While Intel has made great leaps in this area, they'll never be able to touch AMD, they simple have to much of a lead. So AMD needs to leverage that advantage in the best way it can. How can they? While it's true that more and more applications are taking advantage of mulit-core CPUs, simply adding even more and more cores is eventually a dead end solution. Instead why not throw that "hail mary pass"? Produce an entirely different approach something like a heterogeneous CPU. http://en.wikipedia.org/wiki/Heterogeneous_computing
AMD already has many of the parts already in place on the CPU die, and if their Heterogeneous initiative succeeds. http://hsafoundation.com/ They might just pull it off. All they really need once it's up and running is a "killer app" and AMD will quite likely make an end run around Intel.

With all that in mind AMD can't afford to be split into a APU and CPU company. This is an all or nothing play. A true "hail mary pass". So eventually AMD was going to have to go this route any way. I'm just hoping that it means AMD's heterogeneous system architect isn't that far away. It just might be one of the biggest game changers for desktop computing we've seen yet, and save AMD's bacon in the process.


----------



## GreiverBlade (Dec 11, 2013)

ironwolf said:


> bulldozer, piledriver, excavator...  When are we going to get devastator?


Devastator is the codename of the igp at last on the A10-5800K my 7660D was a Devastator nicknamed


----------



## techtard (Dec 13, 2013)

It would be nice if they released something that is more efficient on AM3+, my 8320 is a power hog. I have to plug the hot air ducting from my furnace or my computer room becomes stupid hot, even in the winter.


----------



## ensabrenoir (Dec 14, 2013)

....all this intel this and amd that.   If amd did most of the improvements  most of their users want......  the chips would probually cost as much as an intel chip. Hence why they won't.   There is a shift from computers to "devices" happening which suits amd better.  They are following the right path though.


----------



## cyneater (Dec 14, 2013)

The last round AMD won was the Athlon 64 great chip was  a Pentium 4 killer. Ever since then they have dropped the ball over and over. Although the 1060-T wasn't too bad. The current line of FX chips suck though. 

Although saying this if AMD used there ARM license and made an ARM CPU and motherboard with 4 or more sata ports many people would start using them. Hint hint ...  low power high performance and loads of sata ports would make a great nas. Since mine and many other peoples server / nas doesn't run windows.


----------



## techtard (Dec 14, 2013)

They didn't drop the ball so much as get locked out of the market while they had a better product by Intel via illegal business practices. 
The billion dollar fine they were hit with was a slap on the wrist for basically crippling their competition.

ARM might be interesting, but I suspect that it will take forever for developers to switch over from x86. It could be a good thing if it happens though, a clean break for the next gen of computing.


----------



## TheoneandonlyMrK (Dec 14, 2013)

cyneater said:


> The last round AMD won was the Athlon 64 great chip was  a Pentium 4 killer. Ever since then they have dropped the ball over and over. Although the 1060-T wasn't too bad. The current line of FX chips suck though.
> 
> Although saying this if AMD used there ARM license and made an ARM CPU and motherboard with 4 or more sata ports many people would start using them. Hint hint ...  low power high performance and loads of sata ports would make a great nas. Since mine and many other peoples server / nas doesn't run windows.


what kind of an idiot calls the FX line a failure due to not beating intels best HEDT cpu then pulls arm chips out the back pocket as an example of win an intel fanboi, Amd's cpu's are not defective just less powerfull in some applications, most inc me do 2-200 things all at the same time on a pc and i can tell you FX's manage fine yeh i may not get quite the Fps as some but ive just been out on the beer with the money i saved so happy days.


----------



## HumanSmoke (Dec 14, 2013)

techtard said:


> They didn't drop the ball so much as get locked out of the market while they had a better product by Intel via illegal business practices.
> The billion dollar fine they were hit with was a slap on the wrist for basically crippling their competition.


Too simplistic to say that it was all Intel's fault. AMD are as much an architect of their present position as Intel.
Even before Intel bribed Dell et al AMD had issues with fabrication capacity. Under the cross lease agreement, AMD could outsource up to ~20% of their production to other foundries. Even with markets denied AMD, they could not satisfy the demands of the customers that they had. It wasn't until the situation became acute that AMD approached Chartered Semi, and even then did not utilize the full 20% outsource allocation available ( ~7% IIRC). Why the reluctance in using non-AMD foundries at the expense of market share ? Answer: W. Jerry "real men have fabs" Sanders. It is no coincidence that AMD only explored the use of third-party foundries to add capacity when Sanders stepped down.
That is likely the primary reason that Intel settled with AMD for a relatively paltry $1bn (remember that Nvidia's settlement was $1.25bn, and the EU antitrust fine was $1.45bn by way of comparison). The secondary reason was just as likely AMD's desperate need to pay for debt servicing (see below) which is why the low-ball $1bn was accepted.

So, Intel is cause #1, Sanders hubris is cause #2, And of course, AMD own lack of strategic planning is cause #3....What other company overpays by 100% for an acquisition ( $5.4bn total paid for ATI - $1.7bn in cash from AMD, $2.5bn borrowed from lending institutions, $1.2bn in AMD shares) only to write down $1.77bn less than a year later, and another $880 million six months after that? Note that the money borrowed for the ATI buyout (and has served as a millstone around AMD's neck ever since) is actually less than the write down associated with the AMD's initial overvaluation of ATI. Also note that AMD was the only company interested in buying ATI in 2006.



techtard said:


> ARM might be interesting, but I suspect that it will take forever for developers to switch over from x86. It could be a good thing if it happens though, a clean break for the next gen of computing.


That is why AMD acquired SeaMicro. Investing in a company that has an existing knowledge base of ARM and it's implementation is easier and less resource hungry than bootstrapping AMD into the ARM environment


----------



## Melvis (Dec 14, 2013)

techtard said:


> It would be nice if they released something that is more efficient on AM3+, my 8320 is a power hog. I have to plug the hot air ducting from my furnace or my computer room becomes stupid hot, even in the winter.



But your 8320 is running at 5GHz, isnt that sorta expected? :S

A stock speed 8320 runs quite cool, like 50c cool.

@HumanSmoke Also note that AMD approached Nvidia first to buy them out before they turned to ATi, which would of cost at the time another 6billlion dollars.


----------



## TheoneandonlyMrK (Dec 14, 2013)

I have my fx going between an un  fluctuating 5ghz down to 1.5 eco style with most eco features on yet I can obv force perm max clock so its possible to use 5ghz cheaply many dont but some do.


----------



## HumanSmoke (Dec 14, 2013)

Melvis said:


> @HumanSmoke Also note that AMD approached Nvidia first to buy them out before they turned to ATi, which would of cost at the time another 6billlion dollars.


One has little, if anything to do with the other. Nvidia was (and is) worth substantially more than ATI. If the AMD/Nvidia deal had gone ahead, Jen Hsun Huang would have been CEO of the new company. Do you think a combined AMD/Nvidia under JHH would have made a company weaker than what we got from AMD/ATI under Hector Ruiz ?

A failed buyout/merger attempt does not mitigate the fact that AMD overpaid for ATI, and that overpayment resulted in the company pouring income into debt servicing rather than R&D and maintaining its foundry business. Nor does it mitigate that fact that AMD were slow to realize that the market for x86 was increasing substantially faster than their own estimates.
AMD have always been a reactive company that has allowed the current state of the market to dictate their product lines, rather than think strategically and actually shape or create the market. Hardly surprising when you consider that AMD was formed by salesmen as opposed to Intel and Nvidia being formed by engineers.


----------



## Melvis (Dec 15, 2013)

HumanSmoke said:


> One has little, if anything to do with the other. Nvidia was (and is) worth substantially more than ATI. If the AMD/Nvidia deal had gone ahead, Jen Hsun Huang would have been CEO of the new company. Do you think a combined AMD/Nvidia under JHH would have made a company weaker than what we got from AMD/ATI under Hector Ruiz ?
> 
> A failed buyout/merger attempt does not mitigate the fact that AMD overpaid for ATI, and that overpayment resulted in the company pouring income into debt servicing rather than R&D and maintaining its foundry business. Nor does it mitigate that fact that AMD were slow to realize that the market for x86 was increasing substantially faster than their own estimates.
> AMD have always been a reactive company that has allowed the current state of the market to dictate their product lines, rather than think strategically and actually shape or create the market. Hardly surprising when you consider that AMD was formed by salesmen as opposed to Intel and Nvidia being formed by engineers.



Thats right as I said 6billion more, ATi was 5.x Billion to buy compared to nvidia of 11billion. And the reason it didnt go through was because of the CEO of Nvidia wanted to be the CEO of both company's which was a pipe dream for him, as if the CEO of AMD would step aside? AMD at the time was worth between 25-30billion. I wish it would of gone through back then, as I liked Nivida alot more then ATi but the CEO was a idiot.  To answer your question, YES I think it would of made the company weaker if JHH took over a CEO, what the hell does he know about CPU's? Nvidia has been going down hill ever since and ATi/AMD have been growing ever since, nvidias lose I say. I don't think they over paid for ATI, as they where less then half the cost of Nvidia, and that saving most likely has helped alot in recent yrs with there struggles as it is, just imagine if they spent that extra 6billion? :S Well I think thats untrue as AMD has said for yrs now that the future is fusion and lets face it, they have been right, there APU line up has been selling like hot cakes. Anyway im hungry, lunch!!


----------



## HumanSmoke (Dec 15, 2013)

Melvis said:


> Thats right as I said 6billion more, ATi was 5.x Billion to buy compared to nvidia of 11billion. And the reason it didnt go through was because of the CEO of Nvidia wanted to be the CEO of both company's which was a pipe dream for him, as if the CEO of AMD would step aside? AMD at the time was worth between 25-30billion.


Well firstly, as I pointed out, this has nothing to do with AMD's current state of affairs, since y'know, the deal never happened.
Secondly, if you're gonna pull facts out of your arse:
On July 21st 2006 the market evaluated NVIDA as worth $6.2 billion, on the day AMD announced they were going to buy ATI NVIDIA market cap increased to $6.9 billion.....and AMD sat at around $10.5 billion
Given that AMD (over)paid twice ATI's effective value, I could see how you'd also think that Hector would also overpay for Nvidia


Melvis said:


> Nvidia has been going down hill ever since and ATi/AMD have been growing ever since, nvidias lose I say.


Why am I not surprised. AMD's market cap is a quarter of its FY 2006 value, and they've lost market share in x86 and GPU (discrete and overall) since the ATI acquisition. Are you Hector Ruin's biographer by any chance?


----------



## Melvis (Dec 15, 2013)

HumanSmoke said:


> Well firstly, as I pointed out, this has nothing to do with AMD's current state of affairs, since y'know, the deal never happened.
> Secondly, if you're gonna pull facts out of your arse:
> On July 21st 2006 the market evaluated NVIDA as worth $6.2 billion, on the day AMD announced they were going to buy ATI NVIDIA market cap increased to $6.9 billion.....and AMD sat at around $10.5 billion
> Given that AMD (over)paid twice ATI's effective value, I could see how you'd also think that Hector would also overpay for Nvidia
> ...




Firstly I didnt say it was, but if it did happen then yes it would of been very much so.
Secondly im not pulling anything out my arse, I remember very clearly reading all about it back then on this very forum about the whole AMD might buy Nvidia but didn't and bought ATi bla bla bla only part i did get wrong was what AMD was worth, I got it confused with Market share at the time. 

Here are three articals that also show what nvidia was worth at the time AMD was thinking about buying. If these are wrong then dont blame the reader blame the site for posting false information.

http://www.tomsguide.com/us/amd-nvidia-merger,review-1061-4.html

http://www.forbes.com/sites/brianca...nvidia-about-acquisition-before-grabbing-ati/

http://www.neowin.net/news/rumor-amd-tried-to-buy-nvidia-before-buying-ati

Not from what I read on this forum once again seeing that AMD is now closer to 40-50% market share, yes it has lost in its CPU division but no way have they lost in the GPU division, if any body thinks that there totally mad. And how can they lose market share in GPU since 2006? that's impossible as they never owned any GPU division till after they bought ATI? :S Nope Im not, what gives you that stupid idea?


----------



## HammerON (Dec 15, 2013)

Alright folks. Stay on the topic at hand:  "Vishera" End Of The Line for AMD FX CPUs: Roadmap


----------



## qubit (Dec 17, 2013)

This outcome is hardly surprising. Since the brutal disappointment a couple of years ago that was Bulldozer the writing was on the wall.

Now we all play Intel's monopoly tune on CPU upgrades. What fucking joy.


----------



## eidairaman1 (Dec 17, 2013)

i still refuse to get Intel personally,


----------



## TRWOV (Dec 17, 2013)

Would it be possible to get AMD to specify if they're going to re-relase Vishera at 28nm? Kabini (Jaguar) and Kaveri (Steamroller) are going 28nm, would be interesting if Vishera was going to as well. An FX8550 5Ghz @ 125w would be a good swan song for AM3+ IMO.


----------



## xkche (Dec 20, 2013)

If you know spanish:

http://www.chw.net/2013/12/amd-alista-futuros-cpus-fx-series-con-controlador-de-memoria-ddr4/


----------



## itsakjt (Dec 20, 2013)

xkche said:


> If you know spanish:
> 
> http://www.chw.net/2013/12/amd-alista-futuros-cpus-fx-series-con-controlador-de-memoria-ddr4/



We have this


----------



## MxPhenom 216 (Dec 20, 2013)

Good news I think. Intel is kind of in a league of its own with their enthusiast chips that cost an arm an a leg(Sandy Bridge E and Ivy Bridge E). What I want to see since Intel's APUs are pretty much Sandy/Ivy/Haswell, Id like too really tight competition between AMD and Intel APUs. AMD has some work on the CPU part of things, where as Intel has a lot of work to do on the GPU side. If that could get evened out, we could, as the consumer, get some pretty good chips at competitive prices. Things could get interesting in the future.


----------



## newtekie1 (Dec 20, 2013)

qubit said:


> This outcome is hardly surprising. Since the brutal disappointment a couple of years ago that was Bulldozer the writing was on the wall.
> 
> Now we all play Intel's monopoly tune on CPU upgrades. What fucking joy.



The only thing that really made Bulldozer a disappointment was AMD's hype.  I believe if they hadn't hyped the crap out of Bulldozer and touted it as the next CPU God that was going to wipe the floor with Intel, and instead just said the truth and said "It's going to close enough to Intel on single threaded apps, and matching or bettering Intel at multi-threaded apps, and cheaper than Intel, with more features" it wouldn't have been such a huge disappointment.



eidairaman1 said:


> i still refuse to get Intel personally,



While I don't refuse to buy Intel, AMD has been receiving my money mostly lately.  The only Intel I've bought was my recent laptop purchase and that simply came down to the Intel laptop being on sale for less than the AMD.  The week before I bought when I was looking the AMD was cheaper so I would have bought AMD.

However, in terms of value for the money in the class of machines I've been building, AMD has been the winner.  Especially since I don't have to spend $250+ just to get a processor I can overclock.  I like that I can still buy a cheap processor and get a few more horsepower out of it by overclocking with and AMD.



TRWOV said:


> Would it be possible to get AMD to specify if they're going to re-relase Vishera at 28nm? Kabini (Jaguar) and Kaveri (Steamroller) are going 28nm, would be interesting if Vishera was going to as well. An FX8550 5Ghz @ 125w would be a good swan song for AM3+ IMO.



In the official slide released in the other thread on this topic, they confirmed that the AM3+ socket processors would stay on 32nm through at least 2014.  I was kind of hoping for a 28/22nm refresh too, but it doesn't look hopeful.


----------



## TheoneandonlyMrK (Dec 20, 2013)

What,, so ,so AMD are still going to make FX chips AND possibly still evolve them short term on AM3+ until DDR4 is in the wild and PCIEX3 really matters before we see AM4, never


----------



## qubit (Dec 21, 2013)

newtekie1 said:


> The only thing that really made Bulldozer a disappointment was AMD's hype.  I believe if they hadn't hyped the crap out of Bulldozer and touted it as the next CPU God that was going to wipe the floor with Intel, and instead just said the truth and said "It's going to close enough to Intel on single threaded apps, and matching or bettering Intel at multi-threaded apps, and cheaper than Intel, with more features" it wouldn't have been such a huge disappointment.


Agreed. Even the name implies something that's gonna smash the competition wide open. Give that name to something that clearly can't and then hype it up, that makes AMD a laughing stock worthy of the stinging user criticism and string of disappointed reviews that they got. I hope someone high up in marketing was fired for pulling this stunt.

AMD eventually went the value route with their CPUs and do give you a lot of CPU for your money, but it looks like this isn't profitable enough for them as they need to invest the money back into R&D and these things aren't cheap to make.

I think there's no getting around the fact that each generation of products from one manufacturer should generally always leapfrog the performance of the competition in order to stay in business by keeping your market fresh and vibrant and your customers dissatisfied with their current systems and hence want to upgrade. Clearly that's not been happening.

With all this so-called "good enough" performance, it becomes a race to the bottom on prices that can't be sustained forever. The fact we're also approaching the end of "moore's law" really isn't helping, either.

If clock speeds had continued to scale past the Pentium 4's clock speeds of 3GHz+ back in 2003, to say around 20GHz+ now, along with architectural improvements, I reckon our PCs would be capable of many more fancy and importantly, useful, functions that we're not seeing today.


----------



## HumanSmoke (Dec 21, 2013)

qubit said:


> If clock speeds had continued to scale past the Pentium 4's clock speeds of 3GHz+ back in 2003, to say around 20GHz+ now, along with architectural improvements, I reckon our PCs would be capable of many more fancy and importantly, useful, functions that we're not seeing today.


Wouldn't happen. Couldn't happen.
Raw clockspeed = branch misprediction increases, increased heat and power.
More numerous shorter pipelines forsaking absolute speed for an actual increase in throughput in a lower power envelope has proven to be the way to go. If NetBurst taught anyone anything, it's that straight line speed from a deep pipeline has limited growth potential.
There doesn't seem to be a paradigm shift in material usage in the offing in the short term that made the "Gigahertz race" a spectator sport. Moving from aluminium to copper interconnects was a huge leap. Moving to more esoteric materials (Indium/Gallium compounds) doesn't look like it will net the same revolutionary jump.


----------



## qubit (Dec 21, 2013)

HumanSmoke said:


> Wouldn't happen. Couldn't happen.
> Raw clockspeed = branch misprediction increases, increased heat and power.
> More numerous shorter pipelines forsaking absolute speed for an actual increase in throughput in a lower power envelope has proven to be the way to go. If NetBurst taught anyone anything, it's that straight line speed from a deep pipeline has limited growth potential.
> There doesn't seem to be a paradigm shift in material usage in the offing in the short term that made the "Gigahertz race" a spectator sport. Moving from aluminium to copper interconnects was a huge leap. Moving to more esoteric materials (Indium/Gallium compounds) doesn't look like it will net the same revolutionary jump.


I think you've missed my point. I know they can't make CPUs run at 20GHz. I'm talking about what could have been. Mix product competition, architectural efficiency improvements along with a blistering clock speed and performance would have been _waay_ better than we see now.

This would have quite likely enabled new functions and features that we can't even think of now, because we're effectively in a "box" that we can't see out of. Raw speed has enabled many things we take for granted today, so revving it right up would have likely given us many more things like this than we have now. Better artificial intelligence would have probably been one of them.


----------



## TRWOV (Dec 21, 2013)

theoneandonlymrk said:


> What,, so ,so AMD are still going to make FX chips AND possibly still evolve them short term on AM3+ until DDR4 is in the wild and PCIEX3 really matters before we see AM4, never



As newtekie mentioned, based on the official AMD slide it's unlikely that Vishera is going to be improved. I was hoping for a 28nm Vishera but it seems that it will remain 32nm until AM3+ is retired. Of course, plans change and we could be talking about the 28nm FX-8550 in a few months *crosses fingers*


----------



## HumanSmoke (Dec 21, 2013)

qubit said:


> I think you've missed my point. I know they can't make CPUs run at 20GHz. I'm talking about what could have been.


That was my point. It couldn't have been- it would be physically impossible...unless you think that a single core with the energy budget of a POWER7 module was feasible for a desktop CPU. Theoretically I think it's closer to Galaxy Quest or Farscape than actual real life..
If you're musing on what might have been, there are plenty of "what if" scenarios that actually could have happened and would have substantially more impact on the industry:
1. Bob Noyce doesn't invest and supply start up capital for W. Jerry Sanders III. Without Noyce's investment, other backers shy away (as it was, Sanders only made the investor deadline with five minutes and $5K to spare). AMD kaput before it starts, IBM's second source for 8088 processors likely falls to National Semi, Motorola, or Zilog.
2. Gary Kildall actually gives a shit about running a company and keeps his appointment with IBM's reps rather than disappear to fly his plane. IBM choose CP/M for the Model 5150...Bill gates and MS-DOS don't get a look in.
3. Jim Harris, Bill Murto, and Rod Canion decide to go with the option of sinking their money into a Mexican restaurant. Compaq doesn't happen, the IBM ROM-BIOS isn't reverse engineered, and the IBM PC clone business either doesn't happen or is stalled past the tipping point where anyone can undercut big blue. More to the point, IBM would have then realized that personal computing's growth warranted more attention/budget that was being lavished on it's mainframe and minicomputer business.

These things all could have happened quite easily, just as Hewlett-Packard could have listened to Steve Wozniak when he approached them about building a personal computer


qubit said:


> Better artificial intelligence would have probably been one of them.


Actual intelligence (i.e. the brain) uses parallelization. Speed is pretty much a constant AFAIK limited by chemical and electromagnetic action. Boosting the latter seems to lead to erratic behaviour (analogous to cache misses ?), losing parallelization (lowering core count ?) leads to Alzheimer's and a new found love of reality TV.


----------



## qubit (Dec 21, 2013)

I'm not really sure what you're arguing about? I was just musing dude.


----------



## HumanSmoke (Dec 21, 2013)

qubit said:


> I'm not really sure what you're arguing about? I was just musing dude.


Ah, okay. For musing, 20GHz seemed quite conservative. How about 1024 cores @ 1THz with a TDP of 5 watts and an MSRP of $9.99 ?


----------



## qubit (Dec 21, 2013)

2THz


----------



## Steevo (Dec 21, 2013)

Soon we will be limited by the speed of electricity through the semiconductor traces and wire. Then we are on to optical multiplier processors or to emi processors. Then possibly to quantum processors or quantum bit check emi processors.


----------



## HumanSmoke (Dec 21, 2013)

Steevo said:


> Soon we will be limited by the speed of electricity through the semiconductor traces and wire. Then we are on to optical multiplier processors or to emi processors. Then possibly to quantum processors or quantum bit check emi processors.


I hope you get due recognition when AMD incorporate all this into their 2014-15 roadmap PPS next month 

/Waits for WCCF to repackage this as front page article


----------



## TheoneandonlyMrK (Dec 21, 2013)

HumanSmoke said:


> I hope you get due recognition when AMD incorporate all this into their 2014-15 roadmap PPS next month
> 
> /Waits for WCCF to repackage this as front page article


yeah using old style switch ,logic on graphene they could have 20Ghz or more in the bag without tickleing quantum's tum


----------



## BiggieShady (Dec 21, 2013)

Steevo said:


> Soon we will be limited by the speed of electricity through the semiconductor traces and wire.



Speed of electricity is actually speed of light (the one we are always limited by) - interestingly enough, movement and speed of electrons in (semi)conductor has nothing to do with this - it's the disturbance in the electromagnetic field that travels. Use the force.


----------



## Ravenas (Dec 21, 2013)

> "AMD will continue to supply AM3+ and AMD FX processors for the foreseeable future, as per AMD's official roadmap update at APU'13 [above]. Recently, AMD launched the FX-9000 series, AMD's fastest desktop processor to date. As AMD's business continues to evolve, AMD will focus on the areas of growth including support for the desktop PC enthusiast leveraging AMD's world-class processor design IP, including heterogeneous compute. AMD's FX branded products will continue to evolve and we look forward to sharing those updates in the future," said James Prior, an AMD manager of APU/CPU product reviews, in a conversation with Gamer’s Nexus web-site.





The roadmap supplied by the OP seems like a stretch of the imagination. FX processors are a core business of AMD.


----------



## Steevo (Dec 22, 2013)

BiggieShady said:


> Speed of electricity is actually speed of light (the one we are always limited by) - interestingly enough, movement and speed of electrons in (semi)conductor has nothing to do with this - it's the disturbance in the electromagnetic field that travels. Use the force.


http://en.wikipedia.org/wiki/Speed_of_electricity

Well understood years ago. But a trace operating at 70% of the speed of light. 

209 854 721 m / s

At 5Ghz switching rate means it can only travel.

0.0419709442 m / s

and that is 1.6524 inch per second, so this is the longest any trace can be assuming everything works perfectly, and you have to be able to read and write data out of caches at this incredible rate too unless you want a significantly higher percentage of time in wait states.......that causes timing issues at speed too.

So we are getting close to how fast we can make processors switch unless they start learning about capacitive roll off for every switch and transistor, and the logic to do that is cost prohibitive. Intel was the first to find this theoretical limit when trying to reach the absurd speeds they thought the P4 possible, they then wrote papers on it and it as well as the poor performance and other issues were the reason they moved to a shorter pipeline and higher IPC instead of higher speed. A horse AMD now seems stuck beating aimlessly.


----------



## xorbe (Dec 24, 2013)

Steevo said:


> and that is 1.6524 inch per second



Close, 1.6525 inches per _clock cycle_.  Less when you consider the propagation through transistors.


----------



## TheHunter (Dec 28, 2013)

All they need is make those APUs 8core  and all will be good.

I see this newest is already a steamroller


----------



## NeoXF (Dec 30, 2013)

FX APUs anyone? (As in beyond "A10s").


----------



## eidairaman1 (Dec 31, 2013)

NeoXF said:


> FX APUs anyone? (As in beyond "A10s").


the 8 core apus will probably be A10s, A8s will be Hex Cores and A6s will be Quads and A4s will be duals.

it would only make sense if they lined them up according to the number- so A10 are ten core, etc etc


----------



## TRWOV (Dec 31, 2013)

I suppose 6 core APUs are a strong posibility once AMD moves to 28/22nm.


----------



## NeoXF (Dec 31, 2013)

What I meant was, APUs with more cores for software/workloads that just won't scale well with HSA, L3 and maybe eDRAM (L4?) as well and higher TDPs, since I imagine this won't probably happen on 28nm, 20nm is a big contender, especially since AMD already announced it's TDPs for the stock family of APUs will top out at 65W. And I don't know about you guys, but 95-100W sounds about right to me, hell, looking at Intel, 130-150W for a HPC targeted platform sounds just about right.

6M/12C Excavator w/ L3, triple/quad DDR4 IMC and 16 PI GCN compute units @ 20nm @ 125W TDP please? 

In any case I don't think AMD will release a 6-core Kaveri anytime soon, unless they change up the roadmap (wouldn't make sense w/ 20nm 65W Carizzo replacing 28nm 95W Kaveri I guess). But I think we can expect a Kaveri clock refresh later next year tho.


----------



## TheoneandonlyMrK (Dec 31, 2013)

Wheres Bta , guy needs a head sort.

AMD retorted to this tale of woe not long after it was released stating.

FX not dead and not being sidelined or forgotten and will progress into the future.

AM3+ wasnt mentioned positively or neg but likely wont get beyond 2014

and that the slide in question is Bs ,not Amd's

also its one year view doesnt state what comes after FX because they are not discussing that at this time, not that when the chart ends so does FX thats just not a fact.



8 core APU's or 6 even wont arrive until TSV 3d chips become easy(and cheap) to make(2015-16) and you can bank that opinion as fact. until then server scrap parts will make up future FX chips, also bankable and imho before 2015 am3+ WILL see steamroller cores(im banking this one though it is just my opinion based on future server upgrade options for the big data crew)


----------



## TRWOV (Dec 31, 2013)

theoneandonlymrk said:


> Wheres Bta , guy needs a head sort.
> 
> AMD retorted to this tale of woe not long after it was released stating.
> 
> ...




Huh? Nobody said the FX line is dead, just that Vishera is the last core developed for AM3+. In fact AMD's retort just confirmed it: AM3+ wont get Steamroller, it won't even get a 28nm refresh. 

Now what Bt should do is change the title: didn't know this many people were dialexic, lots are reading "End of the Line" as "End of Life".


----------



## TheoneandonlyMrK (Dec 31, 2013)

many on here are , thats  ahh feckit


"AMD will continue to supply AM3+ and AMD FX processors for the foreseeable future, as per AMD's official roadmap update at APU'13"   



no end point or date was given for FX  or AM3+



just because a chart ends does not mean anything the next chart will dictate whats next or how long am3+ is here........................

no metion of vischera's reign or any future plans at all were mentioned in fact the only vischera comment was that they had just brought out the fx9xxx's 

how would it be wise to talk up what's next while you are still clearing stock of what's here??


----------



## xenocide (Jan 1, 2014)

theoneandonlymrk said:


> "AMD will continue to supply AM3+ and AMD FX processors for the foreseeable future, as per AMD's official roadmap update at APU'13"


 
There's a huge difference between supply and support.  Sure, they will continue to manufacture and sell AM3+ parts, but will they develop a new CPU to put in there?  No.  The big concern is AMD either consolidating the FM2+ socket or developing a whole new socket for FX-series CPU's.  People have grown very accustomed to AMD continually supporting upgrade pathes, removing them might piss some folks off.


----------



## NeoXF (Jan 1, 2014)

xenocide said:


> There's a huge difference between supply and support.  Sure, they will continue to manufacture and sell AM3+ parts, but will they develop a new CPU to put in there?  No.  The big concern is AMD either consolidating the FM2+ socket or developing a whole new socket for FX-series CPU's.  People have grown very accustomed to AMD continually supporting upgrade pathes, removing them might piss some folks off.


Well, FM2+ is here to stay untill mid-2015 at least, when I predict AMD will *have to* update it's APU platform for the DDR4 version of Carizzo.

It's also the best time when I think AMD should make a real succesor to 990FX and AM3+, since DDR4 will be there for awhile. Sure, once can speculate SATA-Express, PCI-Express 4.0, maybe support for more than 2 64bit buses and so on.
Edit: So they might as well merge the two.


----------



## micropage7 (Jan 1, 2014)

TRWOV said:


> I suppose 6 core APUs are a strong posibility once AMD moves to 28/22nm.


6 cores with better performance per watt ratio


----------



## itsakjt (Jan 1, 2014)

I want a Phenom III X8 with all new advanced instruction sets, high clock speeds and an awesome IMC.


----------



## TheoneandonlyMrK (Jan 2, 2014)

itsakjt said:


> I want a Phenom III X8 with all new advanced instruction sets, high clock speeds and an awesome IMC.


hows about you have a think because many went from a phenom II to an FX as i did and i can resolutely say your suggestion is daft ,Fx easily bests any phenom and in scenarios that DO use all supplied cores ie folding@home crunching some games(bf3-4 for eg)etc the FX chips keep up with intels similar easily, i suppose Amd could just call it a phenom 3 to please you though that would'nt make sense


and micropage etc Amd would wisely use that extra space for MOOOAR shaders not two more x86 cores maybe multi core will truley be multi (type) core in this future though too.


----------



## enkidu_WM (Feb 1, 2014)

You know one thing I have to say is that AMD has been given a ish end of the stick.  It has really been the true innovator all along, and each processor has been capable of greater performance gains over their Intel counterparts, it is just an issue in which code is processed.  Intel cannot truly attain independent processing with its cores relying heavily on 4 cores with one process and virtualizing it all.  AMD is able to perform in a manner befitting individual cores, however programs send information in a format that benefits Intel. 4 people can do one thing faster but 4 people can do four things individually thereby getting more done.


----------



## karakarga (Jan 6, 2015)

AMD 990FX chipset is PCI Express 2.0 compliant. Except Asus Sabertooth 990FX/Gen3 R2.0 mainboard, there is no PCI Express 3.0 mainboard by AMD!

Intel is planning to build new PCI Express 4.0 compliant mainboards at the last quarter of 2015 this year.

If AMD do not wish/need to ready a PCI Express 3.0 mainboard for high segment, this means that, they will not ready any PCI Express 4.0 chipset nor mainboard. So this means, if AMD can not jump to a lower level branch, they can not possibly reach to a higher level.

AMD is dying. Again, if they can not reach enthusiasts, they can not earn much money. Low level APU's are not much profitable. This means, they are close to high profit marge.

Sooner or later, they will close the shop! Their graphics card serie is not so good. This 2014 summer and the earlier 2013 summer, AMD did not managed to release a new graphics driver for 3-4 months. nVidia released 2 or more at the summer period for gamers.

If you have lately bought an AMD graphics card, all summer long in 2 years, you couldn't manage to load a new driver for your graphics card, which is a bad thing!

AMD graphics cards are working louder, compared with same level nVidia counterpart.

Shortly, AMD is no longer a good choice at all....


----------



## TRWOV (Jan 6, 2015)

New user *check*
Year old thread *check*
Inflammatory comment *check*
Broken English *check*

I smell something...


----------



## eidairaman1 (Jan 7, 2015)

Yeah its a shit stain


----------



## Steevo (Jan 7, 2015)

We should have a new member thread, where you have to post there at least once and before anyone approves you.


And if its a failure like these... human spam machines? Their IP and email should be published, you know, for the lulz.


----------



## xvi (Jan 7, 2015)

I sometimes come across an old thread, not realizing it's an old thread, and I try to come up with a reply that contributes to the conversation only to be stopped by the "Click this checkbox if you _really_ want to necro this". I struggle a little, but the checkbox always wins. I suppose that for some, it doesn't.

An introduction thread on TPU is something worth considering, I think, but it could hurt a bit. I've seen some people register just to ask a question and stick around afterwards because the community is pretty cool around here. Having an extra hoop to jump through would likely deter new members.

If a user gets out of hand, there's always the report button and the mods here have always been great (from what I've seen, at least). To quote Churchill, "democracy is the worst form of government, except for all the others."


----------



## de.das.dude (Jan 7, 2015)

i had a talk with amd a week ago. they will give something in 1.5-2 years time in the fx lineup and it will be really something. not a rushed job like the FX.


much amd many wow
such me 
amaze


----------



## FreedomEclipse (Jan 7, 2015)

de.das.dude said:


> i had a talk with amd a week ago. they will give something in 1.5-2 years time in the fx lineup and it will be really something. not a rushed job like the FX.
> 
> 
> much amd many wow
> ...



Or maybe it was one of AMD's infamous PR machine you were talking to that does nothing bang the drum for AMD and get people hyped before letting them down gently....

To coin a phrase - 'Phenomenal' then Intel turned around with a worn rolled up newspaper and beat 50 shades of green from AMD.


--- you can 'talk' to AMD folk, but unless their from the top, i wouldnt believe a word of any of the lackys they send to trade shows or exhibitions. Apart from showcasing their hardware, innovations and answering questions - their job is to hype future products.


----------



## de.das.dude (Jan 7, 2015)

FreedomEclipse said:


> Or maybe it was one of AMD's infamous PR machine you were talking to that does nothing bang the drum for AMD and get people hyped before letting them down gently....
> 
> To coin a phrase - 'Phenomenal' then Intel turned around with a worn rolled up newspaper and beat 50 shades of green from AMD.
> 
> ...




it wasnt a PR meet. meetup with sales people.
cant be arsed about what amd's shit PR dept does. actually made quite some complaints regarding them.


----------



## GLD (Jan 9, 2015)

karakarga said:


> AMD 990FX chipset is PCI Express 2.0 compliant. Except Asus Sabertooth 990FX/Gen3 R2.0 mainboard, there is no PCI Express 3.0 mainboard by AMD!
> 
> Intel is planning to build new PCI Express 4.0 compliant mainboards at the last quarter of 2015 this year.
> 
> ...



Lame.


----------

