Monday, August 27th 2012

AMD "Vishera" FX-Series CPU Specifications Confirmed
A leaked AMD document for retail partners spelled out specifications of the first three FX "Vishera" processors by AMD. The new CPUs incorporate AMD's "Piledriver" architecture, and much like the first-generation "Zambezi" chips, will launch as one each of eight-core, six-core, and four-core chips. The eight-core FX-8350 is confirmed to ship with 4.00 GHz nominal clock speed, with 4.20 GHz TurboCore speed. The six-core FX-6300 ships with 3.50 GHz nominal, and 4.10 GHz TurboCore speed. The quad-core FX-4320, on the other hand, ships with the same clock speeds as the FX-8350. In addition, the document confirmed clock speeds of several socket FM2 A-series APUs, such as the A10-5700 and the A8-5500.
Source:
Expreview
493 Comments on AMD "Vishera" FX-Series CPU Specifications Confirmed
Maxwell (2014) vs ? (2013-2014?)
I think AMDs graphics division is the only I would bet on
seriously lol
yes APU's are a BIG focus as its the future and AMD is headed there quick, that in NO WAY means lesser quality dedicated graphics, none whatsoever.
now. STOP TROLLING!!!!!!!!!!!!
Can you provide us a link or a source based on what you have just said?
I am waiting to see what they have coming in the next gen GPU's.
They would not just abandon a product that has great success
Why would anybody wand AMD to fail? You know as much as everybody we need competition so we have a choice.
As Super XP said - we need a competition.
If AMD were to go away, then Intel would be the only x86 manufacturer left. That would mean there would be nothing stopping Intel to, say, for example, raise their prices by 500%. Also one more [even more horrible thing] - there would be no need for Intel to spend money developing better chips. They could simply sell the current design for years - as there would be no alternative to compete against [by making better chips].
Its like AMD does not even exist.
If AMD had a true competition against Intel, Intel's prices would drop even more.
Don't understand you really.
Excavator, Lawn Mower, Weed Trimmer, Electric Toothbrush :D :laugh:
On Topic- I have 4 AM3+ sockets waiting for some Vishera silicon so bring it on!!! :cool:
I think a valid point that some should note is just how good a chip sandybridge was ,ivybridge isnt a bad follow up but sandy stepped up the game more ,so to me if this chip is as good as a 2600K and the right price then alls good, oh but it has to be 5Ghz capable , then its another reasonably cheap upgrade:D
AMD launched Athlon, tweaked it for years, it became Athlon64, before moving on to Phenom. They tweaked that for a while, several years, then they released FX.
They will tweak that for several years, and then we'll get something new.
FX chips appeal quite well to servers, not so much to desktops, but AMD focusing on the server side of things is not new either.
In fact, it's just the same old story, yet better, because AMD is selling basically every chip it can make now, CPU and GPU alike. They are very much a success right now, business-wise. Performance-wise...well..not everyone needs a BMW.
That was my whole ponit about posting my pic of the FM2 APU, an unreleased chip that was minted like 9 months ago. They needed that ime to build stock, while they sold out on fab time. AMD is back on the way up, but I don't think high peroframnce is worth the risk they would have to take, so really ,I find no fault at all with AMD of their products.
I just wish Eyefinity adn Multi-GPU Crossfire got a bit more support.
See below:
Then the last few drivers have had some minor issues, and 12.8 has caused me a couple issues with my card that has been rock solid for years now.
If they can't figure out a way to deliver a solid product across the platforms I use I don't see a good reason to use them anymore. I'm not disappointed in my 1100T, nor my 5870 in terms of longevity, but the newer "upgrades" that I have to choose from make me sick. So either I stay with a older CPU and buy a new GPU, or build a whole new system and AMD is still not playing in the same game as Intel, how may years after the Phenom bullshit?
Real men use real cores......FFS. Marketing a crap sandwich.
Nvidia even publically admitted that the Fermi problems were 100% related to using automated tools to design the chip, and that those tools, or a lack of using an updated version of teh tools, led to the Fermi problem.
corrected