- Joined
- Apr 2, 2011
- Messages
- 2,810 (0.56/day)
Allow me to make the performance arguments, so this thread doesn't become a pissing contest.
1) AMD chips have a higher TDP than Intel, so they must be more efficient.
No. AMD and Intel do measure chips differently. Between chips being measured differently, and completely differing architecture, efficiency cases can be made for both sides. There is no clear winner here.
2) Intel chips don't clock as high as AMD ones, so AMD makes better chips.
This one is generally true. If you're looking for bragging rights about the highest clock, then AMD wins. The reality is that both manufacturers' chips take huge amounts of power to do this. You don't run a CPU at peak frequencies constantly, unless you want a huge bill and rapidly deteriorating chip. For every day use, either manufacturer produces a relatively solidly performing chip.
3) Intel and AMD don't measure cores the same.
Absolutely. Intel has traditional cores, while AMD decided to share a component among the cores. A four core Intel chip doesn't match the 4 core AMD chip, a two core with hyper-threading chip doesn't match a 4 core AMD chip, and none of this matters. This is not a move for the consumer CPU market. In that market only a handful of program use more than a couple of cores. People using more cores are doing server related work, crunching, or running encoding software.
Now that the silly arguments have been made, can we get back on topic? AMD looks to be firing for the server market, without any bashfulness. Assuming this is the case, it seems like they are making a large step back into competing with Intel. This bodes well for more reasonably priced servers, but more importantly could be parlayed into something interesting on the desktop CPU front. Anyone care to comment on that, rather than on how much they think the current parts are either awesome or terrible?
1) AMD chips have a higher TDP than Intel, so they must be more efficient.
No. AMD and Intel do measure chips differently. Between chips being measured differently, and completely differing architecture, efficiency cases can be made for both sides. There is no clear winner here.
2) Intel chips don't clock as high as AMD ones, so AMD makes better chips.
This one is generally true. If you're looking for bragging rights about the highest clock, then AMD wins. The reality is that both manufacturers' chips take huge amounts of power to do this. You don't run a CPU at peak frequencies constantly, unless you want a huge bill and rapidly deteriorating chip. For every day use, either manufacturer produces a relatively solidly performing chip.
3) Intel and AMD don't measure cores the same.
Absolutely. Intel has traditional cores, while AMD decided to share a component among the cores. A four core Intel chip doesn't match the 4 core AMD chip, a two core with hyper-threading chip doesn't match a 4 core AMD chip, and none of this matters. This is not a move for the consumer CPU market. In that market only a handful of program use more than a couple of cores. People using more cores are doing server related work, crunching, or running encoding software.
Now that the silly arguments have been made, can we get back on topic? AMD looks to be firing for the server market, without any bashfulness. Assuming this is the case, it seems like they are making a large step back into competing with Intel. This bodes well for more reasonably priced servers, but more importantly could be parlayed into something interesting on the desktop CPU front. Anyone care to comment on that, rather than on how much they think the current parts are either awesome or terrible?