Friday, May 24th 2013

Xbox One Chip Slower Than PlayStation 4

After bagging chip supply deals for all three new-generation consoles -- Xbox One, PlayStation 4, and Wii U, things are looking up for AMD. While Wii U uses older-generation hardware technologies, Xbox One and PlayStation 4 use the very latest AMD has to offer -- "Jaguar" 64-bit x86 CPU micro-architecture, and Graphics CoreNext GPU architecture. Chips that run the two consoles have a lot in common, but also a few less-than-subtle differences.

PlayStation 4 chip, which came to light this February, is truly an engineer's fantasy. It combines eight "Jaguar" 64-bit x86 cores clocked at 1.60 GHz, with a fairly well spec'd Radeon GPU, which features 1,156 stream processors, 32 ROPs; and a 256-bit wide unified GDDR5 memory interface, clocked at 5.50 GHz. At these speeds, the system gets a memory bandwidth of 176 GB/s. Memory isn't handled like UMA (unified memory architecture), there's no partition between system- and graphics-memory. The two are treated as items on the same 8 GB of memory, and either can use up a majority of it.
Xbox One chip is a slightly different beast. It uses the same eight "Jaguar" 1.60 GHz cores, but a slightly smaller Radeon GPU that packs 768 stream processors, and a quad-channel DDR3-2133 MHz memory interface, which offers a memory bandwidth of 68.3 GB/s, and holding 8 GB of memory. Memory between the two subsystems are shared in a similar way to PlayStation 4, with one small difference. Xbox One chip uses a large 32 MB SRAM cache, which operates at 102 GB/s, but at infinitesimally lower latency than GDDR5. This cache cushions data-transfers for the GPU. Microsoft engineers are spinning this off as "200 GB/s of memory bandwidth," by somehow clubbing bandwidths of the various memory types in the system.

The two consoles also differ with software. While PlayStation 4 runs a Unix-derived operating system with OpenGL 4.2 API, Xbox One uses software developers are more familiar with -- a 64-bit Windows NT 6.x kernel-based operating system, running DirectX 11 API. Despite these differences, the chips on the two consoles should greatly reduce multi-platform production costs for game studios, as the two consoles together have a lot in common with PC.
Source: Heise.de
Add your own comment

148 Comments on Xbox One Chip Slower Than PlayStation 4

#51
RejZoR
They'll ruin the console world themselves with obsession for POWEEEERRR instead of making good fun games. It's strange how they haven't learned a thing from PC segment which got into this mess largely because of constant obsession for more power and not really optimizing anything right because there was always an excuse to upgrade around a corner...
Posted on Reply
#52
Jorge
It's all good for AMD and consumers. :toast:
Posted on Reply
#53
Easy Rhino
Linux Advocate
newtekie1This is actually the most exciting bit of this news to me. I hope this means that games coming over to the PC will be better than the really bad ports we've seen in the past, and I also hope the PS4 running on a Unix/OpenGL base means we will see more hit games make it to the Linux/OSX side of PCs.



And what advertised feature has sony removed in the past?
OtherOS.
Posted on Reply
#54
Mindweaver
Moderato®™
Hilux SSRGI don't see Sony pulling for msrp $599 at launch; MS may offer an optional subsidized plan.
I wouldn't be surprised if they don't price both consoles around $700-$800 bucks with Nvidia asking $650 for the new GTX 780.. Nvidia should make the new pricing for everyone.. Who else can put out a video card for an unheard of amount and sale completely out of stock.. Then a couple months down the road put out another card that's better than there last card for 3/4's the price and people say, "WOW, that's a great price! Take my money!"..lol People are easily misguided.. lol The new consoles will be around the old NeoGeo prices, and people will buy it.
Posted on Reply
#55
Frick
Fishfaced Nincompoop
newtekie1And backwards compatibility was in the consoles it was advertised for, when backwards compatibility was removed so was the advertising. In fact the advertising had ended way before they removed PS2 backwards compatibility from the console.

The only advertisement for SACD support was on the box itself, and it was only on boxes for units that had it.

OtherOS was never advertised, it was talked about in some of the presentations, but never advertised. I even have a launch PS3 and OtherOS isn't even mentioned on the box. It is a bonus that they decided to remove, it was never an advertised feature.
I knew you were going to say that. :p
Posted on Reply
#56
CrAsHnBuRnXp
tiggerI like this, i think maybe Nvidia is sore 'cause both Msft and Sony have gone with Ati-

www.nextpowerup.com/news/991/nvidia-ps4-gpu-is-too-cheap-and-too-weak.html
Let them have ATI. Nvidia, despite what anyone wants to admit/see, is a PC platformer. More games are desined around nVidia because it's "the way its meant to be played." At least with Sony and Microsoft going AMD, it will keep them in business longer and wont lead to them going bankrupt and send nVidias prices sky rocketing.
Posted on Reply
#57
midnightoil
CrAsHnBuRnXpLet them have ATI. Nvidia, despite what anyone wants to admit/see, is a PC platformer. More games are desined around nVidia because it's "the way its meant to be played." At least with Sony and Microsoft going AMD, it will keep them in business longer and wont lead to them going bankrupt and send nVidias prices sky rocketing.
NVIDIA lost that war long ago. Far more games are developed in partnership with AMD than NVIDIA these days.
Posted on Reply
#58
TheoneandonlyMrK
Lots bemoaning the pace of the jaguar cores its worth noteing that as well as the fine argument that consoles are optimised so do more with less useless background overhead.

Sony M$ and Amd are all Hsa buddy's , , these consoles will work like nothing that has come before in many ways and it's way to early to cast doubt on the performance , mid tier pcs these are Not.
Posted on Reply
#59
KainXS
RejZoRThey'll ruin the console world themselves with obsession for POWEEEERRR instead of making good fun games. It's strange how they haven't learned a thing from PC segment which got into this mess largely because of constant obsession for more power and not really optimizing anything right because there was always an excuse to upgrade around a corner...
Sony did do this(and ultimately failed) but lets be honest MS did it also last gen and they did it RIGHT, the 360's CPU in raw power was more powerful than just about every consumer level CPU at launch and they graphics card was also very powerful and had no real equivalent on the market in terms of features at the time, they had a really good mix at power and focus towards a goal at that time and that goal was a solid gaming machine.(even though the console had too long of a life cycle)

Fast forward to now, the specs are nothing special here and theres more of a shift towards what Sony did at the PS3's launch, with their do everything console at an insane launch price(seems like it) that everyone made fun of thereafter. It could be a real reversal, microsoft is attempting this do all all console(if you can call it that) and Sony is focusing on power again but is also looking more at games.

I have never seen anything take advantage of HSA yet to any real notable extent, has anyone here seen that.

its going to be very interesting to see how this turns out.
Posted on Reply
#60
NinkobEi
MindweaverI wouldn't be surprised if they don't price both consoles around $700-$800 bucks with Nvidia asking $650 for the new GTX 780.. Nvidia should make the new pricing for everyone.. Who else can put out a video card for an unheard of amount and sale completely out of stock.. Then a couple months down the road put out another card that's better than there last card for 3/4's the price and people say, "WOW, that's a great price! Take my money!"..lol People are easily misguided.. lol The new consoles will be around the old NeoGeo prices, and people will buy it.
Mark my words, these consoles will be 300-400 tops for the basic editions. They should be a lot cheaper to fab than their predecessors. AMD can pump out nearly the same design for both systems.
Posted on Reply
#61
librin.so.1
FordGT90ConceptAlso, keep in mind that Xbox One will have an advantage of using DirectX over OpenGL. DirectX sets the hardware requirements and OpenGL adapts them. Less is more when efficiency makes up the difference.
I'd say OGL is only a disadvantage on when on Windoze. And it's not even OpenGL's fault per se.
I'd best describe it in the words one game developer said to me not long a go (not exact words; Greatly shortened) "working with OpenGL is great. OpenGL is also lighter on the CPU and helps to keep the framerate up when running on weaker CPUs. But OpenGL implementations on Windows just suck and are much slower than they could be."

Also, what midnightoil said.
Posted on Reply
#62
Dent1
RejZoRThey'll ruin the console world themselves with obsession for POWEEEERRR instead of making good fun games. It's strange how they haven't learned a thing from PC segment which got into this mess largely because of constant obsession for more power and not really optimizing anything right because there was always an excuse to upgrade around a corner...
We all want fun games, that should be the priority, yeah optimisation is poor but PC gaming isn't in a mess. PC gaming has been steady, last I read it was increasing!

Nobody forces you to upgrade at every iteration. You have an HD7950 and a core i7 920. These components are not even needed to run games. I've been running my Athlon II X4 for over 3 years and aint a single game it can't play well.
Posted on Reply
#63
Jstn7477
Dent1We all want fun games, that should be the priority, yeah optimisation is poor but PC gaming isn't in a mess. PC gaming has been steady, last I read it was increasing!

Nobody forces you to upgrade at every iteration. You have an HD7950 and a core i7 920. These components are not even needed to run games. I've been running my Athlon II X4 for over 3 years and aint a single game it can't play well.
While Phenoms and Athlons can still certainly run games, good luck maintaining 120 FPS in games that aren't bottlenecked by your GPU. Far Cry 3 destroys my 7970 with 100% GPU usage and 65% CPU usage, my i7-3770K @ 4.3GHz struggles to maintain even 80-100 FPS in the biggest Team Fortress 2 fights with 15-40% GPU usage, CPU bottlenecks my overclocked HD 7970 in Planetside 2 on quite a few occasions, and even Skyrim and Minecraft take a hit because they are limited to how many cores they can utilize. 120Hz makes competitive TF2 a lot smoother for everyone, and there are people out there with i7 chips and high end cards running this almost 6 year old game in DX8.1 mode just to have the highest framerates.
Posted on Reply
#64
newtekie1
Semi-Retired Folder
midnightoilNVIDIA lost that war long ago. Far more games are developed in partnership with AMD than NVIDIA these days.
Have any proof of that? I've seen a pretty even distribution as of late.
Posted on Reply
#65
librin.so.1
Jstn7477my i7-3770K @ 4.3GHz struggles to maintain even 80-100 FPS in the biggest Team Fortress 2 fights with 15-40% GPU usage
lolwut
That is really odd. Especially since that is a i7-3770K we are talking about.
On my FX-8320 @ 4 GHz, TF2 hardly ever goes below 100 fps, despite the fact I have 8 BOINC threads crunching while gaming. With BOINC off, I have to turn on vsync as it starts pointlessly sizzling at over 150 fps at all times; most of the time near 300.
Posted on Reply
#66
Dent1
Jstn7477While Phenoms and Athlons can still certainly run games, good luck maintaining 120 FPS in games that aren't bottlenecked by your GPU. Far Cry 3 destroys my 7970 with 100% GPU usage and 65% CPU usage, my i7-3770K @ 4.3GHz struggles to maintain even 80-100 FPS in the biggest Team Fortress 2 fights with 15-40% GPU usage, CPU bottlenecks my overclocked HD 7970 in Planetside 2 on quite a few occasions, and even Skyrim and Minecraft take a hit because they are limited to how many cores they can utilize. 120Hz makes competitive TF2 a lot smoother for everyone, and there are people out there with i7 chips and high end cards running this almost 6 year old game in DX8.1 mode just to have the highest framerates.
So the benchmark for smooth gaming is 120 FPS? All you need is about 40-60 FPS average and this can be achieved on even old hardware like mine. 120 FPS is for show offs and it's unnecessary in most if not all cases.
Posted on Reply
#67
SetsunaFZero
we should get our first test consoles in few months, cant wait to test the ps4 :rockout:
Posted on Reply
#68
Jstn7477
Dent1So the benchmark for smooth gaming is 120 FPS? All you need is about 40-60 FPS average and this can be achieved on even old hardware like mine. 120 FPS is for show offs and it's unnecessary in most if not all cases.
I have no problem paying for hardware that makes my games run smoothly, considering I have a $300 monitor that functions best at 120Hz. Playing games at 40 FPS was something I did a couple years ago with an X2 4400+ and 7800GS in 2008, then X4 9750 and a 9800 GT, and then a 4GHz 955BE and HD 5770 before I got my 2600K and HD 6950 in late 2011. My minimum framerate in TF2 almost doubled when I got the i7 (before you call out the video card differences, my 5770 was never fully stressed in TF2 to begin with). Without VSYNC, TF2 runs in the 200s but in the largest fights on 24-28 player servers, my framerate dips down to around 100 with shadows off, sometimes less in extreme situations. My main work computer with a 2.5GHz Phenom X3 8550 and a 3850 AGP hangs around in the mid 30s-40s in the same situations with an under-utilized GPU.
Posted on Reply
#69
fullinfusion
Vanguard Beta Tester
Who cares, XB1 is going to kick ass! We cant wait to get it!

But boooo on you MS for not allowing the 360 games to work in the new box :slap:
Posted on Reply
#70
Fluffmeister
newtekie1Have any proof of that? I've seen a pretty even distribution as of late.
No proof, he just pulled it straight from his arse.
Posted on Reply
#71
Mussels
Freshwater Moderator
SeventhReignNot much of a gamer are you? Higher CPU clock speeds have been proven time and time again, to have zero effect on gaming, after a certain point. It is the GPU that you want to be faster, not the CPU.
play something other than FPS games, and you'll find that even the highest of todays CPU's can choke.
Posted on Reply
#72
newtekie1
Semi-Retired Folder
Easy RhinoOtherOS.
Was never an advertised feature.
Posted on Reply
#73
Tigershark8700
This thread definitely caught my attention, being an software engineer for a video game company myself in Silicon Valley (Palo Alto, CA).

Although, we develop Desktop and Mobile applications, not console games. I can tell you however that typically game development studios will build a game not based on high resolution textures at first, but will essentially get the guts of the game in first (including low poly textures and animations).

There is enough data analytics in the industry to show games need to compensate for low level machines first, that way they can hope to get as many people on the game as possible. The last thing you want to do is create a game, where only 15% of your market can actually play it.. this is bad business, and can lead to dramatically decreased revenue and community morale issues.

In our company for example, we are currently working on an RPG game for the Desktop (a spiritual successor to a famous 90s game by Konami). We built the initial structure of the game to handle a low level system, and than build out the rest of the textures and animations based on theoretical higher systems.

How this normally works (at least in our company), is that from our growing list of publishers and content providers, we are able to establish a timeline of hardware requirement and their usage. For instance, right now over 50% of our customers would not be able to run current generation games such as Far Cry 3, Crysis 3, Battlefield 4 etc. For this fact, we build our games based on what could reasonably be built for a standard low level system. Next we prepare requirements for higher level machines and build out from there.

At the end of the day (for our products), we have 4 optimal levels of graphic experience.. ranging from low level / no AA to ultra level / 16x AA.

Many companies follow this differently, but again being a software engineer, and being in the game industry for a little over 8 years, this is the best practice I've seen.

Thanks,
Phil
Posted on Reply
#74
entropy13
"an RPG game for the Desktop (a spiritual successor to a famous 90s game by Konami)"


LOL I know what that would be. I think, anyway. That Konami game was for the PlayStation...
Posted on Reply
#75
Lionheart
SeventhReignNot much of a gamer are you? Higher CPU clock speeds have been proven time and time again, to have zero effect on gaming, after a certain point. It is the GPU that you want to be faster, not the CPU.


Where do I begin:wtf:

Yeah I'm not much of a gamer even though I have a decently high end rig:shadedshu and all I do on it is play farmville:twitch: Also got a Xbox360, PS3, PSVita, Gamecube & PS2 and all I do is stare at them:shadedshu

"Higher CPU clock speeds have been proven time and time again, to have zero effect on gaming" :confused:

Well tell that to my i7 920 2.66ghz OCed to 4ghz :twitch: or my i7 970 3.2ghz OCed to 4ghz or even my old AMD X2 6000+ 3ghz - 3.4ghz which all felt pretty damn smoother when playing 3D applications/games & even in the desktop and they removed any bottlenecks:rolleyes: Hell even the PSP got a CPU speed increase, 222mhz - 333mhz :confused: Thus the God Of War series came out on it because of that extra speed :toast:

Anyways IMO increased CPU speeds do help but they reach a certain point where you're not really getting anything out of it but I do agree with you on the faster GPU :toast:
Posted on Reply
Add your own comment
Dec 18th, 2024 21:49 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts