• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Xbox One Chip Slower Than PlayStation 4

Not much of a gamer are you? Higher CPU clock speeds have been proven time and time again, to have zero effect on gaming, after a certain point. It is the GPU that you want to be faster, not the CPU.

Facepalm_2.jpg


Where do I begin:wtf:

Yeah I'm not much of a gamer even though I have a decently high end rig:shadedshu and all I do on it is play farmville:twitch: Also got a Xbox360, PS3, PSVita, Gamecube & PS2 and all I do is stare at them:shadedshu

"Higher CPU clock speeds have been proven time and time again, to have zero effect on gaming" :confused:

Well tell that to my i7 920 2.66ghz OCed to 4ghz :twitch: or my i7 970 3.2ghz OCed to 4ghz or even my old AMD X2 6000+ 3ghz - 3.4ghz which all felt pretty damn smoother when playing 3D applications/games & even in the desktop and they removed any bottlenecks:rolleyes: Hell even the PSP got a CPU speed increase, 222mhz - 333mhz :confused: Thus the God Of War series came out on it because of that extra speed :toast:

Anyways IMO increased CPU speeds do help but they reach a certain point where you're not really getting anything out of it but I do agree with you on the faster GPU :toast:
 
"an RPG game for the Desktop (a spiritual successor to a famous 90s game by Konami)"


LOL I know what that would be. I think, anyway. That Konami game was for the PlayStation...

Actually it was developed for the Super Nintendo, so we might be thinking of different titles, unless it was ported over (but not sure on that).
 
Got 3 words into that video and realized he had absolutely nothing worth listening to. :slap: Clicked the X and moved on, not even worth reading the comments.

What i find funny about it how they went on about watching movies on it so much.. Like sorry last thing i would get is a xbox to watch movies and pay MS to go online then pay other people as well to use the service.

Just that you be better of with a PS4 but they need exclusives to make people like me at least to even think about getting a xbox..

And if movies annd stuff is what you going do mostly with it get a frigging Roku 3 as that will beat the pants of it in every way even more so on power usage as the unit only takes 3.2w on load and supports 3rd party stuff too.

I would have to go PS4 for a few reasons like for Heavy Rain if there is ever another of those and uncharted and then the free online so no monthly fee's.

Again if ya just watching movies Roku 3 has more than enough to keep you happy for a long time..
 
I have no problem paying for hardware that makes my games run smoothly, considering I have a $300 monitor that functions best at 120Hz. Playing games at 40 FPS was something I did a couple years ago with an X2 4400+ and 7800GS in 2008, then X4 9750 and a 9800 GT, and then a 4GHz 955BE and HD 5770 before I got my 2600K and HD 6950 in late 2011. My minimum framerate in TF2 almost doubled when I got the i7 (before you call out the video card differences, my 5770 was never fully stressed in TF2 to begin with). Without VSYNC, TF2 runs in the 200s but in the largest fights on 24-28 player servers, my framerate dips down to around 100 with shadows off, sometimes less in extreme situations. My main work computer with a 2.5GHz Phenom X3 8550 and a 3850 AGP hangs around in the mid 30s-40s in the same situations with an under-utilized GPU.

Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.
 
Last edited:
Now, the Xbox One has an underclocked (850Mhz -> 800Mhz) HD 6770 inside and the PS4 has an underclocked (900Mhz -> 800Mhz) HD 6870. The memory bandwidth appears to be accurate for both examples.

All in all, it's an Entry-Point/Mid-Range computer.

What? PS4's GPU is a 1152 GCN v2.0 cores part @ 800MHz with a shared system memory of colossal size and bandwidth.

And I'm pretty sure the XBO's GPU side is no slouch either.

PS4's CPU part is @ 2GHz, it's been confirmed time and time again over the weeks after it's launch, not sure about the XBO. All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch. However... taking into account AMD's HSA... the CPU part plus the GPU grunt work... it's computational power is going to be way above anything we see in today's PCs, heck, more FLOPs than on a 4-CPU 10-core Ivy Bridge server... (abstracting out the fact that the GPU as a pure graphics processing unit will be starved of resources in that scenario).


All in all, it's more like a mid-range or above gaming system, with the vast untapped capabilities we aren't aware of yet... (que AMD's Kaveri/Kaveri+ HSA demostrations...)
 
Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.

Good point but exactly what has 120hz /fps gameing got to do with consoles anyway , this is an argument for 2025-2030 when the ps 5 is due
 
Good point but exactly what has 120hz /fps gameing got to do with consoles anyway , this is an argument for 2025-2030 when the ps 5 is due

Not likely... more like 3820x2160@30Hz... or 60, if "we're" lucky... Seeing as how they don't see the point of 60fps when they(SONY/the developers) "think" they would rather squeeze some extra eye-candy mumbo-jumbo and stay at 30fps... I don't see 120Hz ever happening in the console world... but then again, HDTV/PC display tech might have a big revolution down the road, who knows what will make more sense then.
 
The GPU and ram are both slightly slower, however, Xbox will most likely have a better price point.
 
So can we install Windows 7 (not 8)?

I imagine the OS being stored on a device other than the 500GB HDD, and the system's EFI being prevented from booting from USB/ODD. It won't be long before Linux geeks figure out a way though.
 
The GPU and ram are both slightly slower, however, Xbox will most likely have a better price point.

Maybe until you add up the monthly fee for the dam thing to be online.
 
our pc's will shine on all the new titles... cant wait...
 
our pc's will shine on all the new titles... cant wait...

openGL and DX11 consoles is looking good for the glorious master race.
 
All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch.

Way above? Really? For it to be true, it would mean that more than half of i7-2600 performance is regularly lost on OS + DirectX + Driver overheads, which is simply ridiculous.
 
Way above? Really? For it to be true, it would mean that more than half of i7-2600 performance is regularly lost on OS + DirectX + Driver overheads, which is simply ridiculous.

actually its not that ridiculous. no one has solid numbers, but coding for set hardware is a MASSIVE benefit.
 
Playstation historically has always had the best exclusive titles. This generation will not be any different. It has to do with which continent the console is developed on. MS can try to make an appeal to japanese developers but in the end there will always be that language barrier.


If you read the Anandtech report, the PS4 should run a lot hotter/higher power. Seems unlikely that their fan will be quieter.

That's an opinion
 
Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.

I will agree that some games do play fine at 60 FPS (which is what the more poorly optimized games tend to go down to on my machine on some occasions despite none of my CPU cores or GPU being maxed out), but if you play multiplayer FPS games competitively e.g. Team Fortress 2, Counter-Strike, Quake, etc. there is quite a difference in smoothness between 60Hz and 100-120Hz. Some people have hung onto their CRTs for years and play at stupid resolutions like 1024*768 and 100Hz for these particular games (if they have yet to purchase a 120Hz 1080p LCD) because your local frame rate determines how many snapshots are sent back to the server. Again, many single player games (especially slower paced ones) at 60 Hz play nicely, but in multiplayer FPS games where people tweak the hell out of their net settings, reduce their interp settings and whatnot, it's hard to be part of the 60Hz norm. Call of Duty (not that I play it) supposedly has the best hit registration at 125 or 250 client FPS from what I heard as well.
 
That's an opinion

it really is
If I had to look at the best exclusives its nintendo without a doubt, sony tends to keep many exclusives in the japanese markets(and they're some very good ones that end up being unknown) ms tends to do multiplatform games better but when their is an exclusive its usually a pretty good minus the kinect games.
 
I'd say OGL is only a disadvantage on when on Windoze. And it's not even OpenGL's fault per se.
I'd best describe it in the words one game developer said to me not long a go (not exact words; Greatly shortened) "working with OpenGL is great. OpenGL is also lighter on the CPU and helps to keep the framerate up when running on weaker CPUs. But OpenGL implementations on Windows just suck and are much slower than they could be."

Also, what midnightoil said.
Apples to apples, Direct3D will always be faster because it's hardware + software, not just software. Case in point, Direct3D created the unified shader model, OpenGL adapted it in its own specifications. Whenever there is a performance hit on Direct3D, it is because it is doing something extra (e.g. post processing).

As to what midnightoil said, bare in mind that Windows on Xbox isn't the same as Windows on IBM-PC compatible. Xbox developers likely have direct access to the hardware resources to squeeze every drop of performance from the hardware. On consumer Windows, developers have to go through layer after layer of software to reach the hardware which means it is slower--but less likely to crash (and other undesirable outcomes) the computer. The reason why there isn't a direct access to the hardware in consumer Windows it has to account for the hundreds of graphics devices out there.

I have no doubt that Sony would have used DirectX if they didn't have to license it from Microsoft.

OpenGL 3.# requires Direct3D 10 hardware
OpenGL 4.# requires Direct3D 11 hardware
 
Last edited:
Apples to apples
On the other hand, comparing D3D to OGL on RL usage scenarios as "apples to apples" is not possible. D3D is only implemented on Windoze [and MS devices]. There, OGL is either greatly neglected by the implementers or is simply non-existent. Say what You want, but comparing D3D to pitiful excuses for OGL implementations found on Windoze simply cannot be called as "apples to apples".
On *nix, for example, the implementations are much better.
But comparing between D3D on Windoze and and OGL on *nix cannot be called "apples to apples" due to arising external factors [obvious one - different friggin' OS].

Direct3D will always be faster because it's hardware + software, not just software.
What the hell were You smoking?
 
Direct3D is emulated on *nix. OpenGL and Direct3D both have to go through the layers of protection on Windows. Windows is the closest apples to apples available. The fact that most professional software is rendered using OpenGL attests to the fact that is well implemented on Windows.

There are a lot of engines out there which run on Windows that support Direct3D and OpenGL render paths and the performance is more or less the same when trying to achieve the same degree of visuals.

A lot of EA titles (The Sims 3 and Spore, for example) are DirectX on Windows and OpenGL on Mac OS X. If DirectX was as terrible as you claim it is, why would EA go out of their way to use DirectX on Windows instead of OpenGL on both?
 
why would EA go out of their way to use DirectX on
Windows instead of OpenGL on both?
Already said the reason many times. Do I really need to repeat myself again?

Direct3D is emulated on *nix. OpenGL and Direct3D both have to go through the layers of protection on Windows.
same - what the hell were You smoking?
 
You do know that virtually all professional software (e.g. AutoCAD, 3DSMax, Photoshop, etc.) uses an OpenGL render, correct? OpenGLs implementation on Windows is good (much better than you claim it not to be), it just isn't up to par with the purpose-built DirectX. Most x86 compatible games are released on Windows because of DirectX, not in spite of it. DirectX was created because Bill Gates wasn't satisified with OpenGL at the time. Hell, about the only game developer that loves him some OpenGL is John Carmack (ID Tech engine). That's mostly because he resents the Microsoft empire.

And don't expect a further reply from me on this topic. The discussion is circular.
 
Last edited:
...
All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch. However... taking into account AMD's HSA... the CPU part plus the GPU grunt work... it's computational power is going to be way above anything we see in today's PCs, heck, more FLOPs than on a 4-CPU 10-core Ivy Bridge server... (abstracting out the fact that the GPU as a pure graphics processing unit will be starved of resources in that scenario).


All in all, it's more like a mid-range or above gaming system, with the vast untapped capabilities we aren't aware of yet... (que AMD's Kaveri/Kaveri+ HSA demostrations...)
:roll: Are you new here?!? Ahahaha! That's the dumbest dumb comment I've read in a while. :laugh:

Come on, man! :shadedshu You seriously think these shitty consoles can beat any current PC, let alone the next-generation ones? A Haswell + Titan will crush any of these so-called "gaming machines" hands down. Even the developers themselves (from both platforms) have said they won't take on the high-end PC's head on, instead focusing on "good" (read: not great, let alone the best) middle-of-the-road performance for a gaming console. This time they focused more on entertainment, making the consoles a "media hub" etc. for the living room, not raw power.

By the time the consoles launch and developers get experience coding for them, Broadwell + Maxwell will be out, so these consoles stand NO CHANCE in beating PC's. Mark my words on that. :rockout:
 
By the time the consoles launch and developers get experience coding for them, Broadwell Maxwell will be out, so these consoles stand NO CHANCE in beating PC's. Mark my words on that.

Consoles aren't here to take over PC gaming. They did that already without having superior graphics so what's your point? PC gamers lately have been a slowly dying niche which is a shame.

Go ahead and pay 1000 USD for your Titan. Some person who could care less will probably get an Xbox One, pay a mere fraction of the cost of a full gaming rig, and still enjoy it just as much as you and not know the difference because the general user really doesn't care as much as we do here at TPU.

I guess that depends on how you look at "winning." Image quality wise PCs will be better. Cost effectiveness, market penetration and profits wise, I think consoles are winning by a pretty large margin.

Are you new here?!? Ahahaha! That's the dumbest dumb comment I've read in a while.

Don't call someone else's comments stupid when your post is just as bad. :shadedshu
 
Consoles aren't here to take over PC gaming. They did that already without having superior graphics so what's your point? PC gamers lately have been a slowly dying niche which is a shame.

Go ahead and pay 1000 USD for your Titan. Some person who could care less will probably get an Xbox One, pay a mere fraction of the cost of a full gaming rig, and still enjoy it just as much as you and not know the difference because the general user really doesn't care as much as we do here at TPU.

I guess that depends on how you look at "winning." Image quality wise PCs will be better. Cost effectiveness, market penetration and profits wise, I think consoles are winning by a pretty large margin.

I bet some of us TPU'ers could build a budget pc sub-600 that is on par with the Xbone. This will be doubly true in a couple of years. And the best thing about PC is we dont have to spend $60 for a game. It is very common to buy last year's triple-A titles for $10 during a sale.

As far as PC gaming being a dying niche, well never has that statement been less true than right now.

The PC gaming market reached $20 billion in 2012, a healthy increase of eight percent over the previous year, the PC Gaming Alliance (PCGA) revealed this week at a news conference held in San Francisco.

Now let us compare that to console sales in 2012
Video game and console sales plunged 22 percent in 2012, according to NPD Group data published by Home Media Magazine. As consumers focused their dollars on a few high-profile titles and opted for new digital services, and publishers just released fewer titles, revenue for the year totaled $13.3 billion compared to $17.0 billion in 2011. The decline more than doubled the 9 percent decrease between 2010 and 2011, reported the Los Angeles Times.
 
Back
Top