Friday, November 2nd 2012
Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU
According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.
Source:
VG 24/7
354 Comments on Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU
I never claimed it had anything to do with a specific hardware configuration, but it does have something to do with real life. We're not talking about unicorns. I've been over this a bunch of times now. I know this, and it's not relevant to my point at all, and I explained that all the way back at post #85 and have clarified further in recent posts.
Seriously guys. READ before you post. :shadedshu I was unaware that they were just going to be re-releasing current gen games on next-gen hardware with beefed up resolution and framerate. Do you have a source suggesting that's what they're going to be doing?
My position on this, I think, is pretty clear. My speculation is the following:
-I take a 1080p60 + 3D claim by Sony to be a claim about either 60fps or near-60fps minimum framerates in their games.
-I assume that next gen games are going to improve graphically from current gen games and thus be more demanding
-I look to the hardware needed to run current gen games on the PC maxed out with 60fps minimum framerates at 1080p, and you need multi-GPU to do that.
-I assume that next-gen games are going to be roughly as demanding as current PC games are maxed out or near-maxed out
-I assume it's impossible to optimize an APU (or an APU + low-midrange GPU) to such a degree that it is able to do things that a 7970/680 cannot.
Could I be wrong on some or all of those counts? Sure. But they're not inherently any more unreasonable than someone who assumes the opposite and thinks that Sony can do this. We'll know in a couple years. It would be awesome if they're able to do it, but I'm not a believer yet.
I've already heard that the WiiU will be sold at a loss...
I'm gonna throw this out that the PS4 and Xbox 720s will be around 399~499 price points. Can't see either hardware manufactures asking more than that. But then again they could add value features that could balloon the prices.
Also about Nintendo, they have never sold hardware at a loss till the 3DS and the Wii U. Sony on the other hand has never sold the Playstation at a profit. Always at a loss that was recouped from the overwhelming sales figures.
Assuming Sony and MS stick to traditional controllers it should cut at least $50 off the launch package. I'm just assuming they have beefier hardware that will drive up the price. I expect a base model PS4 at about $400--which comes with just the system and a controller--and a premium package that comes with 1 launch title, a year of PSN+ and probably a second controller, or a bigger HDD (320GB vs. the 250GB).
The new Xbox will probably be around the same price with similar hardware to the PS4, but I see them packaging all systems from launch with the Kinect which might bump the price up a bit.
you are forgetting that the resolution and gpu capability are kinda 2 different things
resolution is more affected by cache and memory, as detail is affected by computation power
i hardly think running 1080p is the problem, the issue here is details
i am studying video game design and the biggest concern is polygons when we do modeling, the more polygons in the models the more gpu capability it takes, so being a good modeler is to get the best looking shape with the least number of polygons possible
now the reason you dont get x4 the performance out of the hardware is because the poly count doesnt change with resolution, the rendering only changes to 1080p
not to mention the code has alot to do offcouse aswell as the poly count like i mentioned
some of the newer games have stunning graphcs and still run on old cards or even the consoles of today, mostly due to optimizing well with the crapload of graphics shaders available today, older games werent designed to run on 2000+ shaders so when u upscale to newer gens to run the older games they dont nessesarly do so linearly. but when u try the new optimized games for the hardware and run them on older cards then u will see the 4x weaker card running 4x weaker(when mem bandwidth and other gpu specialized features are also 4x weaker)
With Kinect II if it is truly integrated into the hardware we'll see a big push for games for kinect since its already part of the system, allowing MS to market it better. Next is I'm sure we'll get a more Natal product vs Kinect.
In BF3 you'll get an average and maximum way above 60FPS. Minimum will be about 30-40FPS regardless of setup.
But yes, I agree BigMack70 does often pull info out his butt.
So the PS4 having a discrete card paired with the APU makes sense otherwise it wouldn't reach the performance target.
But this is just a theory.
because its not that simple -.-
that doesnt take into accounts the various hardware tweaks, lossless and lossy compression (hardware and software via drivers) and a million other things. you've based this argument around something you see as simple and obvious, without actually checking it yourself.
god, this thread is really full of over-simplified arguments, just one after another after another...
Possible "kinect 2.0" tech in the next Xbox (Durango) . what will be Sony's move?
A similar question would be why is a HD7950 the slowest card here?
Why do all cards perform nearly the same? And the answer is not anything complicated about optimization HW&SW, etc, etc.
EDIT: Low end cards however...
The GTX 650 Ti for example goes from 38.5 fps to 23.7 which is actually very close to the resolution difference which is 1.77X -> 38 / 1.77 =~ 21
Now you all guys are talking about margins. PC graphics cards have power to spare on nearly all fronts, lots and lots of it, for a long time, many years, and most definitely in the case of Pixel Shading capabilities and as such it's in this front where lower end cards have more "leeway".
* Nowadays, I mean any game engine in the past 5 years does everything on a per-pixel basis. There's no real escape from the law. More pixels more power required. Some games, especially on consoles "avoid" this physics law by rendering some elements at lower resolutions. For example rendering the lighting pass(es) at 1/2 or 1/4 the resolution is very common. This is just a workaround and not breaking the law. If output res is increased from 720 to 1080 but everything else on the fragment data is kept the same res, resolution has not really increased by as much as stated.
(SIGH)
This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .
First you morons need to read some facts , here they are :
FACT 1 : Not even High-End PCs can sustain 60FPS @1080p in demanding games with maximum graphics , games like : Metro 2033 , ARMA 3 , Crysis , Dragon Age 2 and many many more , there will be drops below 60FPS and in many occasions .
FACT 2 : Consoles include insane amount of code optimization , where every CPU/GPU cycle is utilized , they literally run on the machine code , which is the lowest language of software programming , contrary to the PC that sports many compilers and higher languages which wastes valuable cycles .
FACT 3 : Even with these optimizations , consoles run with shitty graphics . they cant even maintain 30 FPS at 720p , they usually drop to 25 and 20 FPS , they even run at sub 1280x720 resolutions , sometimes as low as 900x600 !
FACT 4 : Consoles cut down on graphics severely , they decrease Shadows density , Lighting Effects , Level Of Detail , Polygon Count , Alpha Effects , Texture Resolution , Texture Filtering , Anti-Aliasing , Post Processing and so many things that I can't even remember them all .
FACT 5 : a console with the triple specs of,
1-AMD CPU
2-AMD APU
3-AMD GPU (low-end/6670)
Will barely run today games at 1080p @60 FPS with PC graphics level , all of the code optimizations will be spent on the cost of resolution increase (to 1080p) and the cost of graphics increase (to PC level) , such as shadows , lighting , textures and etc .
If the specs has been changed and the console came with a high-end or even a medium AMD GPU , then the situation will be different .
FACT 6 : These consoles will have to do the usual dirty business to be able to run at 1080p , cut resolution and upscale , decrease all graphics elements (lighting and shadows and textures .. etc) below the future PC level .. This happened to the previous generation too , Xbox and PS 3 stared operating at 720p just fine , then they had to cut corners to increase graphics other wise the visuals will stall .
FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .
End of Discussion .
I have a bad feeling about using hybrid crossfire on a console though, even if they optimize it, that's still an extra piece of hardware that could cause problems.