# Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU



## btarunr (Nov 2, 2012)

According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.





*View at TechPowerUp Main Site*


----------



## btarunr (Nov 2, 2012)

Many Thanks to NHKS for the tip.


----------



## [H]@RD5TUFF (Nov 2, 2012)

And it will still be a piece of crap because it's a $ony.


----------



## Dos101 (Nov 2, 2012)

Interesting....good news for AMD though. Just wish we had some more detailed specs. This probably means no backwards compatibility with PS3 titles then.


----------



## Ikaruga (Nov 2, 2012)

I wonder, will they do a discrete custom AMD GPU which will make it impossible to run the games on a normal PC with similar hardware (or emulate realtime/with good speed)? 
Or if the hardware is really "nothing else" but a amd64 based PC, what's could be the business strategy here? Are they going to team up with Valve/Steam against MS or something?


----------



## brandonwh64 (Nov 2, 2012)

LOL it will be like this soon.


----------



## BigMack70 (Nov 2, 2012)

I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.

I call BS. More probably 720p upscaled


----------



## Dos101 (Nov 2, 2012)

BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> 
> I call BS. More probably 720p upscaled



APU system with discrete graphics. Depending on the what the dedicated GPU is it's very doable.


----------



## MxPhenom 216 (Nov 2, 2012)

BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> 
> I call BS. More probably 720p upscaled





Dos101 said:


> APU system with discrete graphics. Depending on the what the dedicated GPU is it's very doable.



You guys forget that with consoles developers are able to get 100% usage out of the hardware it has because developers only have to code for one configuration so they are able to utilize the hardware 100% with 100% efficiency. They are able to push hardware in the consoles a bit differently then on a normal desktop PC.


----------



## No_Asylum (Nov 2, 2012)

BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> 
> I call BS. More probably 720p upscaled




Actually ... that isnt even impressive at all.  60fps @ 1080p is slow by todays standards.


----------



## Disparia (Nov 2, 2012)

Very possible.

I decided against an A10+6670 for my wife's new system because getting acceptable 1920x1200 performance meant lowering the quality down to low-medium settings. But in a controlled console environment with tight stacks, it shouldn't be a problem to bump up the eye candy.


----------



## MxPhenom 216 (Nov 2, 2012)

No_Asylum said:


> Actually ... that isnt even impressive at all.  60fps @ 1080p is slow by todays standards.



Typical console gamers won't even know what 1080p or 60FPS is.


----------



## BigMack70 (Nov 2, 2012)

MxPhenom 216 said:


> You guys forget that with consoles developers are able to get 100% usage out of the hardware it has because developers only have to code for one configuration so they are able to utilize the hardware 100% with 100% efficiency. They are able to push hardware in the consoles a bit differently then on a normal desktop PC.



I don't forget that developers can get more out of console hardware than the same stuff in a PC. But...



No_Asylum said:


> Actually ... that isnt even impressive at all.  60fps @ 1080p is slow by todays standards.



... 60fps 1080p is actually very demanding. You're not going to get 60fps minimum framerates in  even all current titles without a multi-GPU setup. If you want 60fps minimum framerate in BF3, for example, you are either going to be using multiple GPUs or turning settings down.

So, unless they've got some GPU secret sauce in there that they're not announcing, even given the fact that they can get more out of console hardware than PC, I find it extremely unlikely that they're going to be churning out 1080p 60fps in anything demanding. 

Next gen consoles are supposed to be the playground for Unreal Engine 4, and if you think that's going to run at 1080p 60fps on anything less than a high end graphics card, you're crazy. How high end, I don't know, but definitely something many tiers above an APU.


----------



## 1Kurgan1 (Nov 2, 2012)

[H]@RD5TUFF said:


> And it will still be a piece of crap because it's a $ony.



And thats based on what? I'm not saying PS's didn't break, but they sure weren't on the level of the "red ring of death". Just seems like fanboying.


----------



## RCoon (Nov 2, 2012)

I do believe the AMD execs eyes are $.$
If the new Xbox follows the APU with discrete graphics route, they'll certainly be cashing in, making the big monies. Glad to see their success with their architecture, maybe when they port the games to pc they might work and look a little better, for AMD systems of course. Sleeping Dogs will love this xD


----------



## RejZoR (Nov 2, 2012)

BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> 
> I call BS. More probably 720p upscaled



Well, i don't see it that far fetched. Remember, games for PC's are generic as every config is different. For PS4, even if they'll use only AMD Fusion APU, it will be highly optimized and coded specifically for that. And that will make MASSIVE difference. Just remember how much faster were games using just dedicatwed libraries like 3dfx Glide or S3 Metal. I still remember playing UT99 using S3 Savage 3D with S3 Metal better than it was running through Direct3D on RivaTNT2. ANd with S3TC high res textures. Now imagine a whole game build specifically for that platform.

It's the same reason why Gran Turismo 5 looked so damn good on a hardware that was speced as GeForce 7900...


----------



## Dos101 (Nov 2, 2012)

1Kurgan1 said:


> And thats based on what? I'm not saying PS's didn't break, but they sure weren't on the level of the "red ring of death". Just seems like fanboying.



I didn't say anything about playstations breaking, I was referring to the ability to port PS3 games to the PS4. Since PS3 uses Cell for it's CPU and uses an NVidia GPU, I'm sure there might be difficulties in retaining backwards compatibility with PS3 titles.


----------



## RejZoR (Nov 2, 2012)

The reason why they might be going for AMD's APU is the fact that porting will be much much easier. Something Microsoft wanted to achieve with Xbox as well. And they sort of did.


----------



## BigMack70 (Nov 2, 2012)

RejZoR said:


> Well, i don't see it that far fetched. Remember, games for PC's are generic as every config is different. For PS4, even if they'll use only AMD Fusion APU, it will be highly optimized and coded specifically for that. And that will make MASSIVE difference. Just remember how much faster were games using just dedicatwed libraries like 3dfx Glide or S3 Metal. I still remember playing UT99 using S3 Savage 3D with S3 Metal better than it was running through Direct3D on RivaTNT2. ANd with S3TC high res textures. Now imagine a whole game build specifically for that platform.
> 
> It's the same reason why Gran Turismo 5 looked so damn good on a hardware that was speced as GeForce 7900...



So you think that they're going to be able to do on current and future titles, with a highly optimized and well coded APU, what a 7970/680 can not do even on all current titles?

Now, if they have an APU + some sort of midrange GPU in there, I'd find it more plausible, but I just have a hard time buying that they're going to be rendering things at a smooth 60fps at full 1080p when all the rumors about the hardware are that it's on the low-end side of things.

I guess we'll find out in a year or two.


----------



## EpicShweetness (Nov 2, 2012)

MxPhenom 216 said:


> You guys forget that with consoles developers are able to get 100% usage out of the hardware it has because developers only have to code for one configuration so they are able to utilize the hardware 100% with 100% efficiency. They are able to push hardware in the consoles a bit differently then on a normal desktop PC.





MxPhenom 216 said:


> Typical console gamers won't even know what 1080p or 60FPS is.



You are right on both accounts. 384 VLIW4 SMIDS is a hell of alot more power then the what 16 threads on a dx9 GPU with an outdated architecture. I've seen a 3870k (overclocked) do 720p gaming easy, whose to say 1080p without crappy Direct X isn't possible. Most console gamer's are 20-26 collage students, who come home get drunk, and play video games. Hey concept if your drunk hell 480p looks good


----------



## Batou1986 (Nov 2, 2012)

This is good news for AMD, 
Also explains what I heard about them being substantially cheaper like $250~$299 instead of $500+  Using standard x86 parts would help with costs, The bad news is there still wont be any games.

Everyone I know who has a PS3 never uses it except as a BD player.


----------



## newtekie1 (Nov 2, 2012)

Jizzler said:


> Very possible.
> 
> I decided against an A10+6670 for my wife's new system because getting acceptable 1920x1200 performance meant lowering the quality down to low-medium settings. But in a controlled console environment with tight stacks, it shouldn't be a problem to bump up the eye candy.



Not to mention most console gamers rave about how "awesome" console graphics are when they are getting 720p upscaled and low detail settings.  If this new console can do 1080p native with mediumish settings console gamers will shit over how "great" the graphics are.


----------



## Animalpak (Nov 2, 2012)

discrete mid-range GPU


----------



## rpsgc (Nov 2, 2012)

Animalpak said:


> discrete mid-range GPU



How dare you actually read past the title?


Heresy!


----------



## BigMack70 (Nov 2, 2012)

Except the source article mentions nothing about a discrete GPU...

In other words, we still don't know enough to do more than speculate. And I'm speculating that 1080p 60fps is a pipe dream unless they're going to be turning settings down considerably.


----------



## SIGSEGV (Nov 2, 2012)

BigMack70 said:


> So you think that they're going to be able to do on current and future titles, with a highly optimized and well coded APU, what a 7970/680 can not do even on all current titles?



yes for sure, why not?


----------



## RejZoR (Nov 2, 2012)

BigMack70 said:


> So you think that they're going to be able to do on current and future titles, with a highly optimized and well coded APU, what a 7970/680 can not do even on all current titles?
> 
> Now, if they have an APU + some sort of midrange GPU in there, I'd find it more plausible, but I just have a hard time buying that they're going to be rendering things at a smooth 60fps at full 1080p when all the rumors about the hardware are that it's on the low-end side of things.
> 
> I guess we'll find out in a year or two.



Well, just look at PS3. It's basically being powered by GeForce 7900 and most games look like 2012 titles on PC. I won't say in every aspect but to casual gamer, there won't be much difference...


----------



## BigMack70 (Nov 2, 2012)

SIGSEGV said:


> yes for sure, why not?



Well, the highly optimized and well coded nature of the Xbox 360 allows it to do roughly what a card one generation superior to it can do (an 8800 GTX).

The difference between an APU and a 7970 is a lot bigger than the difference between a modded 7900 series chip and an 8800 GTX.

The PS3 & Xbox 360 are rendering games at 720p around 30fps... you don't need much GPU power to do that on the PC. An 8800 series card will still do that just fine on anything as long as the game is compatible with dx10 or earlier (and it will do a lot more than that in most cases e.g. Skyrim, anything UE3 powered).


----------



## 3870x2 (Nov 2, 2012)

BigMack70 said:


> I don't forget that developers can get more out of console hardware than the same stuff in a PC. But...
> 
> 
> 
> ...


I have a single 7950 on an amd 945.  At stock I max everything out 1920x1200, 60FPS.


----------



## BigMack70 (Nov 2, 2012)

3870x2 said:


> I have a single 7950 on an amd 945.  At stock I max everything out 1920x1200, 60FPS.





No. No you don't. Not unless you're one of those people who considers turning down the AA to still be "max" settings. 

Maybe you max all your games at 60fps if you don't own some of the more demanding ones, but you're not getting 60fps minimum framerates in all the demanding games out there on a single card. BF3 multiplayer is the most obvious example. 

For example:
http://hardocp.com/article/2012/10/30/xfx_double_d_hd_7970_ghz_edition_video_card_review/6


----------



## 3870x2 (Nov 2, 2012)

BigMack70 said:


> No. No you don't. Not unless you're one of those people who considers turning down the AA to still be "max" settings.
> 
> Maybe you max all your games at 60fps if you don't own some of the more demanding ones, but you're not getting 60fps minimum framerates in all the demanding games out there on a single card. BF3 multiplayer is the most obvious example.
> 
> ...



Crysis 3 is the only exception.

Recent titles I have played are BF3 and Sleeping dogs, both are astounding.

Don't get upset just because the $900 you spent on your CFX setup is more or less useless for the time being.

Also, those are pre 12.11, and ran at x1600, mine at x1200.  I know you are new here, but do some research before posting "sources"


----------



## BigMack70 (Nov 2, 2012)

Lol w/e man... if you want to claim you've got a magic setup, go ahead. Arkham City, Sleeping Dogs, those are just two other recent examples of things that a single card doesn't do 60fps minimum framerate in (they don't even get close btw).

But hey maybe your card is a magic card that does far better than all the benchmarks say.


----------



## Fairlady-z (Nov 2, 2012)

I wonder what the selling point will be? I mean A10 APU wont differ to much from the PS3 to be easily noticeable to the average user. I really hope its A10 APU and maybe a 7770 or something a long with it. If this turns out to be true I doubt most of the high end users (PC) will need to upgrade any time soon than.


----------



## 3870x2 (Nov 2, 2012)

BigMack70 said:


> Lol w/e man... if you want to claim you've got a magic setup, go ahead. Arkham City, Sleeping Dogs, those are just two other recent examples of things that a single card doesn't do 60fps minimum framerate in (they don't even get close btw).
> 
> But hey maybe your card is a magic card that does far better than all the benchmarks say.




You just disregard the fact that I disproved your argument using actual facts.
You are just embarrassing yourself.

Back on topic, the SNES had a 3.58 MHZ processor.  For a computer to emulate the SNES perfectly, it has to be over 300MHZ.  This is not a very accurate example per say because of the taxing elements of emulating hardware, but gives us a good example of what a small amount of dedicated power can do.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Lol w/e man... if you want to claim you've got a magic setup, go ahead. *Arkham City*, Sleeping Dogs, those are just two other recent examples of things that a single card doesn't do 60fps minimum framerate in (they don't even get close btw).
> 
> But hey maybe your card is a magic card that does far better than all the benchmarks say.



Bad example. Hes not running Physx. He will do 60FPS at 1920 all day in Batman:AC without that.

On topic: These are some nice specs if true. WiiU will have some competition in a year or so.


----------



## 3870x2 (Nov 2, 2012)

BigMack70 said:


> Arkham City, Sleeping Dogs, those are just two other recent examples of things that a single card doesn't do 60fps minimum framerate in (they don't even get close btw).




*¿what now?*


----------



## BigMack70 (Nov 2, 2012)

3870x2 said:


> You just disregard the fact that I disproved your argument using actual facts.
> You are just embarrassing yourself.
> 
> Back on topic, the SNES had a 3.58 MHZ processor.  For a computer to emulate the SNES perfectly, it has to be over 300MHZ.  This is not a very accurate example per say because of the taxing elements of emulating hardware, but gives us a good example of what a small amount of dedicated power can do.



What facts? The "o wait those benchmarks are wrong my card actually does way better" facts?



And PhysX doesn't matter for Batman AC - 60fps is out of reach for single GPUs.
http://hardocp.com/article/2012/10/30/xfx_double_d_hd_7970_ghz_edition_video_card_review/7

Also Guild Wars 2:
http://www.tomshardware.com/reviews/guild-wars-2-performance-benchmark,3268-6.html

Sleeping Dogs isn't even close:
http://hardocp.com/article/2012/10/02/sleeping_dogs_gameplay_performance_iq_review/5

Those are just a few examples which could be multiplied over and over. 

Now, you might have a magic card that does far better than any benchmark suggests. If your card is magic, then good for you!

*Don't forget that all my claims have always been about minimum framerate, not average.* And no AA isn't max settings...


----------



## Delta6326 (Nov 2, 2012)

Really an A10 APU will be enough for 60FPS at 1080p if a 7900 can do 30FPS at 720/1080p



BigMack70 said:


> Lol w/e man... if you want to claim you've got a magic setup, go ahead. Arkham City, Sleeping Dogs, those are just two other recent examples of things that a single card doesn't do 60fps minimum framerate in (they don't even get close btw).
> 
> But hey maybe your card is a magic card that does far better than all the benchmarks say.



... What rock do you live under? Though the Batman has no AA


----------



## _JP_ (Nov 2, 2012)

Fairlady-z said:


> I wonder what the selling point will be? I mean A10 APU wont differ to much from the PS3 to be easily noticeable to the average user.


"New PlayStation 4! Now with more HD & 3D!"
Done.
The majority of people that buy consoles couldn't care less about its innards.


----------



## BigMack70 (Nov 2, 2012)

3870x2 said:


> http://tpucdn.com/reviews/AMD/Catalyst_12.11_Performance/images/sleepingdogs_1920_1200.gif
> 
> 
> 
> *¿what now?*



Maybe lrn2read? :shadedshu

About TPU's test of Sleeping Dogs:
*"We tested at highest settings with super-sampling disabled."*


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> What facts? The "o wait those benchmarks are wrong my card actually does way better" facts?
> 
> 
> 
> ...



PhysX doesn't matter for Batman AC? Wait what? Dude come on you got to be joking.


----------



## 3870x2 (Nov 2, 2012)

TheMailMan78 said:


> PhysX doesn't matter for Batman AC? Wait what? Dude come on you got to be joking.



I give up, there is no convincing him.

Oh look, another Arkham Asylum with a 7950 getting no less than 80 fps.

And why would you be looking at the minimum? There will always be significantly low hitches that are minuscule and provide a very irrelevant statistic.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> PhysX doesn't matter for Batman AC? Wait what? Dude come on you got to be joking.



Did you even check my link? :shadedshu

No PhysX doesn't magically make a 79xx card hit 60fps minimum framerates. Without PhysX, at 1080p, that's a 7970 getting 28fps minimum.

Obviously PhysX increases visuals and matters, but in the context of this discussion, it's irrelevant - PhysX or no, you're not getting 60fps min fps on Batman AC with maxed out settings.


----------



## BigMack70 (Nov 2, 2012)

Yes, a 7950 will do 60fps average framerate in everything if in a few circumstances you're willing to turn down the AA options.

No, a (non-magic) 7950 will not do 60fps minimum framerate in the most demanding games with maxed settings.

Here's a hint for you all: posting benchmarks that don't have minimum fps data and which have AA options turned down (as has been done so far), does nothing other than prove my point.


----------



## 3870x2 (Nov 2, 2012)

BigMack70 said:


> Yes, a 7950 will do 60fps average framerate in everything if in a few circumstances you're willing to turn down the AA options.
> 
> No, a (non-magic) 7950 will not do 60fps minimum framerate with maxed settings.
> 
> Here's a hint for you all: posting benchmarks that don't have minimum fps data and which have AA options turned down (as has been done so far), does nothing other than prove my point.



And if you are using the minimum framerate, you are ignorant.  This isn't like the minimum average, this is the absolute minimum one-time split-second frame drop.

Edited: didn't want to offend.


----------



## _JP_ (Nov 2, 2012)

^That. Considering a value that only is present for a maximum of a second is incredibly relevant for the overall gaming experience. 


BigMack70 said:


> Did you even check my link? :shadedshu


Do you even nVidia?
Since when does PhysX work on Radeon cards?


----------



## BigMack70 (Nov 2, 2012)

3870x2 said:


> And if you are using the minimum framerate, you are borderline retarded.



Yeah, because it's tons of fun to be playing at 60fps and suddenly find yourself chugging along at 25fps... 

If you don't care about minimum fps, why did you even start trying to argue with me in the first place? Not my fault you chose to pick a losing fight without thinking carefully.


----------



## brandonwh64 (Nov 2, 2012)

_JP_ said:


> ^That. Considering a value that only is present for a maximum of a second is incredibly relevant for the overall gaming experience.
> 
> Do you even nVidia?
> Since when does PhysX work on Radeon cards?



For ATI/AMD GPU's physx works on the CPU


----------



## 3870x2 (Nov 2, 2012)

There is no convincing anyone.  The mods will be involved if this continues.  If there is anyone who wants to argue further, might be a good idea to PM.


----------



## Easy Rhino (Nov 2, 2012)

why are people even discussing this bullsh!t from sony. why would you even consider buying this after the crap they put us ps3 owners through!!!! boycott!!!!!!!!!!!!!


----------



## BigMack70 (Nov 2, 2012)

_JP_ said:


> ^That. Considering a value that only is present for a maximum of a second is incredibly relevant for the overall gaming experience.
> 
> Do you even nVidia?
> Since when does PhysX work on Radeon cards?



Huh? Have you read anything I've been posting? Doesn't sound like it...

I simply posted a link showing that having PhysX disabled for Batman:AC doesn't suddenly give AMD cards a 60fps min. framerate in that game.


----------



## _JP_ (Nov 2, 2012)

brandonwh64 said:


> For ATI/AMD GPU's physx works on the CPU


I know that. But he's talking as if didn't and the impact was irrelevant.


----------



## brandonwh64 (Nov 2, 2012)

_JP_ said:


> I know that. But he's talking as if didn't and the impact was irrelevant.



Ahhh I see. Yea it does impact ATI like with BL2 especially.


----------



## SIGSEGV (Nov 2, 2012)

BigMack70 said:


> Did you even check my link? :shadedshu
> 
> No PhysX doesn't magically make a 79xx card hit 60fps minimum framerates. Without PhysX, at 1080p, that's a 7970 getting 28fps minimum.
> 
> Obviously PhysX increases visuals and matters, but in the context of this discussion, it's irrelevant - PhysX or no, you're not getting 60fps min fps on Batman AC with maxed out settings.



it's obviously you still think that a next gen console machines will performs (or render) better if their detail hardware specification is close enough with currently a high end pc machine.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Did you even check my link? :shadedshu
> 
> No PhysX doesn't magically make a 79xx card hit 60fps minimum framerates. Without PhysX, at 1080p, that's a 7970 getting 28fps minimum.
> 
> Obviously PhysX increases visuals and matters, but in the context of this discussion, it's irrelevant - PhysX or no, you're not getting 60fps min fps on Batman AC with maxed out settings.



I got 60+ FPS with a 570 with no PhysX and a 7950 is a lot faster. Your link is irrelevant.


----------



## sergionography (Nov 2, 2012)

MxPhenom 216 said:


> You guys forget that with consoles developers are able to get 100% usage out of the hardware it has because developers only have to code for one configuration so they are able to utilize the hardware 100% with 100% efficiency. They are able to push hardware in the consoles a bit differently then on a normal desktop PC.



not to mention the dev kit is for game companies to start developing compatible games for the upcomming hardwar, so the final spec might be different but we now know the recipe
so even if sony use steamroller instead with much bigger integrated gpus the compatibility in terms of coding will be pretty much the same just wont take full advantage of the hardware for the next gen of games.


----------



## JNUKZ (Nov 2, 2012)

PS4 it´s supposed to run games at 4k resolution. But to be honest I think is almost impossible.


----------



## BigMack70 (Nov 2, 2012)

_JP_ said:


> I know that. But he's talking as if didn't and the impact was irrelevant.



Nope. That's the 3rd or 4th time you've completely missed my point even when I've explained it.

Mailman claimed that my contention about a 7950 not doing 60fps minimum framerates in Batman AC was irrelevant because it has PhysX.

I then argued, and posted a link supporting, that PhysX doesn't matter on that count - you're not going to get 60fps min. framerate with or without PhysX.

The fact stands - if you want 60fps minimum framerates on every game with max settings, you need at least two GPUs.

Now, you might say "that's an irrelevant point nobody cares about minimum framerate and who cares if you have to disable some AA in one or two games". OK fine I'm not going to argue that. Some people will care and some won't. But ticking the boxes next to each of the 3 criteria: 60fps minimum fps, maxed settings, all games - that requires more power than any single GPU has to offer.

Heck, it *technically* requires more power than any TWO GPUs have to offer if you're going to count something like The Witcher 2's ubersampling mode. And some poorly optimized games like Batman AC I'm not even sure if it's possible to get 60fps minimum fps there with max settings either.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Nope. That's the 3rd or 4th time you've completely missed my point even when I've explained it.
> 
> Mailman claimed that my contention about a 7950 not doing 60fps minimum framerates in Batman AC was irrelevant because it has PhysX.
> 
> ...



A 7950 will get better then a MINIMUM of 60FPS without PhysX. My 570 does.


----------



## WhiteLotus (Nov 2, 2012)

PS3 has Steam right?

Think I might leave the PC gaming pretty soon. Well perhaps in a few years.


----------



## JNUKZ (Nov 2, 2012)

WhiteLotus said:


> PS3 has Steam right?
> 
> Think I might leave the PC gaming pretty soon. Well perhaps in a few years.



No it doesn´t, I own a ps3 and only have psn


----------



## Easy Rhino (Nov 2, 2012)

none of you have a point. sony will just screw you over with ps4 like they did on ps3. stop arguing about something you wont buy and shouldn't buy!


----------



## TheMailMan78 (Nov 2, 2012)

Easy Rhino said:


> none of you have a point. sony will just screw you over with ps4 like they did on ps3. stop arguing about something you wont buy and shouldn't buy!



Sony didn't screw me over.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> A 7950 will get better then a MINIMUM of 60FPS without PhysX. My 570 does.



Yeah that makes total sense since HardOCP gets a minimum fps in the 20s for the 7970 GHz edition maxed without PhysX. Very plausible. Definitely sounds legit. 12.11 drivers probably give a 100%+ boost compared to 12.9.


----------



## Solaris17 (Nov 2, 2012)

Easy Rhino said:


> none of you have a point. sony will just screw you over with ps4 like they did on ps3. stop arguing about something you wont buy and shouldn't buy!



they didnt screw me over and you had a thread when you got yours. your just mad. Didnt you just get on the mac bandwagon? anything that doesnt have an apple logo on it in th comming months isnt going to be worth it to buy to you.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Yeah that makes total sense since HardOCP gets a minimum fps in the 20s for the 7970 GHz edition maxed without PhysX. Very plausible. Definitely sounds legit. 12.11 drivers probably give a 100%+ boost compared to 12.9.



HardOCP is not TPU. We know how to review and test hardware properly. Look at OUR benches.



Solaris17 said:


> they didnt screw me over and you had a thread when you got yours. your just mad. Didnt you just get on the mac bandwagon? anything that doesnt have an apple logo on it in th comming months isnt going to be worth it to buy to you.



No hes mad they got rid of Linux support. One thing I have learned about Easy Rhino is he will install Linux on a toaster oven if he can......and if he can't that toaster oven is garbage!


----------



## BigMack70 (Nov 2, 2012)

TPU benches have no AA and don't measure minimum framerate for Batman AC and so aren't relevant to what I'm claiming one way or another.

If you don't like HardOCP, go take a look at the Batman AC chart for overclock3d as another example. See any cards with 60fps minimum framerate? Nope. (I don't know exactly what settings they're using)

I've not seen ANY benchmarks of maxed Batman AC, with or without showing PhysX, that show minimum fps above 60.


----------



## Random Murderer (Nov 2, 2012)

btarunr said:


> The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz.



1080p still? Didn't Sony say they were working on a 2k+ capable system less than a year ago?


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> TPU benches have no AA and don't measure minimum framerate for Batman AC and so aren't relevant to what I'm claiming one way or another.
> 
> If you don't like HardOCP, go take a look at the Batman AC chart for overclock3d as another example. See any cards with 60fps minimum framerate? Nope. (I don't know exactly what settings they're using)
> 
> I've not seen ANY benchmarks of maxed Batman AC, with or without showing PhysX, that show minimum fps above 60.



Well believe what you like. You have several users here that can do it. If you don't think a PS4 can't do it with DEDICATED hardware that's double the specs of the current system on a game designed for the LAST SYSTEM, who am I to argue with such intellect.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Well believe what you like. You have several users here that can do it. If you don't think a PS4 can't do it with DEDICATED hardware that's double the specs of the current system on a game designed for the LAST SYSTEM, who am I to argue with such intellect.



Several users here who can do what, exactly? Use magic to achieve something benchmarks show not to be possible?

Also, just to be picky... 1080p 60fps is more than 4x as demanding than what current consoles do, which is ~720p ~30fps.


----------



## Darkleoco (Nov 2, 2012)

BigMack70 said:


> Several users here who can do what, exactly? Use magic to achieve something benchmarks show not to be possible?
> 
> Also, just to be picky... 1080p 60fps is more than 4x as demanding than what current consoles do, which is ~720p ~30fps.



Because the benchmarks you are using HAVE to be the supreme authority on what is possible because they support your viewpoint am I right?


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Several users here who can do what, exactly? Use magic to achieve something benchmarks show not to be possible?
> 
> Also, just to be picky... 1080p 60fps is more than 4x as demanding than what current consoles do, which is ~720p ~30fps.



Of course because logic would dictate you would need 4x more powerful hardware to reach those kind of results. Never mind hardware and software optimization that's happen over the past 6 almost 7 years. Your logic is infallible. Perfect even. Like a liquid sphere in space.


----------



## Easy Rhino (Nov 2, 2012)

Solaris17 said:


> they didnt screw me over and you had a thread when you got yours. your just mad. Didnt you just get on the mac bandwagon? anything that doesnt have an apple logo on it in th comming months isnt going to be worth it to buy to you.



dude, i loved my ps3 when it had all the features that i paid for. i am not on a mac bandwagon, i happen to really like macbooks. 



TheMailMan78 said:


> No hes mad they got rid of Linux support. One thing I have learned about Easy Rhino is he will install Linux on a toaster oven if he can......and if he can't that toaster oven is garbage!



yes, when you buy a product for its advertised capability and they remove that capability then i will get mad. 

anybody who purchases the PS4 deserves to get screwed.


----------



## BigMack70 (Nov 2, 2012)

My question: If my "viewpoint" is wrong, why are there no benchmarks that show that?

Also... my logic on the "4x more powerful" note is infallible...
1080p = 2.2x as many pixels as 720p
60fps = 2x as many frames as 30fps

So, to render something at 60fps in 1080p as opposed to 720p in 30fps, you need to crunch numbers more than 4x as fast i.e. your hardware needs to be 4x as powerful.


----------



## TheMailMan78 (Nov 2, 2012)

Easy Rhino said:


> dude, i loved my ps3 when it had all the features that i paid for. i am not on a mac bandwagon, i happen to really like macbooks.
> 
> 
> 
> ...



If I were you I would be mad also about the Linux thing.  However with that being said you are not the average user. Most people are very happy with the PS3.



BigMack70 said:


> My question: If my "viewpoint" is wrong, why are there no benchmarks that show that?
> 
> Also... my logic on the "4x more powerful" note is infallible...
> 1080p = 2.2x as many pixels as 720p
> ...



No. It would need to work 4x more efficient. Something that's been done over the past 7 YEARS.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> No. It would need to work 4x more efficient.



If it were 4x more efficient, it would also be 4x more efficient at 720p. For any given hardware/software combo, it is 4x more difficult to get 1080p 60fps than it is to get 720p 30fps.

Really, man, this is basic math here. :shadedshu


----------



## Easy Rhino (Nov 2, 2012)

TheMailMan78 said:


> If I were you I would be mad also about the Linux thing.  However with that being said you are not the average user. Most people are very happy with the PS3.



oh and btw sony removed folding@home from the ps3. apparently sony likes cancer 

http://www.escapistmagazine.com/news/view/120244-Sony-Removes-Folding-home-From-PlayStation-3


----------



## _JP_ (Nov 2, 2012)

Easy Rhino said:


> none of you have a point. sony will just screw you over with ps4 like they did on ps3. stop arguing about something you wont buy and shouldn't buy!





TheMailMan78 said:


> Sony didn't screw me over.


Me neither.
PS4 won't probably screw buyers, because sony won't offer the same type of software package. If anything, it's going to be even more restricted from the start.


BigMack70 said:


> If it were 4x more efficient, it would also be 4x more efficient at 720p. For any given hardware/software combo, it is 4x more difficult to get 1080p 60fps than it is to get 720p 30fps.
> 
> Really, man, this is basic math here. :shadedshu


Except that performance gains/losses are not linear.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> If it were 4x more efficient, it would also be 4x more efficient at 720p. For any given hardware/software combo, it is 4x more difficult to get 1080p 60fps than it is to get 720p 30fps.
> 
> Really, man, this is basic math here. :shadedshu


 I don't think you understand how things are rendered. Hardware is vastly more efficient today then 7 years ago.



Easy Rhino said:


> oh and btw sony removed folding@home from the ps3. apparently sony likes cancer
> 
> http://www.escapistmagazine.com/news/view/120244-Sony-Removes-Folding-home-From-PlayStation-3


 Cancer isn't so bad. You get to save money on hair cuts.



_JP_ said:


> Except that performance gains/losses are not linear.



Shhhh hes on a roll. Can't wait to hear how the Germans invaded Pearl Harbor.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> I don't think you understand how things are rendered. Hardware is vastly more efficient today then 7 years ago.



If I had to pick one, I'd rather fail to understand how things get rendered than fail to understand kindergarten mathematics.


----------



## Jaffakeik (Nov 2, 2012)

No_Asylum said:


> Actually ... that isnt even impressive at all.  60fps @ 1080p is slow by todays standards.



why its slow,you ar claiming that on 60fps your game runs slideshow?becaue on 30fps games dont run slideshow,so 30fps would be still enought.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> If I had to pick one, I'd rather fail to understand how things get rendered than fail to understand kindergarten mathematics.



Of course. Insults when logic escapes.


----------



## MxPhenom 216 (Nov 2, 2012)

Easy Rhino said:


> none of you have a point. sony will just screw you over with ps4 like they did on ps3. stop arguing about something you wont buy and shouldn't buy!



The only one here at this point that doesn't have a point is you. Sony didn't screw people over. Their PS3 console is a great platform with some of the best exclusive titles to date.


----------



## Easy Rhino (Nov 2, 2012)

MxPhenom 216 said:


> The only one here at this point that doesn't have a point is you. Sony didn't screw people over. Their PS3 console is a great platform with some of the best exclusive titles to date.



so removing advertised features is OK?


----------



## BigMack70 (Nov 2, 2012)

_JP_ said:


> Except that performance gains/losses are not linear.



True, but not directly related to my point that you have to crunch 4x more numbers to get 1080p 60fps than you do to get 720p 30fps.

The lack of linearity is due to a number of factors (how optimized the code is, how evenly spread the load is across all hardware subsystems e.g. GPU/CPU, etc), not due to the fact that you need some other amount of number crunching power to get 1080p 60fps relative to 720p 30fps.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> True, but not directly related to my point that you have to crunch 4x more numbers to get 1080p 60fps than you do to get 720p 30fps.
> 
> The lack of linearity is due to a number of factors (how optimized the code is, how evenly spread the load is across all hardware subsystems e.g. GPU/CPU, etc).



Eureka! We have struck common sense! Now we just need to get him to realize its been 7 YEARS since the PS3 and he might see the light.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Eureka! We have struck common sense! Now we just need to get him to realize its been 7 YEARS since the PS3 and he might see the light.



Which has nothing to do with my point. If the PS4 is going to run at 1080p 60fps, it needs to have 4x the overall power as if it were to run at 720p 30fps. It really is just basic math.

I'm not saying it has to be 4x as powerful as the PS3. I'd argue it has to be more than 4x as powerful if it really is going to be humming along at a steady 1080p 60fps, as games are going to be increasing in graphical complexity... the PS3 runs Unreal Engine 3-caliber games, for example. The PS4 is supposedly going to be running UE4-calibur games.


----------



## D4S4 (Nov 2, 2012)

this thread delivers.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Which has nothing to do with my point. If the PS4 is going to run at 1080p 60fps, it needs to have 4x the overall power as if it were to run at 720p 30fps. It really is just basic math.



Again its gains are not linear. Its better then 4x now on just hardware alone. NEVER MIND the optimizations that have been done in 7 YEARS. Hell the hardware in the PS3 is even older then 7 years now.


----------



## EpicShweetness (Nov 2, 2012)

D4S4 said:


> this thread delivers.



For real, I mean wtf I went and got breakfast, and comeback. A damn flame war erupted.


----------



## sergionography (Nov 2, 2012)

Easy Rhino said:


> none of you have a point. sony will just screw you over with ps4 like they did on ps3. stop arguing about something you wont buy and shouldn't buy!



yup thats what they get for spending billions on r&d to build better hardware which has yet more potential that they havent took advantage of years after , but instead they are slapping an apu this time which isnt bad but surely not ahead of its time


----------



## Shihab (Nov 2, 2012)

HSA in action, anyone ?


----------



## TheMailMan78 (Nov 2, 2012)

sergionography said:


> yup thats what they get for spending billions on r&d to build better hardware which has yet more potential that they havent took advantage of years after , but instead they are slapping an apu this time which isnt bad but surely not ahead of its time



Cell was never ahead of its time. Just the hype was.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Again its gains are not linear. Its better then 4x now on just hardware alone. NEVER MIND the optimizations that have been done in 7 YEARS. Hell the hardware in the PS3 is even older then 7 years now.



What in the world are you talking about and how is it at all relevant to comparing two hypothetical PS4 scenarios where one has (x) power and renders things at 720p 30fps and the other has (4x) power to render things at 1080p 60fps?

I'm not directly comparing the PS4 to PS3. I'm comparing 1080p 60fps to 720p 30fps.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> What in the world are you talking about and how is it at all relevant to comparing two hypothetical PS4 scenarios where one has (x) power and renders things at 720p 30fps and the other has (4x) power to render things at 1080p 60fps?
> 
> I'm not directly comparing the PS4 to PS3. I'm comparing 1080p 60fps to 720p 30fps.



Which is directly comparing a PS4 to a PS3.


----------



## nickbaldwin86 (Nov 2, 2012)

grosss... spex suck.

1080p @ 60FPS... WEAK!!!

1080p @ 120hz... ok now we are talking... I don't see why they can acheive this


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Which is directly comparing a PS4 to a PS3.



Huh?

Those are graphical settings, not consoles.


----------



## Frick (Nov 2, 2012)

Easy Rhino said:


> so removing advertised features is OK?



Depends on the features. 



TheMailMan78 said:


> Cell was never ahead of its time. Just the hype was.



Totally. It's still kinda cool imo but it still is a bit of a flop.

Anyway, if PS4 had PS3 and PS2 compatibility I'd be in heaven and I would sell a leg to get one. Woe is me.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Huh?
> 
> Those are graphical settings, not consoles.



Those are the graphical settings of the consoles this thread is about. Unless you are posting off topic?


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Those are the graphical settings of the consoles this thread is about. Unless you are posting off topic?




You originally said this:


> If you don't think a PS4 can't do it with DEDICATED hardware that's double the specs of the current system on a game designed for the LAST SYSTEM, who am I to argue with such intellect.



And I simply reminded you that 1080p 60fps is undeniably 4x as demanding as 720p 30fps. 

So, if the story on the PS4 is merely that it's twice as powerful as the PS3, as you here state, then it's not going to be doing anything at 1080p 60fps. If it's twice as powerful and twice as well optimized, thus making it effectively 4x as powerful, then sure, but that's not what you posted.

And I did warn that I was being picky...


----------



## lyndonguitar (Nov 2, 2012)

No_Asylum said:


> Actually ... that isnt even impressive at all.  60fps @ 1080p is slow by todays standards.



people today buy xbox 360s with 720p 30fps

1080p 60fps is enough even for PC gaming, what more for the streamlined consoles. anything more than that and its just for benchmarks.

any resolution more than 1080p and you need to have a large HDTV(which not all have) and play from farther than the normal


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> You originally said this:
> 
> 
> And I simply reminded you that 1080p 60fps is undeniably 4x as demanding as 720p 30fps.
> ...



Specs are well beyond double with current optimizations and architectural advances of 7+ years. Double is all they are in number. Not power. They are well beyond double in capability.


----------



## Frick (Nov 2, 2012)

BigMack70 said:


> And I simply reminded you that 1080p 60fps is undeniably 4x as demanding as 720p 30fps.
> 
> So, if the story on the PS4 is merely that it's twice as powerful as the PS3, as you here state, then it's not going to be doing anything at 1080p 60fps. If it's twice as powerful and twice as well optimized, thus making it effectively 4x as powerful, then sure, but that's not what you posted.
> 
> And I did warn that I was being picky...



And Mailman keep telling you it doesn't work like that, and he's right. Go play around with any modern game and you see it's not true. It's not that simple, not even generally.


----------



## Rei86 (Nov 2, 2012)

Easy Rhino said:


> so removing advertised features is OK?



What advertised features?

Every f'en gaming forum has one.  Sony stopped advertising backwards compatibility years ago.  They never pushed its ability to Folding@Home, nor did they push 'otherOS' function.  

So what advertised features?


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Specs are well beyond double with current optimizations and architectural advances of 7+ years. Double is all they are in number. Not power. They are well beyond double in capability.



Surely that would have been much easier to simply clarify, rather than attempting a nonsensical argument about how 1080p 60fps in fact does not require 4x the power of 720p 30fps?


----------



## Frick (Nov 2, 2012)

Rei86 said:


> What advertised features?
> 
> Every f'en gaming forum has one.  Sony stopped advertising backwards compatibility years ago.  They never pushed its ability to Folding@Home, nor did they push 'otherOS' function.
> 
> So what advertised features?



Advertised features are all features they say the thing has, which included "otherOS" and F@H, it has nothing to do with actual advertising.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Surely that would have been much easier to simply clarify, rather than attempting a nonsensical argument about how 1080p 60fps in fact does not require 4x the power of 720p 30fps?



It doesn't require 4 times the power with how things work today.


----------



## DarthCyclonis (Nov 2, 2012)

It would not surprise me if they run a mix GPU in this machine.  APU for less demanding games. Discrete GPU or APU+GPU for highend graphics.   Would certainly help to reduce the heat generated when idle.  There are times I thought my sisters PS3 was going to melt down.


----------



## BigMack70 (Nov 2, 2012)

Frick said:


> And Mailman keep telling you it doesn't work like that, and he's right. Go play around with any modern game and you see it's not true. It's not that simple, not even generally.



Read post #85 of this thread.



TheMailMan78 said:


> It doesn't require 4 times the power with how things work today.



So you mean to say that running game [x] at 1080p 60fps is NOT 4x as demanding as running game [x] at 720p 30fps?

Huh? Again, you fail to see that my basic point was about resolutions and framerate, not a set of 2012 hardware vs a set of 2007 hardware.


----------



## 1d10t (Nov 2, 2012)

BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> I call BS. More probably 720p upscaled



so you think APU is weak?
3,2Ghz IBM Three Core + 500Mhz Xenos can do 720p.
3,2Ghz "7" core + 550Mhz nVidia RSX can do 1080p.
Quality?please show me if you see a major difference...

PC






Xbox





PS3





more



Easy Rhino said:


> none of you have a point. sony will just screw you over with ps4 like they did on ps3. stop arguing about something you wont buy and shouldn't buy!



sony did what on PS3?



Easy Rhino said:


> so removing advertised features is OK?



such as?



sergionography said:


> yup thats what they get for spending billions on r&d to build better hardware which has yet more potential that they havent took advantage of years after , but instead they are slapping an apu this time which isnt bad but surely not ahead of its time



but it sufficient enough for such optimized environment.any titles nowaday can do well both on PC or 5-years-old-hardware console


----------



## MxPhenom 216 (Nov 2, 2012)

nickbaldwin86 said:


> grosss... spex suck.
> 
> 1080p @ 60FPS... WEAK!!!
> 
> 1080p @ 120hz... ok now we are talking... I don't see why they can acheive this



120hz is just the refresh rate of the screen. Really any platform can do that if your screen has the capability. 1080p @ 60fps is the standard right now, and not weak at all in terms of consoles.


----------



## Rei86 (Nov 2, 2012)

Frick said:


> Advertised features are all features they say the thing has, which included "otherOS" and F@H, it has nothing to do with actual advertising.



When a company no longer advertise features what do you call that


----------



## Frick (Nov 2, 2012)

BigMack70 said:


> Read post #85 of this thread.



I did.

Anyway I kinda missed the drama here. It's all so cute.


----------



## BigMack70 (Nov 2, 2012)

1d10t said:


> so you think APU is weak?
> 3,2Ghz IBM Three Core + 500Mhz Xenos can do 720p.
> 3,2Ghz "7" core + 550Mhz nVidia RSX can do 1080p.
> Quality?please show me if you see a major difference...



#1) That's not a demanding game
#2) The Xbox/PS3 render at 720p, not 1080p+ that the PC can do, so posting downsized screenshots is irrelevant because it inherently shows the consoles in their best light and removes the biggest advantage of the PC.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> 
> I call BS. More probably 720p upscaled



This is why I was TRYING to break it to you how hardware has changed and can do what they claim VERY EASY. Its not BS. Its Engineering.


----------



## lyndonguitar (Nov 2, 2012)

@BigMack70

so what are you saying, that this supposed PS4 can't run ps3 games @ 1080p 60fps? LOL


----------



## MxPhenom 216 (Nov 2, 2012)

BigMack70 said:


> Read post #85 of this thread.
> 
> 
> 
> ...



Dude, just drop it. Hardware and software has changed since 2005-2007 when current gen launched. Obviously itll take more system horse power to run at 1080p 60fps, but it won't take 4x the power. Like mailman has said this stuff is not linear, and mailman is correct on virtually all accounts.


----------



## Dos101 (Nov 2, 2012)

1d10t said:


> sony did what on PS3?
> 
> 
> 
> such as?



He's referring to the OtherOS feature that allowed you to install certain distro(s) of Linux iirc. It was advertised, then after that whole Geohot hacking thing with the PS3 Sony removed it with a firmware update, so some argues you can't remove an advertised feature like that. I agree, but some are more butthurt than others.


----------



## 1d10t (Nov 2, 2012)

BigMack70 said:


> #1) That's not a demanding game
> #2) The Xbox/PS3 render at 720p, not 1080p+ that the PC can do, so posting downsized screenshots is irrelevant because it inherently shows the consoles in their best light and removes the biggest advantage of the PC.



1.oke..so what is "a demanding game" ?
2. completely wrong.






any objection?


----------



## Atom_Anti (Nov 2, 2012)

Is that also mean GTA 5 will be optimized for A10 Trinity, so those who have Trinity laptop will be able to play on PC very nicely?


----------



## MxPhenom 216 (Nov 2, 2012)

1d10t said:


> 1.oke..so what is "a demanding game" ?
> 2. completely wrong.
> http://img.techpowerup.org/121102/ps3.jpg
> 
> ...



LOL.

It says its at 1080p, but its upscaled, Games don't actually run at native 1080p on current gen consoles. Sorry to break it to yea.


----------



## BigMack70 (Nov 2, 2012)

1d10t said:


> 1.oke..so what is "a demanding game" ?
> 2. completely wrong.
> http://img.techpowerup.org/121102/ps3.jpg
> 
> ...



I hate to break it to you and be the one to shatter your inner console fan, but...
http://forum.beyond3d.com/showthread.php?t=46241

Current consoles don't really render anything at 1080p. The 360 isn't even capable of true 1080p output and with the PS3, for anything even remotely demanding (99.9% of games), it likewise renders at ~720p and upscales.

Also, I think the # of people who have failed to understand I've been making a picky point about comparing the computing power needed for different resolutions/settings is up to 3 or 4.

*whoooooooooooooooooooosh* go my posts over your head.

There are implications in this discussion for 1080p60 vs 720p30, but my point hasn't been about those implications.

And I stand by my speculation about an APU not being able to do 1080p60. Your speculation that it can is as good as mine that it can't - it's all speculation right now - but I just don't believe their claims.


----------



## RejZoR (Nov 2, 2012)

Atom_Anti said:


> Is that also mean GTA 5 will be optimized for A10 Trinity, so those who have Trinity laptop will be able to play on PC very nicely?



Probably not, because the OS used won't be the same...


----------



## Benetanegia (Nov 2, 2012)

I'm probably late, but doing 1080p @60 fps *DEFINITELY* requires 4x the power that 720p @30 fps needs. BigMack is absolutely right on that account. 

A very different thing is that when running low resolutions some other parts (95% of times the CPU or DirectX draw calls...) become the bottleneck and hence you don't see 4x the performance at the lower resolution.

But magic does not happen on computing. If performance moving to higher res is not linear is because graphics cards have power to spare and because of that they do a better work at the higher res. On lower res or with low settings GPU resources stay unnused.


----------



## BigMack70 (Nov 2, 2012)

Benetanegia said:


> I'm probably late, but doing 1080p @60 fps *DEFINITELY* requires 4x the power that 720p @30 fps needs. BigMack is absolutely right on that account.
> 
> A very different thing is that when running low resolutions some other parts (95% of times the CPU or DirectX draw calls...) become the bottleneck and hence you don't see 4x the performance at the lower resolution.
> 
> But magic does not happen on computing. If performance moving to higher res is not linear is because graphics cards have power to spare and because of that they do a better work at the higher res. On lower res or with low settings GPU resources stay unnused.



Exactly. Things not performing exactly linearly doesn't really have any bearing on how much power 1080p60 requires relative to 720p30 - I already explained why back in post #85 - and this basically says the same thing.


----------



## MxPhenom 216 (Nov 2, 2012)

Benetanegia said:


> I'm probably late, but doing 1080p @60 fps *DEFINITELY* requires 4x the power that 720p @30 fps needs. BigMack is absolutely right on that account.
> 
> A very different thing is that when running low resolutions some other parts (95% of times the CPU or DirectX draw calls...) become the bottleneck and hence you don't see 4x the performance at the lower resolution.
> 
> But magic does not happen on computing. If performance moving to higher res is not linear is because graphics cards have power to spare and because of that they do a better work at the higher res. On lower res or with low settings GPU resources stay unnused.



No it doesn't. maybe in our desktops, but developers with consoles are able to optimize it so well, and squeeze every bit of power out of the system to be able to do it.


----------



## Benetanegia (Nov 2, 2012)

MxPhenom 216 said:


> No it doesn't. maybe in our desktops, but developers with consoles are able to optimize it so well, and squeeze every bit of power out of the system to be able to do it.



It still requires 4x the power to run 1080p60 vs 720p30...


----------



## BigMack70 (Nov 2, 2012)

Benetanegia said:


> It still requires 4x the power to run 1080p60 vs 720p30...



I don't think he understands such a simple point... he and mailman are making the point more complex by drawing inherent implications for a hardware comparison and so then they get confused and say it's wrong.

But the issue of 1080p60 vs 720p30 is a very simple one of basic math and nothing more.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> I don't think he understands such a simple point... he and mailman are making the point more complex by drawing inherent implications for a hardware comparison and so then they get confused and say it's wrong.
> 
> But the issue of 1080p60 vs 720p30 is a very simple one of basic math and nothing more.



You said you didn't think the APU could do what Sony claims and went of quoting various websites about dedicated GPU's. What I said is how things are done today is VASTLY different then 7+ years ago and that an APU can in fact meet Sony's claim as its well beyond your "4x" number due to optimization and engineering.


----------



## Deleted member 67555 (Nov 2, 2012)

OK Ok...
I think we should all get the effin point by now...
Apples to Apples it requires 4x the power to do 60fps @1080p than  30fps @720p
But it doesn't "Actually" take 4x the power because of architectural advancements and software development...


----------



## Benetanegia (Nov 2, 2012)

Also @ Mailman

When you compare an HD6000/7000 or GTX600 to RSX/Xenos or they desktop counterparts and say they are Xx faster, 7 years of iptimizations are there too. The GPU in Trinity is X amount faster optimizations included, not excluded.

But ultimmately I have no doubt the PS4 will use dedicated GPU, so it's a moot point.



jmcslob said:


> But it doesn't "Actually" take 4x the power because of architectural advancements and software development...



Software has advanced for PS3 too tho. And "power" is intangible, but as a concept it will always actually require 4x the power. Now if we go by number of shader processors + clock frequency for example, yes less are probably required for the same as efficiency went up in these years, but like I said that's part of new generations being more "powerful".


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> You said you didn't think the APU could do what Sony claims and went of quoting various websites about dedicated GPU's. What I said is how things are done today is VASTLY different then 7+ years ago and that an APU can in fact meet Sony's claim as its well beyond your "4x" number due to optimization and engineering.



I think you just got upset that I picked on the wording of one of your posts 

And my speculation that they can't do what they claim on an APU is just as good as yours that it is. You have conjecture about how well they might be able to optimize things in a console-specific way, and I have PC benchmarks showing it's crazy. Who's right? We'll find out in a couple years.

Console manufacturers aren't exactly known for their forthrightness about what their consoles can do (see: claims about 1080p this generation and claims about PS4 being able to do "4k" next generation...) so it's not exactly crazy to think that they're making deceptive claims here too...


----------



## 1d10t (Nov 2, 2012)

Dos101 said:


> He's referring to the OtherOS feature that allowed you to install certain distro(s) of Linux iirc. It was advertised, then after that whole Geohot hacking thing with the PS3 Sony removed it with a firmware update, so some argues you can't remove an advertised feature like that. I agree, but some are more butthurt than others.



Owh...i get it.Yes,Sony did it since 3.70 firmware.they aware of some sort security flaws and mambo-jambo things that could exploit it's vulnerability.they change their EULA so that only Sony could read and banning inactive / shared PSN account.just think Sony as new Apple,but more girls in it


----------



## MxPhenom 216 (Nov 2, 2012)

Benetanegia said:


> It still requires 4x the power to run 1080p60 vs 720p30...



technically yes.


----------



## TheMailMan78 (Nov 2, 2012)

Benetanegia said:


> Also @ Mailman
> 
> When you compare an HD6000/7000 or GTX600 to RSX/Xenos or they desktop counterparts and say they are Xx faster, 7 years of iptimizations are there too. The GPU in Trinity is X amount faster optimizations included, not excluded.
> 
> But ultimmately I have no doubt the PS4 will use dedicated GPU, so it's a moot point.



No its more I was comparing the RSX to the APU they are using.  He didn't/doesn't think the APU could do 60FPS in a console environment.

Wish you were here about 3 pages ago 



BigMack70 said:


> I think you just got upset that I picked on the wording of one of your posts
> 
> And my speculation that they can't do what they claim on an APU is just as good as yours that it is. You have conjecture about how well they might be able to optimize things in a console-specific way, and I have PC benchmarks showing it's crazy. Who's right? We'll find out in a couple years.
> 
> Console manufacturers aren't exactly known for their forthrightness about what their consoles can do (see: claims about 1080p this generation and claims about PS4 being able to do "4k" next generation...) so it's not exactly crazy to think that they're making deceptive claims here too...


 Um no. I already said those benchmarks do not reflect what Sony is doing.....also some of those benches are BS in unto themselves.


----------



## BigMack70 (Nov 2, 2012)

MxPhenom 216 said:


> technically yes.



Someone finally gets it!!!!!!! 

Didn't think such a simple and introduced-up-front-as-picky point would be so controversial...


----------



## Benetanegia (Nov 2, 2012)

TheMailMan78 said:


> No its more I was comparing the RSX to the APU they are using.  He didn't/doesn't think the APU could do 60FPS in a console environment.
> 
> Wish you were here about 3 pages ago



You're talking about different things man. I too think the APU can not even dream of approaching an stable minimum of 60 fps. Not even close.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> No its more I was comparing the RSX to the APU they are using.  He didn't/doesn't think the APU could do 60FPS in a console environment.
> 
> Wish you were here about 3 pages ago
> 
> Um no. I already said those benchmarks do not reflect what Sony is doing.....also some of those benches are BS in unto themselves.



And you have benchmarks that DO reflect what Sony is doing?

Nope. 

You have speculation that they're going to be able to optimize something so well that it can do leaps and bounds better than it would do on a PC - to the extent that it may even surpass the highest end video cards currently available by a decent margin!

Like I said, neither speculation is inherently any better than the other. We won't know for a couple years which is correct.


----------



## TheMailMan78 (Nov 2, 2012)

Benetanegia said:


> You're talking about different things man. I too think the APU can not even dream of approaching an stable minimum of 60 fps. Not even close.



So you don't think an APU is more powerful then a 7+ year old GPU? Because that's what you are saying.



BigMack70 said:


> And you have benchmarks that DO reflect what Sony is doing?
> 
> Nope.
> 
> ...



lol Games for the PC are ports man. NOTHING is optimized.


----------



## Benetanegia (Nov 2, 2012)

TheMailMan78 said:


> So you don't think an APU is more powerful then a 7+ year old GPU?



Honestly, it's not much more powerful (once we factor in the higher resolution), but let's not start arguing about that. Anyway 7+ year GPU does NOT do 30 fps minimum. Current consoles often get lows of 15-20 fps.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> So you don't think an APU is more powerful then a 7+ year old GPU?



That's a deceptively oversimplified question that leaves out too many variables e.g. that games are poised to become far more demanding than they are on the current console generation (see: Unreal Engine 4 etc) and that in addition to having more demanding games we're talking about making the jump from 720p30 to 1080p60.

Nobody in here is saying that an APU isn't more powerful than a 7 year old GPU. I and some others are saying that there's no way that it's so much more powerful that it will be able to make both those jumps - to substantially more demanding games and to a resolution/framerate 4x as demanding - at the same time.


----------



## erocker (Nov 2, 2012)

Benetanegia said:


> Honestly, it's not much more powerful, but let's not start arguing about that. Anyway 7+ year GPU does NOT do 30 fps minimum. Current consoles often get lows of 15-20 fps.



Not to mention that many games don't even run at 720p. If I remember correctly GTA IV ran under 720p on the PS3 and was upscaled. 

Either way this console is interesting, I'm looking forward to seeing how they squeeze performance out of an APU and a discreet card.



TheMailMan78 said:


> So you don't think an APU is more powerful then a 7  year old GPU?




Most likely the GPU portion of the APU is about the same.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> That's a deceptively oversimplified question that leaves out too many variables e.g. that games are poised to become far more demanding than they are on the current console generation (see: Unreal Engine 4 etc) and that in addition to having more demanding games we're talking about making the jump from 720p30 to 1080p60.
> 
> Nobody in here is saying that an APU isn't more powerful than a 7 year old GPU. I and some others are saying that there's no way that it's so much more powerful that it will be able to make both those jumps - to substantially more demanding games and to a resolution/framerate 4x as demanding - at the same time.



Games are not gonna be that much more demanding. Sony and MS both have confirmed that they don't expect a massive graphical jump this generation. Expect more of the same, just smoother.



erocker said:


> Most likely the GPU portion of the APU is about the same.


 Depends on the APU.


----------



## 1d10t (Nov 2, 2012)

MxPhenom 216 said:


> LOL.
> 
> It says its at 1080p, but its upscaled, Games don't actually run at native 1080p on current gen consoles. Sorry to break it to yea.





BigMack70 said:


> I hate to break it to you and be the one to shatter your inner console fan, but...
> http://forum.beyond3d.com/showthread.php?t=46241
> 
> Current consoles don't really render anything at 1080p. The 360 isn't even capable of true 1080p output and with the PS3, for anything even remotely demanding (99.9% of games), it likewise renders at ~720p and upscales.
> ...



Exactly,that's the point!
For any "casual gamer" point of view,do they need REAL 1080p?Do they have sufficient knowledge about this?
You can muster everything regarding this scaled things,but to casual gamer they already satisfied found their TV shows 1080p/60Hz info


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Games are not gonna be that much more demanding. Sony and MS both have confirmed that they don't expect a massive graphical jump this generation.



Again, speculation. I'd say that Unreal Engine 4 poses problems for this position.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> Again, speculation. I'd say that Unreal Engine 4 poses problems for this position.



Its not speculation. Sony and MS have already confirmed it.


----------



## Benetanegia (Nov 2, 2012)

TheMailMan78 said:


> Games are not gonna be that much more demanding. Sony and MS both have confirmed that they don't expect a massive graphical jump this generation.



If that's true, they are gonna fail hard imo. If 720p vs 1080p is the only thing they expect to change, who's really going to spend the cash on a new console? Fanboys, no one else. Most people already waited 2-3 years until they bought PS3/XB360 and the difference with the previous ones was massive...


----------



## erocker (Nov 2, 2012)

TheMailMan78 said:


> Depends on the APU.



Ya think?


----------



## BigMack70 (Nov 2, 2012)

1d10t said:


> Exactly,that's the point!
> For any "casual gamer" point of view,do they need REAL 1080p?Do they have sufficient knowledge about this?
> You can muster everything regarding this scaled things,but to casual gamer they already satisfied found their TV shows 1080p/60Hz info



What's your point? You got on here and posted some comparison screenshots that I pointed out were irrelevant. Then you posted a screenshot of your PS3 to supposedly argue against my point that your downsized screenshots were irrelevant.

Who here is talking about casual gamers and what they care about? 

The topic of conversation is "Can the PS4 render at 1080p60"... not "Do casual console kiddies care if their games are rendered at 1080p or rendered at 720p and upscaled?"


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Its not speculation. Sony and MS have already confirmed it.



So we can confirm that Sony and MS have speculated that they don't expect a massive graphical leap this generation... See where I'm going with this?

If there's anything to their speculation, it's that they know that they're going to be putting out crap hardware specs that can't handle a graphical leap.


----------



## TheMailMan78 (Nov 2, 2012)

Benetanegia said:


> If that's true, they are gonna fail hard imo. If 720p vs 1080p is the only thing they expect to change, who's really going to spend the cash on a new console? Fanboys, no one else. Most people already waited 2-3 years until they bought PS3/XB360 and the difference with the previous ones was massive...


 Yeah man they have said they (gamers) have come to expect a certain level of graphics and they (MS/Sony) do not want to mess with the formula. Kinda why fix what aint broken scenario. Does it suck? Yeah but that's why this generation has lasted so long. People are content now.



erocker said:


> Ya think?


 I try not to.



BigMack70 said:


> So we can confirm that Sony and MS have speculated that they don't expect a massive graphical leap this generation... See where I'm going with this?
> 
> If there's anything to their speculation, it's that they know that they're going to be putting out crap hardware specs that can't handle a graphical leap.



Cliffy B left Epic. Don't know what you are expecting.


----------



## Crap Daddy (Nov 2, 2012)

Relax. By the time PS4 will be ready Sony will be bought by Samsung and AMD by Apple so, as you might imagine, the project will be aborted.


----------



## BigMack70 (Nov 2, 2012)

Benetanegia said:


> If that's true, they are gonna fail hard imo. If 720p vs 1080p is the only thing they expect to change, who's really going to spend the cash on a new console? Fanboys, no one else. Most people already waited 2-3 years until they bought PS3/XB360 and the difference with the previous ones was massive...



Agreed. It's highly unlikely that the next consoles would prioritize a resolution/framerate increase over a graphical leap in the games, simply because as stated most people don't know and don't care that their consoles are upscaling things.

Nobody console kiddie is going to be impressed that suddenly they can play Unreal Engine 3 games at twice the resolution and framerate. They will be impressed if they can play some awesome looking Unreal Engine 4 games at upscaled 720p and 30fps just like they do now... so long as their TV lies to them and says 1080p, of course.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> Steve B left Epic. Don't know what you are expecting.



I'm not sure you understand what "speculate" means...

You and I don't know what's coming from the next generation of games graphically. Sony and M$ don't really know either. They have a better idea than you and I, but that doesn't necessarily mean much.

The demo for UE4 is out there, and it shows that there is a graphical leap coming in games. How many games make the jump, how big the jump exactly is, and if it will make it to consoles, are all different questions that we can only speculate on right now.


----------



## TheMailMan78 (Nov 2, 2012)

BigMack70 said:


> I'm not sure you understand what "speculate" means...
> 
> You and I don't know what's coming from the next generation of games graphically. Sony and M$ don't really know either. They have a better idea than you and I, but that doesn't necessarily mean much.
> 
> The demo for UE4 is out there, and it shows that there is a graphical leap coming in games. How many games make the jump, how big the jump exactly is, and if it will make it to consoles, are all different questions that we can only speculate on right now.



I'm still waiting for games to look like the UE3 demo we saw years ago.  There isn't gonna be a MEGA leap. That's a fact.


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> I'm still waiting for games to look like the UE3 demo we saw years ago.  There isn't gonna be a MEGA leap. That's a fact.



Woahhhhhhhhhh tell me where you got your time machine! I want one too!!!!!!

:shadedshu


----------



## Benetanegia (Nov 2, 2012)

TheMailMan78 said:


> I'm still waiting for games to look like the UE3 demo we saw years ago.  There isn't gonna be a MEGA leap. That's a fact.



Which demo? The one I remember games have not only matched, they have surpassed it, vastly.


----------



## Lionheart (Nov 2, 2012)

[H]@RD5TUFF said:


> And it will still be a piece of crap because it's a $ony.



You're one negative annoying troll --_--


----------



## Thefumigator (Nov 2, 2012)

Human eye can notice up to 25~30 fps, so we can worry about low peaks below those values

If the game is well programmed and has a good AMD 7000 native 3D engine, the console will deliver amazing 3d quality. Also 3 things come to my mind, 

1 - sony will compete with the rest, against other consoles that have not much better hardware anyway, as far as price goes, if you consider it in the formula, no one can really come out with a super console without making it impossible to buy.

2 - A10 is _today's_ top trinity apu, but won't be the only one. Think about A12, A14, A16, I mean, we don't really know how AMD will refresh its line of apus, we only know the line will be compatible.

3 - Developers had to optimize multithreading on the PS3 very complex architecture. So adopting the A10 should make things easier, and dual GPU would be a breeze. Several games do work nice on dual gpu configs on PC. Its just about following the developing methods learnt in these cases.


----------



## BigMack70 (Nov 2, 2012)

Thefumigator said:


> Human eye can notice up to 25~30 fps, so we can worry about low peaks below those values



Didn't expect to see a "human eye can only see [x] fps" troll in this thread 

I'm not even gonna get into that nonsense...


----------



## Dent1 (Nov 2, 2012)

BigMack70 said:


> Surely that would have been much easier to simply clarify, rather than attempting a nonsensical argument about how 1080p 60fps in fact does not require 4x the power of 720p 30fps?



Surely, if TheMailMan78 was talking nonsense why is nobody agreeing with you? 

1080p @ 60FPS is possible. Obviously it depends on the game in question. i.e. a linear game like the Call of Duty Campaign is more likely to hit 60FPs than a complete open world game like GTA.


----------



## lyndonguitar (Nov 2, 2012)

BigMack70 said:


> Didn't expect to see a "human eye can only see [x] fps" troll in this thread
> 
> I'm not even gonna get into that nonsense...



are you trolling?



Dent1 said:


> Surely, if TheMailMan78 was talking nonsense why is nobody agreeing with you?
> 
> 1080p @ 60FPS is possible. Obviously it depends on the game in question. i.e. a linear game like the Call of Duty Campaign is more likely to hit 60FPs than a complete open world game like GTA.



yea its possible, and EASY.


----------



## BigMack70 (Nov 2, 2012)

lyndonguitar said:


> are you trolling?



I'm procrastinating. 

The argument "human eye can only see [x] fps" is nonsense and I'm not going into it. All I'll say on that is: 
Go get 3 monitors. Set one to 30 Hz refresh rate. Set another to 60. Set another to 120. Drag some windows around on your desktop. You tell me if you notice any differences.



Dent1 said:


> Surely, if TheMailMan78 was talking nonsense why is nobody agreeing with you?



Read the thread? Maybe posts like #127, 130, and especially 134.


----------



## TheMailMan78 (Nov 2, 2012)

Benetanegia said:


> Which demo? The one I remember games have not only matched, they have surpassed it, vastly.



Talking about the "Good Samaritan" video that came out last year. Its UE3.



BigMack70 said:


> Didn't expect to see a "human eye can only see [x] fps" troll in this thread
> 
> I'm not even gonna get into that nonsense...



This time I agree with you. lol


----------



## BigMack70 (Nov 2, 2012)

TheMailMan78 said:


> This time I agree with you. lol



See, I can be reasonable 



All I'm doing is taking a different position regarding speculation about the claims in this article and about what the next generation of console gaming might be like graphics-wise. You're not anymore right or wrong than I am, because neither of us knows. Like I said, we'll find out in a year or two.


----------



## Benetanegia (Nov 2, 2012)

Thefumigator said:


> if you consider it in the formula, no one can really come out with a super console without making it impossible to buy.



The cost difference between the crappy GPU on an APU and a more than decent GPU like HD7870 is <$10 for AMD. Nvidia/AMD have been getting a lot less than that on console licenses. How much do you think it really costs Sony/M$? Not much more. And that is right now, as new processes are released it would cost them much less, just like it happened with PS3 and XB360. They are not doing it because of not being able to "competitively" release a console with better GPU...



TheMailMan78 said:


> Talking about the "Good Samaritan" video that came out last year. Its UE3.



You said several years ago. UE3 released with the consoles, mostly at the same time. Samaritan is not really UE3. It's technically UE3.5 and realistically UE4 sans a couple things.


----------



## TheMailMan78 (Nov 2, 2012)

Benetanegia said:


> The cost difference between the crappy GPU on an APU and a more than decent GPU like HD7870 is <$10 for AMD. Nvidia/AMD have been getting a lot less than that on console licenses. How much do you think it really costs Sony/M$? Not much more. And that is right now, as new processes are released it would cost them much less, just like it happened with PS3 and XB360. They are not doing it because of not being able to "competitively" release a console with better GPU...
> 
> 
> 
> You said several years ago. UE3 released with the consoles, mostly at the same time. Samaritan is not really UE3. It's technically UE3.5 and realistically UE4 sans a couple things.


 Yeah I realize that. I was looking for the fire fight demo but I cant seem to find it on youtube. It looked CG by todays standards even and they said it was "Realtime"


----------



## 1d10t (Nov 2, 2012)

BigMack70 said:


> What's your point? You got on here and posted some comparison screenshots that I pointed out were irrelevant. Then you posted a screenshot of your PS3 to supposedly argue against my point that your downsized screenshots were irrelevant.
> 
> Who here is talking about casual gamers and what they care about?
> 
> The topic of conversation is "Can the PS4 render at 1080p60"... not "Do casual console kiddies care if their games are rendered at 1080p or rendered at 720p and upscaled?"



i'm not arguing about render capability,nor jumping into PS4 render debate.You may quote any of my last posts.
just showing there's no difference between real / upscale resolution.Why Sony choose APU?Definitely Sony knew something that we don't.Console are console,targeted for most casual gamer whose doesn't even bother about upscale.


----------



## lyndonguitar (Nov 2, 2012)

thing is, 1080p is shit, now that they got 4k Monitors, 

I think they are really planning to run REAL 1080p and upscaling @ 4k


----------



## BigMack70 (Nov 2, 2012)

1d10t said:


> just showing there's no difference between real / upscale resolution.



No, you're showing that there's no difference between 720p downscaled to ~780x440 and 1080p downscaled to ~780x440, which is completely irrelevant.


----------



## lyndonguitar (Nov 2, 2012)

1d10t said:


> just showing there's no difference between real / upscale resolution.



there IS a difference, if theres no difference then I would just render my game at 10 x 10 and upscale it to 1920 x 1080 right?


----------



## Akrian (Nov 2, 2012)

Call me a non-believer, but untill I see that APU and GPU pumping UE4 with exactly the same quality it was shown in the tech. demo at 1080p and constant 60 fps, I call BS. UE4 required what, like two 680s to run the way it looked in the demo  ?


----------



## 1d10t (Nov 2, 2012)

BigMack70 said:


> No, you're showing that there's no difference between 720p downscaled to ~780x440 and 1080p downscaled to ~780x440, which is completely irrelevant.



irrelevant to what topic?im not arguing about "how it render" but "how it show".besides that,Sony had to "upscale" to meet with all TV standards like NTSC J/M or PAL B/G 



lyndonguitar said:


> there IS a difference, if theres no difference then I would just render my game at 10 x 10 and upscale it to 1920 x 1080 right?



read my earlier post sir 
you could do that?10x10 pixels?


----------



## semantics (Nov 2, 2012)

Lol all these people 1080p at 60hz isn't terribly hard to do it just depends on what else you want to do. Plus with consoles there are alot of tricks and optimizations that you'll get that a PC doesn't get just due to you only have to develop for 1 thing you can make it as specialized to that set of hardware as you want no need to cater to 100 different set ups. That being said amd's apu's blow XD And really discrete + the apu graphics, sony must really hate developers, it shows in how you get shit to work in ps3 i guess they really want people to hate making shit for their product.


----------



## Lionheart (Nov 2, 2012)

Gotta love this site but seriously is it me or has this site attracted so many negative trolls for the pass several years......I missed the days when Tpuer's actually helped one another with tech issues (We still do) but it just turns into an argument fest....seems like nearly every forum I click on is like this.............

Tpuer - Shutup fools, my opinion matters more then yours!!:shadedshu

Tpuer - No it doesn't, it doesn't even make sense, my opinion is more better!

BigMack70 - Both of you idiots shutup...clearly mines more important then anyone on this site!!!!

Erocker - All you idiots stay on topic or Ima give yo ass an infraction....regards tpu moderator 

Wizzard - The review for the new AMD HD10970 16GB is up foolz...check it out 

BigMack70 - The HD10970 is shit! Can't even play Battlefield 6 on max settings......Nvidia GTX 980 ftw noobs!!!

*Lionheart* - 

^ This thread in a nutshell 

Anyways that's my retarded rant for today....

About the PS4 TBH I'm really looking forward to it.. for exclusive titles but mainly so nextgen consoles can finally have DX11 support therefore we will start to see some better quality PC games..instead of shitty ports


----------



## BigMack70 (Nov 2, 2012)

1d10t said:


> irrelevant to what topic?im not arguing about "how it render" but "how it show".



You posted downscaled pictures which have no relevance to anything in this topic and which definitely have no relevance to "how it renders" or "how it shows" because you're doing something completely opposite - rather than upscaling the low resolution images, you're downscaling the high resolution one (though you, for good measure, downscaled them all ). 

If you want to compare upscaled with real resolution, then you need to post two pictures - one  with dimensions 1080p which was originally rendered at 720p and then upscaled, and then that same picture with dimensions 1080p but which was natively rendered at 1080p. Alternatively, if you have a 1080p monitor, you could just look at a picture in 720p zoomed in to full screen vs that same picture at 1080p with no zooming in.

There's a big difference.

I don't think you know what you're talking about.


----------



## esrever (Nov 2, 2012)

*ps3 has a 7900m...*

the A10 APU is at least 8x the rendering performance of the current ps3 hardware...
384SPs vs the 16 in the 360 which is comparable to the ps3 = 24x the general shading performance. I don't see how there would be a problem rendering only 4x as many pixels...


----------



## BigMack70 (Nov 2, 2012)

esrever said:


> the A10 APU is at least 8x the rendering performance of the current ps3 hardware...
> 384SPs vs the 16 in the 360 which is comparable to the ps3 = 24x the general shading performance. I don't see how there would be a problem rendering only 4x as many pixels...



We'll see before too long...


----------



## esrever (Nov 2, 2012)

BigMack70 said:


> We'll see before too long...



I am pretty sure the end resulting console will have different memory management and might even have dedicated on die ram so I wouldn't be surprised if it can do 1080p 60fps average with 4x AA.


----------



## Benetanegia (Nov 2, 2012)

esrever said:


> 384SPs vs the 16 in the 360 which is comparable to the ps3 = 24x the general shading performance. I don't see how there would be a problem rendering only 4x as many pixels...



Apples to oranges. Different ways of doing things. But in general:

- The 360 had *48* SP not 16.
- These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
- So while it's still kinda apples to oranges this is FAR more accurate than what you posted:

XB360 = 48 x 5 = 240 "SP" (HD2900/3800 had 320 SP, it actually had 64 VLIW5 SP)
APU = 96 x 4 = 384 SP

As you can see not 8x faster and not much faster than 7 year old cards.

The PS3 suffers of the same BAD way of comparing things. RSX had 24 pixel shaders, but also 8 vecter shaders, so that "equals" to 32 unified Nvidia SP, but again like with Xenos, these could do up to 5 ops/cycle vs 2 op/cycle in the current unified shaders. So it's more like 32 x 2.5 = 80 "SP", versus the 128 SPs in the 8800.


----------



## alwayssts (Nov 2, 2012)

Been saying this would be the case...Comparable to 1080p @ 60fps on contemporary pc titles at launch.  

Between the nexbox and the ps4, I've long felt the dev kits were something like 256 + 1024 (salvage trinity + 7850ish gpu) eventually moving to 384 + 896sp (28nm fully-working trinity + an example of what 8770 could be).  There are other possible configs of course, but something like that.

Oh look, the final dev kit comes when 8000 launches...who would've thought?  Wonder why?  

896sp/16 ROPs on a 128-bit bus could run 950/6000 very efficiently, for example....and be very, very close to the avg potency of 7850...which on average is going to net you about ~45fps at 1080p.   With the APU adding perhaps ~40% more resources you're very much at the 1080p60 level for most titles...assuming they find a way to make them operate seamlessly.


----------



## Binge (Nov 2, 2012)

The number of completely uneducated and uninformed individuals making statements other than 'Wow!  I wonder how this will work out," are staggering.  The people I'm referring to know who they are.

The hardware designed for consoles is specialized and commissioned to suit a specific work-load.  They are the min-maxed gaming machines using the absolute lowest common denominator of hardware to achieve optimized results.   Again whatever they select may be based off of an architecture but it will most certainly have its own specialized cpus and gpus which suit the console's intended purpose.  A target BASELINE performance of 1920x1080 at 60Hz is not impressive to us PC users but this is designed for TV play.  With this limitation in mind it allows the system to use hardware to specifically drive those pixels with the utmost efficiency.  This actually allows for less powerful hardware to achieve better results than it would in a PC environment... so for the most part this won't be evolving the console market past TV-HD, and honestly it may be a short-lived bump.  The console devs are able to supply the mass with something for cheap... driving a lot of game sales and keeps the console market strong has got to be a balancing act for these industry giants.  I bet the cost per console will be about $150 but they will be selling them for about $400.  Something they weren't able to achieve with the 360 or PS3.  For once making a console might net them a profit.


----------



## 1d10t (Nov 2, 2012)

BigMack70 said:


> You posted downscaled pictures which have no relevance to anything in this topic and which definitely have no relevance to "how it renders" or "how it shows" because you're doing something completely opposite - rather than upscaling the low resolution images, you're downscaling the high resolution one (though you, for good measure, downscaled them all ).
> 
> If you want to compare upscaled with real resolution, then you need to post two pictures - one  with dimensions 1080p which was originally rendered at 720p and then upscaled, and then that same picture with dimensions 1080p but which was natively rendered at 1080p. Alternatively, if you have a 1080p monitor, you could just look at a picture in 720p zoomed in to full screen vs that same picture at 1080p with no zooming in.
> 
> ...



I never posted any downscaled image.I quote from another site and show the source.Next post is camera capture shows 1080p60Hz on TV info to countermeasure your opinion regarding lack of 1080p capabilities on PS3.You suggesting a proper method for comparing between these two,you may quote any of my last post,did i mention "rendered"?

If you didn't have three of them,PS3 XBox and PC,why arguing?


----------



## L|NK|N (Nov 2, 2012)

Hmm. What I gather from this thread is a bunch of arguing over gfx capability and or processing/horsepower. In my opinion hardware does not mean a thing if the game(s) made for that hardware suck balls. No wonder developers push games with graphics over gameplay. Makes me want to dust off the old NES, SNES, or PS1 and relish the glory days of gaming. 

On Topic: I think people will be surprised with the results of these APUs in the coming consoles.


----------



## esrever (Nov 2, 2012)

Benetanegia said:


> Apples to oranges. Different ways of doing things. But in general:
> 
> - The 360 had *48* SP not 16.
> - These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
> ...


guess I remembered wrong but that is not correct either. The inefficiencies in the original R600 designs were made the performance extremely low. The trinity A10 is exactly 1/4 of a 6970. the 360 performs like a 2600xt. 
the 2600xt gets 933 3dmark vantage, the 6970 gets 21k. If trinity wasn't bandwidth constrained it would get more than 5k which is an almost 6x as much performance increase. This is from outdated and inefficiency software system. 8x the performance is what sony quotes I think. Which given moore's law, its very easily done.


----------



## BigMack70 (Nov 2, 2012)

1d10t said:


> I never posted any downscaled image.I quote from another site and show the source.Next post is camera capture shows 1080p60Hz on TV info to countermeasure your opinion regarding lack of 1080p capabilities on PS3.You suggesting a proper method for comparing between these two,you may quote any of my last post,did i mention "rendered"?
> 
> If you didn't have three of them,PS3 XBox and PC,why arguing?



I already addressed your claims of the PS3 rendering at 1080p. It's nonsense. Go look at the link I posted that shows you the resolution PS3 renders various games in. 99.9% of them are upscaled 720p.

And I know that those images were from another site. They're all heavily downscaled, even on the original site. Not 1080p. Therefore irrelevant.

I already told you what you need to do if you want to compare real vs upscaled resolution. I don't see you doing it...

Like I said, you don't seem to know what you're talking about.


----------



## Frick (Nov 2, 2012)

lyndonguitar said:


> thing is, 1080p is shit, now that they got 4k Monitors,



No it isn't. Stupid argument.


----------



## BigMack70 (Nov 2, 2012)

Here 1d10t, let me do your job for you and we'll take a look at 720p upscaled vs 1080p...

Here's an example of a native 1080p screenshot:






And here's that same screenshot upscaled from 720p to 1080p:





HUGE difference.


----------



## Dent1 (Nov 2, 2012)

BigMack70 said:


> Read the thread? Maybe posts like #127, 130, and especially 134.



So 3 posts out of 189, you are still losing


----------



## Ravenas (Nov 2, 2012)

TheMailMan78 said:


> If I were you I would be mad also about the Linux thing.  However with that being said you are not the average user. Most people are very happy with the PS3.
> 
> 
> 
> No. It would need to work 4x more efficient. Something that's been done over the past 7 YEARS.



Agreed. The PS3 is a very good all around entertainment system. Nothing Sony has done has "screwed" any of their customers in respect to the PS3.


----------



## BigMack70 (Nov 2, 2012)

Dent1 said:


> So 3 posts out of 189, you are still losing



Not on that particular argument... maybe read the thread? Notice anyone still trying to argue that 1080p60 isn't 4x as demanding as 720p30? Your original post just reflects a lack of reading comprehension as even Phenom realized that my argument was correct.

As for rather the PS4 will be doing 1080p60, that's anyone's guess. Some think no, myself among them, others think yes. No way to know who's right until the hardware is released.


----------



## TheGuruStud (Nov 2, 2012)

I vote that a mod deletes everything except the screenshots lol


----------



## 3870x2 (Nov 2, 2012)

LiNKiN said:


> Hmm. What I gather from this thread is a bunch of arguing over gfx capability and or processing/horsepower. In my opinion hardware does not mean a thing if the game(s) made for that hardware suck balls. No wonder developers push games with graphics over gameplay. Makes me want to dust off the old NES, SNES, or PS1 and relish the glory days of gaming.
> 
> On Topic: I think people will be surprised with the results of these APUs in the coming consoles.



Can't disagree with you there.


----------



## 1d10t (Nov 2, 2012)

BigMack70 said:


> I already addressed your claims of the PS3 *rendering* at 1080p. It's nonsense. Go look at the link I posted that shows you the resolution PS3 renders various games in. 99.9% of them are upscaled 720p



*sigh
how many times i have to say this...

read my post #1,#2,#3,#4.i never mention any of word "rendering" of yours.read carefully,i did argued about how it show on TV...none the less. 



> And I know that those images were from another site. They're all heavily downscaled, even on the original site. Not 1080p. Therefore irrelevant.



Images provided to show any differences between these 3 platform.



> I already told you what you need to do if you want to compare real vs upscaled resolution. I don't see you doing it...
> 
> Like I said, you don't seem to know what you're talking about.



Great,now i'm force to do something.Maybe you know more sir,but that don't justified your attitude towards another member.



BigMack70 said:


> Here 1d10t, let me do your job for you and we'll take a look at 720p upscaled vs 1080p...
> 
> Here's an example of a native 1080p screenshot:
> http://images.eurogamer.net/articles//a/1/3/3/2/5/9/2/1080p1_pc.jpg.jpg
> ...



Okay,what differences?Image quality?Jaggies?Colour?


----------



## alwayssts (Nov 2, 2012)

Benetanegia said:


> Apples to oranges. Different ways of doing things. But in general:
> 
> - The 360 had *48* SP not 16.
> - These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
> ...




Ah yes...xenos.  The chip with avg texture usage is 15us:1tmu...more optimal than even the 16 shader:1tmu or 14:1 nvidia use today (a big reason radeons gets crapped on for texturing).  Odd since it has been that way since the dawn of dx10...and this preceded that.  Sometimes you wonder about those 'experimental' chips like xenos that were kind of crazy brilliant.  Why give it unneeded rops?  Why give it unneeded bandwidth?  Why give it unbalanced texture ability?  Then...they create R600.  WTF.

When you see ratios finely and finally perfected in Kepler: 14 shading + sfu: 1 tmu (slight overkill on texturing but better than under like amd), optimal amount of smaller sfus compared to true shaders (32:192...aka perfect...no idea if overall better use of space than amd doing it in shader) or the total amount of shader/sfu per array corresponding with rops (224 total units...around 230 is the sweetspot for scaling with 4 rops...pretty much perfect given how an array has to be set up)...it really makes you wonder where all the mad scientists at ATi went.  They used to own that turf...now all they have is shader density/flexability...which granted helps a metric ton, but still...a carry over from years and years ago.  

I remember when I used to discuss this stuff on B3D...those were the days.


----------



## BigMack70 (Nov 2, 2012)

@1d10t

Ok point taken about what the PS3 "shows". It "shows" a 1080p image that is upscaled from a 720p source (when in game).

But you do realize that the screenshots you posted were all downscaled below what "shows" on the TV, and therefore are irrelevant, right?

If you can't see a difference between the two screens I posted, you probably need glasses.


----------



## 3870x2 (Nov 2, 2012)

BigMack70 said:


> Not on that particular argument... maybe read the thread? Notice anyone still trying to argue that 1080p60 isn't 4x as demanding as 720p30? Your original post just reflects a lack of reading comprehension as even Phenom realized that my argument was correct.



You ever argue with a brick wall? we all just did.  Just when we thought we were going to convince you, we realized that a brick wall is an inanimate object.


----------



## OneCool (Nov 2, 2012)

I hope Sony wise's the F up and does something about its DRM.

cinavia makes me want to smash my ps3 sometimes


----------



## BigMack70 (Nov 2, 2012)

3870x2 said:


> You ever argue with a brick wall? we all just did.  Just when we thought we were going to convince you, we realized that a brick wall is an inanimate object.



So you argue that 720p --> 1080p (2.2x) along with 30fps --> 60fps (2x) somehow yields something other than a ~4x increase in computational difficulty?

I don't know why I keep having to state the obvious: it's basic math. In the case of this argument, I am not and never was talking about PS3 vs PS4 or 2006 vs 2012 hardware. I'm making a picky point about basic kindergarten level math.

I'm absolutely amazed so many people have failed to understand that. :shadedshu


----------



## Binge (Nov 2, 2012)

Yup... completely ignored.  Why are the non-engis/programmers arguing over specs which may not even appear in the end system.  These chips aren't the same you can buy for PCs and have less to do with performance than how well the software is written for the platform...

One of the best examples of this is ICO for the PS2.  The ways they got that game to look the way it did on that system were genius, and so specific the team which remade the game in HD had to re-code the engine to support more resolutions/objects displayable on screen.


----------



## repman244 (Nov 2, 2012)

This thread is just....


----------



## Binge (Nov 2, 2012)

repman244 said:


> This thread is just....
> 
> http://i46.tinypic.com/52b7k3.gif



I love that show.


----------



## ChidoriHV (Nov 2, 2012)

If the next gen consoles could run Battlefield 3 Ultra settings style games @1080p at 60 FPS average and min 30FPS, then i would be a very happy man. Albeit BF3 Ultra setting at 30-40FPS is pretty unplayable. About human eye not noticing past 30FPS... thats really bull... i can see the ghosting difference between a 2 and 5ms screen. You don't focus on those things all the time, but when you notice them... they annoy the hell out of you.

- Accually the worst thing on PS3 and Xbox360 right now is that when you run games at 720P/1080P - The more graphically intense games have a pretty noticeable stutter/lag in them.


----------



## Lionheart (Nov 2, 2012)

repman244 said:


> This thread is just....
> 
> http://i46.tinypic.com/52b7k3.gif



hahahahaha


----------



## 1nf3rn0x (Nov 2, 2012)

The Reality Synthesizer is a modified GeForce 7800. It has a core clock of 550 mhz, 24 pixel-shader pipes each capable of 27 flops/cycle, and 8 vertex pipes capable of 10 flops/cycle, for a total flops performance of 
550mhz x (24x27 + 8x10) = 550 x 728 = about 400 gigaflops.

This is the gpu from a Trinity A10-5800K APU






Compare the pair. With 7 years of optimization, more efficient game engines, most likely more video memory, RAM, processors and in total more powerful hardware full 1080p @ 60fps seems easily done to me.


----------



## v12dock (Nov 2, 2012)

So a PS4 emulator will be very doable


----------



## 1nf3rn0x (Nov 2, 2012)

v12dock said:


> So a PS4 emulator will be very doable



Emulators are made by indie developers, hence why we can just emulate ps2 games @ full speed. The amount of coding a single or a few people can do and optimizations for a completely different platform to a console is pretty hard. No need to be sarcastic.


----------



## Binge (Nov 2, 2012)

v12dock said:


> So a PS4 emulator will be very doable



No, no it will not.  The hardware will be LIKE what we have in PCs, but in reality the specialty chips they will make will behave much differently and will be addressed differently in the system.


----------



## Rowsol (Nov 2, 2012)

Good for AMD.  If 2 gpus are going to be powering the games it should make sli/crossfire that much easier to code for when we get the ports.  I for one can't wait for the new consoles, even though I won't be buying one.  Going to be another year though... pfft


----------



## Steevo (Nov 2, 2012)

There were some people who made a "lite" version of XP a few years ago that was bootable, it made only mild differences in the performance of the games and applications they ran and tried. 


The difference here is the lack of other configurations. Optimize for one machine, with one set of known instructions. The difference is at the software level of the application, much like a patch can improve frame rates, and a GPU driver can increase performance.


----------



## Delta6326 (Nov 2, 2012)

I don't know about you people. But when I play games I spend 90% of my time with the story and gameplay and could care less about visual quality. Remember the days of NES and 64 when it was actually fun to play and you didn't just spend your whole time staring at a screen and saying OH OH look I found a rock that's not properly rendered!! Let me go online and thread crap the internet over it.


----------



## Benetanegia (Nov 2, 2012)

esrever said:


> guess I remembered wrong but that is not correct either. The inefficiencies in the original R600 designs were made the performance extremely low. The trinity A10 is exactly 1/4 of a 6970. the 360 performs like a 2600xt.
> the 2600xt gets 933 3dmark vantage, the 6970 gets 21k. If trinity wasn't bandwidth constrained it would get more than 5k which is an almost 6x as much performance increase. This is from outdated and inefficiency software system. 8x the performance is what sony quotes I think. Which given moore's law, its very easily done.



The XB360 GPU is much more capable than 2600xt. A hell of a lot more capable, and so is RSX. 2600xt was 1/4 a HD2900, while Xenos was more like 2/3. Your 3dmark numbers are irrelevant and unrelated. Trinity GPU is not much more capable than the old HD2900 and by extension the Xenos GPU. And by this I mean generationally speaking, yes it's maybe 4x, maybe 6x faster in some regards, but nearly any dedicated GPU is 10x or 20x faster. APUs are also much more limited by memory bandwidth in it's PC form (which makes it much much slower than HD6570, the chip that can be said t be 1/4 a HD6970), but that's something I expect them to solve in the PS4. 

All in all it's incredibly irrelevant whether it's 4x-6x or 8x faster, it' won't do new games at steady 1080p@60fps unless it's at least one order of magnitude faster. Same games as the PS3 (old games)? Yes of course, but what's the point. If a resolution upgrade is the only thing the new consoles are going to offer, I don't think anyone's going to pay so much for a resolution upgrade, unless the consoles sell for <$200. I mean yeah, people are stupid, but if after 7 years dismissing it, they suddenly awaken to the fact that 1080p, true 1080p, is much better and they decide they now need to shell out $400 just for that upgrade... I swear I'll visit them one by one and punch them in the face. No kidding.



Delta6326 said:


> I don't know about you people. But when I play games I spend 90% of my time with the story and gameplay and could care less about visual quality. Remember the days of NES and 64 when it was actually fun to play and you didn't just spend your whole time staring at a screen and saying OH OH look I found a rock that's not properly rendered!! Let me go online and thread crap the internet over it.



I remember those days. They sucked bad. I spent all days knowing how much better my uncle's PC games were compared to the crap I was forced to play because I was a child.

Better graphics don't make games worse. There's games with bad graphics that suck too, etc. One thing does not exclude the other. Better graphics means more immersion and in some games that contributes and makes it 100x better. i.e. Metro 2033 with worse graphics would have not had the same atmosphere and wouldn't have been so immersing.


----------



## Thefumigator (Nov 3, 2012)

BigMack70 said:


> Didn't expect to see a "human eye can only see [x] fps" troll in this thread
> 
> I'm not even gonna get into that nonsense...



First of all, I'm not trolling. Second, what I said is correct, and its not something to take as lightly. But of course I made a mistake of applying it to gaming, and it really is another story that movies. (As I am a professional video editor, and not a gamer, at all)

Just to polish my argument, take a bluray movie, if its NTSC it runs at 30fps. You don't see any movie losing frames at that framerate. Even a computer animated movie like toy story, unless the disc gets dirty or something and the thing begins stuttering. Of course, toy story was smoothly rendered prior its final release, not real time rendering. This makes difference in games, talking with my colegues about it, it may happen that motion blur and other things don't look good when rendering at 30fps and more frames are needed indeed.

Also, in a real time rendering scene, a colegue of mine demoed to me a 3D scene where some elements were rendered at a different pace. Yes it may happen, depending on the developer of course, that some elements could be rendered on faster framerates than others, on the same scene. So of course, a videocard that reaches 120frames per second may be able to overcome this issue very easily.


----------



## semantics (Nov 3, 2012)

ChidoriHV said:


> If the next gen consoles could run Battlefield 3 Ultra settings style games @1080p at 60 FPS average and min 30FPS, then i would be a very happy man. Albeit BF3 Ultra setting at 30-40FPS is pretty unplayable. About human eye not noticing past 30FPS... thats really bull... i can see the ghosting difference between a 2 and 5ms screen. You don't focus on those things all the time, but when you notice them... they annoy the hell out of you.
> 
> - Accually the worst thing on PS3 and Xbox360 right now is that when you run games at 720P/1080P - The more graphically intense games have a pretty noticeable stutter/lag in them.



You do know Ghosting a FPS have nothing to do with each other, and just because a monitor is rated gtg 2ms or 5ms does mean a 2ms would look the same as the other 2ms gtg. And Ghosting has to do with the monitor ability to change from one color to another or from on to off, when it's slower ghosting occurs where you get sorta a bleeding of the previous frames, fps in a game is independent of that. 

I'd also point out that in 1 second 1000ms 30 frames equates to a difference of 33.3ms per a frame 60 fps results in 16.6ms per frame, to get down to your 2ms it would take a 500fps monitor. Also i'd point out if you believe you can actually see the difference between 2 and 5ms you're delusional that means you can see humming bird sings flap flawlessly, the bleeding effect in ghosting varies by monitor have ghosting as long as 100ms+ in duration in early monitors as long as 300ms+. 2ms GTG is not a good measure of how the overall monitor handles a picture it's insinuates it's faster at turning on and off and changing color of pixels but it's not a 100% flawless.


----------



## Xzibit (Nov 3, 2012)

Am I the only one that read the 3rd paragraph of that linked article



> There are to be four versions of the dev kit, we were told. A previous version was essentially just a graphics card. The version shipping now is a “modified PC,” and the third version, appearing in January, will be close to final spec. A final version will be delivered to developers “next summer”.



Dev kit will change 2 more times.

So how many pages is the thread for the first Dev kit at ?

Cant wait for the next 2 dev kit threads


----------



## BigMack70 (Nov 3, 2012)

Thefumigator said:


> First of all, I'm not trolling. Second, what I said is correct, and its not something to take as lightly. But of course I made a mistake of applying it to gaming, and it really is another story that movies. (As I am a professional video editor, and not a gamer, at all)
> 
> Just to polish my argument, take a bluray movie, if its NTSC it runs at 30fps. You don't see any movie losing frames at that framerate. Even a computer animated movie like toy story, unless the disc gets dirty or something and the thing begins stuttering. Of course, toy story was smoothly rendered prior its final release, not real time rendering. This makes difference in games, talking with my colegues about it, it may happen that motion blur and other things don't look good when rendering at 30fps and more frames are needed indeed.
> 
> Also, in a real time rendering scene, a colegue of mine demoed to me a 3D scene where some elements were rendered at a different pace. Yes it may happen, depending on the developer of course, that some elements could be rendered on faster framerates than others, on the same scene. So of course, a videocard that reaches 120frames per second may be able to overcome this issue very easily.



Oh no... the only thing worse than the "human eye can't see more than [x] fps" troll is the "human eye can't see more than [x] fps" person who seriously believes that argument.

It's bogus. It's been debunked a billion times and has no credibility at all. It's not worth going into. Like I said, if you think the human eye can't see more than 25-30fps, go get a 120 Hz monitor, set its refresh rate to 30 Hz, 60 Hz, and 120 Hz respectively, and drag some windows around the desktop. You tell me if you see a difference.

:shadedshu


----------



## EpicShweetness (Nov 3, 2012)

WOULD JUST ALL SHUT THE FUCK UP! ESPECIALLY YOU BIGMACK70 HOLY SHIT YOUR SUCH A CHILD!! I went to go do my work for the day and after a simple post I come back you've argued over resolution with some kids OVER 9 GOD DAMN PAGES!! LET IT GO SHIT!!!


----------



## KainXS (Nov 3, 2012)

bigmack, . . . . . . . that name . . . . . . . sounds so familiar  . . . . . . and you act just like that guy did . . . . . . .. :shadedshu


----------



## BigMack70 (Nov 3, 2012)

EpicShweetness said:


> WOULD JUST ALL SHUT THE FUCK UP! ESPECIALLY YOU BIGMACK70 HOLY SHIT YOUR SUCH A CHILD!! I went to go do my work for the day and after a simple post I come back you've argued over resolution with some kids OVER 9 GOD DAMN PAGES!! LET IT GO SHIT!!!



I think you're taking the internet too seriously


----------



## Nihilus (Nov 3, 2012)

*New egg selling PS4s for $438*

I N WIN BP655.200BL Black Steel Mini-ITX Desktop Computer Case 200W Power Suppl
    	 $44.99

        Western Digital WD Blue WD5000AAKX 500GB 7200 RPM SATA 6.0Gb/s 3.5" Internal Hard Drive -Bare Drive
        $69.99

        G.SKILL Ares Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 2133 (PC3 17000) Desktop Memory Model F3-2133C9D-8GAB
        $60.99

        ASUS Black Blu-ray Drive SATA Model BC-12B1ST/BLK/B/AS
        $54.99

        AMD A10-5800K Trinity 3.8GHz (4.2GHz Turbo) Socket FM2 100W Quad-Core Desktop APU (CPU + GPU) with DirectX 11 Graphic AMD Radeon HD 7660D AD580KWOHJBOX 
        ASRock FM2A75M-ITX FM2 AMD A75 (Hudson D3) SATA 6Gb/s USB 3.0 HDMI Mini ITX AMD Motherboard 
        $207.98 

    Subtotal: 	$438.94


----------



## BigMack70 (Nov 3, 2012)

KainXS said:


> bigmack, . . . . . . . that name . . . . . . . sounds so familiar  . . . . . . and you act just like that guy did . . . . . . .. :shadedshu



Huh?


----------



## Deadlyraver (Nov 3, 2012)

I'm just happy that the console generation, after so many years, has picked up in graphics capability.


----------



## Super XP (Nov 3, 2012)

The only issue with these new consoles is most of our 3+ year old PC's are superior to these new consoles which will once again be aged at release.

But of course it's a he'll of a lot better than still mucking around with current ones that's been blowing down gaming quality IMO.


----------



## 1d10t (Nov 3, 2012)

BigMack70 said:


> @1d10t
> 
> Ok point taken about what the PS3 "shows". It "shows" a 1080p image that is upscaled from a 720p source (when in game).
> 
> ...



Yes sir, i hope you happy now.
I can't see the difference yet,i'm browsing TPU from 4,3" inch HTC Sensation.

Last word,please enjoy any future game came from console ported.

================================

A lot of a haters.No,seriously.
When TPU staff wrote article about AMD overclocking,Intel fanboys will swarm in spreading flamebait.Someone posted news about Apple,all Apple haters come to arise.Now the best part,even news about upcoming console has been raid with some expertise claiming knew better about hardware capabilities even they never had nor play any console.
So if i use Intel,not using Apple and never touched console,does that make me rich and smart?


----------



## HossHuge (Nov 3, 2012)

Everybody chill out and remember we are all here for the same reason.  We all have an interest in all things Computer/Gaming. 

Can we just agree that this will make gaming better finally for PC and console gamers?

Hot chicks always help!!






Has Sony given up on any sort of motion control device?


----------



## Dent1 (Nov 3, 2012)

BigMack70 said:


> Not on that particular argument... maybe read the thread? Notice anyone still trying to argue that 1080p60 isn't 4x as demanding as 720p30? Your original post just reflects a lack of reading comprehension as even Phenom realized that my argument was correct.



It isnt 4x demanding, what is your point?

Either show mathematical proof or a  link source or sit down. 

You can't claim stuff with no evidence then say you're right.


----------



## Rei86 (Nov 3, 2012)

Deadlyraver said:


> I'm just happy that the console generation, after so many years, has picked up in graphics capability.



Wut?  

Every console generation notches up in graphics capability wtf are you on about?


----------



## LAN_deRf_HA (Nov 3, 2012)

About the emulator thing, basically aren't kits like this emulators? Don't all these companies have fully functional emulators for their consoles/handhelds? Seems like someone could just steal the software and have most of the hard work done.


----------



## sergionography (Nov 3, 2012)

LAN_deRf_HA said:


> About the emulator thing, basically aren't kits like this emulators? Don't all these companies have fully functional emulators for their consoles/handhelds? Seems like someone could just steal the software and have most of the hard work done.



yes I said that in a previous comment, the fi al version might be much buffer with enhanced cores and graphics, this could be just for starters so developers can get the general idea, so while they design and optimize on this architecture the finalized version might end up much more enhanced, not to mention it will definitely not be a trinity chip, it would be a derevitive, think of all the PCI express panes in trinity and some of the dedicated hardware for specialized tasks that aren't needed in consoles, all that will be removed


----------



## Rei86 (Nov 3, 2012)

LAN_deRf_HA said:


> About the emulator thing, basically aren't kits like this emulators? Don't all these companies have fully functional emulators for their consoles/handhelds? Seems like someone could just steal the software and have most of the hard work done.



Development kits aren't emulators


----------



## Alvy Ibn Feroz (Nov 3, 2012)

HSA anyone?


----------



## btarunr (Nov 3, 2012)

BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> 
> I call BS. More probably 720p upscaled



A GPU made in 2006 is able to handle Crysis 2 (Xbox 360 version of the game). I'd guess with the right optimizations, an APU with HD 7600-class discrete companion GPU should run games at 1080p like butter.


----------



## Frizz (Nov 3, 2012)

Well I'm just hoping they won't have as many framerate issues and screen tearing this time around. At the end of the day it's still a console so with new game engines coming out and graphics technology like tessellation etc. I highly doubt it will be able to perform 60fps at 1080p with ALL possible eye candy the game dev can dish out as when consoles change so do games.


----------



## cdawall (Nov 3, 2012)

BigMack70 said:


> Not on that particular argument... maybe read the thread? Notice anyone still trying to argue that 1080p60 isn't 4x as demanding as 720p30? Your original post just reflects a lack of reading comprehension as even Phenom realized that my argument was correct.
> 
> As for rather the PS4 will be doing 1080p60, that's anyone's guess. Some think no, myself among them, others think yes. No way to know who's right until the hardware is released.



I don't understand what you are getting so worked up about. You are comparing it directly to a PC way to much. These consoles lack the bloatware PC's have to start off. On top of that we are talking about a midrange crossfire setup that is purpose built to push 1080P60FPS. Purpose built that means AMD is going to be handed a TDP and clockspeed they have to deliver to Sony. On top of that purpose built software designed to be run on the PS4 means crossfire scaling will be near perfect.

It can already deliver 60+ FPS on certain games as it sits now without perfect scaling and purpose built drivers/applications.





Why do you think when settings get knocked down to low/medium it will not be able to push 60FPS? Remember this wont have 16x AA and AF forced on it. This is still a console and will be doing console things. One of those things is run 1080P@60FPS or Sony simply wont approve the game until the coding is fixed.

Your argument is a waste of time. Look at the 7900 series card in the PS3 on the PC that card could not handle the games and settings it pushes while using the PS3 yet the PS3 does it without sweating. :shadedshu Move on.


----------



## Over_Lord (Nov 3, 2012)

BigMack70 said:


> Lol w/e man... if you want to claim you've got a magic setup, go ahead. Arkham City, Sleeping Dogs, those are just two other recent examples of things that a single card doesn't do 60fps minimum framerate in (they don't even get close btw).
> 
> But hey maybe your card is a magic card that does far better than all the benchmarks say.



When all console games are 30fps (most shitty looking ones are 60fps, oh wait COD), why are you going on about 60 fps?

I game on the HD 5850 even now, and above 40fps the game is fluid. Occasional hiccups happen, but they happen with my droid too. I don't complain day and night about it. It's perfectly usable.

Same way, it's perfectly playable.


----------



## Capitan Harlock (Nov 3, 2012)

a console from my prospective is good for only certain games like sports game ,hack and slash and platform style.
a pc is a much more multimedia system so games on pc made for console sucks if you think a pc dont have the power for handle a game with low res texture and detail at low medium.
now if sony wanna talk like they know about how is run a game at 1920 x 1080 with an apu and a discrete gpu well they dont know nothing .
they wanna continue to make games than if 60 fps are stable on 1 game if in another dont run smooth they low details and texture res = i bring money from you that dont know nothing about graphics that is part of the entertainment and realism like is super great and advanced fake tecnology.


----------



## NC37 (Nov 3, 2012)

60 fps in a benchmark doesn't mean much. 60 fps in actual gameplay...that means a lot more. Stuff happens during gameplay. Benchmarks are usually a controlled flyby run. Gameplay has a lot more going on and is not as predictable. That is where Sony needs to nail the 60fps.

There are a lot that argue against dual GPU setups. Really don't gain the performance benefit from it for the money...etc. Ya know, I finally tried it and overall...I say it was a nice way to extend the life of my system. Certainly see the benefit. Not all titles work perfectly with it but there are enough that do to make it worthwhile. The problem has always seemed to involve drivers or getting the game to utilize it. But now imagine a closed hardware console system where devs can really design for it properly. 

Think a dual GPU system is genius personally. We're gonna see devs able to better handle complex games than ever before. The question is...what bottleneck will be presented. Every console has had a bottleneck of some kind where years later it is the first thing people say the dev shouldn't have cheapened out on. Most of the time it is RAM or VRAM. Last gen continued that tradition. 512MB was not terrible for 2005 but it wasn't great. Most PCs were coming with 2-3GB and games were pushing into those ranges. But the problem with both 360 and PS3 was not the size but allocation. PS3 cut things 50/50 with VRAM, 360 shared with VRAM. Had the PS3 had more than 256MB VRAM...I think we would have seen more games pushing 1080 on it. 360 could do it but still, resources were being pinched. A dual GPU system usually means VRAM on both GPUs. So 2 sets of VRAM. Orbis may be the first system to get rid of the age old RAM bottleneck. If programed properly, the only thing holding it back eventually would be the hardware itself, but we'd continue to see better games for a lot longer than 360/PS3 due to that dualie setup.


----------



## Aquinus (Nov 3, 2012)

You're all assuming performance will be similar to that of the A10-5800k. In reality we don't know how fast it is. If you read the article it says the following:



> PS4′s APU was described today as a “derivative” of existing A10 hardware. The hardware is “based on A10 system and base platform”.



This means that is very possible that the specs of this chip are different from the FM2 chips available now and that Sony and AMD might have a special relationship here where the final product might have a CPU that is much different than what is out now. Just a thought.


----------



## Recus (Nov 3, 2012)

HossHuge said:


> Can we just agree that this will make gaming better finally for PC and console gamers?



For graphics maybe, for gameplay no.


----------



## Frick (Nov 3, 2012)

Benetanegia said:


> I remember those days. They sucked bad. I spent all days knowing how much better my uncle's PC games were compared to the crap I was forced to play because I was a child.
> 
> Better graphics don't make games worse. There's games with bad graphics that suck too, etc. One thing does not exclude the other. Better graphics means more immersion and in some games that contributes and makes it 100x better. i.e. Metro 2033 with worse graphics would have not had the same atmosphere and wouldn't have been so immersing.



I shouldn't reply to this as we approach the subject from vastly different angles but I can't help myself: HERP DERP.


----------



## BigMack70 (Nov 3, 2012)

Dent1 said:


> It isnt 4x demanding, what is your point?
> 
> Either show mathematical proof or a  link source or sit down.
> 
> You can't claim stuff with no evidence then say you're right.



Mathematical proof (for like the 4th time):

(1920*1080)/(1280*720) = 2.25
60/30 = 2

Put them together... 2.25*2 = 4.5

Hence, 1080p60 is ~4x as demanding as 720p30. 

That's not a point about hardware. In fact, strictly speaking my point here has nothing to do with hardware. It's a point about math which is apparently too simple for many of the elite minds in this thread to grasp. :shadedshu



cdawall said:


> I don't understand what you are getting so worked up about



I'm worked up about kiddos who don't understand kindergarten math and reading comprehension. (I also got worked up earlier about people making crazy claims about what a single graphics card can do on the PC.)

It doesn't get me worked up that some people think this APU will be able to do 1080p60... I've said about a hundred times that your guess is as good as mine and we won't know till we actually see what the hardware can do. I'm skeptical and don't believe it. Others here aren't as skeptical. But it's not unreasonable either way. It depends a lot on how demanding you think the next generation of games is going to be, how well you think they can optimize for console rather than PC, and rather or not you read Sony's claims of 60fps as being an average 60fps or a minimum 60fps (and rather or not you trust Sony's word).


----------



## WhiteLotus (Nov 3, 2012)

I understand your math, but I don't think it is as linear as that. To say it is ~4 times demanding is flawed.


----------



## Dent1 (Nov 3, 2012)

BigMack70 said:


> Mathematical proof (for like the 4th time):
> 
> (1920*1080)/(1280*720) = 2.25
> 60/30 = 2
> ...



So if you stand behind your  equation why wouldn't show this in your very first post? Why wait until page #10 to provide proof? Only a complete idiot would argue for 10 pages straight and take a massive amount of heat when it could have been resolved earlier with your equation. You are a certified psychopath. 


PS. Your formula doesn't prove that the GPU needs to be 4x powerful.  The architecture of the GPU is the deciding factor. For example a GPU could in theory render 4x faster, but may still perform worst than it's predecessor with less rendering horsepower due to lack of bandwidth throughput.

If everything was linear, like you seem to suggest the transistor count and heat output might also increase by approximately 4x which isn't the case.




BigMack70 said:


> It doesn't get me worked up that some people think this APU will be able to do 1080p60... ).



How can you know that without knowing the which GPU the APU will feature. For all we know the GPU on the APU could be a 7970 (unlikely due to cost, heat) but we don't know.  Also like I said earlier it depends on the game, a linear game like COD's campaign yes, a big open world game like GTA, probably not. Without knowing which games are in question we can't make any concrete conclusion. Also Sony did not say all games will be 1080p/60. You jumped to your own conclusion.



BigMack70 said:


> I'm worked up about kiddos who don't understand kindergarten math and reading comprehension.



If you've graduated kindergarten, find another forum. Bye.


----------



## BigMack70 (Nov 3, 2012)

Dent1 said:


> So if you stand behind your  equation why wouldn't show this in your very first post? Why wait until page #10 to provide proof? Only a complete idiot would argue for 10 pages straight and take a massive amount of heat when it could have been resolved earlier with your equation. You are a certified psychopath.
> 
> 
> PS. Your formula doesn't prove that the GPU needs to be 4x powerful.  The architecture of the GPU is the deciding factor. For example a GPU could in theory render 4x faster, but may still perform worst due to than predecessor rendering horsepower due to lack of bandwidth throughput.
> ...



I explained my math in post #74. Lrn2read? And even if I hadn't, it's not like I'm asking you to do calculus to figure out how many more pixels need to get crunched out in 1080p60 compared to 720p30. :shadedshu.

Issues about non-linearity are addressed in post #85. Lrn2read?

P.S. I've explained throughout that this is a picky point about math and not a point about hardware. Lrn2read?

I don't claim to know anything about what this APU can do. I say I'm skeptical of these claims and I've explained why. I'm also not the only one skeptical about it in this thread. Lrn2read?

:shadedshu :shadedshu :shadedshu :shadedshu


----------



## Dent1 (Nov 3, 2012)

BigMack70 said:


> I don't claim to know anything about what this APU can do.



Then your arguments have no basis.


----------



## BigMack70 (Nov 3, 2012)

Dent1 said:


> Then your arguments have no basis.





+1 to the number of people in this thread who don't understand what speculation is

You don't know any more than I do about what this APU can do, unless you have a time machine and have been to the future.


----------



## tacosRcool (Nov 3, 2012)

I hope they have hybrid crossfire!


----------



## Aquinus (Nov 3, 2012)

BigMack70 said:


> I explained my math in post #74. Lrn2read? And even if I hadn't, it's not like I'm asking you to do calculus to figure out how many more pixels need to get crunched out in 1080p60 compared to 720p30. :shadedshu.
> 
> Issues about non-linearity are addressed in post #85. Lrn2read?
> 
> ...





BigMack70 said:


> +1 to the number of people in this thread who don't understand what speculation is
> 
> You don't know any more than I do about what this APU can do, unless you have a time machine and have been to the future.



Resolution isn't the only thing that determines performance.

Going from 720p to 1080p does *not* require 4 times as much power. You're over simplifying how computers work and you're spreading false information. Maybe its time for you to get off of your high horse and realize that you really don't know what you're talking about. Are you a programmer? What experience do you have in the field that proves you know what you're talking about because everything you said is straight up false. The amount of geometry and calculations that need to be done to the "world" doesn't change with resolution which is why your "equation" doesn't work. The complexity of the geometry doesn't change just because you're running at a higher resolution so the majority of the work done that is "taking longer" is determined by the fill rate which is directly related to performance at different resolutions, which is only one part of what the GPU does. It also explains why there is more CPU power used at lower resolutions because at higher resolutions the gpu is doing more fragment shading than at lower resolutions.



Dent1 said:


> Then your arguments have no basis.



At least some one knows bullshit when they see it.


----------



## Dent1 (Nov 3, 2012)

BigMack70 said:


> :You don't know any more than I do about what this APU can do, unless you have a time machine and have been to the future.



+1

10 pages later and you admit it.  

We know it's all speculation and your opinion. Forums are built on this and we respect your theories. But you have to remember until we know more about the APU and the games  in question everything, you, others and myself say are educated opinions.

I think where you annoy people is when you assume your opinion are iron truth. Then insult our reading and math ability, and intelligence in general when we fail to agree that your opinion is truth.


----------



## BigMack70 (Nov 3, 2012)

Aquinus said:


> Resolution isn't the only thing that determines performance.
> 
> Going from 720p to 1080p does *not* require 4 times as much power. You're over simplifying how computers work and you're spreading false information.



+1 to the # of people who don't understand my point has nothing to do with hardware

You have to churn out ~4x the pixels at 1080p60 as you do at 720p30. That's basic math.

Now, I understand (and have stated from very early on in this argument) that when you actually go look at how things perform in the real world, this breaks down and is not linear. But there's all sorts of various reasons for that and none of them have to do with the fact that you're putting out a different ratio of pixels at 1080p60 vs 720p30 than ~4x.

Why does it take dozens of posts to explain such a stupidly over-simple point? Go read the thread - I was making a picky point because of a lack of clarity in one of mailman's posts, and you guys have taken it to a whole other level.

Wow.


----------



## BigMack70 (Nov 3, 2012)

Dent1 said:


> 10 pages later and you admit it.



You REALLY haven't read the thread, have you? Multiple times, even beginning in post #19, I've stated we will find out i.e. "know" in a year or two, when the hardware is released, what it can do.







:shadedshu


----------



## Aquinus (Nov 3, 2012)

BigMack70 said:


> +1 to the # of people who don't understand my point has nothing to do with hardware
> 
> You have to churn out ~4x the pixels at 1080p60 as you do at 720p30. That's basic math.
> 
> ...



That's because it isn't as basic as you claim it to be and what you're describing is completely false. Just stop posting before you make yourself look more like a fool than you already have. What you're talking about isn't the performance of this custom APU for Sony's new console, but rather how GPUs work.

4x as many pixels doesn't imply 4x more work. Just stop posting, because how 3D is rendered is clearly beyond you and you're not willing to do the research and change your stance based on the information provided.


BigMack70 said:


> You REALLY haven't read the thread, have you? Multiple times, even beginning in post #19, I've stated we will find out i.e. "know" in a year or two, when the hardware is released, what it can do.
> 
> http://i299.photobucket.com/albums/mm288/larryo340/Funny gifs/7c35013f.gif
> 
> :shadedshu



That isn't what you've been talking about for the majority of your posts...


----------



## EaGle1337 (Nov 3, 2012)

BigMack70 said:


> Mathematical proof (for like the 4th time):
> 
> (1920*1080)/(1280*720) = 2.25
> 60/30 = 2
> ...


With the resolution of my monitor I'm a shade under 3 times the resolution of 1080p, with 1 gtx 570 in pretty much every game I could get 60 fps.  Now 1 570 doesn't power the new monitor quite so well.. Do I need 3 570s to get 60fps? Nope, in fact i only need 2 570s but I get stutter due to running out of vram. So I need nearly 3x the power to run the resolution at 60fps, why does only boosting my power by under 2x work fine??


----------



## Dent1 (Nov 3, 2012)

BigMack70 said:


> You REALLY haven't read the thread, have you? Multiple times, even beginning in post #19, I've stated we will find out i.e. "know" in a year or two, when the hardware is released, what it can do.
> 
> :shadedshu



Fair enough. Then wait 2 years and bump the thread. 

You, if your post was complete in #19, why are we still having this debate at post #254? 




BigMack70 said:


> +1 to the # of people who don't understand my point has nothing to do with hardware
> 
> You have to churn out ~4x the pixels at 1080p60 as you do at 720p30. That's basic math.



I'm not disputing the increase in pixels won't increase processing requirements, I'm disputing that 4x pixels translate to a 4x powerful GPU in the real world.

For example, a GPU 1 is 4x more powerful than GPU 2. 

GPU 1 has access to 512MB of dedicated, 1GB of system memory 
GPU 2 has access to 2GB of dedicated, 8GB of system memory 

Despite, GPU 1 having increase clock speeds, higher shader count, wider memory bus it only gets only gets *20FPS* in Crysis @ 1080p

However the slower GPU 2, gets *35FPS* in Crysis @ 1080p due to having access to a more lucrative memory reserve thus helping it outperform the faster GPU at a higher resolution.


----------



## hooj (Nov 3, 2012)

So this whole argument started because some nab doesn't think the PS4 with APU will run all of it's games at 1080p @ 60fps?


----------



## Aquinus (Nov 3, 2012)

Dent1 said:


> Despite, GPU 1 having increase clock speeds, higher shader count, wider memory bus it only gets only gets 20FPS in Crysis @ 1080p
> 
> However the slower GPU 2, gets 35FPS in Crysis @ 1080p due to having access to a more lucrative memory reserve thus helping it outperform the faster GPU at a higher resolution.



+1 (kind of): Actually the performance hit from running out of memory is much more steep than that, but it's an accurate representation.


----------



## Dent1 (Nov 3, 2012)

hooj said:


> So this whole argument started because some nab doesn't think the PS4 with APU will run all of it's games at 1080p @ 60fps?



Yes, 

And the crazy thing is Sony never said it would run *all* games at 1080p @ 60fps. 

The other crazy thing is we don't know what the specification of the GPU portion of the APU is. 

We don't even know if the games lined up on PS4 are intensive.

We know nothing.  #11 pages of argument based on little data. ahhhhh.


----------



## hooj (Nov 3, 2012)

Dent1 said:


> Yes,
> 
> And the crazy thing is Sony never said it would run *all* games at 1080p @ 60fps.
> 
> ...



Well as long as the games that need 60fps get 60fps then well it doesn't matter because i'm still buying one!


----------



## THE_EGG (Nov 3, 2012)

DAYUM, been away from the forums for one day and this thread has already over 250 posts! 

Reading the past couple of pages it seems to me most of the posts seem to be about the hardware of the "PS4" not being able to run games @ 60fps 1080p 3D. I don't think the majority of console gamers will really care or even notice the performance increase or the fact that not EVERY game will run @ 60fps 1080p 3D.

I'm sure that most people that are that concerned about this issue could just go out and buy a decent computer to run most of the games and be happy that way.


----------



## Aquinus (Nov 3, 2012)

Dent1 said:


> We don't even know if the games lined up on PS4 are intensive.



Sony hasn't used the term PS4 yet, so I'm reluctant to call it a PS4. It's possible that their next gaming console will not be named PlayStation. Sony might be trying to re-brand it and shift the audience for this console. It's looking more like a HTPC than a console so far, and that definitely intrigues me.


----------



## BigMack70 (Nov 3, 2012)

Aquinus said:


> That isn't what you've been talking about for the majority of your posts...



The majority of my posts have been spent explaining basic math to people like yourself who don't understand I'm not talking about real world hardware/software. You said it yourself. It's 4x more pixels. That's my point. Right there. End of story. It was never a major point until a dozen people started failing to read and/or do math.



Dent1 said:


> I'm not disputing the increase in pixels won't increase processing requirements, I'm disputing that 4x pixels translate to a 4x powerful GPU in the real world.



I am not and never was talking about the real world. How many times do I have to explain this? 50? 100? I made a nit-picky post about how many pixels have to get crunched out at 1080p60 vs 720p30, due to a lack of clarity in another post, in order to be facetious, and it apparently went about a mile over the heads of all the brainiacs in here.

Here, since you all are so good at reading, let me just take the time to say that again:

I am not and never was talking about the real world. It was a nit picky-post about how many more pixels have to get crunched out at 1080p60 vs 720p30, and it was in response to another post which could have been more clear (and which the author did later clarify). 

Maybe I'll say it a third time, just to be sure?

I am not and never was talking about a real world performance scenario. I made a nit picky post about how many more pixels need to be output at 1080p60 vs 720p30 in response to another poster.

Now, there are implications there for how much more powerful your hardware needs to be, and if I implied it's as simple as "your GPU needs to be 4x better", then I apologize - I know that's a horrid oversimplification. I thought that was clear from one of my early posts about how non-linearity related to the argument I was trying to make. You DO need 4x more pixel crunching power, but that doesn't necessarily have to come from a 4x more powerful GPU - you could have 2x better software optimization + 2x better GPU, etc.

Now, if YOU read my posts and implied "oh he's arguing that the PS4 needs to be 4x more powerful than the PS3" or something like that, then that's YOUR problem with reading comprehension, and I don't apologize.


----------



## Dent1 (Nov 3, 2012)

BigMack70 said:


> The majority of my posts have been spent explaining basic math to people like yourself who don't understand I'm not talking about real world hardware/software. You said it yourself. It's 4x more pixels. That's my point. Right there. End of story. It was never a major point until a dozen people started failing to read and/or do math.
> 
> 
> 
> ...



You keep talking about "reading". But I don't see anyone disputing the math or anyone disputing that there isn't a 4x pixel difference between 720p and 1080p. If somebody said on the contrary quote it for all to see. So you are arguing a point which we are not disputing, this isn't good use of your time.


----------



## Disparia (Nov 3, 2012)

In conclusion,

Given what is known about Orbis, Sony's goals (resolution/fps, lost cost) are very possible.

/thread


----------



## hooj (Nov 3, 2012)

Aquinus said:


> Sony hasn't used the term PS4 yet, so I'm reluctant to call it a PS4. It's possible that their next gaming console will not be named PlayStation. Sony might be trying to re-brand it and shift the audience for this console. It's looking more like a HTPC than a console so far, and that definitely intrigues me.



Rebrand the playstation? LMAO Nah it's PS4.


----------



## Rei86 (Nov 3, 2012)

I really don't understand why a person in this thread is arguing about this 

Consoles have always been closed platforms that are optimized to do one thing and one thing only - Play Games.

Being a closed platform in the hardware sense, developers can maximize optimization for said platform since they are developing for one set of hardware and one set of hardware alone.  I really don't see why the PS4 can't deliver 1080p24 for movies and 1080p60 for video games.  Same for the Xbox 720 that should be pushed out around the same time.


----------



## xenocide (Nov 3, 2012)

If they are using a GPU built into a modified APU, I'm willing to be Sony will muck it up by using really slow RAM so the GPU runs much worse than it should...


----------



## BigMack70 (Nov 3, 2012)

Dent1 said:


> So you are arguing a point which we are not disputing, this isn't good use of your time.



So why are you arguing with my posts, again? Is it a good use of your time?

I've been trying to explain for a while that people are responding like I'm making an argument that I'm not making, hence my repeated calls to actually read.

The appropriate response to my initial post, if someone had taken the time to read + think (and mailman & others at the time got this fairly quickly), would have been "OK fine there's 4x the pixels that need to get crunched out but there's a more than one way to get that". 

It was never a big deal until people started trying to put a straw man argument in my mouth and I had to clarify (over and over and over).


----------



## Aquinus (Nov 3, 2012)

BigMack70 said:


> The majority of my posts have been spent explaining basic math to people like yourself who don't understand I'm not talking about real world hardware/software. You said it yourself. It's 4x more pixels. That's my point. Right there. End of story. It was never a major point until a dozen people started failing to read and/or do math.
> 
> 
> 
> ...



I have a degree in Computer Science. I have a job as a System Administrator and Developer and I contribute occasionally to open source projects in my free time so I like to think I know what I'm talking about you twit. As far as "this is what I actually was saying," you're full of shit. You did mention hardware, and you did say it requires 4x more power. Stop changing your story and stop posting because it's very obvious that just about everyone here is frustrated with your lack of common sense and education. You're overbearing, ignorant, and think that you know your shit when you really do not.

Let me quote you:


BigMack70 said:


> Hence, 1080p60 is ~4x as demanding as 720p30.



Stop being a fucking tool and realize that people who actually work in the field and work on software for a living are telling you that you're wrong. So unless you're in the same position I am and work with computers professionally from a management and development standpoint, then you really need to stop spewing out all this garbage that you think is correct.

Now I recommend that you drop it and and stop defending your flawed position because at this point you're just making an ass out of yourself and obviously pissing people off in this thread.


BigMack70 said:


> So why are you arguing with my posts, again? Is it a good use of your time?
> 
> I've been trying to explain for a while that people are responding like I'm making an argument that I'm not making, hence my repeated calls to actually read.
> 
> ...



It's because you're a liar and dent is doing the right thing by calling you out.


----------



## Rei86 (Nov 3, 2012)

xenocide said:


> If they are using a GPU built into a modified APU, I'm willing to be Sony will muck it up by using really slow RAM so the GPU runs much worse than it should...



Depends.  PS1~3 development Sony was heavy handed in picking which parts that they wanted.  This time around kind of sounds like AMD has the reigns on the main components on the cpu/gpu.  

Also to note, the Wii U which is also another AMD product comes with 2GB of shared ram.  I'm sure they won't muck things up for the "higher end" PS4.


----------



## BigMack70 (Nov 3, 2012)

Aquinus said:


> [sooooooooooooooo ANGRYYYYYYYYYY]



#1) You're taking the internet too seriously. Chill out.

#2) I already apologized if I made statements about precisely how 4x the pixels would need to be applied to a hardware/software solution. All I meant was that somehow you need to get 4x the pixels produced. If I implied that requires a single solution (e.g. a 4x more powerful GPU), then I was wrong. If you took the time to read, maybe you wouldn't be raging so bad. I'm not trying to move the goalposts here - it's quite possible I said something that was poorly worded and wrong or which needed more clarification (I haven't gone back to read all my posts to check). 

In fact, given the extent of discussion on this, the post you quoted does need more clarification:

#3) 1080p60 IS 4x more demanding than 720p30, in terms of the requirements of pixel output. That doesn't necessarily mean you need a CPU that's 4x as powerful, or a GPU that's 4x as powerful, or anything else (and I don't think I ever claimed any of those things?). But it does mean that your hardware/software implementation needs to meet 4x the pixel output demand as if it were running at 720p30. Hence, it's not wrong, per-se, to say that 1080p60 is 4x as demanding as 720p30, but perhaps it begs the question "in what respect is it 4x as demanding?".


----------



## Dent1 (Nov 3, 2012)

To be fair, over the course of this thread you've switched your argument to your own convenience when you are losing. Now you are saying you were merely highlighting the pixel count difference, however post #70 you clearly state that 1080p @ 60fps is 4x demanding than 720p @ 60FPS. This is the part most of us, if not all are disputing.

BigMack70: "Also, just to be picky... 1080p 60fps is more than _*4x as demanding *_than what current consoles do, which is ~720p ~30fps."
Post #70: http://www.techpowerup.com/forums/showpost.php?p=2765070&postcount=70



BigMack70 said:


> So why are you arguing with my posts, again? Is it a good use of your time?




No, this isn't good use of my time either. I'm man enough to admit that.




BigMack70 said:


> The appropriate response to my initial post, if someone had taken the time to read + think (and *mailman* & others at the time got this fairly quickly), would have been "OK fine there's 4x the pixels that need to get crunched out but there's a more than one way to get that".
> 
> It was never a big deal until people started trying to put a straw man argument in my mouth and I had to clarify (over and over and over).



Your mistake was keep repeating the same fact and then changing arguments as pointed out above. Mailman and a few others stopped responding, yet you continued.


----------



## BigMack70 (Nov 3, 2012)

^^ See clarification #3 above. The claim 1080p60 = ~4x more demanding than 720p30 isn't necessarily wrong. I've clarified what I meant by it, and I don't think I ever implied anything beyond what I've here clarified.

Seriously, guys. Read!

Is this a good use of my time? Probably not. But I'm not going to apologize for defending my arguments against straw man attacks from people who fail to read my posts.


----------



## repman244 (Nov 3, 2012)

BigMack70 said:


> Is this a good use of my time? Probably not.








I must say I'm surprised to see that a thread about a console gets this much attention on TPU.

Carry on


----------



## flyin15sec (Nov 3, 2012)

BigMack70 said:


> ^^ See clarification #3 above. The claim 1080p60 = ~4x more demanding than 720p30 isn't necessarily wrong. I've clarified what I meant by it, and I don't think I ever implied anything beyond what I've here clarified.
> 
> Seriously, guys. Read!
> 
> Is this a good use of my time? Probably not. But I'm not going to apologize for defending my arguments against straw man attacks from people who fail to read my posts.



Nobody failed to read your posts, so stop saying that. 

I was excited to read what a "Tech" forum had to say about this news regarding an APU, but it's come down to 10 pages of "kindergarten" math.

Next time, if you are going to use Kindergarten math as your defense, go troll here instead: 

http://www.nickjr.com/home/messageboard/


----------



## Dent1 (Nov 3, 2012)

BigMack70 said:


> ^^ See clarification #3 above. The claim 1080p60 = ~4x more demanding than 720p30 *isn't necessarily wrong*. I've clarified what I meant by it, and I don't think I ever implied anything beyond what I've here clarified.



It is wrong. Here you go defending an invalid statement again. What you meant and what you said are two different things.

Your statement implies it requires 4x the horsepower in hardware, thus 4x more powerful CPU, GPU, RAM or subcomponent(s) of either three.




BigMack70 said:


> ^^
> Seriously, guys. Read!



You said it yourself, this is a Kindergarten right? If you have graduated leave the forum.




BigMack70 said:


> ^^
> But I'm not going to apologize for defending my arguments against straw man attacks from people who fail to read my posts.



Yes but what are you defending?

The 4x the pixel count?

or 

4x demanding.

If you was defending one weak statement I would respect your argument. But can't respect somebody that flip flops between two statements at their own convenience.


----------



## Steevo (Nov 3, 2012)

I happen to have two computers worth of the same hardware for all intents and purposes. 


I should play some games at 1080P on the 27" monitors that I ordered with them and see how they do.


----------



## sergionography (Nov 3, 2012)

Benetanegia said:


> Apples to oranges. Different ways of doing things. But in general:
> 
> - The 360 had *48* SP not 16.
> - These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
> ...



not to mention this is assuming u use the same detail level


----------



## cdawall (Nov 3, 2012)

BigMack70 said:


> I'm worked up about kiddos who don't understand kindergarten math and reading comprehension. (I also got worked up earlier about people making crazy claims about what a single graphics card can do on the PC.)



It's not kindergarten math. Your little equation has zero to do with real life. If you don't believe me play the exact same game with the exact same graphics settings at 720P and 1080P you don't get 4 times as many frames in 720P.



BigMack70 said:


> It doesn't get me worked up that some people think this APU will be able to do 1080p60... I've said about a hundred times that your guess is as good as mine and we won't know till we actually see what the hardware can do. I'm skeptical and don't believe it. Others here aren't as skeptical. But it's not unreasonable either way. It depends a lot on how demanding you think the next generation of games is going to be, how well you think they can optimize for console rather than PC, and rather or not you read Sony's claims of 60fps as being an average 60fps or a minimum 60fps (and rather or not you trust Sony's word).



Why wouldn't an APU be able to play Gran Tourismo at 1080P 60 FPS? A 6 year old G70 series card can play it at 720P. If you want to get into your silly little equation the GPU inside of an A10-5800K is 4x as fast as the old G70 series card in a PS3.


----------



## BigMack70 (Nov 3, 2012)

cdawall said:


> It's not kindergarten math.



OK. First grade math.



> Your little equation has zero to do with real life.



Sure it does. You need a hardware/software solution that can produce ~4x more pixels than at 720p30. 

I never claimed it had anything to do with a specific hardware configuration, but it does have something to do with real life. We're not talking about unicorns.




> If you don't believe me play the exact same game with the exact same graphics settings at 720P and 1080P you don't get 4 times as many frames in 720P.



I've been over this a bunch of times now. I know this, and it's not relevant to my point at all, and I explained that all the way back at post #85 and have clarified further in recent posts.

Seriously guys. READ before you post. :shadedshu




> Why wouldn't an APU be able to play Gran Tourismo at 1080P 60 FPS? A 6 year old G70 series card can play it at 720P. If you want to get into your silly little equation the GPU inside of an A10-5800K is 4x as fast as the old G70 series card in a PS3.



I was unaware that they were just going to be re-releasing current gen games on next-gen hardware with beefed up resolution and framerate. Do you have a source suggesting that's what they're going to be doing?

My position on this, I think, is pretty clear. My speculation is the following:
-I take a 1080p60 + 3D claim by Sony to be a claim about either 60fps or near-60fps minimum framerates in their games.

-I assume that next gen games are going to improve graphically from current gen games and thus be more demanding

-I look to the hardware needed to run current gen games on the PC maxed out with 60fps minimum framerates at 1080p, and you need multi-GPU to do that. 

-I assume that next-gen games are going to be roughly as demanding as current PC games are maxed out or near-maxed out

-I assume it's impossible to optimize an APU (or an APU + low-midrange GPU) to such a degree that it is able to do things that a 7970/680 cannot. 


Could I be wrong on some or all of those counts? Sure. But they're not inherently any more unreasonable than someone who assumes the opposite and thinks that Sony can do this. We'll know in a couple years. It would be awesome if they're able to do it, but I'm not a believer yet.


----------



## erocker (Nov 3, 2012)

I think it is time to let others chime in with their thoughts. To the few people who have been posting over and over again... It is time to give it a rest. Take your arguments elsewhere.


----------



## WhiteLotus (Nov 3, 2012)

What do people think the price of the next generation consoles will be?

I've already heard that the WiiU will be sold at a loss...


----------



## xenocide (Nov 3, 2012)

WhiteLotus said:


> What do people think the price of the next generation consoles will be?
> 
> I've already heard that the WiiU will be sold at a loss...



I had heard Nintendo Investors were pretty insistent that the WiiU be sold at either break even or a slight profit, to the point of basically cutting as many corners as they could get away with.  As for the PS4, I know Sony has said they will not price a console as high as the PS3 was, so I'm thinking $400 and $500 models, at a slight loss.  If they skimp a little on the hardware knock $50 off each package.


----------



## Kreij (Nov 3, 2012)

Last I read about Nintendo was this ...


> Satoru Iwata, president of Nintendo, has revealed the Wii U will be sold below cost in the company's most recent financial results briefing.
> 
> "The Wii U hardware will have a negative impact on Nintendo's profits early after the launch because rather than determining a price based on its manufacturing cost, we selected one that consumers would consider to be reasonable," he said.



As for the PS4, I have no idea as we have virtually no details on anything related to it.


----------



## Rei86 (Nov 3, 2012)

WhiteLotus said:


> What do people think the price of the next generation consoles will be?
> 
> I've already heard that the WiiU will be sold at a loss...



Wii U 32GB MSRP is 349.99

I'm gonna throw this out that the PS4 and Xbox 720s will be around 399~499 price points.  Can't see either hardware manufactures asking more than that.  But then again they could add value features that could balloon the prices.

Also about Nintendo, they have never sold hardware at a loss till the 3DS and the Wii U.  Sony on the other hand has never sold the Playstation at a profit.  Always at a loss that was recouped from the overwhelming sales figures.


----------



## WhiteLotus (Nov 3, 2012)

Rei86 said:


> Wii U 32GB MSRP is 349.99



Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.


----------



## xenocide (Nov 3, 2012)

WhiteLotus said:


> Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.



Well they couldn't keep using dated hardware, so they had to step up the cost.  Plus those tablet controllers cost quite a bit.  With the Wii Nintendo made a fortune for the first few years because they were the only company selling their consoles at positive margins, and they were selling so many more of them than Sony and MS.  But once people stopped buying Wii's, they started to suffer because they had nothing for Software sales, which is where MS and Sony started making a majority of their revenue in the second half of this console cycle.

Assuming Sony and MS stick to traditional controllers it should cut at least $50 off the launch package.  I'm just assuming they have beefier hardware that will drive up the price.  I expect a base model PS4 at about $400--which comes with just the system and a controller--and a premium package that comes with 1 launch title, a year of PSN+ and probably a second controller, or a bigger HDD (320GB vs. the 250GB).

The new Xbox will probably be around the same price with similar hardware to the PS4, but I see them packaging all systems from launch with the Kinect which might bump the price up a bit.


----------



## Rei86 (Nov 3, 2012)

WhiteLotus said:


> Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.



The controller itself cost like 50% of that machine.



xenocide said:


> The new Xbox will probably be around the same price with similar hardware to the PS4, but I see them packaging all systems from launch with the Kinect which might bump the price up a bit.



The next Xbox is rumored to have Kinect 2 as part of the hardware and not as an accessory.


----------



## sergionography (Nov 3, 2012)

cdawall said:


> It's not kindergarten math. Your little equation has zero to do with real life. If you don't believe me play the exact same game with the exact same graphics settings at 720P and 1080P you don't get 4 times as many frames in 720P.
> 
> 
> 
> Why wouldn't an APU be able to play Gran Tourismo at 1080P 60 FPS? A 6 year old G70 series card can play it at 720P. If you want to get into your silly little equation the GPU inside of an A10-5800K is 4x as fast as the old G70 series card in a PS3.



actualy theoretical numbers are really important
you are forgetting that the resolution and gpu capability are kinda 2 different things
resolution is more affected by cache and memory, as detail is affected by computation power
i hardly think running 1080p is the problem, the issue here is details
i am studying video game design and the biggest concern is polygons when we do modeling, the more polygons in the models the more gpu capability it takes, so being a good modeler is to get the best looking shape with the least number of polygons possible
now the reason you dont get x4 the performance out of the hardware is because the poly count doesnt change with resolution, the rendering only changes to 1080p
not to mention the code has alot to do offcouse aswell as the poly count like i mentioned
some of the newer games have stunning graphcs and still run on old cards or even the consoles of today, mostly due to optimizing well with the crapload of graphics shaders available today, older games werent designed to run on 2000+ shaders so when u upscale to newer gens to run the older games they dont nessesarly do so linearly. but when u try the new optimized games for the hardware and run them on older cards then u will see the 4x weaker card running 4x weaker(when mem bandwidth and other gpu specialized features are also 4x weaker)


----------



## xenocide (Nov 3, 2012)

Rei86 said:


> The controller itself cost like 50% of that machine.



Well the retail cost of the controllers was placed at $100-150, so not completely inaccurate.



Rei86 said:


> The next Xbox is rumored to have Kinect 2 as part of the hardware and not as an accessory.



Yea it would make sense for them to bundle it as one system since they have been pushing Kinect so hard.  I think it will have interesting applications in like 5 years when the sensors are accurate enough to actually measure fine movements (like individual finger movements).


----------



## Rei86 (Nov 3, 2012)

xenocide said:


> Well the retail cost of the controllers was placed at $100-150, so not completely inaccurate.
> 
> 
> 
> Yea it would make sense for them to bundle it as one system since they have been pushing Kinect so hard.  I think it will have interesting applications in like 5 years when the sensors are accurate enough to actually measure fine movements (like individual finger movements).



If you remember Natal had the ability too, but MS was still fine tuning its software.  Even all the industry insiders that was able to see Natal behind closed doors was impressed with it.  The Kinect we got was a cut down version to keep cost low.  

With Kinect II if it is truly integrated into the hardware we'll see a big push for games for kinect since its already part of the system, allowing MS to market it better.   Next is I'm sure we'll get a more Natal product vs Kinect.


----------



## Disruptor4 (Nov 3, 2012)

BigMack70 said:


> ... 60fps 1080p is actually very demanding. You're not going to get 60fps minimum framerates in  even all current titles without a multi-GPU setup. If you want 60fps minimum framerate in BF3, for example, you are either going to be using multiple GPUs or turning settings down.



I don't understand where you got your info for that. With my set up, which is single GPU, I get well above 60FPS with max graphics...


----------



## Dent1 (Nov 3, 2012)

Disruptor4 said:


> I don't understand where you got your info for that. With my set up, which is single GPU, I get well above 60FPS with max graphics...



As much as I believe  BigMack70  is a troll, on balance he said minimum 60FPS. 

In BF3 you'll get an average and maximum way above 60FPS. Minimum will be about 30-40FPS regardless of setup.

But yes, I agree BigMack70 does often pull info out his butt.


----------



## mystikl (Nov 4, 2012)

The PS3 had a custom 7800 series GPU which was slightly slower than the regular one. Remember it was an upper midrange  GPU released one year before the PS3, so if they were to do that again, they should be aiming for something with performance similar to a midrange GPU released this year namely a 7850.
 So the PS4 having a discrete card paired with the APU makes sense otherwise it wouldn't reach the performance target. 
But this is just a theory.


----------



## M3T4LM4N222 (Nov 4, 2012)

I've built a FM1 A6 3650 APU based system and FM2 A8 5600K system and they both had great integrated graphics. My guess is the PS4 would be a A10 w/ a dedicated ATI GPU in hybrid crossfire mode. It'll be more than enough for console gaming especially when you consider that the games will be 100% optimized for those specific pieces of hardware. My guess is a A10 65W w/ Passive cooler + Hybrid Crossfire GPU w/ passive cooler. Correct me if I am wrong - but I think an A10 + 6670 would be nearly equivalent to a 7750 and that would handle a decent amount of PC games @ 1920 x 1080 no problem. Now the only issue is console hardware will be holding back 2560 x 1440 from becoming more mainstream. At least you can get a 2560 x 1440 IPS panel monitor for $350 online right


----------



## Mussels (Nov 4, 2012)

BigMack70 said:


> +1 to the # of people who don't understand my point has nothing to do with hardware
> 
> You have to churn out ~4x the pixels at 1080p60 as you do at 720p30. That's basic math.
> 
> ...




why doesnt performance drop to 1/4, when you up the pixels by 4x?


because its not that simple -.-
that doesnt take into accounts the various hardware tweaks, lossless and lossy compression (hardware and software via drivers) and a million other things. you've based this argument around something you see as simple and obvious, without actually checking it yourself.


god, this thread is really full of over-simplified arguments, just one after another after another...


----------



## lyndonguitar (Nov 4, 2012)

Possible "kinect 2.0" tech in the next Xbox (Durango) . what will be Sony's move?


----------



## Benetanegia (Nov 4, 2012)

Mussels said:


> why doesnt performance drop to 1/4, when you up the pixels by 4x?
> 
> 
> because its not that simple -.-
> ...



The answer to that is actually much simpler than that. It doesn't drop because when on lower resolution the GPU is not actually performing as fast as it can. The bottleneck is elsewhere in lower resolution (CPU, triangle setup...). Any GPU built in the last decade is built with higher resolutions in mind. This is why ROPs, TMUs and pixel shaders have always kept increasing to a point where they (pixel shaders) are 100x times higher than a decade ago while triangle setup and raster engines are now only 2x (AMD+ Nvidia mid-range) or 4x (Fermi/Kepler high end) higher.

A similar question would be why is a HD7950 the slowest card here?







Why do all cards perform nearly the same? And the answer is not anything complicated about optimization HW&SW, etc, etc.

EDIT: Low end cards however...









The GTX 650 Ti for example goes from 38.5 fps to 23.7 which is actually very close to the resolution difference which is 1.77X -> 38 / 1.77 =~ 21


----------



## Mussels (Nov 4, 2012)

Benetanegia said:


> The answer to that is actually much simpler than that. It doesn't drop because when on lower resolution the GPU is not actually performing as fast as it can. The bottleneck is elsewhere in lower resolution (CPU, triangle setup...). Any GPU built in the last decade is built with higher resolutions in mind. This is why ROPs, TMUs and pixel shaders have always kept increasing to a point where they (pixel shaders) are 100x times higher than a decade ago while triangle setup and raster engines are now only 2x (AMD+ Nvidia mid-range) or 4x (Fermi/Kepler high end) higher.
> 
> A similar question would be why is a HD7950 the slowest card here?
> 
> ...



the answer is simple: *nothing about video game performance is simple.* (which is my problem with this 4x 'fact')


----------



## Benetanegia (Nov 4, 2012)

Mussels said:


> the answer is simple: *nothing about video game performance is simple.* (which is my problem with this 4x 'fact')



Sorry but it is simple in this case. Pixels (whether final, or fragments, or texels or whichever step in the pipeline you want to talk about) need to be calculated and rendered and this is done on processors*. 2x the pixel amount 2x the power required, either 2x the SP number or 2x clock. It really is as simple as that. It's a fact.

Now you all guys are talking about margins. PC graphics cards have power to spare on nearly all fronts, lots and lots of it, for a long time, many years, and most definitely in the case of Pixel Shading capabilities and as such it's in this front where lower end cards have more "leeway".

* Nowadays, I mean any game engine in the past 5 years does everything on a per-pixel basis. There's no real escape from the law. More pixels more power required. Some games, especially on consoles "avoid" this physics law by rendering some elements at lower resolutions. For example rendering the lighting pass(es) at 1/2 or 1/4 the resolution is very common. This is just a workaround and not breaking the law. If output res is increased from 720 to 1080 but everything else on the fragment data is kept the same res, resolution has not really increased by as much as stated.


----------



## MuhammedAbdo (Nov 4, 2012)

*Sigh*

(SIGH) 

This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .

First you morons need to read some facts , here they are :

FACT 1 : Not even High-End PCs can sustain 60FPS @1080p in *demanding *games with maximum graphics , games like : Metro 2033 , ARMA 3 , Crysis , Dragon Age 2 and many many more , there will be drops below 60FPS and in many occasions . 

FACT 2 : Consoles include insane amount of code optimization , where every CPU/GPU cycle is utilized , they literally run on the machine code , which is the lowest language of software programming , contrary to the PC that sports many compilers and higher languages which wastes valuable cycles .

FACT 3 : Even with these optimizations , consoles run with shitty graphics . they cant even maintain 30 FPS at 720p , they usually drop to 25 and 20 FPS , they even run at sub 1280x720 resolutions , sometimes as low as 900x600 !

FACT 4 : Consoles cut down on graphics severely , they decrease Shadows density , Lighting Effects , Level Of Detail , Polygon Count , Alpha Effects , Texture Resolution , Texture Filtering , Anti-Aliasing , Post Processing and so many things that I can't even remember them all .

FACT 5 : a console with the triple specs of, 
1-AMD CPU 
2-AMD APU
3-AMD GPU (low-end/6670)

Will barely run today games at 1080p @60 FPS with PC graphics level , all of the code optimizations will be spent on the cost of resolution increase (to 1080p) and the cost of graphics increase (to PC level) , such as shadows , lighting , textures and etc .

If the specs has been changed and the console came with a high-end or even a medium AMD GPU , then the situation will be different .

FACT 6 : These consoles will have to do the usual dirty business to be able to run at 1080p , cut resolution and upscale , decrease all graphics elements (lighting and shadows and textures .. etc) below the future PC level .. This happened to the previous generation too , Xbox and PS 3 stared operating at 720p just fine , then they had to cut corners to increase graphics other wise the visuals will stall .

FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .

End of Discussion .


----------



## DaMobsta (Nov 4, 2012)

298 posts and yet no intel fanboys bashing the amd article. Go TPU! 

I have a bad feeling about using hybrid crossfire on a console though, even if they optimize it, that's still an extra piece of hardware that could cause problems.


----------



## SIGSEGV (Nov 4, 2012)

MuhammedAbdo said:


> ...
> If the specs has been changed and the console came with a high-end or even a medium AMD GPU , then the situation will be different .



seriously, are you kidding me? 
i think, you should open your console machine cover then move and place your high end GPU along with intel 6 cores, high end mobo, and also your 32GB rams inside your console box.  



> FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .



...and many of pc games are ported from console games.


----------



## lyndonguitar (Nov 4, 2012)

MuhammedAbdo said:


> (SIGH)
> 
> This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .
> 
> First you morons need to read some facts , here they are :\





> FACT 1 : Not even High-End PCs can sustain 60FPS @1080p in *demanding *games with maximum graphics , games like : Metro 2033 , ARMA 3 , Crysis , Dragon Age 2 and many many more , there will be drops below 60FPS and in many occasions .



*In many occasions yes, But AVERAGE is what matters bro. a few 50 fps in couple of miliseconds doesnt matter*



> FACT 2 : Consoles include insane amount of code optimization , where every CPU/GPU cycle is utilized , they literally run on the machine code , which is the lowest language of software programming , contrary to the PC that sports many compilers and higher languages which wastes valuable cycles .



*You said it yourself, Heavy Optimization, Which is why Consoles from 2005 are able to run today's games. same reason why a Console in 2013(PS4) will be able to run games up to 2020 or so.*



> FACT 3 : Even with these optimizations, consoles run with shitty graphics . they cant even maintain 30 FPS at 720p , they usually drop to 25 and 20 FPS , they even run at sub 1280x720 resolutions , sometimes as low as 900x600 !



*Because they are pushing the console to its limit by trying to play a game on a 7 year old machine. but still looks decent for CONSOLE gamers to play*



> FACT 4 : Consoles cut down on graphics severely , they decrease Shadows density , Lighting Effects , Level Of Detail , Polygon Count , Alpha Effects , Texture Resolution , Texture Filtering , Anti-Aliasing , Post Processing and so many things that I can't even remember them all .



*Not severely, Almost all games in consoles are the low-medium graphics counterparts of their PC versions. High/Ultra is just additional eye candy benefit for PC users. but most PC players mostly aim for pure performance and flexibility.*



> FACT 5 : a console with the triple specs of,
> 1-AMD CPU
> 2-AMD APU
> 3-AMD GPU (low-end/6670)
> ...



*you said it yourself again, "PC graphics level", This is a console bro, and Its not running PC graphics level, include all the heavy optimizations and the huge possibility of mid-range AMD GPU, as with PS3's midrange GPU. and you can do 1080p @60fps average*



> FACT 6 : These consoles will have to do the usual dirty business to be able to run at 1080p , cut resolution and upscale , decrease all graphics elements (lighting and shadows and textures .. etc) below the future PC level .. This happened to the previous generation too , Xbox and PS 3 stared operating at 720p just fine , then they had to cut corners to increase graphics other wise the visuals will stall .



*Again this is a console, don't expect it to be PC God like performance, If you want that, Buy a PC, they never said the graphics to be similar to PCs, they only said 1080p @60FPS *



> FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .
> 
> End of Discussion .



*uhmm Nobody said that Consoles "will maintain a higher visual quality and frame rates" than PCs, End of Discussion???*

Trololololol


----------



## acerace (Nov 4, 2012)

DaMobsta said:


> 298 posts and yet no intel fanboys bashing the amd article. Go TPU!



You ask for it.

AMD sucks balls, Intel is million times better.  They should just burned to the ground, what a waste of human resources. Don't get me started on their shite GPU.


----------



## lyndonguitar (Nov 4, 2012)

acerace said:


> You ask for it.
> 
> AMD sucks balls, Intel is million times better.  They should just burned to the ground, what a waste of human resources. Don't get me started on their shite GPU.



says the Radeon User


----------



## xenocide (Nov 4, 2012)

@lyndonguitar

I wouldn't say Average frame rate is _all_ that matters.  FPS dips will affect gameplay way more than adjusting the LoD so its just a tad better at the cost of some visual affects.  Say you have 2 situations, one where a certain game--say Modern Warfare 4--can run at 1080p with an average of 60fps, but it drops down as low as 30fps.  Now compare that to the same game running at the same resultion but it average 55 fps, and never drops below 50fps, the only difference is they removed a particular lighting effect.  On a console the second option will be more enjoyable because you'll get a more constant frame rate, despite having a lower average frame rate.

As for console optimization, you're kind of blurring the perception a bit.  The most demanding and well optimized console games render at 720p at about 29-30 fps, with the equivalent of low\medium settings and almost no anti-aliasing.  They look decent by most peoples standards, but hitting the bar isn't that hard.  You could accomplish that on same title with a mediocre PC (with none of that optimization) akin to a C2D and HD4870 GPU.  Hell, you could probably do a decent amount better.

You have to realize rendering stuff at 540p and 720p really gives a lot of wiggle room.  I think the biggest bottleneck with current gen consoles is actually in the RAM department.  Imagine making a game that can only access 256MB of system memory (PS3), and see how well it runs.  I think we have to keep things in perspective.  The APU's GPU is capable of running modern PC games at medium settings at 1080p with moderate amounts of AA, and still posting 20-30fps.  That's really not bad.  When you throw a 6670 in the mix it only gets better.  The question will always be quality.  They could use that setup and render a single textured cube spinning at 1080p at 60fps and their statements would be accurate, but people want a game that hits those settings and still looks good.


----------



## Capitan Harlock (Nov 4, 2012)

xenocide said:


> @lyndonguitar
> 
> I wouldn't say Average frame rate is _all_ that matters.  FPS dips will affect gameplay way more than adjusting the LoD so its just a tad better at the cost of some visual affects.  Say you have 2 situations, one where a certain game--say Modern Warfare 4--can run at 1080p with an average of 60fps, but it drops down as low as 30fps.  Now compare that to the same game running at the same resultion but it average 55 fps, and never drops below 50fps, the only difference is they removed a particular lighting effect.  On a console the second option will be more enjoyable because you'll get a more constant frame rate, despite having a lower average frame rate.
> 
> ...



this seems a marketing move to tell we give you 60 fps at 1920 x 1080 when most of the console gamers dont know nothing about fps and punt money on something wortless than a pc. 
If they comes out with a system with high performance in EVERYGAME at high settings or ultra is good but they dont have to give the same FAKE NEW TECNOLOGY of ps3 for steal money from idiots with the excuse of the bluray .
This is very sad and we pc gamers brings again porting with shit graphics and unoptimized like gta 4 and others.
Now will see .


----------



## Dent1 (Nov 4, 2012)

MuhammedAbdo said:


> (SIGH)
> 
> This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .



BigMack70's  second account? Hello 



MuhammedAbdo said:


> (SIGH)
> 
> This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .
> 
> ...



What has any of this got to do with BigMack70's claim that your hardware needs to be 4x demanding to jump from 720p to 1080p?




DaMobsta said:


> 298 posts and yet no *Trickson *bashing the amd article. Go TPU!



FIXED!


----------



## acerace (Nov 4, 2012)

lyndonguitar said:


> says the Radeon User



In case you didn't notice, that is a sarcasm. 

Well, he asked for it.


----------



## Am* (Nov 4, 2012)

I'm really not impressed, and pretty disappointed with this. 

Rumours suggest the next Xbox going the same route of using AMD APUs. So, that will make both consoles boring and pretty much identical PCs-in-a-box, with only the optical drives and the company logos setting them apart. 

The worst part is, they're not even going to be high end by today's standards, which they were back in 2005/2006 when the 360 and the PS3 launched, and they certainly NEED to be if they want them to last anywhere near as long as the current consoles have. It's been about half a decade since the launch of both the 360 and the PS3 and they're already seeming more and more dated with every release. 

The only good news I see from this is the fact that it'll use AMD GPUs. This should take a tonne of weight from the shoulders of their driver developers and make their lives a whole lot easier with first party developers that will maximize the capabilities of their hardware (and possibly encourage more console developers to develop or at least release decent ports of their games). 

Other than that, the whole lineup of next gen consoles sounds crap. At this point, I can't see why another company, other than Sony, Microsoft and Nintendo, can't do the very same thing but better.



BigMack70 said:


> I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.
> 
> I call BS. More probably 720p upscaled



I can't see how it seems so farfetched to you. The 360's X1950-level GPU and the PS3's crappy 7800GT can do that already. This APU is in a whole different league compared to those old dogs.



No_Asylum said:


> Actually ... that isnt even impressive at all.  60fps @ 1080p is slow by todays standards.



With a reasonable level of antialiasing/anisotropic filtering, it sounds good by today's standards. The problem is, it'll be painfully slow by tomorrow's standards (even if they release it by 2014, it will already be way outdated).


----------



## Ferrum Master (Nov 4, 2012)

Am* said:


> The worst part is, they're not even going to be high end by today's standards



At this point guys... they are forgetting that at current development scale in mobile market... next gen ARM + PowerVR and Tegra will catch up the consoles pretty soon...(I am worried about Adreno I mean Radeon R500... it is still stucked in DX9 era... they fetched some people from ATI team recently again, I guess they are in a hurry)

Why you ask?

It has a larger market!

Good app ecosystem (stores). Unreal engine kit is already working with no problems, bringing no problems for devs...

The darn thing isn't only usable for gaming and doesn't gain dust in the shelf while mommy doesn't give $$ for a new[again the same] COD .

Do you think it won't be capable to catch this so called next gen?

Well I am currently playing this on my almost two years old crap phone based on Tegra2.

Horn Game


----------



## Trovaricon (Nov 4, 2012)

hello TPU fellows.

today I spared more than usual ammount of time to read the whole thread, register (after years) and prepare reply in the Internet 90' format.



BigMack70 said:


> Mathematical proof (for like the 4th time):
> (1920*1080)/(1280*720) = 2.25
> 60/30 = 2
> Put them together... 2.25*2 = 4.5
> ...


That is mathematically correct for painting pixels on 2D surface by "simple c algorithm" with predefined array of 2D vector images. 
Today rendering of 3D scene to 2D surface is much more complex than your simple calculation.

Objects are represented as 3D mesh with additional properties tied to them (material, textures, etc.). Now, you need to place this objects into 3D space by applying transformation and output texture position(s) (2d, to pick color from texture [x,y]) and vertex position "on screen", usually you utilize not only [x,y] coordinations but information about how deep "in screen" vertex is  (vertex processing).
For each pixel that covers calculated triangle (output of 3 executions of vertex position transformation) on screen is then run pixel (fragment) shader with interpolated values of vertex shader output (here you can mix, modify skip etc. on-screen pixel color).

You see ? Your simple calculation matches only pixel shader computation. Described procedure is very simplified totally basic projection of 3D objects with textures into 2D space. Our simple "resolution based computation power requirements" formula is  now much more complex, isn't it ? (let's not start even with adding basic lighting, shadowing or God-forbid animation to the calculation).

If you try to animate some object, you need to update its vertex positions within mesh as they are not only somewhere else on-screen (happens when you turn camera) but their relative position to each other is different. If this is handled by CPU, then this is often responsible for "CPU bottleneck", as it pushes constant pressure on CPU regardless of graphical settings. You can see it in multiplayer FPS with many players, or as perfect example - MMORPG games (CPU requirements for games as Lineage 2 in seriously "mass" pvp are astronomical). If it its handled by GPU, then you again have constant computation complexity not affected by render screen resolution.

*On topic:*
Create compiler exactly for single x86 architecture without compromises to "universal x86" operations selection (you can take into account exact memory / cache latencies and instruction latencies and their selection) and believe me, you will see miracles.
If next gen consoles contain some sort of X86 based APU, then did you not consider, that this will force to adapt new thinking for utilizing APU's in general for considerable amount of software developers ? And that is great success even for future desktop development.
If you have exact machine specification (HW, SW), you don't need statistics to determine how much operations can you execute in a given time on (avg) target machine HW with (avg) target software layer (drivers, OS) > you can count them exactly. 


AD *Internet 90' format* (semi-ot):
In the past, reading almost any discussion thread on sites devoted to technical stuff resulted in gaining substantial knowledge (either by users directly writing information in post, or by pointing other discussants to relevant resources). After spending hour of forum reading, you took for granted, that your knowledge base expanded (not necessarily in exactly-wanted direction). 
Today after huge Internet users expansion and with connection accessible even on toilet you need to watch out to not end up more stupid after hour of reading technical stuff related forum.
If users spent single hour of reading about how 3D rendering works (You can pick DirectX SDK samples, NeHe tutorial, some other introduction material or even a completely simple "How it works" or Wikipedia [1][2] reading) instead of smashing F5 for quickest possible response to "discussion enemies", then there would be real information sharing and knowledne gain benefit for all. Today Internet is not a medium for information and knowledge sharing (I have sometimes bad feeling that knowledge-generation process is stagnating) but one great human based random "BS" generator that can without any problems compete with random number generator run on supercomputer. 

Seriously - this thread contains enough text and graphics to cover PhD or some other work, but information value posts can be counted on one's fingers...
Until some genius comes up with "BS filter", it would be interesting to "emulate" such feature by manually picking of information-rich posts by moderators or even by forum users (something like existing "thanks" function) with forum filter to show only flagged posts.

*EDIT:* Now I checked Wikipedia second link and statement *"Vertex shaders are run once for each vertex given to the graphics processor"* is not alway true. If you utilize Radeon HD2k-HD4k tesselator, then vertices count processed by vertex shader is actually higher, because fixed pipeline tesselator is placed before vertex shader in rendering pipeline (see Programming for Real-Time Tessellation on GPU)


----------



## Ferrum Master (Nov 4, 2012)

Trovaricon said:


> hello TPU fellows.
> 
> today I spared more than usual ammount of time to read the whole thread, register (after years) and prepare reply in the Internet 90' format.
> 
> ...



Fully agree... It reminds me of the old times when Voodoo reigned and the sucker still didn't have hardware T&L... if someone still remembers what did it do. 

Simply summing up by coefficient isn't possible due to large data overhead that runs also in parallel. First of all memory bandwidth bottle necks, latency increases due to more complex scene and more shader intensive tasks due to light sources etc and the engine itself.

The coefficient ain't linear anymore since those late 90ies I guess.

The next thing that consoles have only! Why they can evolve the graphics on the same platform. For example Metal Gear, Hideo Kojima stated himself in interview that the development was so long due to the fact that they had to rewrite many engine parts using assembly to achieve needed performance for PS3. It is a nightmare you know, but this also bends the math about calculating what we could expect on screen, due to nonexistent recompiler software layer.


----------



## Super XP (Nov 4, 2012)

WhiteLotus said:


> Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.


How can it be cheaper? They are using miniature tablets for controllers. Quite interesting, but I wouldn't buy it for more than $250.


> Quote:
> FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .


This is not a fact, since when is the 7+ Year old XBox 360 2 years old in graphics vs. PC.


DaMobsta said:


> 298 posts and yet no intel fanboys bashing the amd article. Go TPU!
> 
> I have a bad feeling about using hybrid crossfire on a console though, even if they optimize it, that's still an extra piece of hardware that could cause problems.


No, they would never release a console if it wasn't running 100%. This may be a good idea and may benefit the PC because of these blasted console ports. I say go Hybrid Crossfire.


----------



## lyndonguitar (Nov 4, 2012)

we need more people like Trovaricon here lol


----------



## douglatins (Nov 4, 2012)

Wasn't the ps3 supposed to last 10 years?


----------



## Benetanegia (Nov 4, 2012)

lyndonguitar said:


> we need more people like Trovaricon here lol



Yes, I can agree with that, because of the effort and such, BUT the info he posted is heavily outdated and because of that non-relevant almost entirely. He's describing forward rendering and to make matters worse, forward rendering with per-vertex shading. That's not used in 95% of the games for 5+ years already. Nowadays everything is calculated on a per-pixel basis and several buffers are created with the resulting pixel information. That's it, deferred rendering/shading.

In deferred rendering what he describes only happens in the first pass (BF3 has 8+ passes): the diffuse color pass, which has very little information and after that everything from lighting, to shading, to advanced shadowing to ambient occlusion to everything happens on a per-pixel basis. Without all of this per-pixel calculations the end result would be the looks of a 90's 3d game. 90% of the work is based on pixel data == buffers == frames that are afterwards mixed (by ROPs and again pixel by pixel) into the final composition.


----------



## Super XP (Nov 4, 2012)

douglatins said:


> Wasn't the ps3 supposed to last 10 years?


 Yes that is what Sony wanted people to believe 


BigMack70 said:


> Couple quick points just since people keep interacting with my arguments.
> 
> @Trovaricon - I know. See my posts #270 & 272. I clarified, or at least tried to.
> 
> ...


I fully agree 100%. Today's consoles do not do 1080p, they do 720p then they get upscaled to 1080p.  Consoles don't have the power to output at 1920 x 1080p resolution especially when the game is too complex.


----------



## Disparia (Nov 4, 2012)

douglatins said:


> Wasn't the ps3 supposed to last 10 years?



All systems (except for maybe the Sega 32X, Nintendo Virtual Boy) have at least one or a couple timeless classics that keep them alive 

Now I know what you meant and I think it'll be close.



			
				September 29th 2012 said:
			
		

> Last week, the Vice President of Hardware Marketing for the PlayStation brand, John Koller, told news outlet GameSpot that the company plans to support the PlayStation 3 until 2015. While some may think this automatically means the PlayStation 4 will not see release until 2015, this is probably not the case. As Koller explains, the PlayStation 2 stayed active for years, even after the PlayStation 3 saw release. It makes sense for Sony to continue to support the PlayStation 3 since they do not alienate those Sony fans who won't upgrade ASAP to the PlayStation 4. However, chances are good that all of Sony's major franchises will make the jump to the PlayStation 4, leaving only the smaller developers and publishers to continue to support the PlayStation 3.


----------



## Trovaricon (Nov 4, 2012)

Benetanegia said:


> Yes, I can agree with that, because of the effort and such, BUT the info he posted is heavily outdated and because of that non-relevant almost entirely. He's describing forward rendering and to make matters worse, forward rendering with per-vertex shading. That's not used in 95% of the games for 5+ years already. Nowadays everything is calculated on a per-pixel basis and several buffers are created with the resulting pixel information. That's it, deferred rendering/shading.
> 
> In deferred rendering what he describes only happens in the first pass (BF3 has 8+ passes): the diffuse color pass, which has very little information and after that everything from lighting, to shading, to advanced shadowing to ambient occlusion to everything happens on a per-pixel basis. Without all of this per-pixel calculations the end result would be the looks of a 90's 3d game. 90% of the work is based on pixel data == buffers == frames that are afterwards mixed (by ROPs and again pixel by pixel) into the final composition.


What I described in previous post is definitely not forward rendering but exactly what I stated that it is:


Trovaricon said:


> basic projection of 3D objects with textures into 2D space.


And yes such projection looks like 90' 3d game. EDIT: I used "On screen" term to avoid need to explain render targets. 

Both forward and deferred rendering require more outputs from vertex shader (e.g. normals), not only texture coordination and vertex position in 2D space. It is hard to imagine how can you produce dynamic shadows with only single projection (mentioned BF)


----------



## VIPER (Nov 4, 2012)

btarunr said:


> ...The design goal is to be able to play games 1920 x 1080 pixels resolution, *with 60 Hz refresh rate*, and with the ability to run stereo 3D at 60 Hz...


13 pages of arguing about PS4 being capable of 1080p @60fps... OK, I know I am old, and maybe ignorant, but where does it say something about 60fps?!


----------



## Aquinus (Nov 4, 2012)

VIPER said:


> 13 pages of arguing about PS4 being capable of 1080p @60fps... OK, I know I am old, and maybe ignorant, but where does it say something about 60fps?!



The 360 now can output a "60hz signal" it doesn't mean that it rendering 1080p at 60hz though.


----------



## TheoneandonlyMrK (Nov 4, 2012)

Whilst some argue this wont be enough, imho this is an initial dev kit based on what they project will work with ps4,
 I recently noticed Amd have changed trinitys sucessor back to piledriver cores with radeon cores next, which to me indicates a trinity successor with Gcn2 and Hsa optimisations ahead of more cpu grunt, and its this chip with additional IP yet to be disclosed(imho interposer with lvl 4 x126mb cache and further Dsp's) that i believe will be the bases of a ps4 anyway, not directly this trinity chip thats for sure, so apples to apples will count for nothing,.

add in hardware optimisations for consoles and Amd's hint at a hardcoded gameing future (Api reduced/removed) and you will have a console that does 60 fps,

to the naysayers i have a crossfire main rig with a physx card(HYBRID) and it will do 60 Fps in EVERY game at 1080p with few settings needing to be eased ever bar AA, so to me the games amd dont do well in are nvidia biased or straight up physx games and an interesting point is that the xbox and ps3 implementation of physx use sse like extensions and are optimised better so nvidia are going to have to write physx to work well on Amd gear, of a kind

id swear some games work better just because an nvidia cards present(tho not used by the game)sometimes.


----------



## camoxiong (Nov 4, 2012)

a pc with Sony PS OS


----------



## Benetanegia (Nov 4, 2012)

Trovaricon said:


> Both forward and deferred rendering require more outputs from vertex shader (e.g. normals), not only texture coordination and vertex position in 2D space. It is hard to imagine how can you produce dynamic shadows with only single projection (mentioned BF)



And who said single projection?

I'm sorry but it looks like you're learning 3D programming and you're stuck on Chapter 1 yet.

Your argument was there's more than pixel shading, which is true and no one said otherwise. However you went all off with the description of what it's basically <5% of a modern game render pipeline and frame time, as if that represents a big proportion of it. So it's essentially true, but as I said irrelevant to dispute the argument that 4x the resolution requires 4x the power*. Even modern shadows are much more than a projection into shadow maps and is dependent on pixel shading.

*I said it in a previous post, this word is the biggest problem. The problem in a way is that people understand power as == performance in reviews and that's incredibly innacurate. A card with 2x the SPs is undeniably 2x as powerful in that department whether it ends up producing 2x the fps or fails to do so because it's bottlenecked elsewhere. 

Now the problem regarding the OT is that an A10 APU is severely limited in ALL fronts, SPs, ROPs, texture units... everything and is simply not going to do what a high-end GPU has difficulties on achieving even today, no matter the optimization. And that's another focus of argument because some people are saying that we don't know if it's going to be a custom APu with more GPU power, etc, but the artcile states it's a A10 and that's not a custom APU is it? It's a commercially available APU, which is why an APU is supposedly going to be used, because it's available and cheap to produce. The days of heavily customized chips is over. Otherwise (custom chip) they would have continued with PowerPC architecture and keep the backwards compatibility.



> and it will do 60 Fps in EVERY game at 1080p with few settings needing to be eased ever bar AA



First of all that's a lie or very arbitrary on what "few settings needing to be eased" trully means. 

Second your crossfire setup is at least 5x more powerful than an A10 APU, so even if it was true an APU would do 12 fps on the same "few settings eased" conditions. It's a console so let's add a MASSIVE optimization from being a console and you might or might not reach 30 fps (200% increase), but 60 fps not a chance.

Anyway the Wii U is rumored to have a significantly more powerful GPU than A10 APU. Is SONY trully going to release something less capable?


----------



## Aquinus (Nov 4, 2012)

Benetanegia said:


> Anyway the Wii U is rumored to have a significantly more powerful GPU than A10 APU. Is SONY trully going to release something less capable?



They said a custom chip based on the A10. We don't actually know what the specs are so it is very possible that it could sport a larger iGPU than what the A10-5800k has. We will have to wait and see what Sony churns out.


----------



## THE_EGG (Nov 4, 2012)

As far as the 4X performance needed for 1080p @ 60fps and all the other arguments about the next playstation, CAN WE JUST AGREE TO DISAGREE?


----------



## Rei86 (Nov 4, 2012)

Benetanegia said:


> Anyway the Wii U is rumored to have a significantly more powerful GPU than A10 APU. Is SONY trully going to release something less capable?



Most of the Wii U's spec is out.

2GB of shared RAM but with 1GB always detected to games, three core Boradway CPU, GPU on the same die being based on either the HD4000/5000 AMD.


----------



## erocker (Nov 5, 2012)

After having to cleanup things up once again, frankly I'm fed up with this thread. So, if anyone feels like ignoring the posting guidelines, insulting others, trolling, going off topic or ignoring a moderator's instruction you will receive a vacation for a while. This goes for everyone in this thead. I'm making no exceptions. This is your only warning.


----------



## jamsbong (Nov 5, 2012)

Sounds to me like a very average hardware specs. This is so un-playstation like.
All past Playstation hardware were always ahead of its time at launch. It took PC a couple of years to catch up PS3. 

PS4 looks more like a cost-cutting exercise.


----------



## Mussels (Nov 5, 2012)

why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should  give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.


----------



## happita (Nov 5, 2012)

Let's all look on the bright side. The console gamers can finally enjoy what PC gamers have been spoiled with for a long time. I can't wait to see what developers can come up with using a directx 11 capable GPU when this gen of consoles only had dx9 capable cards. It will no doubt be exciting for everyone. I'm tired of half-assed ports looking like shit and having all sorts of bugs because a developer didn't care enough to spend the hours needed for a fantastic gameplay experience. There's plenty of them out there unfortunately.


----------



## Mussels (Nov 5, 2012)

i'm just glad the console ports will be native DX11 this time around. cant we all be happy with that?


----------



## Rei86 (Nov 5, 2012)

Mussels said:


> why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should  give that fact away.
> 
> consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.



Eh, the Playstation 3 was pretty high end tech gear when it did launch.

Its own HDD, BR player built in, multi output, four USB 2.0, memory card reader, etc
And the STI produced Cell CPU was pretty high end for the time.


----------



## Benetanegia (Nov 5, 2012)

Mussels said:


> why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should  give that fact away.
> 
> consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.



Yeah but in order to be the same it would be something like an underclocked GTX 660 Ti or HD7950, not an APU that is 5x less powerful, we are talking about the other side of the spctrum altogether. And no, an APU is just 1 or 2 generations faster than what the XB360 had for example and it's not launching now but in end of 2013 or 2014, that's why I'm sure it will have smething like HD7850 underclocked. And just like they did, once smaller process is available they will include it in soc fashion.


----------



## xenocide (Nov 5, 2012)

Rei86 said:


> Its own HDD, BR player built in, multi output, four USB 2.0, memory card reader, etc
> And the STI produced Cell CPU was pretty high end for the time.



Most of those things had already existed on PC's for years, the onyl cutting edge feature was Blu-Ray.  The Cell CPU was really impressive on paper, but the simple fact was in underwhelmed.  The best analogy I can think of is Bulldozer--people assumed when AMD went from 4/6 Phenom II cores to an 8-Core Bulldozer they'd see a huge performance gain, but because the per core\per thread performance was down so much, it was a huge disappointment.  The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it.  They use Cell Technology in TV's now btw.



Benetanegia said:


> Yeah but in order to be the same it would be something like an underclocked GTX 660 Ti or HD7950, not an APU that is 5x less powerful, we are talking about the other side of the spctrum altogether. And no, an APU is just 1 or 2 generations faster than what the XB360 had for example and it's not launching now but in end of 2013 or 2014, that's why I'm sure it will have smething like HD7850 underclocked. And just like they did, once smaller process is available they will include it in soc fashion.



This is true.  This generation Consoles launched with 1 generation old GPU's that had features of their successors.  The Xenos was basically a streamlined X1950XT with some of the features that appeared in the 2900 series, and similarly the PS3 GPU was 7800-based with some features from the 8800 series.  The saving grace for the Xenos was its incredible usage of memory and implementation of eDRAM.  In order for this upcomming generation of consoles to impress, it would have to use as he said, a 7800 series or 660 GPU, which is nearly impossible.


----------



## Rei86 (Nov 5, 2012)

xenocide said:


> Most of those things had already existed on PC's for years, the onyl cutting edge feature was Blu-Ray.  The Cell CPU was really impressive on paper, but the simple fact was in underwhelmed.  The best analogy I can think of is Bulldozer--people assumed when AMD went from 4/6 Phenom II cores to an 8-Core Bulldozer they'd see a huge performance gain, but because the per core\per thread performance was down so much, it was a huge disappointment.  The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it.  They use Cell Technology in TV's now btw.



For a product that was AIO I say it was pretty damn impressive at its time.


----------



## happita (Nov 5, 2012)

xenocide said:


> The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it.  They use Cell Technology in TV's now btw.



They all learned from experience: Sony, Toshiba, and IBM. A CPU made with strong correlations to supercomputing doesn't exactly translate to killer real-world game performance, as Sony saw. So I really don't blame them for not wanting to bring back the Cell architecture for a second round. But I have to say though, they did a pretty dam good job for what they paid for. The developers who make exclusives for PS3 really push it to its limits, for ex. God of War 3, Uncharted 2 & 3, etc.


----------



## Rei86 (Nov 5, 2012)

happita said:


> They all learned from experience: Sony, Toshiba, and IBM. A CPU made with strong correlations to supercomputing doesn't exactly translate to killer real-world game performance, as Sony saw. So I really don't blame them for not wanting to bring back the Cell architecture for a second round. But I have to say though, they did a pretty dam good job for what they paid for. The developers who make exclusives for PS3 really push it to its limits, for ex. God of War 3, Uncharted 2 & 3, etc.



Just like Nintendo having a hold on gamers hearts with nostalgic names and the reason to purchase a Nintendo product; Sony has been on the offensive when it comes to first party offerings.

I mean really Uncharted 1/2/3, God of War 3, Killzone 2/3, GT5, and many of the smaller titles on PSN like Journey has made owning a PS3 worth it.


----------



## happita (Nov 5, 2012)

Rei86 said:


> I mean really Uncharted 1/2/3, God of War 3, Killzone 2/3, GT5, and many of the smaller titles on PSN like Journey has made owning a PS3 worth it.



That and the fact that I don't have to pay $50 a year just to play my games online. Sony did a good job when it decided not to charge it's customers to pay in order to play online which is great and is another reason why I own a PS3 and not a XBOX.


----------



## Steevo (Nov 5, 2012)

erocker said:


> After having to cleanup things up once again, frankly I'm fed up with this thread. So, if anyone feels like ignoring the posting guidelines, insulting others, trolling, going off topic or ignoring a moderator's instruction you will receive a vacation for a while. This goes for everyone in this thead. I'm making no exceptions. This is your only warning.



Big meanie!!!!!!

My first A8-5600 is together and based off the tiny cooler they included and very cool running temps I am almost scared of what it is actually going to be able to do.

Benchmarks to follow shortly.


To avoid a double post.

3D11 at stock with memory @ 800 Mhz 9.9.9.24 stock. 
Score
P1189 3DMarks

Graphics Score
1056 

Physics Score
3932 

Combined Score
1087 

PC Health Check

Your PC is performing properly.


Overclocked

GPU at 1000 Mhz, core speed bumped to 4.1Ghz no change to memory speed as the memory doesn't like it. 

Score
P1371 3DMarks

Graphics Score
1229 

Physics Score
4163 

Combined Score
1206 

PC Health Check

Your PC is performing properly.


4.2Ghz CPU, 1000 Mhz GPU

Score
P1370 3DMarks

Graphics Score
1227 

Physics Score
4239 

Combined Score
1208 

PC Health Check

Your PC is performing properly.


----------



## Binge (Nov 5, 2012)

Yes this is the thread that never ends... yes it goes on and on my friends... some people started trolling it not knowing a freaking thing, and they'll continue trolling it because.. I guess it's an e-penis win?  Yes this is the thread that never...

I'd like to thank the folks who have dabbled in game development for lending their comments despite the overwhelming wave of blatherskite.  That is all.


----------



## cdawall (Nov 5, 2012)

Binge said:


> Yes this is the thread that never ends... yes it goes on and on my friends... some people started trolling it not knowing a freaking thing, and they'll continue trolling it because.. I guess it's an e-penis win?  Yes this is the thread that never...
> 
> I'd like to thank the folks who have dabbled in game development for lending their comments despite the overwhelming wave of blatherskite.  That is all.



It works better to just unsubscribe from the thread..







Works rather well as the silliness stops.


----------



## lyndonguitar (Nov 5, 2012)

Either way if the PS4 sucks or whatever, I'm still pretty excited for this coming new "gen", that means more GAMEZ!!!


----------



## Cataclysm_ZA (Nov 5, 2012)

Hello everybody! This is my first time posting on TPU and I've been a lurker guest for years. I decided today I'm rather going to start adding my bit to the community here and this seems like the perfect place to start.



Thefumigator said:


> 2 - A10 is _today's_ top trinity apu, but won't be the only one. Think about A12, A14, A16, I mean, we don't really know how AMD will refresh its line of apus, we only know the line will be compatible.



AMD won't refresh the entire lineup until the generation after Steamroller hits, possibly unifying the entire family under one socket, all with GPU and ARM components. But for now lets assume that with Sony giving AMD a large dosh of cash and telling it what is needed that this drives AMD to accelerate production of Piledriver cores on a 22nm process with VLIW4 GPUs on a 32nm process. They've arguably had enough time to get something like that tapered out because the Sony deal has been in the works for years. But since it doesn't look like Steamroller will be on 22nm, it wouldn't be a train smash if the components were stuck on 32nm and 40nm respectively.

I don't think they'll change from the existing lineup of A4; A6; A8 and A10 cpus because that would introduce more complexity. But that's on the desktop which, for this thread, doesn't fit into the equation. What will be in the PS4 is going to be a different beast to what we're used to and it could be based off the A10-5700 and a HD6670 GPU. So lets leave the desktop out of this for now because, as everyone in this thread is eager to point out, they're not directly comparable. Similar in terms of hardware, yes, but with software they are very different performers.



Thefumigator said:


> 3 - Developers had to optimize multithreading on the PS3 very complex architecture. So adopting the A10 should make things easier, and dual GPU would be a breeze.



I also think that coding for an x86-64 architecture and more modern instruction set will make the developers life easier but I'm not sold on the dual-GPU portion just yet (even on the desktop I'm hesitant to recommend SLI or Xfire to anyone). I haven't seen results from an APU and GPU combo that shows frame rates over time and from what I've seen with SLI and Xfire, the stuttering issues are enough to put some people off dual GPUs completely. However, I do know that with a strict hardware configuration the Xfire rendering could be tweaked so that instead of rendering alternating frames the devs could choose to divide up the frames between the two. Unfortunately, only developers with access to these machines can answer our questions and until then, everything else is just assumption.



1d10t said:


> Why Sony choose APU?Definitely Sony knew something that we don't.Console are console,targeted for most casual gamer whose doesn't even bother about upscale.



TDP requirements and  less complexity, mostly (along with cost, which is going to be a big factor). You can build a APU setup into a thin client-like/ITX  chassis without having to worry too much about cooling and your GPU requirements are mostly catered for already. When I was working at a computer repair shop a year ago I received three PS3s to diagnose and fix. When I could finally open two up for myself, one a launch version and the other a Slim I was stumped at how the launch versions could have survived the heat generation. You had these massive, (relatively) power-sucking chips that needed a good amount of cooling to stay functional for as long as the warranty remained valid and I often found with other units that the cooling wasn't always up to scratch. I had to re-flow the boards, clean out the cooling systems and lap the heatsinks so that the older units wouldn't overheat. 

With desktop-class APUs, the stock cooler is perfectly fine. In fact, cooling requirements for the A10-5700 peaks at only 65W TDP which means there's much less work required in designing the console's cooling system. I think we might even see a launch version that is as slim as the PS3 Slim (not the recent swanky one which, IMO, looks fugly) and in future could become as small as the PS2 Slim. Considering that the APU in question might even be the mobile A10-4600M, its plausible. 



1d10t said:


> Next post is camera capture shows 1080p60Hz on TV info to countermeasure your opinion regarding lack of 1080p capabilities on PS3.You suggesting a proper method for comparing between these two,you may quote any of my last post,did i mention "rendered"?



No-one's ever said that the PS3 can't run games at 1080p. In fact, there's a handful that can, Prince of Persia being the only recent one I can remember. Its down to the developers that have to figure out what they want to sacrifice the most: visual fidelity and potentially higher performance or more stuff on the screen but potentially lower performance. 

To those of you who say that a game running at 30fps at 1080p is crap, I'd have to agree with you initially. If I notice it, it becomes a problem until I play the game enough times to not notice it and then its smooth sailing from there on. Even Forza Horizon, which I got to play recently, runs at 720p and minimum 30fps. Technically the developers could run the game at 1080p and get similar performance, but they'd have to sacrifice some visual fidelity and the beautiful world the game is rendered in. Personally, I don't have a problem with the speed at which the game is rendered, only that it looks good and doesn't suffer hiccups. 

Likewise, you can't directly compare today's consoles with desktops. Well, at least not the PS3 because the RSX GPU lacks some components and instruction sets that make it comparable to a desktop-class GPU. The Xbox 360 is closer to a proper desktop setup but again can't be compared directly because it can't render anything in DX10. A good deal of games today include a DX10/DX11 render path so that makes the comparison even more moot. 720p30 DX9 and 720p30 DX11 with their highest settings will look different and will behave differently due to the rendering mode. With the PS4 and the Xbox 720 being based off modern hardware, at least we'll have consoles and computers on the same footing again in terms of graphical ability, if not in performance. 

And I'm sad that the shift to an x86-64 architecture means my existing PS3 library won't be compatible with the new system, but I guess that its only fair that six years on a new standard is introduced. The PS3 has had an incredibly long run and its time for something new.


----------



## T4C Fantasy (Nov 9, 2012)

http://www.techpowerup.com/gpudb/1682/NVIDIA_RSX_'Reality_Synthesizer'.html

lets get nerdy, I have provided a spec sheet  and huge calculations in comments, enjoy

Edit: http://www.ted.com/talks/melissa_marshall_talk_nerdy_to_me.html


----------



## 1d10t (Nov 9, 2012)

Cataclysm_ZA said:


> When I was working at a computer repair shop a year ago I received three PS3s to diagnose and fix. When I could finally open two up for myself, one a launch version and the other a Slim I was stumped at how the launch versions could have survived the heat generation. You had these massive, (relatively) power-sucking chips that needed a good amount of cooling to stay functional for as long as the warranty remained valid and I often found with other units that the cooling wasn't always up to scratch



yes,i remembered around January 2009 i bought 40GB PS3 CECHJ NTSC-J,been enhanced to 65nm from the previous 90nm process.Still produce large amount of heat from rear-leaf blower fan,after 30 minutes of playing it sounded like a jet ready to take off 



> Likewise, you can't directly compare today's consoles with desktops. And I'm sad that the shift to an x86-64 architecture means my existing PS3 library won't be compatible with the new system, but I guess that its only fair that six years on a new standard is introduced. The PS3 has had an incredibly long run and its time for something new.



And yet fellow TPU member here say they need bla-bla gpu to do 1080p,mumble-mumble about so called rendering or giving mambo-jambo calculation for kindergarten exam.
Are they forget?This is a console,a consumer electronic,a set-top-boxes attached to a TV set.Sony had to make their console compatible for a wide range TV,so they attached A/V RCA output legacy for analog TV and sporting HDMI for SD/HD/Full HD TV.They didn't have to do 60fps because if doing so it violates NTSC standards (29,97fps),PAL standards (25fps) or non-popular SECAM.

Current PS3 using Linux kernel 2.4,i bet Sony will use Linux 3.0 with some HSA optimization


----------



## KainXS (Nov 9, 2012)

I am wondering how much information on the Wii U's gpu is available that can be confirmed, most of what I have seen are rumors, anyone know any facts, all we have are some pictures that show that the gpu and cpu are on the same package and unconfirmed sources just like when the wii launched and it took years to verify that after the wii's release.


----------



## newtekie1 (Nov 9, 2012)

Mussels said:


> why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should  give that fact away.
> 
> consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.



Agreed completely, and it only takes a little bit of research to confirm this.

Lets look at the PS3.  It was launched in Nov/2006.  It used basically the core from a 7800GTX, but with lower memory clocks and less ROPs, but lets just say it was a 7800GTX for the sake of argument.  At almost the exact same time, within days actually, nVidia released the 8800GTX which was a huge leap ahead of what the 7800GTX was capable of.  Not only did it provide 50-100% more performance(depending on the game) in DX9, it also introduced DX10.

The Xbox360 wasn't that much different.  It used, basically, an X1800XT GPU(again lower clocks and memory speeds, but we'll just say x1800XT).  And while it was a little more up to date when it was released, thanks to be released a year earlier than the PS3, it was still behind as ATI had just released the x1900 series as the Xbox was coming out.


----------



## Benetanegia (Nov 9, 2012)

newtekie1 said:


> Agreed completely, and it only takes a little bit of research to confirm this.
> 
> Lets look at the PS3.  It was launched in Nov/2006.  It used basically the core from a 7800GTX, but with lower memory clocks and less ROPs, but lets just say it was a 7800GTX for the sake of argument.  At almost the exact same time, within days actually, nVidia released the 8800GTX which was a huge leap ahead of what the 7800GTX was capable of.  Not only did it provide 50-100% more performance(depending on the game) in DX9, it also introduced DX10.
> 
> The Xbox360 wasn't that much different.  It used, basically, an X1800XT GPU(again lower clocks and memory speeds, but we'll just say x1800XT).  And while it was a little more up to date when it was released, thanks to be released a year earlier than the PS3, it was still behind as ATI had just released the x1900 series as the Xbox was coming out.



And again, the PS4 will not release until 2014 or very late 2013. By that time HD8900 refresh will be released if not HD9900. Following the logic above the PS4 should use at the very least an HD7950 (lower clocks, maybe 256 bit) or simply a HD7870 or whatever OR as said a HD8950, because HD9900 is around the corner.

They are talking about an APU or in the best case scenario maybe paired up with HD6770 or something like that. It's as if PS3 had shipped with a GeForce 6600 instead of a 7800. Plus in 2004/2005 when the PS3 was spec'ed the 7800 was the fastest card.

So yeah let's stop with that argument. No one's saying it had a high-end GPU when it launched, but it definately had a (slower) variant of the high-end GPUs when they were designed. PS4 by the time it launches, it will have a low-end GPU 3 generations behind.


----------



## T4C Fantasy (Nov 10, 2012)

its like this every year, some people get sick of it, but some people love to see others talk about things they don't know about. 

here are the facts, im not wrong and its been said before, even in this thread, game consoles only need a fraction of the power a pc needs for gaming because there is no heavy multitasking, no os throttling, and devs get a dev kit that PROMISES 100% performance so the devs have time to work with what little they have without worrying about component upgrades. even today I bet any of you cant play gta iv with 256mb ram  and even a hd7970 on windows xp and up with as good frames as a ps3 gets... lol and with a core i7 3770k.. I doubt gta iv will even open, it will just say fuck you im busy.


----------



## Benetanegia (Nov 10, 2012)

T4C Fantasy said:


> its like this every year, some people get sick of it, but some people love to see others talk about things they don't know about.



I'm actually a low profile game developer and I know that to be BS. Not 100% bullshit, but it's completely overblown. Console being a closed platform with very lightweight OS, helps increase performance, but not to the point where a console becomes 2x faster on the same hardware and let's not start talking about 8x hardware difference.

As a developer I also know that console games are heavily watered down (in much more ways than average guy can appreciate, i.e. in bit precision on internal buffers) so any comparison of performance is rather stupid, since you are never comparing apples to apples. GTA IV on PC is not the same as console versions. Plus it's horribly optimized, if at all.

I also know that they have to make almost *miracles* to "optimize" the console games to the point they run at 30 fps, whereas they are only allowed to spend a fraction of the time to optimize for PC. If they spent the same time and used the same tricks the performance would be really close to consoles. In fact early games like for example Gears of War, looked better and ran better on PC way back in 2006, despite the fact that the hardware difference was not as pronounced back then. Why was that posible is simple, it was "optimized" (read watered down + real optimization) almost as much as the console version.


----------



## THE_EGG (Nov 10, 2012)

Benetanegia said:


> I'm actually a low profile game developer and I know that to be BS. Not 100% bullshit, but it's completely overblown. Console being a closed platform with very lightweight OS, helps increase performance, but not to the point where a console becomes 2x faster on the same hardware and let's not start talking about 8x hardware difference.
> 
> As a developer I also know that console games are heavily watered down (in much more ways than average guy can appreciate, i.e. in bit precision on internal buffers) so any comparison of performance is rather stupid, since you are never comparing apples to apples. GTA IV on PC is not the same as console versions. Plus it's horribly optimized, if at all.
> 
> I also know that they have to make almost *miracles* to "optimize" the console games to the point they run at 30 fps, whereas they are only allowed to spend a fraction of the time to optimize for PC. If they spent the same time and used the same tricks the performance would be really close to consoles. In fact early games like for example Gears of War, looked better and ran better on PC way back in 2006, despite the fact that the hardware difference was not as pronounced back then. Why was that posible is simple, it was "optimized" (read watered down + real optimization) almost as much as the console version.



Thanks for the insight, I suppose that makes sense for the older games to run better on PC vs. console compared to newer ones because of what you said being that the performance gap wasn't as large. I think the worst port I played on PC was 'Wheelman' where even the controls were all left as X360 controls (such as 'A' button, 'RT' etc) and that happened without any controllers plugged in as well D:


----------



## T4C Fantasy (Nov 10, 2012)

Benetanegia said:


> I'm actually a low profile game developer and I know that to be BS. Not 100% bullshit, but it's completely overblown. Console being a closed platform with very lightweight OS, helps increase performance, but not to the point where a console becomes 2x faster on the same hardware and let's not start talking about 8x hardware difference.
> 
> As a developer I also know that console games are heavily watered down (in much more ways than average guy can appreciate, i.e. in bit precision on internal buffers) so any comparison of performance is rather stupid, since you are never comparing apples to apples. GTA IV on PC is not the same as console versions. Plus it's horribly optimized, if at all.
> 
> I also know that they have to make almost *miracles* to "optimize" the console games to the point they run at 30 fps, whereas they are only allowed to spend a fraction of the time to optimize for PC. If they spent the same time and used the same tricks the performance would be really close to consoles. In fact early games like for example Gears of War, looked better and ran better on PC way back in 2006, despite the fact that the hardware difference was not as pronounced back then. Why was that posible is simple, it was "optimized" (read watered down + real optimization) almost as much as the console version.



I didn't mention anything about graphics, only fps, of course pc version is better in everyway except it requires more demanding hardware but for very good reason, the multi tasking and throttling. and consoles bring out the best in devs, there still isn't a pc racing game with better graphics than gt5, maybe better simulation... but not graphics, its peoples opinion on gameplay..  that in rare cases devs  put more work into graphics for consoles then pcs and achieve success in doing so.


----------



## Filiprino (Nov 10, 2012)

Benetanegia said:


> I'm actually a low profile game developer and I know that to be BS. Not 100% bullshit, but it's completely overblown. Console being a closed platform with very lightweight OS, helps increase performance, but not to the point where a console becomes 2x faster on the same hardware and let's not start talking about 8x hardware difference.
> 
> As a developer I also know that console games are heavily watered down (in much more ways than average guy can appreciate, i.e. in bit precision on internal buffers) so any comparison of performance is rather stupid, since you are never comparing apples to apples. GTA IV on PC is not the same as console versions. Plus it's horribly optimized, if at all.
> 
> I also know that they have to make almost *miracles* to "optimize" the console games to the point they run at 30 fps, whereas they are only allowed to spend a fraction of the time to optimize for PC. If they spent the same time and used the same tricks the performance would be really close to consoles. In fact early games like for example Gears of War, looked better and ran better on PC way back in 2006, despite the fact that the hardware difference was not as pronounced back then. Why was that posible is simple, it was "optimized" (read watered down + real optimization) almost as much as the console version.



That's what I always try to explain.


----------

