# Next gen consoles will ALL use AMD GPU



## twilyth (Jul 8, 2011)

This is big news for AMD if it's accurate.  Nintendo, Sony and Xbox will all use AMD GPU's in their next gen consoles.



> The Big GPU News
> 
> What looks to be a "done deal" at this point is that AMD will be the GPU choice on all three next generation consoles. Yes, all the big guns in the console world, Nintendo, Microsoft, and Sony, are looking very much to be part of Team AMD for GPU. That is correct, NVIDIA, "NO SOUP FOR YOU!" But NVIDIA already knew this, now you do too.
> 
> There are going to be game spaces that NVIDIA does succeed in beyond add in cards and that will likely be in the handheld device realm but we do not see much NVIDIA green under our TV sets. NVIDIA was planning to have very much underwritten its GPU business with Tegra and Tegra 2 revenues by now, but that is moving much slower than the upper brass at NVIDIA wishes. Tegra 2 penetration has been sluggish to say the least.


----------



## Frick (Jul 8, 2011)

Oh oh oh my oh my. How much would that deal be worth to AMD? Not that it matters really, its still good news.


----------



## HossHuge (Jul 8, 2011)

I wonder if they showed Nintendo, Microsoft, and Sony this chart as a part of their sales pitch?  I'm guessing it would really help.


----------



## erixx (Jul 8, 2011)

Consoles just have to be and stay crapware. I hardly imagine the Catalyst-hot-fix nightmare all the households will have.....


----------



## Batou1986 (Jul 8, 2011)

Now we will see all the consoles fail because there will be no epeen fights over who's got the better gfx it will be all up to the developer.

Kinda how PS3 has the best gfx if you compare PGR3 to GT5 but compare  GT5 to Forza 3 or Blur and there about the same what the xbox lacks in power they make up for in engine design.


----------



## dir_d (Jul 8, 2011)

erixx said:


> Consoles just have to be and stay crapware. I hardly imagine the Catalyst-hot-fix nightmare all the households will have.....


You do realize that AMD has been in the top Consoles for some time now and their software has nothing to do with their hardware for consoles.



Batou1986 said:


> Now we will see all the consoles fail because there will be no epeen fights over who's got the better gfx it will be all up to the developer.
> 
> Kinda how PS3 has the best gfx if you compare PGR3 to GT5 but compare  GT5 to Forza 3 or Blur and there about the same what the xbox lacks in power they make up for in engine design.



The article said they will all be running AMD GPU's, it did not say they would be running all of the SAME AMD GPU.


----------



## Nesters (Jul 8, 2011)

erixx said:


> Consoles just have to be and stay crapware. I hardly imagine the Catalyst-hot-fix nightmare all the households will have.....



You do realise that those "hot-fix" releases are more like an update rather than an actual bug/problem fix... right?

Don't judge things by its labels.


----------



## FordGT90Concept (Jul 8, 2011)

That's what NVIDIA gets for focusing on CUDA/Tesla.  I think PS3 is the only one running on an NVIDIA GPU right now.

Sony is in a world of hurt (read: either really expensive hardware or software emulation which won't be easy of the PS3 architecture) in terms of backwards compatibility on PS4.  That puts Xbox360 and Wii in a better position.  Add to it the fact that PS4 is unlikely to debut a new hardware medium (where PS was CD, PS2 was DVD, and PS3 was BR), PS4 is lining up to be...not a failure, but not wildly successful either.  Oh, can't forget it's going to attempt to clone Kinect--another strike.  The only thing PS4 might have going for it is exclusive titles but that can be said of all consoles.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> That's what NVIDIA gets for focusing on CUDA/Tesla.  I think PS3 is the only one running on an NVIDIA GPU right now.
> 
> Sony is in a world of hurt (read: either really expensive hardware or software emulation which won't be easy of the PS3 architecture) in terms of backwards compatibility on PS4.  That puts Xbox360 and Wii in a better position.  Add to it the fact that PS4 is unlikely to debut a new hardware medium (where PS was CD, PS2 was DVD, and PS3 was BR), PS4 is lining up to be...not a failure, but not wildly successful either.  Oh, can't forget it's going to attempt to clone Kinect--another strike.  The only thing PS4 might have going for it is exclusive titles but that can be said of all consoles.



PS3 Emulation will be easy, RSX is just a box standard Nvidia PC GPU with no special alterations. Porting RSX code over to PS4 will and should be a piece of cake.

Getting Cell code to run will be the tricky part, Unless Sony use one of the newer generations of Cell for PS4.

Personally I hate the consoles, They should all die.


----------



## FordGT90Concept (Jul 8, 2011)

It's the Cell CPU that is going to be difficult to emulate.  Most likely they'll have to use AMD Streams to off-load the SPE work to the GPU.

The only tricky parts with RSX is memory handling because it shared RAM with the Cell CPU and connected via Cell IO bus.  That would be very difficult to implement if they went with hardware emulation without doing like the PS3 had (a hardware PS2 processor onboard). Really, software emulation is the only option for PS3 support and that means they have to spend more on especially the CPU.

Any way you shake it, it is likely to be either very buggy with backwards compatibility or very expensive.


----------



## TheoneandonlyMrK (Jul 8, 2011)

this news will not be good for nvidia or intel as some or all may end up with APU's over cpu's, would be very good for the pc world as AMD have allways supported open standards so porting would become a doddle,

and this news if true would certainly explain a recent comment made by an ATI rep that in the future dev co's may go back to bare metal codeing to get the best out of gfx hardware,
good news for us pc gamers, just hope its true.

 i do like nvidia and wish them no trouble fiscally, but i dont like their attitude to open standards or their apple like corporate strategy sometimes (physx is bollox), 

how many games would have better physics if the code was unified into a std, something that might help the whole IT universe inc many areas not just games


----------



## FordGT90Concept (Jul 8, 2011)

All three are using IBM CPUs (as the current gen consoles are too) so there's zero chance of all-in-one processors being used.


----------



## TheoneandonlyMrK (Jul 8, 2011)

FordGT90Concept said:


> All three are using IBM CPUs (as the current gen consoles are too) so there's zero chance of all-in-one processors being used.



rumoured to maybe use IBM cell processors, I read elsewhere the Xbox and possibly PS4 MAY (also a rumour) use an AMD APU that is as yet unseen

even this news is a rumour nothing more, and im not saying your wrong im saying we may both be


----------



## TheMailMan78 (Jul 8, 2011)

Batou1986 said:


> Now we will see all the consoles fail because there will be no epeen fights over who's got the better gfx it will be all up to the developer.
> 
> Kinda how PS3 has the best gfx if you compare PGR3 to GT5 but compare  GT5 to Forza 3 or Blur and there about the same what the xbox lacks in power they make up for in engine design.


 Since when did the PS3 have a better GPU? They are both damn near identical in power.



FordGT90Concept said:


> All three are using IBM CPUs (as the current gen consoles are too) so there's zero chance of all-in-one processors being used.



Oh? Why? Because they use IBM chips now they cant use an AMD APU in future? Why exactly when most games on the consoles are ported over to PC? A good APU would make the process even easier. Its not like they make the fucking games on an IBM PowerPC platform.

APU is the future.


----------



## FordGT90Concept (Jul 8, 2011)

PS4 = IBM POWER7
Wii U = IBM "Power-based"

If PS4 or Xbox360 use an x86 CPU, you can kiss backwards compatibility goodbye.  Microsoft learned that lesson when they went from Xbox to Xbox360. Sony isn't likely to make that mistake either.  POWER processors are the easiest way forward for all of them.  IBM does large-scale custom orders where Intel would rather just sell them an underclocked Core 2 procssor in volume.  AMD would likely respond the same and they don't even own foundries anymore where Intel and IBM do.


----------



## TheMailMan78 (Jul 8, 2011)

FordGT90Concept said:


> PS4 = IBM POWER7
> Wii U = IBM "Power-based"
> 
> If PS4 or Xbox360 use an x86 CPU, you can kiss backwards compatibility goodbye.  Microsoft learned that lesson when they went from Xbox to Xbox360. Sony isn't likely to make that mistake either.  POWER processors are the easiest way forward for all of them.  IBM does large-scale custom orders where Intel would rather just sell them an underclocked Core 2 procssor in volume.



Any of the AMD APU's should be able to emulate any of the current platforms perfectly fine. Hell I can play Dolphin just fine on my laptop right now.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> It's the Cell CPU that is going to be difficult to emulate.  Most likely they'll have to use AMD Streams to off-load the SPE work to the GPU.
> 
> *The only tricky parts with RSX is memory handling because it shared RAM with the Cell CPU and connected via Cell IO bus.  That would be very difficult to implement if they went with hardware emulation *without doing like the PS3 had (a hardware PS2 processor onboard). Really, software emulation is the only option for PS3 support and that means they have to spend more on especially the CPU.
> 
> Any way you shake it, it is likely to be either very buggy with backwards compatibility or very expensive.



No it wouldn't, RSX stores things in local RAM because it doesn't have enough VRAM to keep it all in local memory.

I expect PS4's GPU to have more then enough VRAM to keep everything in it's local memory pool, So they would have enough VRAM in PS4 just to dump all of RSX's data in there.

And PS4 might offer the same memory configuation anyway.


----------



## TheoneandonlyMrK (Jul 8, 2011)

TheMailMan78 said:


> Since when did the PS3 have a better GPU? They are both damn near identical in power.



true, they are both poo, the ps3 is saved from being xbox shit by the (ive heard only 5of the 6 work mind)cell processors running gfx assisting threads afaik

QUOTE=FordGT90Concept;2335584]PS4 = IBM POWER7
Wii U = IBM "Power-based"

If PS4 or Xbox360 use an x86 CPU, you can kiss backwards compatibility goodbye. Microsoft learned that lesson when they went from Xbox to Xbox360. Sony isn't likely to make that mistake either. POWER processors are the easiest way forward for all of them. IBM does large-scale custom orders where Intel would rather just sell them an underclocked Core 2 procssor in volume. AMD would likely respond the same and they don't even own foundries anymore where Intel and IBM do.[/quote]

thats balls AMD are achip ARCHtecture design house, they would be perfectly happy to do a special for each and any foundry will make what their god damn told given a multi million chip order


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> Since when did the PS3 have a better GPU? They are both damn near identical in power.
> 
> 
> 
> ...



They aint going to go with an APU just to make ports easier :shadedshu

They're going to with whatever gives the best power to price ratio, They don't do things to make PC life easier, Why would they? They would be crippling there own platform for the sake of competitive one 



theoneandonlymrk said:


> true, they are both poo, the ps3 is saved from being xbox shit by the (ive heard only 5of the 6 work mind)cell processors running gfx assisting threads afaik



RSX is slightly faster then Xenos at shader work but Xenos completely slaps RSX in the triangle department which is why most developers use Cell for culling.

Cell inside PS3 has 7 fully open SPU's, The 7th is part shared with the OS and the other 6 are fully availavle to what ever delvelopers want to use them for.


----------



## TheMailMan78 (Jul 8, 2011)

All I know is this generation may still be IBM but soon they will all have to go APU. It will become way to expensive not to. Dedicated GPU's and uber powerful desktops are going to be dinosaurs in 15 years. Production will flip and APU will me the only logical path.


----------



## FordGT90Concept (Jul 8, 2011)

TheMailMan78 said:


> Any of the AMD APU's should be able to emulate any of the current platforms perfectly fine. Hell I can play Dolphin just fine on my laptop right now.


The processor in the Wii is a piddly 700 MHz single-core and your computer is?  2+ GHz and more than one core.  You got a lot of processing headroom where consoles don't.  Not to mention, over 4 years of work on the emulator and not all titles play flawless.

This is what happened when Microsoft changed from Intel to IBM.  They've been trying to patch it ever since.

x86 and POWER don't play together nicely.




TheMailMan78 said:


> All I know is this generation may still be IBM but soon they will all have to go APU. It will become way to expensive not to. Dedicated GPU's and uber powerful desktops are going to be dinosaurs in 15 years. Production will flip and APU will me the only logical path.


I think more likely, IBM and AMD would pair up to design a POWER + Radeon all-in-one processor on a chip or at least an MCM.  Hell, an MCM is possible in the upcoming generation but, considering they are usually passively cooled, they may still keep them separate.  It's easier to fight two relatively cool hot spots than one really hot hotspot.


----------



## TheMailMan78 (Jul 8, 2011)

FordGT90Concept said:


> The processor in the Wii is a piddly 700 MHz single-core and your computer is?  2+ GHz and more than one core.  You got a lot of processing headroom where consoles don't.  Not to mention, over 4 years of work on the emulator and not all titles play flawless.
> 
> This is what happened when Microsoft changed from Intel to IBM.  They've been trying to patch it ever since.
> 
> x86 and POWER don't play together nicely.



Yeah because its a little open source project. If Nintendo backed it then it would be fine.



antuk15 said:


> They aint going to go with an APU just to make ports easier :shadedshu
> 
> They're going to with whatever gives the best power to price ratio, They don't do things to make PC life easier, Why would they? They would be crippling there own platform for the sake of competitive one



Do you think they make those games on a console? Maybe in the dashboard there is a "game maker" app?


----------



## TheoneandonlyMrK (Jul 8, 2011)

yet Apple went from IBM to intel just recently

fair dos ive no ps3 thought it was 6 total one disabled but same shit diff no, 8 with one disabled wtf


----------



## TheMailMan78 (Jul 8, 2011)

FordGT90Concept said:


> I think more likely, IBM and AMD would pair up to design a POWER + Radeon all-in-one processor on a chip or at least an MCM.  Hell, an MCM is possible in the upcoming generation but, considering they are usually passively cooled, they may still keep them separate.  It's easier to fight two relatively cool hot spots than one really hot hotspot.



How hot do you think an APU designed strictly for a console is gonna be? If they can be cooled in a laptop then a console is cake.


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> Do you think they make those games on a console? Maybe in the dashboard there is a "game maker" app?



You meen like how they code Cell on PC? A CPU that has an architecture thats fuck all like anything on PC, Seriously get a clue. 

They're not going to pick hardware to make porting easier to other machines.


----------



## TheMailMan78 (Jul 8, 2011)

antuk15 said:


> You meen like how they code Cell on PC? A CPU that has an architecture thats fuck all like anything on PC, Seriously get a clue.
> 
> They're not going to pick hardware to make porting easier to other machines.



Now the cell owns everything on the PC?


----------



## antuk15 (Jul 8, 2011)

theoneandonlymrk said:


> yet Apple went from IBM to intel just recently
> 
> fair dos ive no ps3 thought it was 6 total one disabled but same shit diff no, 8 with one disabled wtf



A fully active Cell has 8 SPE's and a PowerPC core.

PS3's Cell has had one SPE disabled to help with yeilds which leaves 7 SPE's and one PowerPC core.


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> Now the cell owns everything on the PC?



You're an idiot and have no idea what you're talking about.


----------



## FordGT90Concept (Jul 8, 2011)

I suggest you (anyone thinking it's good to switch from POWER to x86) read this:
Analysis: x86 Vs PPC




TheMailMan78 said:


> How hot do you think an APU designed strictly for a console is gonna be? If they can be cooled in a laptop then a console is cake.


Very, imagine a 2+ GHz quad-core CPU and HD 5670 or HD 5770 crammed into the same space.  PS4 is likely to have more than that even.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> Very, imagine a 2+ GHz quad-core CPU and HD 5670 or HD 5770 crammed into the same space.  PS4 is likely to have more than that even.



You really think they'll just stick a 45nm GPU inside a new console when a 28nm process will mature by then?


----------



## TheMailMan78 (Jul 8, 2011)

antuk15 said:


> You're an idiot and have no idea what you're talking about.



Actually I do know what I am talking about. The only thing good about the Cell is the DSP. Everything else pretty much sucks.



FordGT90Concept said:


> I suggest you (anyone thinking it's good to switch from POWER to x86) read this:
> Analysis: x86 Vs PPC
> 
> 
> ...



True. Very true. However the first gen of the APU is already running DX11 with very good frame rates. Things will get cooler and quicker. APU is heading for laptops and tablets soon. Once it does the heat will be so small it would be dumb NOT to go APU.


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> Actually I do know what I am talking about. The only thing good about the Cell is the DSP. Everything else pretty much sucks.



But back in the day it was a very very powerful CPU, Infact when it comes to running graphicical tasks it can hold it's own with the best PC CPU's.

IBM and Sony have gone quite a few Cell revisions since PS3 launched and added better performance on top.


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> You really think they'll just stick a 45nm GPU inside a new console when a 28nm process will mature by then?


Consoles are always behind in manufacturing process because they need massive production runs for launch.

Not to mention, requirements change.  PS3/Xbox360 struggle at 1080i, nevermind 1080p and that is going to be expected of the next generation.  A 45nm-processor (what they are using now) is simply inadequate for the constantly evolving needs.  Lower processes help but it still going to run hot.


----------



## TheoneandonlyMrK (Jul 8, 2011)

antuk15 said:


> A fully active Cell has 8 SPE's and a PowerPC core.
> 
> PS3's Cell has had one SPE disabled to help with yeilds which leaves 7 SPE's and one PowerPC core.



thanks for telling me that again i had failed to read it stated earlier i do not have a ps3 or any console but yall be wrong APPLE used IBM power PC chips now use SB intel their is no issue thats an issue if millions of pounds are involved  your so confident bout cell use, when its an effin rumour, nugget im only sayin nowts set in stone yet

lmao the ps3 has a blue ray player and it dosnt struggle with 1080i or 1080p it IS slower but ANY gfx card or console would be given the higher workload.


----------



## antuk15 (Jul 8, 2011)

theoneandonlymrk said:


> thanks for telling me that again i had failed to read it stated earlier i do not have a ps3 or any console but yall be wrong APPLE used IBM power PC chips now use SB intel their is no issue thats an issue if millions of pounds are involved  your so confident bout cell use, when its an effin rumour, nugget im only sayin nowts set in stone yet



I hate consoles, Not owned one since PS2 was king.....

Jaggie, shimmery console image quality...Ewwww


----------



## TheMailMan78 (Jul 8, 2011)

antuk15 said:


> But back in the day it was a very very powerful CPU, Infact when it comes to running graphicical tasks it can hold it's own with the best PC CPU's.
> 
> IBM and Sony have gone quite a few Cell revisions since PS3 launched and added better performance on top.



5 years ago.....Used a Sandy yet?

Dude the Cell was an over hyped piece of shit. Still is. Its fanboy fodder.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> Consoles are always behind in manufacturing process because they need massive production runs for launch.
> 
> Not to mention, requirements change.  PS3/Xbox360 struggle at 1080i, nevermind 1080p and that is going to be expected of the next generation.  A 45nm-processor (what they are using now) is simply inadequate for the constantly evolving needs.  Lower processes help but it still going to run hot.



When they laucnhed 1080p TV's were either not widespread or were to expensive.

Heck in the UK 1080p was still for the mega rich at the time of PS3's release.

iirc Sony have there own fab labs and could easily handle fabbing 32nm parts.


----------



## FordGT90Concept (Jul 8, 2011)

Few engines use 4 SPEs, nevermind 7.  SPEs are a PITA to code for and Sony is likely to return to the POWER architecture which is easier to handle (it's more intelligent in delegating tasks).


----------



## TheoneandonlyMrK (Jul 8, 2011)

antuk15 said:


> Heck in the UK 1080p was still for the mega rich at the time of PS3's release.



A ps3 was for the mega rich when they came out


----------



## TheMailMan78 (Jul 8, 2011)

antuk15 said:


> When they laucnhed 1080p TV's were either not widespread or were to expensive.
> 
> Heck in the UK 1080p was still for the mega rich at the time of PS3's release.
> 
> iirc Sony have there own fab labs and could easily handle fabbing 32nm parts.



It upscales to 1080i now. It cant even do that natively.


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> 5 years ago.....Used a Sandy yet?
> 
> Dude the Cell was an over hyped piece of shit. Still is. Its fanboy fodder.



Dude SB is an interger monster but most graphical tasks are floating point based and Cell is much better suited then SB.

Maybe you should try reading up on what develpers are actually running on Cell.

A few of the newer games are doing full ML-Anti-aliasing on the SPE's with full culling, Physics, Lighting calculations and other very intensive lighting simulations.

Most PC CPU's would not be able to handle all of that.

As a normal CPU Cell is old hat but pull your head out of PC's arse and give credit were credit's due.


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> It upscales to 1080i now. It cant even do that natively.



Because it's doesn't have a true scaler chip like 360 does.

But it still has more native 1080 games then 360 does.


----------



## MilkyWay (Jul 8, 2011)

Well tbh the xbox 360S has a sort of APU with the cpu and gpu on die.
I could see PS4 being nvidia to keep some sort of backwards compatibility but my guess is its more to do with what cpu they use which i think is a modified or upgraded cell.

APU works on a system like this since you dont upgrade, on a pc i want the ability to change whatever i want.


----------



## TheMailMan78 (Jul 8, 2011)

MilkyWay said:


> Well tbh the xbox 360S has a sort of APU with the cpu and gpu on die.
> I could see PS4 being nvidia to keep some sort of backwards compatibility but my guess is its more to do with what cpu they use which i think is a modified or upgraded cell.
> 
> APU works on a system like this since you dont upgrade, on a pc i want the ability to change whatever i want.



Exactly. Its only a matter of time before the merge.


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> When they laucnhed 1080p TV's were either not widespread or were to expensive.
> 
> Heck in the UK 1080p was still for the mega rich at the time of PS3's release.
> 
> iirc Sony have there own fab labs and could easily handle fabbing 32nm parts.


Even if 1080p were widespread, they wouldn't support it because of the costs in terms of GPU and memory (which translates to heat, money, and power consumption).  PS3 was already costing Sony hundreds of dollars more per console to manufacturer than people were willing to pay.  It would have even been worse.  Not to mention, the Xbox360 was plagued with heat issues without the more potent hardware.


MCM is not an all-in-one processor.  It's a discreet GPU and CPU on the same chip, not die.


----------



## TheoneandonlyMrK (Jul 8, 2011)

antuk15 said:


> Dude SB is an interger onsters but most graphical tasks are floating point based and Cell is much better suited then SB.
> 
> Maybe you should try reading up on what develpers are actually running on Cell.
> 
> ...



your describing why Bulldozer (trinity) will be the perfect console Processor there arent you,
 256 or 2x 128 bit FPU (per core) plus 400-800 maybe more shaders with single and dual core gfx with possibly GCU instead of vliw4 or 5 shaders.

now that you mention it ALL 3 WILL BE TRINITY based or derived APUS imho well more ridiculous guess


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> Even if 1080p were widespread, they wouldn't support it because of the costs in terms of GPU and memory (which translates to heat, money, and power consumption).  PS3 was already costing Sony hundreds of dollars more per console to manufacturer than people were willing to pay.  It would have even been worse.  Not to mention, the Xbox360 was plagued with heat issues without the more potent hardware.



You can support 1080p on any hardware, It just depends on wheater you think the decreased game resources is worth it.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> Few engines use 4 SPEs, nevermind 7.  SPEs are a PITA to code for and Sony is likely to return to the POWER architecture which is easier to handle (it's more intelligent in delegating tasks).



As a member of Beyond3D, A forum that has quite a few console developers registered giving tech talk on game engines I can say with 100% FACT that all the above is a load of shit.


----------



## MilkyWay (Jul 8, 2011)

Well i remember a few years ago they said they where keeping cell for the PS4 which i can believe i mean they can modify or upgrade that to get more performance but it really depends on how much RAM they use and what gpu core they go with. I say RAM because at the start of the PS3 and 360 life the RAM was fine but it really has limited developers later on like how they said they had to take out features in crysis 2 for consoles due to RAM limitations.


----------



## Frick (Jul 8, 2011)

antuk15 said:


> As a member of Beyond3D, A forum that has quite a few console developers registered giving tech talk on game engines I can say with 100% FACT that all the above is a load of shit.



Links and links please.


----------



## TheMailMan78 (Jul 8, 2011)

Cell has 218 GFlops of floating point. A good Sandy setup can hit almost 500GFLOPs.

Also the 360 has a higher floating point count then the Cell from what I remember. So does it have better graphics?


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> Cell has 218 GFlops of floating point. A good Sandy setup can hit almost 500GFLOPs.
> 
> Also the 360 has a higher floating point count then the Cell from what I remember.



You what you keep talking complete nonesense so I'm done with you.

When you understand a bit more and lose some of that fan boyness you have then we'll talk :shadedshu

And for the record the most I've ever seen Snady Bridge do was 130Gflops in Intel Burn Test clocked at 5 Ghz with AVX instruction enabled.


----------



## TheMailMan78 (Jul 8, 2011)

antuk15 said:


> You what you keep talking complete nonesense so I'm done with you.
> 
> When you understand a bit more and lose some of that fan boyness you have then we'll talk :shadedshu



Hmmmm I guess the facts hurt huh? Also what am I fanboy of? Please enlighten me.


----------



## MilkyWay (Jul 8, 2011)

TheMailMan78 said:


> Cell has 218 GFlops of floating point. A good Sandy setup can hit almost 500GFLOPs.



I still doubt they will go with anything other than IBM. I wonder what the Watson cpu in the wii u is like? Thats IBM.


----------



## antuk15 (Jul 8, 2011)

Frick said:


> Links and links please.



It's called register and read.


----------



## TheoneandonlyMrK (Jul 8, 2011)

TheMailMan78 said:


> Also what am I fanboy of? Please enlighten me.





Trollin


----------



## sneekypeet (Jul 8, 2011)

Sand in panties warning...play nice fellas!


----------



## btarunr (Jul 8, 2011)

Frick said:


> Oh oh oh my oh my. How much would that deal be worth to AMD? Not that it matters really, its still good news.



They'll make more money selling console GPUs than mainstream-thru-high-end PC GPUs.


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> You can support 1080p on any hardware, It just depends on wheater you think the decreased game resources is worth it.


2D, yes, not 3D.  We're talking about game consoles here.  Increasing the resolution typically leads to squaring the memory requirements and these consoles didn't even 2 GiB, *total*.  Basically, it would mean they'd have to cut back big somewhere else and that would likely be textures.  What use is a high resolution display when you're using low resolution images?  




antuk15 said:


> As a member of Beyond3D, A forum that has quite a few console developers registered giving tech talk on game engines I can say with 100% FACT that all the above is a load of shit.


Just because my computer has a quad-core doesn't mean games use 100% of the CPU (most don't use 50% total).  Cell's major fault is that it has little thread-balancing intelligence.  Instead, developers assign a task to a SPE no matter how few resources it uses just because it is easy to do so (no different than a Windows developer assign cores to tasks).  The fact is, fewer, faster cores are often superior to many slow cores, not only from an architecture perspective but also a software perspective.  Cell may have higher FP performance but I can guarentee you Xenon has higher logic performance.  AMD Streams and NVIDIA CUDA lessens the importance of CPU FP performance (I'm sure all three consoles will utilize Streams to some degree).


----------



## MilkyWay (Jul 8, 2011)

btarunr said:


> They'll make more money selling console GPUs than mainstream-thru-high-end PC GPUs.



They make their money in sheer volume, its like how Intel dominated the OEM sector and made its money there. They still do dominate that market.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> Just because my computer has a quad-core doesn't mean games use 100% of the CPU (most don't use 50% total).



Cell is in PS3, A closed box system so I can gurantee Cell gets utilised very very hard.

And EDGE pretty much took the guess work out of Cell programing and gave developers access to power that was previously hard to get access too.


----------



## TheoneandonlyMrK (Jul 8, 2011)

btarunr said:


> They'll make more money selling console GPUs than mainstream-thru-high-end PC GPUs.



 thats why they'd be happy to oblige in any way poss all 3

ps4 and xbox Will use 8 core with dual core 800 GCU shader trinity APUs id throw a pound on that(only at the bookies though), anyone know what AMD's share price been doing l8tly?


----------



## TheMailMan78 (Jul 8, 2011)

antuk15 said:


> Cell is in PS3, A closed box system so I can gurantee Cell gets utilised very very hard.
> 
> And EDGE pretty much took the guess work out of Cell programing and gave developers access to power that was previously hard to get access too.



Do you have a Sony tattoo?


----------



## antuk15 (Jul 8, 2011)

I reckon for PS3 you'll be looking at 2Gb total RAM
A newer revised Cell
Maybe 5850 level GPU

And everything on a 32nm die shrink.


----------



## TheoneandonlyMrK (Jul 8, 2011)

TheMailMan78 said:


> Do you have a Sony tattoo?





id say 4 gig deff 
8-10 core BD
79xx+ GPU

sounds nice that and all in one cheep chip , it just makes sense, if i were AMD id make it happen


----------



## antuk15 (Jul 8, 2011)

TheMailMan78 said:


> Do you have a Sony tattoo?



As I said, I actually hate consoles and haven't owned one since my old PS2.

I'm just not biased like people in this thread and I happen to know what I'm talking about


----------



## MilkyWay (Jul 8, 2011)

antuk15 said:


> As I said, I actually hate consoles and haven't owned one since my old PS2.
> 
> I'm just not biased like people in this thread and I happen to know what I'm talking about



What you talking bout'?
EDIT: Try play nice ladies.


----------



## TheMailMan78 (Jul 8, 2011)

MilkyWay said:


> What you talking bout'?
> EDIT: Try play nice ladies.



He doesn't know. Humor him.


----------



## sneekypeet (Jul 8, 2011)

antuk15 said:


> As I said, I actually hate consoles and haven't owned one since my old PS2.
> 
> I'm just not biased like people in this thread and I happen to know what I'm talking about



please stop feeding the troll!


----------



## Frick (Jul 8, 2011)

antuk15 said:


> It's called register and read.


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> Cell is in PS3, A closed box system so I can gurantee Cell gets utilised very very hard.
> 
> And EDGE pretty much took the guess work out of Cell programing and gave developers access to power that was previously hard to get access too.


The RAM bottoms out before the Cell CPU does.

EDGE or not, it's still compartmentalized and in relatively weak compartments at that.  I wouldn't want to program for it.  I'd rather have fewer fully capable cores to many half-baked calculators.

If the rumors are correct that they will be using POWER7, that means no more Cell.  Cell may be excellent in a super computer but it's not suitible for gaming.




sneekypeet said:


> please stop feeding the troll!


But, but, but trolls gotta eat too!


----------



## TheoneandonlyMrK (Jul 8, 2011)

antuk15 said:


> And everything on a 32nm die shrink.



no way dude gona b3 28nm by the time their tappin out


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> EDGE or not, it's still compartmentalized and in relatively weak compartments at that.  I wouldn't want to program for it.  I'd rather have fewer fully capable cores to many half-baked calculators.



What ever your opinion of it, Cell seems to be worth its weight in gold at the moment as developers are really starting to use it to do some pretty special stuff. So it can't all be that bad 

Naughty Dog are actually using Cell to do Dynamic Radiosity and FXAA in U3 which is very cool.


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> What ever your opinion of it, Cell seems to be worth its weight in gold at the moment as developers are really starting to use it to do some pretty special stuff. So it can't all be that bad
> 
> Naughty Dog are actually using Cell to do Dynamic Radiosity and FXAA in U3 which is very cool.


Nothing that hasn't already been done a dozen times on PC.  You don't need any anti-aliasing when running 1920x1200 with high resolution textures. 

They have to start milking it for all its worth because their only alternative is to not do it at all.  Hardware is holding the developers back, not pushing them forward.


----------



## TheoneandonlyMrK (Jul 8, 2011)

amazing 1 game co have managed what many can do in API(pc), AT LAST and thats the point, their not going to let other devs in on that one are they < old way

APU with open known stds <new way any game dev can make it sing


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> Nothing that hasn't already been done a dozen times on PC.
> 
> They have to start milking it for all its worth because their only alternative is to not do it at all.  Hardware is holding the developers back, not pushing them forward.
> 
> You don't need any anti-aliasing when running 1920x1200 with high resolution textures.  Doesn't matter how you shake it, the consoles that are out now are crippled due to their age.



Oh of course now they're starting to reach a dead end, Although I still think PS3 has a bit more puff left in it then 360 does.

But people tend to forget just what hardware they're running, PS3's running a damn bandwdith crippled 7800GTX for crying out loud and yet look what it's producing.

And blasphamy, You *ALWAYS *need anti-aliasing


----------



## TheoneandonlyMrK (Jul 8, 2011)

FordGT90Concept said:


> Nothing that hasn't already been done a dozen times on PC. You don't need any anti-aliasing when running 1920x1200 with high resolution textures.



im sure you were pro cell and IBM at the start of this thread


----------



## FordGT90Concept (Jul 8, 2011)

IBM POWER is damn impressive.  Just do the math: 4 MCM chips per processor, 8 cores per chip, 4 threads per core.  Multiply that up and you got 128 threads simutaneously executing on a single processor.  Not to mention, that's on a 45nm process.  It's just a matter of time before they double that to at least 256 threads on a a 28nm or less process.   We also can't forget those chips are far more efficient than x86 per clock and they are pulling 4+ GHz.

I was never a fan of Cell but POWER has had my attention since at least 6 when they were pushing 5 GHz.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> IBM POWER is damn impressive.  Just do the math:* 4 MCM chips per processor, 8 cores per chip, 4 threads per core.  Multiply that up and you got 128 threads simutaneously executing on a single processor.  Not to mention, that's on a 45nm process.  It's just a matter of time before they double that to at least 256 threads on a a 28nm or less process.*   We also can't forget those chips are far more efficient than x86 per clock and they are pulling 4+ GHz.
> 
> I was never a fan of Cell but POWER has had my attention since at least 6 when they were pushing 5 GHz.







FordGT90Concept said:


> The fact is, fewer, faster cores are often superior to many slow cores, not only from an architecture perspective but also a software perspective.


----------



## FordGT90Concept (Jul 8, 2011)

I don't doubt everyone of those cores could go toe to toe with SandyBridge cores.  Those are cores--they're not SPEs.

Edit: If my maths are correct, what I described above would be 1059.84 GFLOPs--more than double SandyBridge and quadruple PS3.


----------



## TheoneandonlyMrK (Jul 8, 2011)

FordGT90Concept said:


> I was never a fan of Cell but POWER has had my attention since at least 6 when they were pushing 5 GHz.



 me 2 but im watchin them all


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> I don't doubt everyone of those cores could go toe to toe with SandyBridge cores.  Those are cores--they're not SPEs.



That's the point with Cell, In gernal purpose stuff it would get layed to rest by an Athlon 2 x2 let alone Sandy Bridge, But for what it's being used for in PS3 suits it very well and I dount putting Snady Bridge in place of Cell would be any better.

Personally I can see it not offering as much help to RSX in terms of off loading GPU type tasks as Cell does currently.

Cell is really the only reason why PS3 has even been able to compete with 360, Can you just imagine if developer couldn't do all the things that they're doing on Cell? PS3 graphicsally would of been in real trouble years ago. Cell has arguably bought it beyond what 360 can handle.


----------



## Steevo (Jul 8, 2011)

erixx said:


> Consoles just have to be and stay crapware. I hardly imagine the Catalyst-hot-fix nightmare all the households will have.....



As opposed to beta Nvidia drivers that kill their consoles? 

But hey.... at least with Nvidia they can have lower performance per dollar, and less integration. APU anyone?


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> That's the point with Cell, In gernal purpose stuff it would get layed to rest by an Athlon 2 x2 let alone Sandy Bridge, But for what it's being used for in PS3 suits it very well and I dount putting Snady Bridge in place of Cell would be any better.
> 
> Personally I can see it not offering as much help to RSX in terms of off loading GPU type tasks as Cell does currently.
> 
> Cell is really the only reason why PS3 has even been able to compete with 360, Can you just imagine if developer couldn't do all the things that they're doing on Cell? PS3 graphicsally would of been in real trouble years ago. Cell has arguably bought it beyond what 360 can handle.



Um, Cell was the reason why PS3 couldn't compete with Xbox360.  PS3 had very few launch titles and none of them drove any excitement for the system due, in large part, to the fact it is difficult to code for compared to the Xbox360 and Wii.  It wasn't until much later when MGS4 was launched, as well as the cheap price of a BD-DVD player, that drove PS3 sales.

We also can't forget that Sony blamed the Cell for the launch delays because they weren't getting the yields they wanted and ultimatedly had to settle on launching PS3s with a SPE disabled in order to meet demands.  And the cost--Cell was not cheap to research and it's not exactly a cheap processor to manfuacturer because of the increased silicon real-estate.

All said, the PS3 would have been better off with a Xenon.


----------



## TheoneandonlyMrK (Jul 8, 2011)

antuk15 said:


> Cell is really the only reason why PS3 has even been able to compete with 360, Can you just imagine if developer couldn't do all the things that they're doing on Cell? PS3 graphicsally would of been in real trouble years ago. Cell has arguably bought it beyond what 360 can handle.



earlier i lied a little i did recently buy a fat ps3, just to play GT5, i forgot that fact because the whole experience was un epic and miserable, ive dirt 3 now and im feeling better though the ps3 got sold 2 mnths after buying it,  mostly due to the shit gfx i got f1 2010 on pc at the same time and whod think GT5 good next to that graphically.


----------



## Steevo (Jul 8, 2011)

IBM and AMD are buddies on manufacturing anyway, so having two teams that are already used to working together and then throwing ATI ( I still call GPU's made by AMD ATI and always will) tech in to the mix a few years ago.....well. Not unstoppable, but a hard team to beat, and with constant rumors of IBM buying AMD.


----------



## FordGT90Concept (Jul 8, 2011)

IBM won't buy AMD unless AMD declares bankruptcy.  Even then, I can't see them really wanting AMD.  It would put them in direct competition with Intel in the x86 market and that's not exactly something they would want (they've had no love for x86 since the 80s when it wasn't even x86).  The GPU market doesn't interest IBM much, if any.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> Um, Cell was the reason why PS3 couldn't compete with Xbox360. All said, the PS3 would have been better off with a Xenon.



For the last 12 months PS3 has been kicking 360's ass in the graphics department, And Xemon would Cripple PS3.

Bye bye FXAA, Bye bye Polygon cully, Bye bye Light rendering, Bye bye post processing.

Killzones 2 and 3's post processing effects? All done on Cell, Take Cell out and replace it with Xenon? Bye bye Killzones effects...


----------



## Damn_Smooth (Jul 8, 2011)

Awesome news for AMD if true. There is a lot of revenue that comes from consoles. Buying ATI is the smartest thing I have seen a corporation do in a long time.


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> For the last 12 months PS3 has been kicking 360's ass in the graphics department, And Xemon would Cripple PS3.
> 
> Bye bye FXAA, Bye bye Polygon cully, Bye bye Light rendering, Bye bye post processing.
> 
> Killzones 2 and 3's post processing effects? All done on Cell, Take Cell out and replace it with Xenon? Bye bye Killzones effects...


PS3 and Xbox360 look more or less the same.

Xbox360 does 4xAA standard.  Polygon culling is a rendering technique not exclusive to the PS3.  The same goes for light rendering and post processing.  Xenon does all the above, yes.

Killzone is not available on Xbox360 and not for hardware reasons.  It is developed by Guerrilla Games which is a wholly owned subsidiary of Sony; thus, PlayStation exclusive title.  Try comparing apples to apples.


----------



## KainXS (Jul 8, 2011)

the PS3's Cell was more of just simple development testing by IBM, in terms of power though the 360's gpu is better and has better support since its dx based

but I would like to know more, its rumored the wii has a r700 so what will the next 2 have.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> PS3 and Xbox360 look more or less the same.
> 
> Xbox360 does 4xAA standard.  Polygon culling is a rendering technique not exclusive to the PS3.  The same goes for light rendering and post processing.
> 
> ...



360 does not do 4xMSAA standard, The option is there but due to a 720p frame buffer with 4xMSAA requireing 30mb of memory developers don't use it because you can't fit it straight into 360's 10mb EDRAM, So you have to tile rendering which is not a viable option as 

1. It requires an engine that's been built with tileing in mind, And because most engines are multi platform it's easier for developers to not use tiling and make porting easier.

2. Tlieing has down falls of it's own

3. The EDRAM is the reason why most 360 games run at less then 720p, It's also the reason why Halo 3 only runs at 600p because a native 720p buffer would not fit into the 10mb EDRAM. Develoers run a smaller resolution buffer which then alows them to run anti-aliasing while still being with the 10mb EDRAM limitation.

Polygone culling is not PS3 exclusive but RSX is not particually good at it, It also consumes RSX's time and Cell is better at it and has enough spare cycles to do it so why do it on RSX and waste performance and time?

Again the same with light rendering, RSX can only do so much, Any work which Cell can handle get's shifted over to Cell leaving RSX free to handle the heavier tasks. Why do you think PS3 game have got a lot better over the last 12 months? Because developers are having Cell do all the donkey work. Why have RSX doing morhological anti-aliasing and other compute heavy work loads when Cell can do it faster thus freeing up RSX do to somthing else?

And no Xenon can do fuck all in terms of rendering, You never hear any developer saying they're calculating MLLAA, global illumination, culling or anything else for that matter on Xenon, You know why? Because it just doesn't have the spare power to do it. If xenon had enough spare resources to handle and take some of the work load off Xenos leaving it free to do other things then developers would take advantage of that but they don't because Xenon can't do it.

And delveopers have stated multiple times that PS3 has the edge when it comes to CPU power so it's a good job xenon isn't inside PS3 because it would slow the whole system down.

And for the record they don't look the same, PS3 exclusive games look a hell of a lot better then 360 exclusive ones do now.


----------



## antuk15 (Jul 8, 2011)

KainXS said:


> the PS3's Cell was more of just simple development testing by IBM, in terms of power though the 360's gpu is better and has better support since its dx based
> 
> but I would like to know more, its rumored the wii has a r700 so what will the next 2 have.



Sony, Toshiba and IBM developed Cell and it wasn't just development testing


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> 360 does not do 4xMSAA standard, The option is there but due to a 720p frame buffer with 4xMSAA requireing 30mb of memory developers don't use it because you can't fit it straight into 360's 10mb EDRAM, So you have to tile rendering which is not a viable option as


All titles up to 2009 required 2xMSAA vertical or horizontal or 4xMSAA for both and 720p support.  In 2009, Microsoft removed the MSAA requirement and allowed developers to focus on resolution.

Source: http://www.develop-online.net/blog/44/Microsofts-new-resolution


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> All titles up to 2009 required 2xMSAA vertical or horizontal or 4xMSAA for both and 720p support.  In 2009, Microsoft removed the MSAA requirement and allowed developers to focus on resolution.
> 
> Source: http://www.develop-online.net/blog/44/Microsofts-new-resolution



So 360 doesn't do 4xMSAA standard does it then? :shadedshu

And yes, They removed that requirement because the 10mb EDRAM made it difficult to implement anti-liasing while running at 720p


----------



## FordGT90Concept (Jul 8, 2011)

It did until recently when Microsoft decided 720p (and higher) was more important than multisampling.




antuk15 said:


> 3. The EDRAM is the reason why most 360 games run at less then 720p, It's also the reason why Halo 3 only runs at 600p because a native 720p buffer would not fit into the 10mb EDRAM. Develoers run a smaller resolution buffer which then alows them to run anti-aliasing while still being with the 10mb EDRAM limitation.


Only a handful of titles on Xbox360 got the 720p requirement waived (most resulting in 1152x640 resolution).  Halo was one of them.





antuk15 said:


> Polygone culling is not PS3 exclusive but RSX is not particually good at it, It also consumes RSX's time and Cell is better at it and has enough spare cycles to do it so why do it on RSX and waste performance and time?
> 
> Again the same with light rendering, RSX can only do so much, Any work which Cell can handle get's shifted over to Cell leaving RSX free to handle the heavier tasks. Why do you think PS3 game have got a lot better over the last 12 months? Because developers are having Cell do all the donkey work. Why have RSX doing morhological anti-aliasing and other compute heavy work loads when Cell can do it faster thus freeing up RSX do to somthing else?


All of this, coupled with the fact that PS3 went way over budget and unit-costs were ridiculous, is testament to Xenon + Xenos was a better business decision.  Virtually the same performance for substantially less costs.




antuk15 said:


> And no Xenon can do fuck all in terms of rendering, You never hear any developer saying they're calculating MLLAA, global illumination, culling or anything else for that matter on Xenon, You know why? Because it just doesn't have the spare power to do it. If xenon had enough spare resources to handle and take some of the work load off Xenos leaving it free to do other things then developers would take advantage of that but they don't because Xenon can't do it.


Just because they don't talk about it doesn't mean they aren't doing it.  Considering they look virtually the same, it doesn't matter.  Memory is the limiting factor on both consoles and the Xbox360 has 10 MiB more.


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> It did until recently.
> 
> 
> 
> Only a handful of titles on Xbox360 got the 720p requirement waived (most resulting in 1152x640 resolution).  Halo was one of them.



2009 is not recent my friend, And even then it was no were near as manatory as that article makes out, I don't ever remember seeing 4xMSAA and 720p in all 360 games up until it was wavered.

And with MLAA now very easy to acheive on Cell more and more PS3 games will have much better anti aliasing quality then 360.

The Sabatour as an example, MLAA on PS3 gamave it the same quality as 4xMSAA, The 360 version of the games had no anti-aliasing at all.

iirc DICE are running MLAA on Bad company 3 on PS3 as well.

Naughty dog have moved to FXAA and so have the God of war guys.


----------



## LAN_deRf_HA (Jul 8, 2011)

antuk15 said:


> For the last 12 months PS3 has been kicking 360's ass in the graphics department, And Xemon would Cripple PS3.
> 
> Bye bye FXAA, Bye bye Polygon cully, Bye bye Light rendering, Bye bye post processing.
> 
> Killzones 2 and 3's post processing effects? All done on Cell, Take Cell out and replace it with Xenon? Bye bye Killzones effects...



What post processing? Killzone used canned lighting to try not to disappoint after sony ran around pretending cgi trailers were actual in game graphics. They had to cut from the ai too as most fluff ps3 games do in exchange for that minor graphical edge. The best ps3 games look no different than 360 games because they're from decent developers that didn't cut corners on things like ai. While the 360 and ps3 cores suck at ai, cell is notably worse, requiring more cycles to do the same job negating its increased graphical potential. The cores are just too specialized towards graphics. On this new gen they need to realize the gpu should be doing more of the work. Perhaps even a tegra approach would be best. Dedicated cores for physics, graphics, ai. Would be much nicer than trying to force graphic chips to do processing they weren't designed for.


----------



## antuk15 (Jul 8, 2011)

LAN_deRf_HA said:


> What post processing? Killzone used canned lighting to try not to disappoint after sony ran around pretending cgi trailers were actual in game graphics. They had to cut from the ai too as most fluff ps3 games do in exchange for that minor graphical edge. The best ps3 games look no different than 360 games because they're from decent developers that didn't cut corners on things like ai. While the 360 and ps3 cores suck at ai, cell is notably worse, requiring more cycles to do the same job negating its increased graphical potential. The cores are just too specialized towards graphics. On this new gen they need to realize the gpu should be doing more of the work. Perhaps even a tegra approach would be best. Dedicated cores for physics, graphics, ai. Would be much nicer than trying to force graphic chips to do processing they weren't designed for.



I suggest you hop onto youtube and watch the KIllzone development videos, Killzone 3 off the top of my head uses MLLA, Which is a post processing effect. There's too many to list and post processing is not just lighting 

And Killzone did not use canned lighting, A very good chunk of it is fully dynamic.

Uncharted 3 and Killzone 3 look much better then anything I've seen 360 spit out.


----------



## Hayder_Master (Jul 8, 2011)

AMD gpu's win cuz one thing= heat
only big consols problem is heat,cuz it's hard to cool down down small box have many chipsets and processores specially most chip have heat the GPU, high heat break all effecient for the hole system.


----------



## FordGT90Concept (Jul 8, 2011)

antuk15 said:


> 2009 is not recent my friend, And even then it was no were near as manatory as that article makes out, I don't ever remember seeing 4xMSAA and 720p in all 360 games up until it was wavered.
> 
> And with MLAA now very easy to acheive on Cell more and more PS3 games will have much better anti aliasing quality then 360.
> 
> ...


TCR means it has to pass in order to get published.  That's very mandatory.  The only way around it was if you got an explicit waiver from Microsoft.

You don't "see" it because TCRs are under an NDA.  One developer happened to leak those requirements in an article (although not the full details of said requirements--Microsoft is very thorough in this stuff).

As for MLAA:


> http://www.afterdawn.com/news/article.cfm/2011/06/30/xbox_360_to_get_boost_in_graphics
> 
> The MLAA used with the PS3 requires 3-4ms of rendering time spread across five SPUs.
> 
> "On the Xbox 360 we run at 2.47ms, with still a lot of possible optimisations to try," Jimenez told GamesIndustry.biz.


It could be that 10 MiB eDRAM showing it's skillz, or the Xenon, or the Xenos.  Anyway, nothing special about the PS3.


FXAA is an NVIDIA technology but it sounds like Xbox360 can use it:
http://timothylottes.blogspot.com/2011/03/nvidia-fxaa.html


----------



## antuk15 (Jul 8, 2011)

FordGT90Concept said:


> TCR means it has to pass in order to get published.  That's very mandatory.  The only way around it was if you got an explicit waiver from Microsoft.
> 
> You don't "see" it because TCRs are under an NDA.  One developer happened to leak those requirements in an article (although not the full details of said requirements--Microsoft is very thorough in this stuff).
> 
> ...



I suggest you look HERE and HERE and then show me all these native 720p 360 games with 4xMSAA :shadedshu

And there will be the odd ocasion of that, But MLAA doesn't cost RSX anything


----------



## FordGT90Concept (Jul 8, 2011)

Most are 720p and at least half have at least 2xAA.  

All AA costs time.  It is post-processing after all.


----------



## LAN_deRf_HA (Jul 8, 2011)

antuk15 said:


> I suggest you hop onto youtube and watch the KIllzone development videos, Killzone 3 off the top of my head uses MLLA, Which is a post processing effect. There's too many to list and post processing is not just lighting
> 
> And Killzone did not use canned lighting, A very good chunk of it is fully dynamic.
> 
> Uncharted 3 and Killzone 3 look much better then anything I've seen 360 spit out.



None of this makes sense to me. Not only was the canned lighting in killzone a well known issue, but why are you sitting through ps3 triple A title developer vids? Only eagerly awaiting fans watch those. Your I hate consoles thing just seems a little iffy given your statements thus far.


----------



## antuk15 (Jul 8, 2011)

LAN_deRf_HA said:


> None of this makes sense to me. Not only was the canned lighting in killzone a well known issue, but why are you sitting through ps3 triple A title developer vids? Only eagerly awaiting fans watch those. Your I hate consoles thing just seems a little iffy given your statements thus far.



I'm just obsessed with tech, It's interesting just what idea devs come up with on these old dogs. It also makes you relise just how under utilised PC hardware is


----------



## TheoneandonlyMrK (Jul 8, 2011)

Originally Posted by antuk15  
I suggest you hop onto youtube and watch the KIllzone development videos, Killzone 3 off the top of my head uses MLLA, Which is a post processing effect. There's too many to list and post processing is not just lighting 

And Killzone did not use canned lighting, A very good chunk of it is fully dynamic.

Uncharted 3 and Killzone 3 look much better then anything I've seen 360 spit out

prob the only games you can, any more that you do will be sony dev made as the xbox is what devs make for anyway so a lot of games are no better or not much better then xbox on ps3 and pc simples and thats the reason id most appreciate any update to the ancient console empire


----------



## antuk15 (Jul 8, 2011)

theoneandonlymrk said:


> prob the only games you can, any more that you do will be sony dev made as the xbox is what devs make for anyway so a lot of games are no better or not much better then xbox on ps3 and pc simples and thats the reason id most appreciate any update to the ancient console empire



I can't wait for the next gen consoles.

Then DX9 can finally die and we can say a big hello to true DX11


----------



## LAN_deRf_HA (Jul 8, 2011)

I think I've been radically misquoted haha


----------



## FordGT90Concept (Jul 8, 2011)

I just hope there's more symbiosis between Xbox and Windows.  Ideally, GFWL certified game would run on Xbox and an Xbox game would run on Windows.  That won't happen with the continued use of POWER processors. 

Microsoft would probably object to limited DRM support too in the media.


----------



## TheoneandonlyMrK (Jul 8, 2011)

LAN_deRf_HA said:


> I think I've been radically misquoted haha




sorry bad grabbing their, Im not so sure AMD need or want IBM's help the simple economics of it means that 

IF AMD did make an APU for the big 2 they might well use bulldozer enhanced (trinity) with a gfx core poss dual on board but could also add a discrete gpu to that all kept low volts and watts but with 3xscaling gpus in a v cheap package considering the apus would be mid bin range it would be win win and their scaleing these days is bob on that would make a cheap doable package with high end offshoot chips plus plenty of low end die capital 

and its all about die economics , everychip needs to be sold in some cut down form or not


----------



## Lionheart (Jul 8, 2011)

antuk15 said:


> As I said, I actually hate consoles and haven't owned one since my old PS2.
> 
> I'm just not biased like people in this thread and I happen to know what I'm talking about



So you hate console but you bitch and whine in a console thread

Awesome news for AMD, really looking forward to the next gen consoles


----------



## crazyeyesreaper (Jul 8, 2011)

what i find funny is all the Killzone 3 killzone 3, have you actually played the damn game,

the graphical popin and texture pop in is so fucking terrible its actually worse to play then games that came out years ago i mean good god Killzone on PS2 had less pop in and issues lol PS3 is ancient cell is fucking ancient the 360 is ancient its all ancient hardware

most of the tech is based on architectures based in 2004-2005

the PS3 is actually using a revised 6800 Ultra, its performance is someware between 6800gt and 7800gt performance

which is old as dirt its nearly half the performance of one of AMD's current APUs in the graphics department even more so when looking at pure DX9 or Open GL performance

the Cell yes it had 8 spes 1 disabled 1 for os leaving 6 of them functional only sony exclusives truly use the cell to the best of its ability but then MGS4 was exclusive, but now MGS Rising is multi platform and looks better and it looks the same on 360 and PS3 so the cell and all its goodies didnt really do jack in this senario, any dev can leech a bit more performance out of the hardware but theres always costs.

ive noticed PS3 games might have a bit better graphics these days but they also have alot more texture and LOD pop in, so it looks better standing still looks terrible in motion,

i can say this as i have a PS3 myself almost all my friends own a PS3 AND Xbox 360, i can tell you right now in 99% of games graphics are nearly identicle i can see lighting difference some shadowing differences etc but its about the same difference as you can see from Nvidia vs AMD / ATi on the PC front do to differing hardware. In essence the console tech is 7 years old effectively cell was good then its shit now, and heres why, when a 100w cpu + apu offers 4 x86 cores that spank the shit out of the cell and offers a GPU around 8800gt performance which is 2x the PS3 or Xbox 360s performance and thats looking it at it from a consoles greater then 80% efficiency vs PC at less then 50%, essentially a $100 chip that uses half the power and offers far more performance. Cell is just plain old

and IBM has scaled the Cell to massive # of cores which is great for as Ford pointed out, super computers, guess what, no developer making games is gonna spend a fortune trying to tailor a multiplatform game engine to use 64-128 cores ./ threads, if they cant even be asked to get quad core supported properly after 5 years i doubt they can utilize a cell chip in terms of whats available now, and thats where cell gets its power massive number of cores... oh wait dont we get that already with todays GPUs and Direct compute . Open CL, CUDA etc? yea, 

The Cell kept the PS3 in the game so to speak it offered another alternative down the road, problem is it took them 5 years to master the Cell cpu, and thats to utilize 6 spe's yea... lets see them do the same with 30-60-100 + cores it wont happen devs dont ahve the time and heres why most license there game engines from 3rd parties or they reuse ancient game engines

examples Call Of Duty is baseed on the Quake 3 engine its nearly 12 years old now and there still utlizing alot of the base source code just tweaked over time with shit bolted on, almost all developers do this the above is just an example
]
Sony will have to drop cell for a different cpu architecture,simply because developers want the easiest platform to code for with the greatest return on investment.

Xenon cpu offered that its 3 cores were easier to load balance and utilize in terms of a programming a game engine, its entire platform was closer to that of the PC itself 

meaning a developer got a 2 for 1 with the Xbox and its Xenon + Xenos

where as most PS3 exclusive developers have there own game engines but lets see how many of them actually ported to a different platform no many because the process of doing so was a royal bitch.

and developers again want return on investment so most likely all COnsoles having AMD gpus of varying types and performance lvls and all having IBM cpus would probably be a good thing because at that point games are much less likely to have glaring platform specific bugs that cripple some ports.


----------



## RejZoR (Jul 9, 2011)

I think the reason they'll be going for that is AMD's expertise in Fusion cores. Because ultimately that's what consoles are in a nut shell. All in one thing and Fusion is dead on for that.
They'll also probably fuse a much more powerful "CPU and GPU" together for that purpose i assume... I used quotes because it's not CPU and GPU on its own anymore...

If it's true however, it will suck badly for NVIDIA...


----------



## CDdude55 (Jul 9, 2011)

And yet this is all irrelevant if the games aren't good.


----------



## hellrazor (Jul 10, 2011)

To nVidia: Who's the bitch now?


----------



## KainXS (Jul 10, 2011)

crazyeyesreaper said:


> the PS3 is actually using a revised 6800 Ultra, its performance is someware between 6800gt and 7800gt performance



where did you read it has a 6800?


----------



## KainXS (Jul 10, 2011)

antuk15 said:


> I can't wait for the next gen consoles.
> 
> Then DX9 can finally die and we can say a big hello to true DX11



I have been  waiting so long for that, . . . . . . . . please let them die

I luv my PS3 and 360 but they are really holding back development of GL and DX



KainXS said:


> where did you read it has a 6800?


----------



## crazyeyesreaper (Jul 10, 2011)

the PS3 was designed before the 7800 was ready to be used

Sony themselves qoutes the PS3 as having a gpu faster then a 6800 ultra time proved this to be false  but its in some of the original launch previews and interviews from ages ago

but its essentially much like the Xenos gpu we all know its based on the older x1800 but uses a shader design somewhat similar to the x1900 /x1950 

PS3s gpu is essentially an improved 6800 series but lacks the full on grunt of the 7800 series

overall design and performance wise the PS3s gpu has more in common with the 6000 series then it does the 7000 series


----------



## ShiBDiB (Jul 10, 2011)

antuk15 said:


> You meen like how they code Cell on PC? A CPU that has an architecture thats fuck all like anything on PC, Seriously get a clue.
> 
> They're not going to pick hardware to make porting easier to other machines.



Huh?

He's saying they dont design games using consoles, they design them on PC's and port them to consoles. That being said they design them on the PC's to perform better with the console components thus making them run much better on the console then the PC


----------



## devguy (Jul 11, 2011)

FordGT90Concept said:


> MCM is not an all-in-one processor.  It's a discreet GPU and CPU on the same chip, not die.



The original 360S was an MCM design, however, they have recently started shipping 360s with the GPU/CPU/Mem/IO Controller on the same die.  source


----------



## yogurt_21 (Jul 11, 2011)

crazyeyesreaper said:


> the PS3 was designed before the 7800 was ready to be used
> 
> Sony themselves qoutes the PS3 as having a gpu faster then a 6800 ultra time proved this to be false  but its in some of the original launch previews and interviews from ages ago
> 
> ...



there really isn't any difference between the x1800 and x1900 except for shader count. 

the x1800 had 16 rop's and 16 shaders
the x1900 had 16 rop's and 48 shaders

same goes for the 6800 vs 7800. when the 6800 had 16 rop's against the 7800's 24. other than that they were the same basic gpu. 

but really looking at the rsx on the ps3 it looks merely like a downclocked (especially for memory) 7800gtx
http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer'
the main performance drop from the 6800 ultra is more than likely cpu based combined with the downclocked memory.

this is coincides with the xenos gpu which looks exactly like and X1900XT just down clocked.
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

again the drop in performance is more likely due to clock speeds and weak cpu.

in any event architecturally they look far more like the x1900xt and 7800gtx than they do the x1800xt and 6800ultra respectively. 

and for all out there who say the 360 is built on a modified x1800....the x1900 *IS* a modified x1800. lol


----------



## FordGT90Concept (Jul 11, 2011)

We don't know how many components of the RSX and Xenos are disabled though.  You'd think ATI/NVIDIA wouldn't produce millions of chips they could sell for $200+ a pop for use in a console where they probably get $25-75 each.  They most likely engineer the chips to take up as little die space as possible for cost savings so they can still get a profit on the chips.

The clockspeeds are reduced for power consumption and heat concerns.

All told, I think RSX and Xenos are custom-built GPUs for the console market.  People insist on comparing them to computer parts but really, its like comparing ARM to x86.  Like ARM being purpose built, console GPUs can achieve more with less.  They are no doubt based on 7800GTX and X1900XT in order to save on research costs but you couldn't pop a Xenos/RSX chip off the console, set it in a card, and exect it to work.  For example, no GeForce chip supports XDR but RSX does and no Radeon chip supports EDRAM but Xenos does.




devguy said:


> The original 360S was an MCM design, however, they have recently started shipping 360s with the GPU/CPU/Mem/IO Controller on the same die.  source


It doesn't say who is actually manufacturing it and this is pretty laughable:
"Microsoft has guaranteed there are no processor incongruity between the old and the new Xbox 360 consoles, as it will implement a latency and bandwidth that will impersonate the old system which worked with separate CPU and GPU processors, the Think Digit website reported."

They basically gained nothing from it except to make the Xbox360 as a whole more compact.

It's also important to note that all the technology put into that SoC is old.  The three major parties (ATI, IBM, and Microsoft) won't be so defensible about their IP/technology as new technology.  Most SoCs are built entirely by one company (ARM, Intel, and AMD mostly).  Heads are likely to buck.  At the same time, having to give your IP over to someone else (like IBM who has fabs) in order to manfuacture a SoC would probably be written into the contract.


----------



## crazyeyesreaper (Jul 11, 2011)

yogurt_21 said:


> there really isn't any difference between the x1800 and x1900 except for shader count.
> 
> the x1800 had 16 rop's and 16 shaders
> the x1900 had 16 rop's and 48 shaders
> ...



one major problem tho the xenos was in development before the x1000 line was fully tapped out and long before the 1900 series, in essence the 2 are the same but different as ford pointed out that said the xenos has more in common with the x1800xt in terms of its abilities then it does the x1900 or x1950 were significantly changed to offer better shader performance over the old gpus, similar as to the x800 vs the x1800 having roughly similar design aspects but they significantly boosted shader performance,

that said its also as ford pointed out they cant be directly compared to there PC counter parts due to the fact both are supporting far different technologies, and the setup used just for the AA on the 360 shows this.

the supporting and usage of said different ram standards  Edram XDR etc can impact performance as well

simply put no PC part uses these ram standards etc, ad clock speeds dont always mean the end all be all a good example would be the 5850 vs 5870 where the 5870 being ROP starved could be matched and overtaken by a 5850 simple because pushing the clock speeds didnt really do much for it after it hit that wall, but its all relative , and at this point were talking about technology thats so old the new AMD APU's offer double the GPU performance.


----------



## TheoneandonlyMrK (Jul 11, 2011)

crazyeyesreaper said:


> simply put no PC part uses these ram standards etc, ad clock speeds dont always mean the end all be all a good example would be the 5850 vs 5870 where the 5870 being ROP starved could be matched and overtaken by a 5850 simple because pushing the clock speeds didnt really do much for it after it hit that wall, but its all relative , and at this point were talking about technology thats so old the new AMD APU's offer double the GPU performance.


:shadedshu


eh dont be callin me shiz you its not starved lol

no seriously are you serious because il get a 5850 from somewhere on payday xfire and wc it, oc it to 1000+ and grin massivly for a bit( the foldin to be done tempts me alone ) whilst hopefully playin crysis 2 maxed at 60+fps or a bit closer plus then eyefinitys on

on topic the now ancient GPUs in the big 2 surprise me, id forgoten their rop and shader count  im less impressed by my electronic sledgehammer of a pc(comparatively)

i still doubt well see an end to dx9 style gfx when they come as tard devs like their old toys so your A typical gears of war or fifa etc will still use their shit ol engines till 3 years deep


----------



## crazyeyesreaper (Jul 11, 2011)

yes im serious

5850 and 5870 were the same gpu just binned 

reference 5850s overclocked to the same wall as the 5870s 5870 usually had a 1-2% performance boost but after hitting around 1000 core the 5870s performance was equal to the 5850 but not better it came down to ram, motherboard cpu etc, depending on the mix and match of parts, the card was ROP starved,

pay attention to the green bar thats the 5850 and 5870 at equal speeds

as you can see ram speeds timings and even CPU clock speed etc hell even motherboard choice ive seen cause as much as a 5-6% performance difference which as you can see would make both cards fall into the same performance bracket, just it takes a referenec 5850 with voltage tuning to get the clocks , and all i can say is good luck.


----------



## TheoneandonlyMrK (Jul 11, 2011)

i know full well about binning, and thats why im surprised a cut down gpu can equal mine at 1000+ gpu sdp, i never looked into 5850 performance though to be honest as i got the 5870 the min it was @ microdirect and Afaik they came out first or at same time, id waited on a 5870 like a school kid does candy.

not doubting ya though just taken back and eager for pay day now too


----------



## crazyeyesreaper (Jul 11, 2011)

no worries i was honestly suprised to as i got 2x 5850s free when my 4870x2 died thanks to newegg being awesome and giving me a full refund,

so after having MSI afterburner not work someone on TPU sent me a usb drive with some unlocked bios and i flashed my cards, and presto after that point afterburner worked just fine, and i hit 1050 core on both cards granted took alot of volts to get there 1.35v but it ran and stayed under 80c so it was pretty damn good for what it was.

now back to are regularly scheduled on topic posts


----------

