# Orbis Implements Multi-GPU, Too



## btarunr (Apr 4, 2012)

Sony's next-generation PlayStation, reportedly codenamed "Orbis", is known to be powered by an AMD x86-64 CPU with graphics based on its Southern Islands architecture, from the older report. We're now hearing that Sony may implement a multi-GPU solution of its own. According to an IGN.com report, the CPU in question will be a custom-version of AMD's A8-3850 quad-core APU. This should serve as indication that the processor cores will be based on AMD's K10 Stars architecture, rather than K15 Bulldozer/Piledriver. 

The GPU, on the other hand, will be based on the "Southern Islands" architecture, and the IGN.com report pin points it to resemble Radeon HD 7670. The HD 7670 is a re-branded HD 6670, which is based on the 40 nm "Turks" GPU. Turks uses neither Graphics CoreNext nor VLIW4, but the older VLIW5 number-crunching machinery. The most interesting piece of information here is talk of a multi-GPU configuration between this Turks-based GPU, and the GPU that's embedded into the "Llano" APU. We know that the graphics core embedded into AMD A8-3850, the Radeon HD 6550D, can work in tandem with Radeon HD 6670 to yield an AMD Hybrid CrossFireX configuration called "Radeon HD 6690D2". This could be end up being Sony's graphics weapon of choice. 






Speaking of choices, we see that AMD is falling back to established technologies, be it K10 Stars CPU cores, or VLIW5-based GPUs, avoiding Bulldozer CPU. While its new Graphics CoreNext architecture is found to be fairly efficient in performance GPUs such as the HD 7700 series, its incompatibility with Llano's graphics could have served as a deal-breaker. Pictured above, is fan-made concept art.

*View at TechPowerUp Main Site*


----------



## btarunr (Apr 4, 2012)

Many Thanks to Dos101 for the tip.


----------



## v12dock (Apr 4, 2012)

I just want to buy a copy of the PS4 OS and make my own PS4


----------



## seronx (Apr 4, 2012)

Super late april fools!


----------



## phanbuey (Apr 4, 2012)

weaksauce.


----------



## Fermi (Apr 4, 2012)

Weak. More crappy console ports for us PC gamers.


----------



## NC37 (Apr 4, 2012)

Lame...seriously, a Llano? Come on, if you are gonna cheap out at least use one from Trinity. 

Bah, next console gen is looking real shitty. Xbox using midrange GPU and now PS4. These consoles better be dirt cheap on launch.


----------



## Dos101 (Apr 4, 2012)

NC37 said:


> Bah, next console gen is looking real shitty. Xbox using midrange GPU and now PS4. These consoles better be dirt cheap on launch.



I doubt the next generation of consoles will be as expensive as this generation's were at launch, so I think that's what MS and Sony are going for. Neither one of them wants to sell their systems at a loss anymore so they're going for reasonable specs, not high end. Plus if Sony (and MS) want to release a new console in 2013, the specs would have been generally decided a while ago, it's just how the process is.


----------



## Damn_Smooth (Apr 4, 2012)

Fermi said:


> Weak. More crappy console ports for us PC gamers.



If they cap it at 720p and 30fps, then just concentrate on graphics, we could have some good looking games.


----------



## FreedomEclipse (Apr 4, 2012)

I dunno what you guys think but that control pad looks really really really ugly.... Its like they tried to copy the Dreamcast idea of having a little display on the controller (which was the memory card if you slotted it in) but in a more modern way like simply adding a PSP to a PS control pad.


----------



## dvelez1985 (Apr 4, 2012)

*Honestly*

These companies should give up on the set top gaming consoles and focus more on the portable gaming devices. From the looks of it with 3DS and Vita they have. From a personal stand point I can get the ultimate Gaming, Movie, Media device from my rig than I can get from those machines and then some. However I was tired of playing crappy portable games when I wasn't near a PC. If Sony, Nintendo and Microsoft can corner that market and perfect it I think it would bring big business to them.


----------



## Kärlekstrollet (Apr 4, 2012)

Now you can get microstuttering on your TV as well!


----------



## dj-electric (Apr 4, 2012)

Again, i feel the need to use this meme:







The HD7750 chip is perfect, why? why not HD7750 for the love of everything that has a brain?!
Instead we get this hybrid pile of junk...


----------



## Lionheart (Apr 4, 2012)

Hmmm I see what Microsoft & Sony are doing, getting low end hardware and selling it at double the price..


----------



## [H]@RD5TUFF (Apr 5, 2012)

And sony makes it so no one cares.


----------



## FreedomEclipse (Apr 5, 2012)

dvelez1985 said:


> These companies should give up on the set top gaming consoles and focus more on the portable gaming devices. From the looks of it with 3DS and Vita they have. From a personal stand point I can get the ultimate Gaming, Movie, Media device from my rig than I can get from those machines and then some. However I was tired of playing crappy portable games when I wasn't near a PC. If Sony, Nintendo and Microsoft can corner that market and perfect it I think it would bring big business to them.



your post is all over the place. One moment you say youre bored of portable gaming, then the next minute you are saying they should drop set top gaming consoles and focus more on portable gaming devices???

nailing *BOTH* sides of the market will bring them big business, since the console market has been long well established. giving up set top gaming consoles to work on portable gaming instead would be a very stupid and costly move as they would give up marketshare to the competitors who would then make more money.

Portable gaming has its advantages though, but there is only so much power you can squeeze inside something as small as a PSP and as good as the PSP might be, it cant compete directly with set top consoles.


----------



## wickerman (Apr 5, 2012)

honestly the real answer is just to do upgradable graphics in consoles. The Alienware X51 proves you can get a system the size of n xbox 360 housing a higher end chip and graphics. That system can be configured with an i7-2600 and an Nvidia GTX 555 (288 shaders) and the efficiency of IVB and the 600/HD 7000 series will only improve upon that.

Every few years Microsoft or Sony could certify one or two GPUs as upgrades or have ATI/Nvidia create custom versions of their desktop/mobile GPUs that will only work in the consoles. 

Developers would like it as it would give them more resources to work with, gamers would love it because it would mean they could actually take advantage of their 1080p TVs (unlike modern console games that are limited to 720p or LOWER), and naturally the console makers would like it as they get to extend the life of their consoles rather than having to dump resources into developing new ones every few years. 

It really wouldn't be that complex, Microsoft trusted people to upgrade hard drives in the 360, a gpu could be made just as modular. 


But...I'm all for consoles supporting Eyefinity...would be epic fun doing 6 panel eyefinity across cheap LCD tvs. I mean honestly, how many walls do you really need, why cant one of them be full of screens!


----------



## AsRock (Apr 5, 2012)

OOh another console that will have crappy cooling. I wish they would make the box bigger so much better cooling can be applied.


----------



## Kaleid (Apr 5, 2012)

Microstuttering now for consoles?


----------



## Mussels (Apr 5, 2012)

my laptop is A6 based, so i know how this tech works.


its a great idea because it lowers the power consumption a large amount at idle, and when doing low load tasks like movie playback.


edit: and i bet AMD's 'steady video' tech had something to do with their choices.


----------



## Fairlady-z (Apr 5, 2012)

I have a feeling my GTX680 SLI set up with a 2600k will have very very very very long legs lol... this is nuts if they do it, as I think they are trying to buy time for until Streaming OnLive style is feasible for them to do.


----------



## Easy Rhino (Apr 5, 2012)

who cares. sony treats their customers like sh!t.


----------



## Mike_b (Apr 5, 2012)

In all honesty if this is what sony has planed for the next gen i think i'll pass. (And i'm probably one of sonys biggest fans) i expected them to at least use a trinity based APU paired with a customized version of the HD7750 it's not only a better performing chip, but it's also amazingly efficient. I'm very disappointed if this is true, Sony you can do better than this.


----------



## bear jesus (Apr 5, 2012)

So far the rumored hardware makes me think it would be relatively cheap for valve to make a more powerful "console", i hope they do.


----------



## FordGT90Concept (Apr 5, 2012)

Have they not learned anything from SLI and Crossfire?

1 + 1 < 2

In english: two GPUs never produce 200% output.  One chip with double the power can.


The only reason why I can think of they are doing this is because they're going for something really slim.  It's easier to cool two small GPUs than one big GPU.  At the same time, you'd think they would rule that out quickly because two GPUs cost more to manufacturer and install than one big GPU.  I think Microsoft and Sony got hit with a stupid stick and a big one that.  Sense, this makes none.


----------



## TheLaughingMan (Apr 5, 2012)

Why not just go with the big GPU anyway. Granted the console will technically be more powerful than a PC built with the same specs because of the lack of overhead; but why not just go bigger. If they went with a 7870 or 7950, the could build the system to use the on-die 6550D equivalent and only turn on the big GPU when a game is loaded. It would use less than 1W to keep on standby so WTF not? It would also eliminate potential crossfire issues and would have more power overall. Hell the could use the on-die GPU as a physics processor while gaming using Direct Compute or AMD APP whatever.

I think both Sony and AMD are missing an opportunity here or there is more to come.


----------



## hhumas (Apr 5, 2012)

what's the big deal then ... technology is growing faster and now upcoming games need Powerful gpu


----------



## FreedomEclipse (Apr 5, 2012)

hhumas said:


> what's the big deal then ... technology is growing faster and now upcoming games need Powerful gpu



correct, but 6670's are rather low/mid range GPUs -- thats the problem


----------



## Shoggoth (Apr 5, 2012)

This is hilarious. That console will be worse than a year old budget gaming build. But us PC gamers are going to see good things out of this because there will be more development for the x86_64 platform and developers will start trying to make sure their games work well on dual-GPU setups.


----------



## Mistral (Apr 5, 2012)

I get that a written to metal code will run blazing fast on a 6670 compared to the current chips in half a decade old consoles, but wasn't Sony planning to push 4K displays with the PS4? 

This kind of make this piece of "news" suspect and braking Sony's modus operandi. They've always used their consoles to trojan their other hardware into people's homes.


----------



## Mussels (Apr 5, 2012)

ford: being a console, they can do something that PC games cant: optimise them for dual GPU from the very first steps, with no driver bugs or crossfire/SLI profiles needed.


----------



## Steevo (Apr 5, 2012)

VLIW is extremely efficient when it is written and compiled for it. An integrated setup engine could work to help along dependancies and branch misses.


----------



## FordGT90Concept (Apr 5, 2012)

Mussels said:


> ford: being a console, they can do something that PC games cant: optimise them for dual GPU from the very first steps, with no driver bugs or crossfire/SLI profiles needed.


But you're still splitting a task meant for one GPU to two.  Just like multithreading on CPUs, that creates wasteful overhead.


----------



## Rowsol (Apr 5, 2012)

My 'crappy' 5770 is more powerful than what they are going to put in here?


----------



## Mussels (Apr 5, 2012)

FordGT90Concept said:


> But you're still splitting a task meant for one GPU to two.  Just like multithreading on CPUs, that creates wasteful overhead.



not if its designed for it. no waste or overhead if they do it properly.

EG, one can render the game, the other adds post processing, or the second runs GPGPU/direct compute tasks exclusively.


there is many ways this could be used properly, i do not believe multi GPU issues from PC will translate across to console.


----------



## FordGT90Concept (Apr 5, 2012)

Can't post process until the render is done--second GPU is idle until it is given a task.  Likewise GPGPU tasks will not accelerate the graphics nor will it put the second GPU to full use.  In short, you're providing double the GPU power but only using, at most, 1.5 of it.  A single GPU with the power of two could do all the above with few drawbacks.

Everything in computers translates to consoles.  Look at the CELL as a prime example.  The issues plaguing computer developers with multithreading are amplified on PS3 because they not only have two cores to code, but also 7 SPEs.  Developers simply don't use the SPEs unless what they are trying to achieve requires it.  CELL was abandoned for that reason.

Console developers have the same push for profits as PC developers.  If they can achieve their goals without using the intricacies of hardware, they're not going to go there.

Likewise, the issue of limited RAM in consoles translates back to computers.


----------



## Steevo (Apr 5, 2012)

Not withX64


----------



## FordGT90Concept (Apr 5, 2012)

Very few games go over 2 GiB of RAM usage.  Consoles have ~1 GiB.  They're developed to stay inside the console threshold.  They only go north of that 1 GiB when you turn up the settings.


----------



## Mussels (Apr 5, 2012)

FordGT90Concept said:


> Very few games go over 2 GiB of RAM usage.  Consoles have ~1 GiB.  They're developed to stay inside the console threshold.  They only go north of that 1 GiB when you turn up the settings.



that comes down to opinion, many would say turning up the settings to max should fall within that 2GB limit (due to the apps often crashing with no LAA support)


sure at console level graphics, they dont go over it.



meh, this arguing is pointless. these consoles are still a MASSIVE improvement over current gen, and most people are happy with current gen console graphics. they'll be faster, more power efficient, and more features.


----------



## Isenstaedt (Apr 5, 2012)

btarunr said:


> The GPU, on the other hand, will be based on the "Southern Islands" architecture, and the IGN.com report pin points it to resemble Radeon HD 7670. The HD 7670 is a re-branded HD 6670, which is based on the 40 nm "Turks" GPU.


This is wrong btarunr. Turks is Northern Islands.


----------



## KainXS (Apr 5, 2012)

more rumors more laughs


----------



## ompak5 (Apr 5, 2012)

lets wait for that console...  of course this would be not exactly what you think.. i think its more powerfull.. heheheehe....


----------



## shamus087 (Apr 5, 2012)

FordGT90Concept said:


> Have they not learned anything from SLI and Crossfire?
> 
> 1 + 1 < 2
> 
> ...



Not trying to sound like a @ss but, I watch soo many benchmark sites and I'd say 95% of them show pretty much near full scaling by adding a second GPU. provided they are benching a GPU newer then the HD5000k series.  I did a random search for current GPU's and got this http://www.guru3d.com/article/radeon-hd-6850-6870-crossfirex-review/7 which shows a 6850 crossfired vs a 6870 crossfired, the scaling they got was really close to 2 GPU's working perfectly in tandem. 

I know this is only concerning titles that the dev's put the effort into making the scaling work better. But it still shows the GPU's are very capable of doing it. 

forgive me if I miss understood what you were saying. this is how I see it though, current and even last gen are VERY capable of scaling well together.


----------



## FordGT90Concept (Apr 5, 2012)

Mussels said:


> that comes down to opinion, many would say turning up the settings to max should fall within that 2GB limit (due to the apps often crashing with no LAA support)


By "turn up settings" I meant "to levels that exceed consoles."  For example, a cross-platform game can be expected to use ~1 GiB at the same resolution it is on consoles (720p or lower usually).  Double the pixels (1920x1200) and you'll see that jump to the 2 GiB I mentioned before.  If consoles were removed from the picture, we'd have games using 4+ GiB now at 1920x1200 and more than ~2 cores.




Mussels said:


> meh, this arguing is pointless. these consoles are still a MASSIVE improvement over current gen, and most people are happy with current gen console graphics. they'll be faster, more power efficient, and more features.


Agreed.  The only logical explaination I can come up with for opting to use two GPUs instead of one superior GPU is that they're buying surplus GPUs for cheap.  The illogical explaination would be that hardware developers haven't figured out why multi-GPU hasn't been wildly successful on computer hardware and they'll learn from this mistake in the next generation.


----------



## Easy Rhino (Apr 5, 2012)

it's sony. nothing they do makes sense.


----------



## FordGT90Concept (Apr 5, 2012)

Case in point: CELL -- great for IBM servers, bad for consoles.


----------



## wickerman (Apr 5, 2012)

graphics aside, does anyone else think the best upgrade for the next gen consoles would be some form of SSD or hybrid storage? Perhaps something like Seagates Momentus hybrid or a cheaper version of OCZ's RevoDrive? Obviously straight flash SSD would be epicly fast but next gen games are only going to get bigger and bigger as these consoles get more PC-like. Hell PC games are getting absurdly large. SWTOR is 35gb, Shogun 2 is 30gb, RAGE was 22gb.


----------



## hardcore_gamer (Apr 5, 2012)

So the next generation consoles are going to be underpowered compared to the PCs at that time by a huge margin:shadedshu. Something that didn't  happen since the PS2.


----------



## Jurassic1024 (Apr 5, 2012)

I'll sh*t bricks if the PS4 has the same controller as the PS, PS2 and PS3.


----------



## Jurassic1024 (Apr 5, 2012)

fordgt90concept said:


> the illogical explaination would be that hardware developers haven't figured out why multi-gpu hasn't been wildly successful on computer hardware...



lmao


----------



## Jurassic1024 (Apr 5, 2012)

hardcore_gamer said:


> So the next generation consoles are going to be underpowered compared to the PCs at that time by a huge margin:shadedshu. Something that didn't  happen since the PS2.



Underpowered compared to a PC at launch?  What about the 5+ year lifespan of a console.  Even if a console was equal to the latest PC hardware at launch, it certainly wouldn't be one or two or five or ten years later.


----------



## rpsgc (Apr 5, 2012)

A8 + HD 6670 Hybrid Crossfire can be as fast as an HD 6770 (HD 5770).

http://www.xbitlabs.com/articles/cpu/display/amd-a8-3870k_7.html#sect0


----------



## Jurassic1024 (Apr 5, 2012)

Kaleid said:


> Microstuttering now for consoles?



lol  Comments like these are way too funny.


----------



## dieterd (Apr 5, 2012)

right now my old hd 6850 can run every newest console port max out @1080. I thought, that I can skip new GPU generations (one, maybe two) untill next gen consoles will come out, because then there will be major graphic preformance leap and any old GPU (even GTX 680) would be to weak to run newest games maxes out, there would be new monitors at resolutions way beyond 1080, because new monster consoles would need that... Now I understand that my HD 6850 will be good for long time and we wont see any reasonable priced monitors @ higher resolution . even my i5-760 @ stock speed (and turbo turned off - lol) will chew each newest game without any problem . I was so eager to upgraide and spend my money for good new shiny things - if only I had some reason to do so


----------



## Thefumigator (Apr 5, 2012)

A llano in a PS4 actually is enough, considering you have direct access to hardware without the penalty of passing throu direct x, windows OS, and/or windows compliant drivers. And you won't have paging files neither, a massive windows registry, and so on.

I bet Games will look 5x better and the thing will perform much better than on a PC with same specs. An Xbox 360 shows quite an impressive video quality with such an outdated (ATI 1950 based) video chip.


----------



## insane 360 (Apr 5, 2012)

dieterd said:


> right now my old hd 6850 can run every newest console port max out @1080. I thought, that I can skip new GPU generations (one, maybe two) untill next gen consoles will come out, because then there will be major graphic preformance leap and any old GPU (even GTX 680) would be to weak to run newest games maxes out, there would be new monitors at resolutions way beyond 1080, because new monster consoles would need that... Now I understand that my HD 6850 will be good for long time and we wont see any reasonable priced monitors @ higher resolution . even my i5-760 @ stock speed (and turbo turned off - lol) will chew each newest game without any problem . I was so eager to upgraide and spend my money for good new shiny things - if only I had some reason to do so



send me money and i'll send you shiny things...promise!

on a serious note, mussels is right, these consoles will be worlds better than the current gen, use less power and these components are closer than ever to pc parts, should make ports better.  this will be good


----------



## hardcore_gamer (Apr 5, 2012)

Jurassic1024 said:


> Underpowered compared to a PC at launch?  What about the 5+ year lifespan of a console.  Even if a console was equal to the latest PC hardware at launch, it certainly wouldn't be one or two or five or ten years later.



That happens every time. But being underpowered at launch didn't happen since PS2.


----------



## Mussels (Apr 5, 2012)

wickerman said:


> graphics aside, does anyone else think the best upgrade for the next gen consoles would be some form of SSD or hybrid storage? Perhaps something like Seagates Momentus hybrid or a cheaper version of OCZ's RevoDrive? Obviously straight flash SSD would be epicly fast but next gen games are only going to get bigger and bigger as these consoles get more PC-like. Hell PC games are getting absurdly large. SWTOR is 35gb, Shogun 2 is 30gb, RAGE was 22gb.



failure rates are too high on flash media. they cant rely on it.


----------



## HossHuge (Apr 5, 2012)

Jurassic1024 said:


> I'll sh*t bricks if the PS4 has the same controller as the PS, PS2 and PS3.



From the looks of that picture you are going to need this.


----------



## rpsgc (Apr 5, 2012)

FreedomEclipse said:


> I dunno what you guys think but that control pad looks really really really ugly.... Its like they tried to copy the Dreamcast idea of having a little display on the controller (which was the memory card if you slotted it in) but in a more modern way like simply adding a PSP to a PS control pad.





HossHuge said:


> From the looks of that picture you are going to need this.
> 
> http://img.techpowerup.org/120405/toilet-paper.jpg





If only people would RT(F)A... 



> Pictured above, is *fan-made concept art*.


----------



## vega22 (Apr 5, 2012)

wewt!!!

now the world of console gamers can get micro stutter too


----------



## Boljack (Apr 5, 2012)

*Holy macaroni!*

This will be faster than my HP laptop dv6 6620G that I paid 600$ for it and this prolly can be overclock whoot whoot!! ...

http://www.youtube.com/watch?v=30zTM11hg7A


----------



## HossHuge (Apr 5, 2012)

rpsgc said:


> If only people would RT(F)A...



Obviously somebody (for example the fan who created the picture) thinks that the controller isn't going to change.  Now, can't I comment on that without having you crying about me supposedly not reading the article?


----------



## ompak5 (Apr 5, 2012)

Thefumigator said:


> A llano in a PS4 actually is enough, considering you have direct access to hardware without the penalty of passing throu direct x, windows OS, and/or windows compliant drivers. And you won't have paging files neither, a massive windows registry, and so on.
> 
> I bet Games will look 5x better and the thing will perform much better than on a PC with same specs. An Xbox 360 shows quite an impressive video quality with such an outdated (ATI 1950 based) video chip.




i remember a while ago... AMD said all games should not render under DIRECT X environment so the GPU can do according to their program with out OS limiter? I think its time for AMD to shine here.. if this proved to be correct here on PS4 then all games dont need DIRECT X on windows???


----------



## THE_EGG (Apr 5, 2012)

hang on a second. Wasn't the 6670 a re-branded 5670? So really the 7670 is a re-branded 5670. :/


----------



## Thefumigator (Apr 5, 2012)

ompak5 said:


> i remember a while ago... AMD said all games should not render under DIRECT X environment so the GPU can do according to their program with out OS limiter? I think its time for AMD to shine here.. if this proved to be correct here on PS4 then all games dont need DIRECT X on windows???



Under windows you don't need Direct x to render 3D, you can use openGL if you want, or your own technology if its mature enough (but not all video card will support it). Direct X and OpenGL proven to be the most widely used. Why? because gaming video cards support them. 

So this is the reason a console hardware will work better, because developers knows that the video chip will be the same in every PS4. 

You just can't make the same assumption with PC. Even having DirectX in the middle of a PC, if you are a developer you may want to test the game in every video card possible, lets say at least from the ATI 2000 series all the way to 7000, and Nvidia 8000 series all the way to GTX600 series to make sure it will run without problems.

There were times where some consoles changed the chip from version to version, like the Xbox 360, the older one vs the newer ones but they just changed the chip for a better revision of it, lets say, it was the same chip with the same capabilities but improved in power consumption and other factors. "Digitally" it is the same thing, so it made no difference for the user nor the developer.


----------



## Prima.Vera (Apr 5, 2012)

If this is true, with that crappy GPU's I think is a waste to even create a new PS4 which is only marginally faster than PS3...WTH?!?!


----------



## erek (Apr 5, 2012)

looks like a rip off of the new Apple TV


----------



## xenocide (Apr 5, 2012)

ompak5 said:


> i remember a while ago... AMD said all games should not render under DIRECT X environment so the GPU can do according to their program with out OS limiter? I think its time for AMD to shine here.. if this proved to be correct here on PS4 then all games dont need DIRECT X on windows???



Developers were pretty quick to refute that claim.  I remember Crytek being one of the people that said programming "to the metal" wasn't an ideal solution for anything other than consoles, and even for consoles it wasn't a great solution.  DirectX is so common these days because it works so well.  As mentioned, there are alternatives, but there is no point in using most of them because they are either not supported well by the hardware, or just not on par for the current solutions.

I don't think some kind of Hybrid Crossfire solution is the best idea for consoles.  Just look at how often people on these forums have issues with Crossfire, do you really want that to carry over to your console?  I know I don't.  I see no reason why they couldn't just use a standard Quad-Core CPU with something akin to a 6850 or 7770.

I hope half these rumors about this upcoming generation of consoles are lies, because if not it seems like a huge amount of disappointment is inbound.


----------



## Thefumigator (Apr 5, 2012)

xenocide said:


> I don't think some kind of Hybrid Crossfire solution is the best idea for consoles.  Just look at how often people on these forums have issues with Crossfire, do you really want that to carry over to your console?  I know I don't.  I see no reason why they couldn't just use a standard Quad-Core CPU with something akin to a 6850 or 7770.



A standard quad core cpu + a video card will be more expensive. Llano is perfect, its everything integrated in one single chip.

I believe in crossfire, its a good way of improving performance. In a console its even better: developers know that if something doesn't work well, they will have to fix it before release, and this fix will work on all (PS4) consoles. because they are all the same internally.

Now, if the PS4 will change specs from stepping to stepping and/or be upgradeable, get ready for problems. Because upgradability and compatibility with different hardware configs plagued the P.C. with trouble for ages.


----------



## Isenstaedt (Apr 5, 2012)

THE_EGG said:


> hang on a second. Wasn't the 6670 a re-branded 5670? So really the 7670 is a re-branded 5670. :/


No. The 7670 is a rebranded 6670 because it is based on the same GPU, Turks XT. The 5670 is based on Redwood XT.

Still, the 6670 is about 10% faster than the 5670. It does it consuming less power, though.


----------



## Frick (Apr 5, 2012)

A question for all the people saying the new consoles are weak: Have the consoles ever been more powerful than a home computer? I mean I don't know and would like to know if the whining is justified or not.


----------



## Thefumigator (Apr 5, 2012)

Frick said:


> A question for all the people saying the new consoles are weak: Have the consoles ever been more powerful than a home computer? I mean I don't know and would like to know if the whining is justified or not.



That's exactly my point. Home computers were always much more powerful than consoles.

PC weren't specifically made for gaming despite technologies like DirectX/OpenGL enable it when paired with a good video card. 

However, a good and powerful video card on PC proves to beat everything available in console gaming except for title availability. Here the difference with consoles is longetivity. PCs are always updating. Each year a new video card model is released, each year a new processor is released.


----------



## FordGT90Concept (Apr 5, 2012)

All graphics hardware out there today is designed to DirectX specifications.  Some developers might use OpenGL, OpenCL, etc. because the DirectX libraries aren't available on any platform except Windows but those open libraries benefit from the hardware advancements DirectX requires.  Even what gets thrown into consoles benefits from those hardware advancements because there is no profit in rewriting what works when porting hardware to a console.  Yeah, of course they don't use the DirectX libraries (except Xbox) but they're using the same techniques and technologies DirectX pioneered.


----------



## Dent1 (Apr 5, 2012)

xenocide said:


> I don't think some kind of Hybrid Crossfire solution is the best idea for consoles.  Just look at how often people on these forums have issues with Crossfire, do you really want that to carry over to your console?



The reason why people have issues with Crossfire is because PCs have an infinite amount of variables which can cause conflicts. Different models/brands/types of OS, configurations, driver revisions, memory, motherboards, CPUS, GPUs. Then on top of that you have incompetent users.

With consoles developers are working with a system which isn't changing in specifications and thus typical PC issues (like CF issues) shouldnt occur.


----------



## THE_EGG (Apr 6, 2012)

Isenstaedt said:


> No. The 7670 is a rebranded 6670 because it is based on the same GPU, Turks XT. The 5670 is based on Redwood XT.
> 
> Still, the 6670 is about 10% faster than the 5670. It does it consuming less power, though.



ah cheers, my mistake.  I thought it was like how the 5770 got rebranded and such.


----------



## xenocide (Apr 6, 2012)

Thefumigator said:


> A standard quad core cpu + a video card will be more expensive. Llano is perfect, its everything integrated in one single chip.
> 
> I believe in crossfire, its a good way of improving performance. In a console its even better: developers know that if something doesn't work well, they will have to fix it before release, and this fix will work on all (PS4) consoles. because they are all the same internally.



Llano in Hybrid Crossfire is most certainly not integrated into one chip.  You also have to consider that the GPU's built into Llano are not very powerful, and even in Hybrid Crossfire are nothing spectacular compared to even an entry-level discrete GPU.  As someone said earlier in here, the best Hybrid Crossfire setup you can do is only about as good as an HD6670, which doesn't sound majorly impressive to me.  The PS3 and 360 used--at the time--rather high-end and advanced GPU's, and if these companies are really pushing for longer life cycles on consoles, they need to do at least that.

I also think it's more likely they would release the games with any Crossfire-induced bugs and try and patch around it similar to how AMD does with drivers (hopefully more successfully than AMD does with drivers but I digress).  It's adding more of a burden to developers than necessary, which is the EXACT same problem developers had with CELL.


----------



## ompak5 (Apr 6, 2012)

xenocide said:


> Llano in Hybrid Crossfire is most certainly not integrated into one chip.  You also have to consider that the GPU's built into Llano are not very powerful, and even in Hybrid Crossfire are nothing spectacular compared to even an entry-level discrete GPU.  As someone said earlier in here, the best Hybrid Crossfire setup you can do is only about as good as an HD6670, which doesn't sound majorly impressive to me.  The PS3 and 360 used--at the time--rather high-end and advanced GPU's, and if these companies are really pushing for longer life cycles on consoles, they need to do at least that.
> 
> I also think it's more likely they would release the games with any Crossfire-induced bugs and try and patch around it similar to how AMD does with drivers (hopefully more successfully than AMD does with drivers but I digress).  It's adding more of a burden to developers than necessary, which is the EXACT same problem developers had with CELL.



i think the APU to be used on this console will be more powerfull compare to the current APU for desktop. it will be a redesign APU with a powerfull GPU, it will be less bug coz they only have one hardwre.


----------



## Dent1 (Apr 6, 2012)

xenocide said:


> I also think it's more likely they would release the games with any Crossfire-induced bugs and try and patch around it similar to how AMD does with drivers (hopefully more successfully than AMD does with drivers but I digress).  It's adding more of a burden to developers than necessary, which is the EXACT same problem developers had with CELL.



Crossfire induced bugs? Please go back to page 3, post #75. I already addressed that issue.


----------



## xenocide (Apr 6, 2012)

ompak5 said:


> i think the APU to be used on this console will be more powerfull compare to the current APU for desktop. it will be a redesign APU with a powerfull GPU, it will be less bug coz they only have one hardwre.



It said in the OP that it would be a Stars-based (Llano APU) and those have been readily available for a year.  I doubt they would waste production space on a special design just for the PS3.  I don't think GloFo or TSMC have the available space to do so without charging a mark up (further increasing the cost of the system).

They cannot just throw a more powerful GPU on an older CPU design by the way.  They have to consider power consumption and heat generation, as well as available die space.  If none of the these things were very real concerns we'd see Trinity launching with the likes of HD7860D's on them or something of that nature.



Dent1 said:


> Crossfire induced bugs? Please go back to page 3, post #75. I already addressed that issue.



Which I was responding to.  I think you'd sooner see developers push that kind of stuff off than spend time fixing it right away.  If AMD has issues getting Crossfire to work so frequently I doubt regular developers would be immune from the issues.  I still think a single more powerful GPU would be an infinitely better solution.


----------



## Irish_PXzyan (Apr 6, 2012)

Should I be excited about the PS4 or not?????


----------



## THE_EGG (Apr 6, 2012)

Irish_PXzyan said:


> Should I be excited about the PS4 or not?????



I suppose that's for you to decide.


----------



## Irish_PXzyan (Apr 6, 2012)

Usually I get all hyped up about the latest and greatest Play Station but with these rumors it's not getting me excited at all!
I don't want to get the latest Gen console if it's still going to be 720p and look rubbish on any 1080p HD tv :/ 
It doesn't really sound next gen at all to me. Not impressed.


----------



## Dent1 (Apr 6, 2012)

xenocide said:


> If AMD has issues getting Crossfire to work so frequently I doubt regular developers would be immune from the issues..



What issues specifically? If you are talking about performance issues then crossfire typically always increases performance, even in badly supported titles, granted it isn't always 100% performance boost, but the boost is typically present. 

If you are talking about bluescreens, crashes and lock ups - those issues are due to variables pertaining to driver conflicts, configurations, software/hardwre variations and is not a major factor in a console due to standardisation of equipment.




xenocide said:


> I still think a single more powerful GPU would be an infinitely better solution.



I guess that's one thing we're in agreement about. I would rather see a single powerful GPU too (but for different reasons).


----------



## TheoneandonlyMrK (Apr 6, 2012)

Dent1 said:


> If you are talking about bluescreens, crashes and lock ups - those issues are due to variables pertaining to driver conflicts, configurations, software/hardwre variatiosn and is not a major factor in a console due to standardisation of equipment.



quite right if setup well xfire,( as sli )is faultless ,mines worked well this last year  with no issues,

firstly hopefully they will go with trinity, as they wouldnt want the next 4/6 core xbox piping them in performance so im optimistically declareing this news BS 

but either way not long ago AMD were talking about games coded bare metal style(like on consoles )being a future possibility on pc, and if you consider the fact that pc's under utilise gpu's as it is 2x low end gpu's can be made to pack a punch, plus sony are said to be adding in chips for functionality / performance, without knowing the full details of these ,and how they might enhance performace its a bit early to right sony or the ps4 off imho


----------



## Morgoth (Apr 8, 2012)

Pc ftw


----------



## vagxtr (Apr 26, 2012)

btarunr said:


> Sony's NGN PS is known to be powered by an AMD x86-64 CPU with graphics based on its Southern Islands architecture, from the older report. We're now hearing that Sony may implement a multi-GPU solution of its own.
> custom-version of AMD's A8-3850 4core APU. (based on AMD's K10 Stars architecture, rather than K15 Bulldozer/Piledriver.)
> 
> and HD 7670 (HD6670 rebranding based on the 40 nm GPU "Turks")
> Turks uses neither GCN nor VLIW4, but the older VLIW5 number-crunching machinery. The most interesting piece of information here is talk of a multi-GPU configuration between this Turks-based GPU, and the GPU that's embedded into the "Llano" APU. We know that the graphics core embedded into AMD A8-3850, the Radeon HD 6550D, can work in tandem with Radeon HD 6670 to yield an AMD Hybrid CrossFireX configuration called "Radeon HD 6690D2". This could be end up being Sony's graphics weapon of choice.



So should this mean that nextgen consoles are going to have MicroStuttering as fully advertised feature?

CFX is a nice feature especially if and when hardware is already available so there's no waste of resources. This time end-user experience (EUXP) during gameplay ain't of any crucial significance it's reather how to build readily available hardware on cheapest process (40nm?) and offer something newer than PS3 that's already 5yrs old (in Q1 2013 will be 6yrs obsolete product)

I think it would be interesting to see where Soy will produce DAMNs IP licenced chips and at what price. TSMC 28nm is highly overpriced and UMC who should have it in H2 2012 could offer some alternative. Or it will be a glofo. It would be grief if they would produce and optimise it for already obsolete 40nm which is 2-3 times cheaper but then it would suck 200W more than DAMNs APU on 32nm SOI that is in production now. It crapptastic for customers which already dont know on what they would waste their money.

But then DAMNs proposal is to respin consoles from 6-7yrs cycle to 3-4yrs and that could be part of this agenda.

I dont think obsoleted VLIW5 engine would be in NGC PS4 as it performs best but just for it's compatibility to build CFX as you mentioned. And we could hail microstuttering feature finally arrives into consoles world.

Would those APU+GPU mean really that there wil be separate GPU in PS4 not as before uni-chip-composite aka. Quasi-SoC?



Dos101 said:


> I doubt the next generation of consoles will be as expensive as this generation's were at launch, so I think that's what MS and Sony are going for. Neither one of them wants to sell their systems at a loss anymore so they're going for reasonable specs, not high end. Plus if Sony (and MS) want to release a new console in 2013, the specs would have been generally decided a while ago, it's just how the process is.



I think thats more like of DAMNs agenda to fertilize their investment in hardware that Sony-MS-DAMN consortium agreed upon rather than that we should see satisfying experience on superior optimized apps/games that will run on those TWO-GEN Obsoleted Hadware when it finally arrives to the market.



Kaleid said:


> Microstuttering now for consoles?



Yep. Finally i might add


----------



## vagxtr (Apr 26, 2012)

Mike_b said:


> In all honesty if this is what sony has planed for the next gen i think i'll pass. (And i'm probably one of sonys biggest fans) i expected them to at least use a trinity based APU paired with a customized version of the HD7750 it's not only a better performing chip, but it's also amazingly efficient. I'm very disappointed if this is true, Sony you can do better than this.



It's impossible because even Piledriver APU supposedly should have only VLIW4 based engine not GCN, and different tech GPUs cant mix, hell even same tech GPUs cant mix in DAMNs drivers, but only same chips with same thing enabled only different clocks (like in HD4800 series). It's poor practice but when there's BLIND SUPPORTERS for it why the hell not abuse them and milk money out of their pockets.

I'm not dissapointed with Sony or DAMN.

Why o why o why o why nobody ever complains about same shabby GigaIntel practice that they NEED TO SOLD OUT their little or no upgrade chipsets everytime when release their shiny tic-toc chip out of the box??? It's shabby practice Intel does for 25yrs. And now Sony-MS-DAMN is THE ONLY evil kartel?


----------



## NdMk2o1o (Apr 26, 2012)

All 4 pages TL;DR 

Are these specs confirmed or not?


----------

