Monday, December 22nd 2014

AMD to Power Next-Generation NES

Nintendo is working on a next-generation gaming console to succeed even the fairly recent Wii U. The company is reacting to the plummeting competitiveness of its current console to the likes of PlayStation 4 and the Xbox One. Reports suggest that Nintendo would make a course-correction on the direction in which it took its game console business with the Wii, and could come up with a system that's focused on serious gaming, as much as it retains its original "fun" quotient. In that manner, the console could be more NES-like, than Wii-like.

Nintendo could ring up AMD for the chip that will drive its next console. It's not clear if AMD will supply a fully-integrated SoC that combines its own x86 CPU cores with its GCN graphics processor; or simply supply the GPU component for an SoC that combines components from various other manufacturers. The Wii U uses IBM's CPU cores, with AMD's GPU, combined onto a single chip. There's no word on when Nintendo plans to announce the new console, but one can expect a lot more news in 2015-16.
Source: Expreview
Add your own comment

62 Comments on AMD to Power Next-Generation NES

#26
cadaveca
My name is Dave
Dj-ElectriCYou wouldn't define the ability to get 2C/4T Haswell +iGPU at under 5W? i would. And no, NVIDIA isn't far far away, one far is enough.
AMD has decent offering, and decent brought enough potato-like trouble to both current-gen consoles with an array of devs having the lack of ability to supply either 1080P image and/or 60FPS in most games.
No, no thanks. Hopefully, maybe Nintendo will use a more powerful tech in the 20NM gen. Maybe not.
Yeah, but DJ, look at the cost of that chip. Like, don't get me wrong, I have Surface Pro 3, and it's 11W CPU is pretty good.

But yes, 1080P development is far more important, IMHO, and to see this requires a rather beefy chip, which other consoles do not use. On the GPU side, looking at power efficiency, and maintaining a <250W power envelope, 50W for CPU and <175W for GPU isn't that exciting. HAswell 4c/8t and GTX 970 might work though. AMD cannot compete when competing within a specified power envelope.... or can they? :p
Posted on Reply
#27
HisDivineOrder
On the one hand, it's a slap in the face of Wii U gamers if it's not completely compatible.

On the other hand, it's probably the best move they have available to them. Wii U is never going to take off the way Nintendo wants it to. It just won't happen. Third party publishers have abandoned the platform specifically because it's a 360-level console in a PS4-level world of gaming, especially going forward. Developers don't want to continue to make games built to 360-level specifications (in terms of processor power, in terms of GPU power, and most importantly in terms of lacking memory) when pushing games onto the supposedly "next gen" consoles is already hard enough. They can't even get games out at 1080p on PS4 as often as they'd like and now you want to try and scale down to 360-esque hardware for years to come to support Wii U?

No. Something has to give. When Nintendo's Wii U faltered out the gate, publishers saw this as their chance to finally show Nintendo what they thought of their strategy of lagging the entire industry back a generation. When the Wii was huge, they went to the trouble of having different developers produce a game with the same name and often different gameplay built around the Wii's Gamecube-class hardware. But with the Wii U lagging out the gate, third party publishers took advantage of the situation to trim costs by cutting those ports altogether.

And the Wii U's gimmicky focus was on something people didn't want: tablet-style gaming on a tablet of less quality both in terms of the screen quality and the touchscreen input accuracy/speed than they already had on their iPad. And that was years ago. Tablets have progressed since then, too. Nintendo was counting on $500 tablets dominating the space, but then they release Wii U in time to face a $200 Google Nexus 7 (2013). 2013 brought an even better Nexus 7 that trounced the Wii U's tablet. Whoops. So you had the option to buy more games for your 360 and get a Nexus 7 (2013) for less money or buy a Wii U to get a tablet tied eternally to your Wii U. That's a no brainer.

Suddenly, their gimmick was nothing. Games didn't use it. Not even Nintendo's own published ones. Yet they refuse to get rid of it and trim costs.

So now they're looking to catch up to the rest of the industry with a SOC not unlike the Xbox One or PS4, I'd imagine. It's a smart move. A smarter move by Nintendo would be to quit the hardware business altogether and become a third party publisher, even if they only do it on Steam, iOS, and Android. That way, they don't have to join up with their eternal enemies but they can get access to bajillions of hungry gamers.

Imagine poor Microsoft and Sony facing an onslaught of Nintendo games on SteamOS. Cue the weeping.
Posted on Reply
#28
alwayssts
Steevo5X faster than the Xbone isn't realistic at all, 3X faster isn't realistic.

2X faster is realistic but would cost too much, and frankly would make it so power hungry and expensive it would be cost prohibitive.
Do you know what's essentially 5x (I would actually say they need 4.5x + scaling inefficiency difference) faster than an xbox1? A 980 (or overclocked 970). Pretty much exactly (I bet that's TOTALLY a coincidence on nvidia's part). I simply quoted the amd equivalent.

That's (gm204) on tsmc 28nm. GF 20nm is almost 3x as dense. 14nm *should* be around 20%+ faster than the bottom-end of tsmc 28nm (somewhere around 925mhz, if you go by salvage 28nm amd skus) at low voltage/high yield. It's surely not impossible, especially when you figure a typical console gpu is around 200mm2 on a cost-efficient process.

If you were to say 'but Nintendo puts out tech that 3 years old at release', I would argue that by that time it would be roughly three years old.

The main difference essentially has to be that instead of using what was low-end three years from launch, it is relatively high-end. That way, even a couple gpu gens later, it will still be relevant across the board. I would argue there is a good chance a 980-like product will still exist in the gpu stacks of both companies in 2016-2017.
Posted on Reply
#29
Xzibit
cadavecaYeah, but DJ, look at the cost of that chip. Like, don't get me wrong, I have Surface Pro 3, and it's 11W CPU is pretty good.

But yes, 1080P development is far more important, IMHO, and to see this requires a rather beefy chip, which other consoles do not use. On the GPU side, looking at power efficiency, and maintaining a <250W power envelope, 50W for CPU and <175W for GPU isn't that exciting. HAswell 4c/8t and GTX 970 might work though. AMD cannot compete when competing within a specified power envelope.... or can they? :p
Your confusing Average power use to power needed. GTX 970 can draw up to 290watts by itself during its Boost cycles.



Edit: Comparison graph to older gen consoles

Posted on Reply
#30
cadaveca
My name is Dave
XzibitYour confusing Average power use to power needed. GTX 970 can draw up to 290watts by itself during its Boost cycles.

Right, but boost in some card is like 1300 MHz or so. limit that to 1100 MHz, I think power draw could be handled. However, point taken rightly. I am eager to see what AMD can cook up on the GPU front, but maybe this idea is why we have GTX970/980, and no response from AMD.


Those are rather interesting numbers, too, thanks very much for that graph. Too bad that hardware is absolute crap on the GPU side. I haven't bought this gen of console since I knew that 1080P was not gonna happen properly.
Posted on Reply
#31
Steevo
alwaysstsDo you know what's essentially 5x (I would actually say they need 4.5x + scaling inefficiency difference) faster than an xbox1? A 980 (or overclocked 970). Pretty much exactly (I bet that's TOTALLY a coincidence on nvidia's part). I simply quoted the amd equivalent.

That's on 28nm. GF 20nm is almost 3x as dense. 14nm *should* be around 20%+ faster than the bottom-end of tsmc 28nm (somewhere around 925mhz, if you go by salvage 28nm amd skus) at low voltage/high yield. It's surely not impossible, especially when you figure a typical console gpu is around 200mm2 on a cost-efficient process.

If you were to say 'but Nintendo puts out tech that 3 years old at release', I would argue that by that time it would be roughly three years old.

The main difference essentially has to be that instead of using what was low-end three years from launch, it is relatively high-end. That way, even a couple gpu gens later, it will still be relevant across the board. I would argue there is a good chance a 980-like product will still exist in the gpu stacks of both companies in 2016-2017.
Wow, so all we have to do is strap a 980 on it and look at it go!!!

Never mind the themals, nevermind the power consumption, nevermind the cost, nevermind that Intel already makes CPU's on tiny nodes and they are only about 20-40% more efficient and we are coming to the end of the silicon era, and we can no longer squeeze performance out of it, lets just say what we wish were true.


The AMD CPU's in the PS4 for example is still an off the shelf chip including a few extra parts, with very good voltage control, where if it doesn't meet the criteria for the PS4, it may for the Xbone and vice versa, and on a very, very mature process with as high of yields as can be expected. It is not however equal to the top of the line GPU from 3 years ago (its essentially a 7850 at lower clock speed, sharing its memory with the CPU (www.gamespot.com/forums/system-wars-314159282/gpu-specs-comparison-wiiu-vs-xbox-one-vs-ps4-30976500/) www.techpowerup.com/reviews/AMD/HD_7850_HD_7870/26.html placing it around the performance of a 5870, which is 6 years old) , let alone 5 years ago. So I am not sure what you are saying other than generalizing what you WISH would happen.

Essentially we are at the bend of the knee, once past 20nm the power consumption and performance stop decreasing exponentially as the traces on the motherboard, I-O and other items begin to need high signal voltage to remain stable, thus negating the lower process size, and actually causing issues in the chip, and why Intel had to move to on die voltage control, so they could effectively terminate the voltage without damaging the more fragile transistors.
Posted on Reply
#32
TheGuruStud
Dj-ElectriCYou wouldn't define the ability to get 2C/4T Haswell +iGPU at under 5W? i would. And no, NVIDIA isn't far far away, one far is enough.

AMD has decent offering, and decent brought enough potato-like trouble to both current-gen consoles with an array of devs having the lack of ability to supply either 1080P image and/or 60FPS in most games.
No, no thanks. Hopefully, maybe Nintendo will use a more powerful tech in the 20NM gen. Maybe not.
Terrible performance isn't AMD's fault. The gpu could have been 3x faster and cpu clocked over 2ghz, but Sony and (especially) M$ are cheapskates. AMD made what they asked for.
Posted on Reply
#33
Steevo
TheGuruStudTerrible performance isn't AMD's fault. The gpu could have been 3x faster and cpu clocked over 2ghz, but Sony and (especially) M$ are cheapskates. AMD made what they asked for.
They also could have been liquid cooled, with a 1200W power supply, but in the interest of being reasonable and logical that wouldn't work anymore than a 3X faster chip due to power consumption and thermal issues.
Posted on Reply
#34
TheGuruStud
SteevoThey also could have been liquid cooled, with a 1200W power supply, but in the interest of being reasonable and logical that wouldn't work anymore than a 3X faster chip due to power consumption and thermal issues.
275W isn't ridiculous (my estimate).
Just doubling the performance would be easy without doubling power consumption.

I have CPUs that consume that and more alone and they're EASY to cool.

Both camps saw how hard a sell the PS3 was at 600 (which was below cost iirc) and didn't want to lose sales early on.
Posted on Reply
#35
Steevo
TheGuruStud275W isn't ridiculous (my estimate).
Just doubling the performance would be easy without doubling power consumption.

I have CPUs that consume that and more alone and they're EASY to cool.
My laptop powerbrick puts out 56W under full load when its running F@H on a shiny Intel i7-4702, it gets hot to the touch. We are at support.xbox.com/en-US/xbox-one/system/about-power-supply the point where consoles require fans in the power supply bricks, and you want to double that (Xbone 112) and then some?
Posted on Reply
#36
TheGuruStud
SteevoMy laptop powerbrick puts out 56W under full load when its running F@H on a shiny Intel i7-4702, it gets hot to the touch. We are at support.xbox.com/en-US/xbox-one/system/about-power-supply the point where consoles require fans in the power supply bricks, and you want to double that (Xbone 112) and then some?
Stop using bricks. PCs don't use bricks until they're 25 watts.

There's also solutions such as the case being a passive heatsink itself.

Consoles are a total joke and obsolete before they're even released. These stupid boxes are brand new and are already stuck at 30 fps. What life does it have? None.
Posted on Reply
#37
TheMailMan78
Big Member
All I had to read was "Next-Generation NES" and my wallet put its teeth on the curb and spread its butt cheeks.
Posted on Reply
#38
NC37
Going away from RISC would break compatibility with past titles unless they use an emulation layer. However, it may pay off given the console could be ported to easier. Wii U customers are just going to be very pissed.

This is one of those things where Nintendo should have used common sense and seen this problem coming ahead of time. If you consider when Wii was popular and when it declined, you'll realize literally this is a generation of kids who got hooked then grew out of it. The Wii was a fad, all it was. Fads die. Nintendo should have seen that and not tried to repeat it with the Wii U.

Any way they do it, can't convince me to buy Nintendo. They constantly ride their old IP and barely release anything new. Mario, Zelda, Metroid...don't care. Been there, done it, they've yet to show anything new. They are the Disneyland of gaming but its like their entire park is just a few types of rides. At least Disney caters to multiple age groups. Nintendo doesn't.

They can stay kid focused but I'd just like Nintendo to step up and invent something new instead of ride the same formula titles over and over.
Posted on Reply
#39
natr0n
Nintendo should get more serious and less kiddie friendly imo.
Posted on Reply
#40
Steevo
TheGuruStudStop using bricks. PCs don't use bricks until they're 25 watts.

There's also solutions such as the case being a passive heatsink itself.

Consoles are a total joke and obsolete before they're even released. These stupid boxes are brand new and are already stuck at 30 fps. What life does it have? None.
So essentially you are suggesting to build a PC, Steam tried that, and look how well its gone for them. Also, feel free to go lick a 150W Halogen bulb, report back how it feels.


What will most likely happen is the evolution of server/client in home processing will occur, and much like the stem in home client works on almost all hardware developers will wise up, sell a PC or use console like hardware, and eventually license games to run on X number of devices either remotely processed at mid levels of graphics on high speed internet, or on a local server and then use TV's, Tablets, and other devices on the receiving end. MS has already shown this to work with remote processing on Xbone hardware.
Posted on Reply
#41
swaaye
Maybe they will just re-use the original NES hardware. ;)
Posted on Reply
#42
Xzibit
cadavecaRight, but boost in some card is like 1300 MHz or so. limit that to 1100 MHz, I think power draw could be handled. However, point taken rightly. I am eager to see what AMD can cook up on the GPU front, but maybe this idea is why we have GTX970/980, and no response from AMD.


Those are rather interesting numbers, too, thanks very much for that graph. Too bad that hardware is absolute crap on the GPU side. I haven't bought this gen of console since I knew that 1080P was not gonna happen properly.
Tom's HardwareNvidia’s GPU Boost Accelerates Maxwell
Everything makes sense in theory, but we still want to know how Maxwell achieves better efficiency at this magnitude. Kepler already adjusted the GPU’s voltage quickly and exactly depending on its load and temperature, and AMD’s PowerTune did the same thing as well. It turns out that Maxwell refines the formula further. With its shaders fully utilized, the new architecture's advantage over Kepler practically vanishes. So, Maxwell depends on its superior ability to adjust to changing loads, and, consequently, it’s able to tailor the power consumption even better to the needs of the application in question. The more variance there is, the better Maxwell fares.
To illustrate, let’s take a look at how Maxwell behaves in the space of just 1 ms. Its power consumption jumps up and down repeatedly within this time frame, hitting a minimum of 100 W and a maximum of 290 W. Even though the average power consumption is only 176 W, the GPU draws almost 300 W when it's necessary. Above that, the GPU slows down.
Gaming Power Consumption
These findings further illustrate what we said on the previous page about Maxwell and its ability to regulate GPU voltage faster and more precisely. The gaming-based power consumption numbers show just how much efficiency can be increased if the graphics card matches how much power is drawn to the actual load needed. The version of the GeForce GTX 980 that comes overclocked straight from the factory manages to use significantly less power than the reference version, while offering six percent more performance at the same time.
Stress Test Power Consumption
If the load is held constant, then the lower power consumption measurements vanish immediately. There’s nothing for GPU Boost to adjust, since the highest possible voltage is needed continuously. Nvidia's stated TDP becomes a distant dream. In fact, if you compare the GeForce GTX 980’s power consumption to an overclocked GeForce GTX Titan Black, there really aren’t any differences between them. This is further evidence supporting our assertion that the new graphics card’s increased efficiency is largely attributable to better load adjustment and matching.
.
All AMD would have to do is a better load matching algorithm.

Another thing to be considered is the multi use of these consoles. Most TVs now are 8bit+ even economic ones. Nvidia still limits GeForce to 8-bit. Radeons run native 10-bit. When your screen size is varied you'll want to minimize the issues such as consumers experiencing banding which is greatly varied by the screen/size/viewing distance and ones eyesight. Anything 4k is suppose to be 10-bit 4:4:4. Most 4k video streams already announced to be 10-bit 4:2:2.
Posted on Reply
#43
cadaveca
My name is Dave
4K is unimportant to me, and is akin to 720p, but I hear what you are saying.
Posted on Reply
#44
Thefumigator
Dj-ElectriCYou wouldn't define the ability to get 2C/4T Haswell +iGPU at under 5W? i would. And no, NVIDIA isn't far far away, one far is enough.
not when it comes to a price. and I don't believe that 2C/4T haswell at 5 watts is powerful enough for anything than a tablet running office. Still, even if it was enough, the nvidia gpu isn't sticked together to that haswell. So its like have two useless pieces of hardware. Unless you want an expensive console made of both nvidia + intel (it could happen if it was viable commercially)
Dj-ElectriCAMD has decent offering, and decent brought enough potato-like trouble to both current-gen consoles with an array of devs having the lack of ability to supply either 1080P image and/or 60FPS in most games.
No, no thanks. Hopefully, maybe Nintendo will use a more powerful tech in the 20NM gen. Maybe not.
as far as I see people who owns an xbox or ps4 they just adore their console, so I don't even see where are those missing FPS specially if you don't have a 120Hz TV. What I see are poor developers in a rush with limited budget making bad games. And real engineers were those over a Z80 programming for a nintendo gameboy. Once these consoles mature enough games will improve. Its a question of time.
Posted on Reply
#45
TRWOV
HisDivineOrderA smarter move by Nintendo would be to quit the hardware business altogether and become a third party publisher,
I've read that countless of times but have yet to comprehend what would Nintendo win from that move. The way they do things now is kinda like what Apple does: they control the development environment from top to bottom. Yes, 3rd party support is almost non existent but it was like that with the N64 and (to a lesser extend) the Gamecube and they did just fine.

What has aggravated the Wii U's situation is the long drought between big releases. On the two last generations there was a slow but steady influx of Nintendo games but now that isn't the case. Of course, third party games would help pad the holes in the lineup but Nintendo has demonstrated that they can support a console all by themselves if they keep their shit together. Their problem is that they haven't been able to keep a consistent release schedule, how would going third party solve that? If anything that would exasperate it. :confused:
Posted on Reply
#46
Melvis
Bring back Cartridge I say, use a plug and play SSD or something like that instead of CD/DVD/Blue ray, super fast loading then, just like the old days xD
Posted on Reply
#47
Xzibit
MelvisBring back Cartridge I say, use a plug and play SSD or something like that instead of CD/DVD/Blue ray, super fast loading then, just like the old days xD
DiY Tech Support



I always wondered why there was never a push to go USB stick given how cheap its gotten.
Posted on Reply
#48
Hilux SSRG
Sadly, at the end of the day Nintendo will stay first party centric and lose more marketshare and gamers.
Posted on Reply
#49
FordGT90Concept
"I go fast!1!11!1!"
XzibitI always wondered why there was never a push to go USB stick given how cheap its gotten.
Because it is still more expensive than disks and for substantially less capacity too.

I'm wondering why they haven't abandoned disks entirely and gone with an internet/subscription-based model. Physical mediums are so 20th century.
Posted on Reply
#50
Xzibit
FordGT90ConceptBecause it is still more expensive than disks and for substantially less capacity too.

I'm wondering why they haven't abandoned disks entirely and gone with an internet/subscription-based model. Physical mediums are so 20th century.
I'm totally against that but it might be more controllable with Nintendo. I wouldn't hold my breath. Look what its done to the PC gaming industry and the consoles. Developers become so dam lazy it becomes a race to yearly profit cycles and we the consumer end up with half ass finished game requiring a substantial download to patch, spanning weeks to months to even play the game as intended.

Now companies are becoming more sneaky and greedy then ever before to turn a profit. I fully expect games & patch EULAs to come with a no sue clause upon installation & use going forward.

Don't trust the cloud it will piss on you, Science says.
Posted on Reply
Add your own comment
Nov 25th, 2024 06:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts