# Complete Hardware Specs Sheet of Xbox Series X Revealed



## btarunr (Mar 16, 2020)

Microsoft just put out of the complete hardware specs-sheet of its next-generation Xbox Series X entertainment system. The list of hardware can go toe to toe with any modern gaming desktop, and even at its production scale, we're not sure if Microsoft can break-even at around $500, possibly counting on game and DLC sales to recover some of the costs and turn a profit. To begin with the semi-custom SoC at the heart of the beast, Microsoft partnered with AMD to deploy its current-generation "Zen 2" x86-64 CPU cores. Microsoft confirmed that the SoC will be built on the 7 nm "enhanced" process (very likely TSMC N7P). Its die-size is 360.45 mm². 

The chip packs 8 "Zen 2" cores, with SMT enabling 16 logical processors, a humongous step up from the 8-core "Jaguar enhanced" CPU driving the Xbox One X. CPU clock speeds are somewhat vague. It points to 3.80 GHz nominal and 3.66 GHz with SMT enabled. Perhaps the console can toggle SMT somehow (possibly depending on whether a game requests it). There's no word on the CPU's cache sizes. 



 

 




The graphics processor is another key component of the SoC given its lofty design goal of being able to game at 4K UHD with real-time ray-tracing. This GPU is based on AMD's upcoming RDNA2 graphics architecture, which is a step up from "Navi" (RDNA), in featuring real-time ray-tracing hardware optimized for DXR 1.1 and support for variable-rate shading (VRS). The GPU features 52 compute units (3,328 stream processors provided each CU has 64 stream processors in RDNA2). The GPU ticks at an engine clock speed of up to 1825 MHz, and has a peak compute throughput of 12 TFLOPs (not counting CPU). The display engine supports resolutions of up to 8K, even though the console's own performance targets at 4K at 60 frames per second, and up to 120 FPS. Variable refresh-rate is supported.

The memory subsystem is similar to what we reported earlier today - a 320-bit GDDR6 memory interface holding 16 GB of memory (mixed chip densities). It's becoming clear that Microsoft isn't implementing a hUMA common memory pool approach. 10 GB of the 16 GB runs at 560 GB/s bandwidth, while 6 GB of it runs at 336 GB/s. Storage is another area that's receiving big hardware uplifts: the Xbox Series X features a 1 TB NVMe SSD with 2400 MB/s peak sequential transfer rate, and an option for an additional 1 TB NVMe storage through an expansion module. External storage devices are supported, too, over 10 Gbps USB 3.2 gen 2. The console is confirmed to feature a Blu-ray drive that supports 4K UHD Blu-ray playback. All these hardware specs combine toward what Microsoft calls the "Xbox Velocity Architecture." Microsoft is also working toward improving the input latency of its game controllers.

*View at TechPowerUp Main Site*


----------



## dicktracy (Mar 16, 2020)

Much better than 5700XT that doesn’t have hardware RT. I’ll laugh if people will still buy that paper weight right now.


----------



## TheLostSwede (Mar 16, 2020)

All it needs now is wide keyboard/mouse support and this could be a decent gaming PC...
I guess there will be a lot of third party custom SSD upgrades for this thing as well, since it'll most likely require some special kind of housing for the SSD to be easy to slot in to the system.


----------



## The Quim Reaper (Mar 16, 2020)

10Gb of fast RAM, 6Gb of slow RAM...

The Nvidia 970 designers are smiling.


----------



## Kissamies (Mar 16, 2020)

The Quim Reaper said:


> 10Gb of fast RAM, 6Gb of slow RAM...
> 
> The Nvidia 970 designers are smiling.


That slower part still has plenty of bandwith. Unlike the 970's slow 512MB which was on a 32-bit bus


----------



## phanbuey (Mar 16, 2020)

Looks like a beast.


----------



## ratirt (Mar 16, 2020)

Never seen a console specs like this before. It really matches or surpasses the ordinary computer in the household. At least most of them. Nice


----------



## JAB Creations (Mar 16, 2020)

> I’ll laugh if people will still buy that paper weight right now.



 Please wake me when humanity has purged the fragile mentality of egotism.

Any way I like the specs thus far. The extra NVMe expansion module is a great way to prevent massive games from sucking up all the space, especially if developers are going to release 4K games. Allowing a second drive and using 6GB for the system are obviously to contend with the issue of cost and I like that approach.


----------



## Easo (Mar 16, 2020)

More powerful than most gaming PC's out there. I can now cut the PC Master Race jokes for a while.


----------



## ZoneDymo (Mar 16, 2020)

dicktracy said:


> Much better than 5700XT that doesn’t have hardware RT. I’ll laugh if people will still buy that paper weight right now.



Dude slow down with the unintelligent comments, I cant keep up


----------



## notb (Mar 16, 2020)

ratirt said:


> Never seen a console specs like this before. It really matches or surpasses the ordinary computer in the household. At least most of them. Nice


Still too expensive. The specs of cheaper model will be what actually matter for majority of buyers.


----------



## ratirt (Mar 16, 2020)

JAB Creations said:


> Please wake me when humanity has purged the fragile mentality of egotism.
> 
> Any way I like the specs thus far. The extra NVMe expansion module is a great way to prevent massive games from sucking up all the space, especially if developers are going to release 4K games. Allowing a second drive and using 6GB for the system are obviously to contend with the issue of cost and I like that approach.


True that. I'm curious what are the power requirements for this stuff and if there is any extreme cooling necessary? Probably not but when you look at this monster, damn.


notb said:


> Still too expensive. The specs of cheaper model will be what actually matter for majority of buyers.


If it gets you 4k 60FPS I think the price $500 is OK. Try same price for 4K on a PC you will get blown away with how much you would need.


----------



## HwGeek (Mar 16, 2020)

I hope MS wil let us use Win10 on it, so we could play or use it as a PC["Desktop Console" ].
I will buy it at first day if so.


----------



## ratirt (Mar 16, 2020)

HwGeek said:


> I hope MS wil let us use Win10 on it, so we could play or use it as a PC["Desktop Console" ].
> I will buy it at first day if so.


I wonder what Sony has with their PS5? Will it be faster than this XBOX? Either way not bad.


----------



## dicktracy (Mar 16, 2020)

ZoneDymo said:


> Dude slow down with the unintelligent comments, I cant keep up


Nextgen consoles have raytracing. 5700XT’s obsolete tech is built for last-gen console games instead. Try to keep up with reality!


----------



## oxrufiioxo (Mar 16, 2020)

ratirt said:


> I wonder what Sony has with their PS5? Will it be faster than this XBOX? Either way not bad.



This is what I'm wondering as well. Rumors are all over the place but its kinda funny that the Xbox actually ended up on the higher end of what the rumors were saying. Hopefully the PS5 is just as impressive. My wife likes all the Sony exclusives so I have no choice but to buy one either way.


----------



## kings (Mar 16, 2020)

With these specs, it will probably cost somewhere between $599 and $649.

I'm not seeing MS selling this console for the initial price of Xbox One X, unless they lose money.


----------



## ratirt (Mar 16, 2020)

oxrufiioxo said:


> This is what I'm wondering as well. Rumors are all over the place but its kinda funny that the Xbox actually ended up on the higher end of what the rumors were saying. Hopefully the PS5 is just as impressive. My wife likes all the Sony exclusives so I have not choice but to buy one either way.


I was PS5 always but seeing what XBOX has in store I might get this one. Hope Sony won't disappoint 


kings said:


> With these specs, it will probably cost somewhere between $599 and $649.
> 
> I'm not seeing MS selling this console for the initial price of Xbox One X, unless they lose money.


If it is $600 or $650 it is still not bad. Consider how much money you would have to spend to get this on a PC? Just the graphics card would cost $1200. anyway time will tell


----------



## r.h.p (Mar 16, 2020)

just thinking , a console with this much onboard CPU and GPU power must produce a lot of heat , hence a Descent heatsink and fan system must be needed …//? ?
my PS3 has always needed constant fan intake and exhaust vacuuming to keep the fan on normal speed ( run run run …)


----------



## oxrufiioxo (Mar 16, 2020)

ratirt said:


> I was PS5 always but seeing what XBOX has in store I might get this one. Hope Sony won't disappoint
> 
> If it is $600 or $650 it is still not bad. Consider how much money you would have to spend to get this on a PC? Just the graphics card would cost $1200. anyway time will tell



it seems close to a 2080 at this point maybe a little faster going by how they demonstrated gears 5 running at higher than pc ultra settings matching it in performance without proper optimization..... impressive stuff but I'm guessing AMD will have a GPU with a similar maybe 56 cu count at around $499 when this launches still making the console a pretty awesome deal even if it comes in at 100 more.


I'm actually just as surprised by the CPU 3.6 ghz across 16 threads on a console is just crazy to me 3 years ago in the desktop people were still buying quads without hyper threading.


----------



## notb (Mar 16, 2020)

ratirt said:


> If it gets you 4k 60FPS I think the price $500 is OK. Try same price for 4K on a PC you will get blown away with how much you would need.


Most buyers don't look at it as a cheaper alternative to a gaming desktop, because they don't consider a gaming desktop at all. 
There's no "value". There's no "how many fps you can get for your money" philosophy.
It's a console - a box that makes gaming possible. Almost like a household appliance. But it suddenly got more expensive.

I bet you wouldn't be happy if washing machines got 20% more expensive with next generation - even if I tried to convince you they're still way more efficient than hand washing.

Also, I can't believe MS would accept the risk of selling this for breakeven price. People willing to pay more for the top product, including the group that games in 4K on PCs, will accept $600+. It's still cheap compared to their flagship smartphones.
Bulk of clients will get the new budget model or hold on to / buy an Xbox One X - there's a big chance it will support future games.


----------



## TheLostSwede (Mar 16, 2020)

dicktracy said:


> Nextgen consoles have raytracing. 5700XT’s obsolete tech is built for last-gen console games instead. Try to keep up with reality!


Having watched that, the "expansion SSD" really does look like a proprietary standard   
Thanks M$ for creating yet another memory standard that will be sold at premium prices...


----------



## milewski1015 (Mar 16, 2020)

dicktracy said:


> Much better than 5700XT that doesn’t have hardware RT. I’ll laugh if people will still buy that paper weight right now.



It's stated that the GPU is based on the upcoming RDNA2 architecture - of course it's going to be an improvement over the current RDNA architecture. 



dicktracy said:


> Nextgen consoles have raytracing. 5700XT’s obsolete tech is built for last-gen console games instead. Try to keep up with reality!



So because something doesn't support RT - a feature only supported by a small list of games that severely decreases performance for what is, in my opinion, a barely noticeable visual difference - that makes it a paperweight?


----------



## notb (Mar 16, 2020)

HwGeek said:


> I hope MS wil let us use Win10 on it, so we could play or use it as a PC["Desktop Console" ].
> I will buy it at first day if so.


Xbox is running the Win10 kernel already. It even uses Hyper-V: games and apps are run in containers.

You mean the full Windows 10? With full access to software? Unlikely.
If mandatory, this would ruin the console experience (clean, easy to use with a controller).
If optional, it would seriously affect the PC market. I don't think MS partners would accept that.

Xbox One should also be able to run full Windows and it didn't happen.
As of today there isn't even an RDP client for Xbox.

That said, it seems we'll see a lot more software for MS ecosystem - including (finally) integration with Office Online.


----------



## ppn (Mar 16, 2020)

560GBs 10GB gpu optimal memory, only problem is the cpu will eat alot of that 10GB-5,5=4,5GB, 560-336=224, so the GPU is left with 4,5GB at 224GBs.

How about NO, and gives us HBM2E 32GB 1TBs, 5nm maxed to 420mm2 GPU only, and separate 60mm2 8 core ZEN3 with separate DDR5 24GB, or forget about it.


----------



## FordGT90Concept (Mar 16, 2020)

The Quim Reaper said:


> 10Gb of fast RAM, 6Gb of slow RAM...
> 
> The Nvidia 970 designers are smiling.


PlayStation 4 had a similar set up where CPU got significantly less bandwidth from the GDDR chips than the GPU did.

5700 XT = 9.754 TFLOPS 

This chip is 12 TFLOP...






This thing is more than double the Xbox One X.


Also, 8K support was baked into Navi.  It's likely there for 8K web streams or maybe BluRays with a firmware update.  Maybe ATSC 3.0 could handle it someday too.  Point is, the GPU is ready and able to decode it.



Hype rising for RDNA2.  If they can manage this kind of performance with semi-custom then imagine what it can do on an AIB.  I'm glad I waited to upgrade.


----------



## T4C Fantasy (Mar 16, 2020)

to put it extremely simple
Memory Config:
12288 MB + 4096 MB
10240 MB @ 560GB/s, 6144 MB @ 336GB/s
320 Bit / 192 Bit


----------



## FordGT90Concept (Mar 16, 2020)

Ohhhhh, I think I see it: the 192-bit chips share bandwidth with the CPU where the 128-bit chips are dedicated.

GPU is effectively 10 GiB at 128-bit and CPU is effectively 6 GiB at 64-bit.


----------



## xkm1948 (Mar 16, 2020)

Meh, if I ever to game on consoles it would be  playstation. Xbox just does not have enough exclusive good titles.


----------



## T4C Fantasy (Mar 16, 2020)

FordGT90Concept said:


> Ohhhhh, I think I see it: the 192-bit chips share bandwidth with the CPU where the 128-bit chips are dedicated.
> 
> GPU is effectively 10 GiB at 128-bit and CPU is effectively 6 GiB at 64-bit.


i fixed my message it was incorrect, 2 seperate memory configs 
12288 MB + 4096 MB
10240 MB @ 560GB/s, 6144 MB @ 336GB/s
320 Bit / 192 Bit


----------



## Valantar (Mar 16, 2020)

Have to say this thing looks _really_ good. I'll be buying both new consoles regardless, but hot damn, this is (so far) the one that truly tickles my fancy. Love the design too.


----------



## rvalencia (Mar 16, 2020)

FordGT90Concept said:


> PlayStation 4 had a similar set up where CPU got significantly less bandwidth from the GDDR chips than the GPU did.
> 
> 5700 XT = 9.754 TFLOPS
> 
> ...


1825 Mhz with 52 CU yields about 12.147 TFLOPS FP32.  According to DF, RT has ~13 TFLOPS equlavent. 

XSX GPU has about 25 TF.


----------



## FordGT90Concept (Mar 16, 2020)

T4C Fantasy said:


> i fixed my message it was incorrect, 2 seperate memory configs
> 12288 MB + 4096 MB
> 10240 MB @ 560GB/s, 6144 MB @ 336GB/s
> 320 Bit / 192 BitView attachment 148296


So apparently the slower bus is the one that's tied to the CPU and the GPU is aware of it so it can start pulling from that pool too if necessary.


----------



## ShurikN (Mar 16, 2020)

ppn said:


> 560GBs 10GB gpu optimal memory, only problem is the cpu will eat alot of that 10GB-5,5=4,5GB, 560-336=224, so the GPU is left with 4,5GB at 224GBs.


The way I understood it, the fastest memory is reserved only for the GPU in the 10GB amount. The remaining (slower) 6GB is split between OS and CPU (and maybe GPU if needs more). Even if the CPU eats into GPU's mem pool, it wont be by the amount you mentioned.
I mean, upcoming titles in 4K with RT will definitely not run with 4.5GB of vram


----------



## notb (Mar 16, 2020)

xkm1948 said:


> Meh, if I ever to game on consoles it would be  playstation. Xbox just does not have enough exclusive good titles.


I guess it depends what kind of games you play.





						Category:PlayStation 4-only games - Wikipedia
					






					en.wikipedia.org
				



There are maybe 3 games on this list that I'd want to play at all: Spider Man, Gran Turismo and God of War (maybe).
As for Xbox exclusives: I'd only miss Forza Horizon (which is my most played console game so far).

Vast majority of popular games are available on both platforms. I doubt exclusives are a serious factor for many.

For me Xbox wins because of the ecosystem.
Gaming-wise: Game Pass and Live are both good, Play Anywhere is nice as well and controller support in Windows is excellent. Frankly, I also prefer the controller itself.
But there's much more to it. I love that there's an app for OneDrive (and Dropbox as well). I occasionally use Skype too.
There's also very good integration with Assistants (not just Cortana) and automation frameworks (I use IFTTT extensively).
And there's a chance they'll go further with this philosophy.
If they do - I may even buy this stupid next-gen flowerpot.

PlayStation is very focused on gaming, which may be attractive for some, but for me it just doesn't beat the flexibility of Xbox.


----------



## dirtyferret (Mar 16, 2020)

$500 xbox console with PS5 coming in at $450+, it will take two years before there is any significant market penetration



notb said:


> Most buyers don't look at it as a cheaper alternative to a gaming desktop, because they don't consider a gaming desktop at all.
> There's no "value". There's no "how many fps you can get for your money" philosophy.
> It's a console - a box that makes gaming possible. Almost like a household appliance. But it suddenly got more expensive.
> 
> ...


+1


----------



## Valantar (Mar 16, 2020)

rvalencia said:


> 1825 Mhz with 52 CU yields about 12.147 TFLOPS FP32.  According to DF, RT has ~13 TFLOPS equlavent.
> 
> XSX GPU has about 25 TF.


That's a misunderstanding. It has the _equivalent_ of 25TF _if the RTRT was done purely in shaders_. The RTRT hardware can't do regular shader workloads, and thus does not translate back into FP32 TFLOPS.



notb said:


> Most buyers don't look at it as a cheaper alternative to a gaming desktop, because they don't consider a gaming desktop at all.
> There's no "value". There's no "how many fps you can get for your money" philosophy.
> It's a console - a box that makes gaming possible. Almost like a household appliance. But it suddenly got more expensive.
> 
> I bet you wouldn't be happy if washing machines got 20% more expensive with next generation - even if I tried to convince you they're still way more efficient than hand washing.


Agreed. Though the difference between $400 and $500 isn't massive, it's still there, and $500 has scared people off before - hello PS3 and Xbox One! - though those were both inferior in performance as well, making it a double whammy of inferiority that this very likely won't be.



notb said:


> Also, I can't believe MS would accept the risk of selling this for breakeven price. People willing to pay more for the top product, including the group that games in 4K on PCs, will accept $600+. It's still cheap compared to their flagship smartphones.
> Bulk of clients will get the new budget model or hold on to / buy an Xbox One X - there's a big chance it will support future games.


Here, though, you're off the rails. There's no risk in selling at break-even, as game licensing easily makes up for any lost profits there. $10 per title sold, plus a portion of all in-game purchases, plus Xbox Live and Game pass - that all goes a long way quickly. They have near zero risk in selling this at break-even or even at a slight loss. PCs are quite different here, as it's an open system, so no game licences, no subscriptions, no cut of in-game purchases, etc. - so all profits must be made ahead of time.


----------



## HD64G (Mar 16, 2020)

A great gaming machine with a cut-down semi-big Navi gen2 that will allow 60FPS@4K and 120FPS@1440P. What's not to like for $600? Just its iGPU's can be valued more than that compared to the ones on sale today.


----------



## dirtyferret (Mar 16, 2020)

Valantar said:


> Agreed. Though the difference between $400 and $500 isn't massive, it's still there, and $500 has scared people off before - hello PS3 and Xbox One! - though those were both inferior in performance as well, making it a double whammy of inferiority that this very likely won't be.



just to back this up





						Why the PlayStation 4 Triumphed Over the Xbox One | ExtremeTech
					

Sony has decisively won this round of the console wars. Why?  ...




					www.extremetech.com
				






HD64G said:


> A great gaming machine with a cut-down semi-big Navi gen2 that will allow 60FPS@4K and 120FPS@1440P. What's not to like for $600? Just its iGPU's can be valued more than that compared to the ones on sale today.



Those are meaningless numbers to console gamers, they only care if it will play the latest Madden, FIFA, CoD game on their 4k TV with shinier graphics then the previous generation


----------



## Tartaros (Mar 16, 2020)

The Quim Reaper said:


> 10Gb of fast RAM, 6Gb of slow RAM...
> 
> The Nvidia 970 designers are smiling.



Memory with different speed have been always used in computing since its dawn and everything is tailored for specific needs. Having different memories with different speeds is not the problem rather than hiding it for commercial purposes.


----------



## HD64G (Mar 16, 2020)

dirtyferret said:


> just to back this up
> 
> 
> 
> ...


Smart marketing can change the perspective of what a console can become with the next-gen ones. And then, the $600 monster will seem VERY cheap for what it will offer.


----------



## MxPhenom 216 (Mar 16, 2020)

Anyone else question how long the system can sit at those clocks without heat concerns. Even if its 7nm, thats still a fairly big chip to be cooled. Though they are putting a huge heatsink on the thing.


----------



## FordGT90Concept (Mar 16, 2020)

Like a second.  Nominal clockspeed is probably much lower.

...unless n7+ is really that good...


----------



## notb (Mar 16, 2020)

MxPhenom 216 said:


> Anyone else question how long the system can sit at those clocks without heat concerns. Even if its 7nm, thats still a fairly big chip to be cooled. Though they are putting a huge heatsink on the thing.


The beauty of gaming on a console is: you don't have to give a f... In fact: you shouldn't!
If the first thing you think about is: "but what about overheating?", you've literraly wasted a big chunk of the premium you're paying to Microsoft/Sony/Nintendo.
What you should think about is: "will it look well in my living room?" and "is the controller comfortable?"

It's a black box. It was designed and (hopefully: well) tested. We should assume it can do its part.
And if it can't - there's very little you can do to help.
And there are only 2 makers to choose from. And if you already got a console but you don't like it, it's hard to move to the competition, because you can't take your games with you.


----------



## MxPhenom 216 (Mar 16, 2020)

notb said:


> The beauty of gaming on a console is: you don't have to give a f... In fact: you shouldn't!
> If the first thing you think about is: "but what about overheating?", you've literraly wasted a big chunk of the premium you're paying to Microsoft/Sony/Nintendo.
> What you should think about is: "will it look well in my living room?" and "is the controller comfortable?"
> 
> ...



Well I'm an engineer so I do consider it. Even if you dont have it, i can't help my brain.


----------



## Super XP (Mar 16, 2020)

notb said:


> Still too expensive. The specs of cheaper model will be what actually matter for majority of buyers.


What price is the XBOX Series X going to sell for? I don't see a price associated with it at the moment. And you cannot compare a console based system with a actual computer, in terms of cost. As the Consoles cost to make is a lot cheaper.


----------



## MxPhenom 216 (Mar 16, 2020)

Super XP said:


> What price is the XBOX Series X going to sell for? I don't see a price associated with it at the moment. And you cannot compare a console based system with a actual computer, in terms of cost. As the Consoles cost to make is a lot cheaper.



Itll probably be close to $600 im thinking.


----------



## TheLostSwede (Mar 16, 2020)

Seagate's Xbox Series X storage card has 1TB of space, but no price | Engadget
					

Microsoft only just revealed that the Xbox Series X will support "expansion cards" that allow for faster storage, but it already has an accessory maker lined up.




					www.engadget.com


----------



## Valantar (Mar 16, 2020)

MxPhenom 216 said:


> Anyone else question how long the system can sit at those clocks without heat concerns. Even if its 7nm, thats still a fairly big chip to be cooled. Though they are putting a huge heatsink on the thing.





FordGT90Concept said:


> Like a second.  Nominal clockspeed is probably much lower.
> 
> ...unless n7+ is really that good...


DF reported that MS were very explicit in saying (and reiterating and underscoring) that clocks are entirely fixed. No boost, no throttling. Period. Reported clocks are 24/7 clocks. The thinking is that this is a console, so it should have X performance no matter what, as performance shouldn't be down to user configurations or tweaks like in the PC space. A good call if you ask me.


----------



## ARF (Mar 16, 2020)

Valantar said:


> DF reported that MS were very explicit in saying (and reiterating and underscoring) that clocks are entirely fixed. No boost, no throttling. Period. Reported clocks are 24/7 clocks. The thinking is that this is a console, so it should have X performance no matter what, as performance shouldn't be down to user configurations or tweaks like in the PC space. A good call if you ask me.



Expect this to be a hot (pun intended) box, and noisy if it has some fans to chill it down a bit.

At N7 node, that performance needs wattage.


----------



## TheLostSwede (Mar 16, 2020)

ARF said:


> Expect this to be a hot (pun intended) box, and noisy if it has some fans to chill it down a bit.
> 
> At N7 node, that performance needs wattage.


It has one fan. I take it the pictures didn't load for you?
There are some helpful animations if you click on the link below.








						Xbox Series X | Xbox
					

Discover the fastest, most powerful Xbox ever with the Xbox Series X.



					www.xbox.com


----------



## Valantar (Mar 16, 2020)

HD64G said:


> Smart marketing can change the perspective of what a console can become with the next-gen ones. And then, the $600 monster will seem VERY cheap for what it will offer.


Nah, I don't think so. As was mentioned above, a console is a gaming appliance. Appliances for mass markets have rather strict entry points. And $600 with no games included has previously proven to be too steep for mass market adoption. And people's disposable income hasn't increased much since then. I would say $300 is the sweet spot for consoles, with $400 being a good launch price and $500 being acceptable if (and only if) it's a batshit crazy aspirational purchase with some real X-factor. $600 will be effectively DOA, as you'd never get the volume off the ground to being prices down or reach critical mass to make the related services really good.


ARF said:


> Expect this to be a hot (pun intended) box, and noisy if it has some fans to chill it down a bit.
> 
> At N7 node, that performance needs wattage.


As @TheLostSwede said above, a single large (and thick, looks like 35-40mm) fan on top pulling air out. Almost an ideal cooling layout, should perform well. And that heatsink is huge too.


----------



## TheLostSwede (Mar 16, 2020)

Valantar said:


> As @TheLostSwede said above, a single large (and thick, looks like 35-40mm) fan on top pulling air out. Almost an ideal cooling layout, should perform well. And that heatsink is huge too.


It's a 130mm fan, but yeah, it looks really thick too.


----------



## Valantar (Mar 16, 2020)

TheLostSwede said:


> It's a 130mm fan, but yeah, it looks really thick too.


Hm, 130mm is interesting. Though my first thought when I saw the layout was "here come the Noctua fan swaps!" I have no doubt they will be attempted no matter the size.


----------



## Od1sseas (Mar 16, 2020)

milewski1015 said:


> It's stated that the GPU is based on the upcoming RDNA2 architecture - of course it's going to be an improvement over the current RDNA architecture.
> 
> 
> 
> So because something doesn't support RT - a feature only supported by a small list of games that severely decreases performance for what is, in my opinion, a barely noticeable visual difference - that makes it a paperweight?



"Barely makes a difference". Lmfao. People said same shit like that when tessellation first appeared and now look how important it is for games to look realistic. Educate yourself. Ray Tracing is the future and the difference is noticable.


----------



## ARF (Mar 16, 2020)

Od1sseas said:


> "Barely makes a difference". Lmfao. People said same shit like that when tessellation first appeared and now look how important it is for games to look realistic. Educate yourself. Ray Tracing is the future and the difference is noticable.



Games are not designed to look photo-realistic in the first place. This alone cancels out the whole idea of using ray-tracing in gaming development. Another reason is the exponential need for hardware resources for effects which can be simulated and similar effects achieved using the old rasterization rendering.


----------



## dirtyferret (Mar 16, 2020)

HD64G said:


> Smart marketing can change the perspective of what a console can become with the next-gen ones. And then, the $600 monster will seem VERY cheap for what it will offer.



Marketing might help choosing an xbox over a ps5 or vice versa but it won't be putting $600 into the wallet of people who can't afford it.


----------



## Od1sseas (Mar 16, 2020)

ARF said:


> Games are not designed to look photo-realistic in the first place. This alone cancels out the whole idea of using ray-tracing in gaming development. Another reason is the exponential need for hardware resources for effects which can be simulated and similar effects achieved using the old rasterization rendering.


Screen Space effects are not even close to Ray Tracing , they is no way they can be simulated using Rasterization. Reflections is one example.
Games are designed to be photorealistic otherwise we would have stuck at 1990 graphics with 2 polygons max


----------



## ARF (Mar 16, 2020)

Od1sseas said:


> Screen Space effects are not even close to Ray Tracing , they is no way they can be simulated using Rasterization. Reflections is one example.
> Games are designed to be photorealistic otherwise we would have stuck at 1990 graphics with 2 polygons max



You don't need ray-tracing. See following NFS images:





__ https://twitter.com/i/web/status/1002013665963393024




__ https://twitter.com/i/web/status/860150800605212672


----------



## X828 (Mar 16, 2020)

Thought AMD had the contract for both Xbox and Playstation...... did that change?   If not... all of you saying you "look forward to seeing what the PS5 has to offer"   look no further than the specs in this article.   Same specs, different shell.


----------



## iO (Mar 16, 2020)

ARF said:


> You don't need ray-tracing. See following NFS images:
> 
> 
> 
> ...



Horrible example of "RT is useless" if SSR can't draw any reflections of the cars underbody...


----------



## HD64G (Mar 16, 2020)

dirtyferret said:


> Marketing might help choosing an xbox over a ps5 or vice versa but it won't be putting $600 into the wallet of people who can't afford it.


Who is talking about forcing people to bleed their wallet to get anything just for fun? We are talking about people with enough money to buy a PC with $1500 or prefer a $600 console that performs at least equally.

Something techy after watching videos about the XBOX's APU specs: If that 52CU GPU allows for constant 1.8GHz for less than 150W (simple assumption as with another 50W the APU will reach 200W in total which is a sensible limit) while possibly matching 2080 Super performance, the RDNA2 efficiency will be a great breakthrough in compute efficiency in general. My 5c.


----------



## MxPhenom 216 (Mar 16, 2020)

ARF said:


> You don't need ray-tracing. See following NFS images:
> 
> 
> 
> ...



Ray tracing is so much more than just reflections you nitwits. Saying oh we don't need this tech (something that could significantly advance graphics in the future) is someone advocating for mediocrity. Ray tracing is the closest we are to simulating the behavior of lights in an environment. Screen space effects aren't doing that at all.

Hell just the shadow improvements alone in the new CoD is enough to keep me interested in where this could go and hope hardware ray tracing becomes a standard.


----------



## Od1sseas (Mar 16, 2020)

ARF said:


> You don't need ray-tracing. See following NFS images:
> 
> 
> 
> ...


Are you trolling me right now? Do you even know how Screen Space Reflections work?


----------



## Darmok N Jalad (Mar 16, 2020)

FordGT90Concept said:


> Ohhhhh, I think I see it: the 192-bit chips share bandwidth with the CPU where the 128-bit chips are dedicated.
> 
> GPU is effectively 10 GiB at 128-bit and CPU is effectively 6 GiB at 64-bit.


You know, this is what I was thinking. There's no reason to believe the memory is all shared--maybe they found it easier to design this way, or to have better performance results. Maybe one memory controller can't rule them all in this case.


----------



## ARF (Mar 16, 2020)

MxPhenom 216 said:


> Ray tracing is so much more than just reflections you nitwits. Saying oh we don't need this tech (something that could significantly advance graphics in the future) is someone advocating for mediocrity. Ray tracing is the closest we are to simulating the behavior of lights in an environment. Screen space effects aren't doing that at all.
> 
> Hell just the shadow improvements alone in the new CoD is enough to keep me interested in where this could go and hope hardware ray tracing becomes a standard.



The problem is that you need a hell lot of transistors which you currently can't access because you are stuck at N7 process. With the further nodes not promising too much advancement.

You need quantum computing for ray-tracing to have real value in reality.


----------



## gamefoo21 (Mar 16, 2020)

X828 said:


> Thought AMD had the contract for both Xbox and Playstation...... did that change?   If not... all of you saying you "look forward to seeing what the PS5 has to offer"   look no further than the specs in this article.   Same specs, different shell.



AMD does, but they basically build the CPU and GPU and all that to MS and Sony's requests.

Look at the PS4 Pro vs One X. One vs PS4.


----------



## Kissamies (Mar 16, 2020)

Personally I can't even tell between two gameplay videos is there RT on or not. Not the best invention after sliced bread IMO. I see more difference between high and ultra settings in a game.


----------



## TheGuruStud (Mar 17, 2020)

HwGeek said:


> I hope MS wil let us use Win10 on it, so we could play or use it as a PC["Desktop Console" ].
> I will buy it at first day if so.



How's that going to work? You would NEVER be able to install anything that isn't from their store...so basically nothing useful. Windows is too broken to be exposed.

You could run a copy virtualized, but then you're just increasing complexity to nincompoops. It'll be infected by the end of day 1 and lead to angry customers that wouldn't know how to press a reset button if you stapled it to their forehead. And eating storage space, and a hundred dumb things only the general population could conjure.

K.I.S.S.


----------



## Darmok N Jalad (Mar 17, 2020)

TheGuruStud said:


> How's that going to work? You would NEVER be able to install anything that isn't from their store...so basically nothing useful. Windows is too broken to be exposed.
> 
> You could run a copy virtualized, but then you're just increasing complexity to nincompoops. It'll be infected by the end of day 1 and lead to angry customers that wouldn't know how to press a reset button if you stapled it to their forehead. And eating storage space, and a hundred dumb things only the general population could conjure.
> 
> K.I.S.S.


It might be fun if they could take the same hardware, put it in a different case, throw in native controller support, have it just run Windows 10, and call it a Surface Desktop or something. Of course, if they did that, they might diminish people's desire to just go buy an Xbox, but it would make for a pretty powerful PC.


----------



## Makaveli (Mar 17, 2020)

ppn said:


> 560GBs 10GB gpu optimal memory, only problem is the cpu will eat alot of that 10GB-5,5=4,5GB, 560-336=224, so the GPU is left with 4,5GB at 224GBs.
> 
> How about NO, and gives us HBM2E 32GB 1TBs, 5nm maxed to 420mm2 GPU only, and separate 60mm2 8 core ZEN3 with separate DDR5 24GB, or forget about it.



Lmao and give me a million dollars.

The console you want would cost $1000-1500 dollars.

At least make the request reasonable.


----------



## notb (Mar 17, 2020)

dirtyferret said:


> $500 xbox console with PS5 coming in at $450+, it will take two years before there is any significant market penetration


Maybe it's a totally new strategy? Building up the lineup instead of replacing?
I mean: if new games work on both Xbox One and Series, there's really no need to replace what they have. They're offering a new model above those existing - with prices shifting over time to make room for another gen.
It's almost exactly what Sony does in cameras (unlike most competition). They launch a top model, but keep making the earlier ones for few years, so eventually they have a full lineup.
We could see more frequent launches as well (a single top model every 2-3 years).


HD64G said:


> Smart marketing can change the perspective of what a console can become with the next-gen ones. And then, the $600 monster will seem VERY cheap for what it will offer.


Sure, but that would move consoles from cheap "gaming for everybody" to a more high-end market. Less volume, probably less games. Higher margin.

Maybe smartphones are eating into the casual, mass-market gaming - like they did with PCs and cameras. I don't know. But in those markets we've seen the (potentially) exact same reaction: focus on a high-end, high-paying client.


MxPhenom 216 said:


> Well I'm an engineer so I do consider it. Even if you dont have it, i can't help my brain.


You're an engineer. Is that a pick-up line? Hilarious.


Super XP said:


> What price is the XBOX Series X going to sell for? I don't see a price associated with it at the moment. And you cannot compare a console based system with a actual computer, in terms of cost. As the Consoles cost to make is a lot cheaper.


I don't understand the comparing part. The fact that these consoles are expensive to make leaked a long time ago. It's not official, but it's pretty much as certain as the specs at this point.


----------



## Super XP (Mar 17, 2020)

X828 said:


> Thought AMD had the contract for both Xbox and Playstation...... did that change?   If not... all of you saying you "look forward to seeing what the PS5 has to offer"   look no further than the specs in this article.   Same specs, different shell.


Different specs different shell both powered by AMD.


----------



## notb (Mar 17, 2020)

ARF said:


> You don't need ray-tracing. See following NFS images:


These renders are purely atrocious. Is this really 2018? 

BMW: shadow under car, when body illumination suggests that most light comes from the side.
3 cars: front light reflections don't line up with the lamps. This likely means the cars are modeled as boxes for lighting interaction with environment. That's miles away from ray tracing.

But you're right: we don't need RTRT.
And I understand you completely because when 3D games started to become popular I was just as unconvinced as you are now. I kept playing 2D/isometric games. The first time I had fun in 3D was in 2003 when I tried Morrowind.
Frankly, sometime around 2007 I decided I don't need games at all. 


TheGuruStud said:


> How's that going to work? You would NEVER be able to install anything that isn't from their store...so basically nothing useful. Windows is too broken to be exposed.


Well, actually Xbox - like Windows 10 Pro and up - runs on Hyper-V. And Windows Store provides you with linux images. So if Microsoft opened access to the hypervisor, that would open some serious possibilities.

Sure, if this was made public, people would start wrecking their Xboxes.
But what if MS created an "Xbox Store" with productivity apps run in containers?
Your Xbox could become a NAS, a database server, a general VM engine.
Think about Synology Store and their VM Manager: https://www.synology.com/en-global/dsm/feature/virtual_machine_manager
There's absolutely no technical reason why Xbox could offer that. And it's perfectly stable and robust.

But generally speaking - your Xbox could really become a computing backend for multiple scenarios.
This could be nicely packed in the Windows ecosystem and cloud/edge/distributed computing idea that's already taking over.
We already get "Accelerate with cloud" buttons. We could just as well get some "Accelerate using Xbox"


----------



## ratirt (Mar 17, 2020)

Ray tracing again? Guys, Full ray tracing with all it really should do is not happening for few years at least. If you wanted to fully ray trance games like Metro Exodus you would need 3 2080Ti's tied together in a perfectly working environment to get that 60FPS in 4K. Even though we have some of it we are not there yet. Fools errand RT is at this point.  

Anyway.
Is there any information about when the PS5 specs are going to be out? I'd like to compare the two and decide which one will be mine


----------



## Tsukiyomi91 (Mar 17, 2020)

This is the first time that consoles actually closes the gap with the PC. it's not like consoles is gonna decimate the PC market space anytime soon or it's going to destroy Nvidia's RTX series of GPUs just because next gen consoles have RT baked into the SoC. I see this as something to look forward to than rejecting/hating it outright.


----------



## notb (Mar 17, 2020)

ratirt said:


> Ray tracing again? Guys, Full ray tracing with all it really should do is not happening for few years at least.


I don't understand why you're so clung to "full ray tracing".
Most 3D games aren't fully 3D. Doesn't that bother you?


----------



## ratirt (Mar 17, 2020)

notb said:


> I don't understand why you're so clung to "full ray tracing".
> Most 3D games aren't fully 3D. Doesn't that bother you?


Because there is more than just shadows or lightning etc. there's the entire environment. If you want (as other claim) full realism you can't focus on one thing. 
Yeah well 3d is not full 3D just as RT is not full RT.


----------



## notb (Mar 17, 2020)

ratirt said:


> Because there is more than just shadows or lightning etc. there's the entire environment. If you want (as other claim) full realism you can't focus on one thing.
> Yeah well 3d is not full 3D just as RT is not full RT.


Full 4K RT render of a complex model takes a lot of time... on rendering clusters. You may not live long enough to play a game rendered in the same way movies are rendered in 2020.
More importantly, you'd never benefit. Gaming is elusive. You'd never notice if the remote tree is rendered with RT or not.

Also, I'm not sure why I'd want full realism. We're talking about games. You're like those people who say "but if someone falling from a building was caught by Superman right above the ground, he would die anyway". 
Clearly it hasn't happend yet, but at some point you'll notice that some things in games are unrealistic and they should stay like that. For example: laser weapons, all kinds of energy fields etc. You can't ray trace them. 

The significant gain is that already today you can have properly rendered shadows of the objects you focus on: your character / vehicle, faces of NPCs, objects that you interact with etc.
I get this may not be important for you, but guess what: it's gaming. It's not that important in general.

Anyway, this topic is not about RTRT so lets leave it here. Plus, I'm really bored by these discussions by now. I'm glad some graduated from "RT will never work" to "but this is not full RT".
At the same time I'm rather shocked that some still have no clue how ray tracing works - it's such a simple, intuitive idea (compared to pixel shaders etc).


----------



## ratirt (Mar 17, 2020)

notb said:


> Full 4K RT render of a complex model takes a lot of time... on rendering clusters. You may not live long enough to play a game rendered in the same way movies are rendered in 2020.
> More importantly, you'd never benefit. Gaming is elusive. You'd never notice if the remote tree is rendered with RT or not.
> 
> Also, I'm not sure why I'd want full realism. We're talking about games. You're like those people who say "but if someone falling from a building was caught by Superman right above the ground, he would die anyway".
> ...


To summarize, I don't need RT that much as of now. Thanks  
I am bored too so lets leave it as is.


----------



## Rahnak (Mar 17, 2020)

Tsukiyomi91 said:


> This is the first time that consoles actually closes the gap with the PC. it's not like consoles is gonna decimate the PC market space anytime soon or it's going to destroy Nvidia's RTX series of GPUs just because next gen consoles have RT baked into the SoC. I see this as something to look forward to than rejecting/hating it outright.


I just watched the DF video and I'm very, very impressed by the XSX. Especially at the part where they showed a 2 week port of Gears 5 running at the equivalent PC 4K Ultra settings which by DF's testing was pretty close to the performance you would get out of a 3.6Ghz locked 3700X and a RTX 2080. For a system that's going to cost 500 to 600 €/$ that's shaping up to be incredible value.


----------



## Tsukiyomi91 (Mar 17, 2020)

@Rahnak that's very impressive. =O My only gripe here is the majority of games that's on XSX might ship to Windows unless there are some upcoming titles being "exclusive" to the console only.


----------



## ppn (Mar 17, 2020)

Rahnak said:


> I just watched the DF video and I'm very, very impressed by the XSX. Especially at the part where they showed a 2 week port of Gears 5 running at the equivalent PC 4K Ultra settings which by DF's testing was pretty close to the performance you would get out of a 3.6Ghz locked 3700X and a RTX 2080. For a system that's going to cost 500 to 600 €/$ that's shaping up to be incredible value.



and 16GB of Ram... Should have 20GB at least. 10 Fast and 10 Slow., I would say 24GB of 384bit memory and 3840 Cores, but obviously too expensive. But not having at least another 4GB could be a problem in the future. also non M.2 nVME, to expand to 4TB on the cheap in 2 years.


----------



## Rahnak (Mar 17, 2020)

@Tsukiyomi91 Microsoft is done with exclusives in the traditional sense. They see xbox as a platform/service rather than a console. So yeah, this isn't for those of us that already have gaming PCs, we already have xbox in Win10. This is for people that have a smaller budget, or just want a couch gaming experience in a more compact/cheaper form factor.

@ppn You can't think of the memory in the XSX like you do for your PC or even the current consoles. The 10gb of fast memory is for the GPU, 2.5GB of the slower is reserved for OS and the remaining is for audio and other stuff (you can watch DFs video on that). And there will be a lot more loading from disk than before because they're so much faster now. So yeah, it's going to be a whole new way of doing things. I wonder if PC is going to be somewhat holding games back now rather than consoles, since not everyone has nvme ssds.

Oh and Sony just announced they're releasing PS5 system details tomorrow. Can't wait to see how they compare.

__ https://twitter.com/i/web/status/1239885256888590344


----------



## Valantar (Mar 17, 2020)

ppn said:


> and 16GB of Ram... Should have 20GB at least. 10 Fast and 10 Slow., I would say 24GB of 384bit memory and 3840 Cores, but obviously too expensive. But not having at least another 4GB could be a problem in the future. also non M.2 nVME, to expand to 4TB on the cheap in 2 years.


16GB is in no way a limitation for a console like this. Period. Even with 2.5GB reserved for the OS there is _plenty_ of memory there, especially when you add on their new texture streaming methods that cuts down on loading of unused textures (DF video explains this; apparently only about 1/3 of loaded texture data is actually ever displayed) and that the entire NVMe drive is directly accessible by both CPU and GPU (slower than RAM, obviously, but much faster than SSD->RAM). There's no reason whatsoever to expect this console to run out of memory in its lifespan.



As for all of you people arguing about RT: please stop. Yes, there are very few games making any use of RT whatsoever currently. No, it's not currently possible (nor will it be in the foreseeable future) to do full RTRT of even somewhat photorealistic games. That doesn't change the fact that RT makes realistic lighting, shadows, reflections etc. _much_ easier to implement (rather than the two+ decades of jerry-rigged hacks currently used and bogging down game development pipelines massively (while looking okay at best, terrible at worst)) and that next-gen consoles will be entirely capable of this. As for adoption, this new generation of consoles will ensure that RT lighting and reflections will be pretty much everywhere in the next couple of years - and due to RT being _easier_ to implement than a stack of hacks and tricks in a rasterized lighting scheme it will likely spread into smaller games rather quickly once there's a significant install base. Is there a performance penalty? Absolutely. Is it _necessary_? Of course not. Neither is ambient occlusion, bloom, god rays, water transparency, volumetric lighting, anti-aliasing, etc., etc., etc. It's just that all these things make games look better - and better looking games are often (though obviously reliant on the quality of other aspects of the game) a boon in the immersion and other experiential parts of a game. Playing Rocket League, Overwatch or LoL in greyscale with simplified world and character designs would obviously make those games worse games, regardless if gameplay was otherwise unchanged. 

Now, was RT a must-have for the first year of RTX market availability? No. The second year? No. The third? Not likely, but that depends on how long you keep your hardware for. I've kept my current GPU for going on five years now, and haven't truly decided to replace it until this year, which obviously means that my next GPU (which I want to last as long) really, really ought not to lack a rather crucial feature like this. Similarly, consoles last for 5-7 years minimum, so as such _not_ having RT at this point is going to be an issue if the gaming world otherwise adopts it.

So let's stop bickering over this silly nonsense, please. For non-RT games we don't lose anything in terms of performance, and what has been demonstrated is that this console does a bang-up job in automatically improving legacy titles whether they are XBone, 360 or OG Xbox titles. Higher frame rates, higher resolution, increased fidelity, etc. - it's all there, and it doesn't suffer from there also being an option to have RT lighting and reflections in upcoming games.


----------



## dirtyferret (Mar 17, 2020)

notb said:


> Maybe it's a totally new strategy? Building up the lineup instead of replacing?
> I mean: if new games work on both Xbox One and Series, there's really no need to replace what they have. They're offering a new model above those existing - with prices shifting over time to make room for another gen.



Maybe, time will only tell if that strategy works.



Tsukiyomi91 said:


> This is the first time that consoles actually closes the gap with the PC. it's not like consoles is gonna decimate the PC market space anytime soon or it's going to destroy Nvidia's RTX series of GPUs just because next gen consoles have RT baked into the SoC. I see this as something to look forward to than rejecting/hating it outright.



Original Playstation was as good or better then anything 3dfx had at the time for consumers so not the first time consoles closed the gap.


----------



## milewski1015 (Mar 17, 2020)

Od1sseas said:


> "Barely makes a difference". Lmfao. People said same shit like that when tessellation first appeared and now look how important it is for games to look realistic. Educate yourself. Ray Tracing is the future and the difference is noticable.


I'm not arguing that RT isn't the way of the future - as the technology and hardware improves it will no doubt eventually become standard in just about every game. As for whether the difference is noticeable, personally I have a hard time differentiating the more subtle implementations unless there's a side by side comparison. Maybe that's because I mainly play fast-paced competitive games and would happily turn down graphics settings for increased frame rates. Similarly, I likely wouldn't take a significant performance hit for a touch more photo-realism. As @Valantar said, at this point in time, the list of games supporting RT is small, and it isn't currently a necessary feature. Point being, I didn't base my GPU decision on RTX. 

My gripe was with @dicktracy calling the 5700XT obsolete tech. Just because the RTX cards were released doesn't mean people aren't still buying used 1080s/1080Tis. What's to say AMD won't bring RT support to the 5700 series like Nvidia did with Pascal?  The 5700 XT is far from a paperweight - even if it doesn't get RT support, the performance is solid and it will continue to deliver such at least for a few years. I can live without RT until another upgrade


----------



## Darmok N Jalad (Mar 17, 2020)

Rahnak said:


> @Tsukiyomi91 Microsoft is done with exclusives in the traditional sense. They see xbox as a platform/service rather than a console. So yeah, this isn't for those of us that already have gaming PCs, we already have xbox in Win10. This is for people that have a smaller budget, or just want a couch gaming experience in a more compact/cheaper form factor.
> 
> @ppn You can't think of the memory in the XSX like you do for your PC or even the current consoles. The 10gb of fast memory is for the GPU, 2.5GB of the slower is reserved for OS and the remaining is for audio and other stuff (you can watch DFs video on that). And there will be a lot more loading from disk than before because they're so much faster now. So yeah, it's going to be a whole new way of doing things. I wonder if PC is going to be somewhat holding games back now rather than consoles, since not everyone has nvme ssds.
> 
> ...


I suspect PS5 will be very similar to XSX. I’m mainly curious if Sony will keep the storage upgrade option open to the customer.


----------



## Valantar (Mar 17, 2020)

Darmok N Jalad said:


> I suspect PS5 will be very similar to XSX. I’m mainly curious if Sony will keep the storage upgrade option open to the customer.


While I would like that, I can only imagine the horror of the average console user trying to install (even a simplified) m.2 drive. Hello bent pins, trashed sockets and torn-off SMD's!


----------



## gamefoo21 (Mar 17, 2020)

Super XP said:


> Different specs different shell both powered by AMD.



Seems people don't understand that AMD CPUs and GPUs drive both the Xbox One and PS4 already.


----------



## kapone32 (Mar 17, 2020)

gamefoo21 said:


> Seems people don't understand that AMD CPUs and GPUs drive both the Xbox One and PS4 already.



Yes but they have different specs.


----------



## Darmok N Jalad (Mar 17, 2020)

Valantar said:


> While I would like that, I can only imagine the horror of the average console user trying to install (even a simplified) m.2 drive. Hello bent pins, trashed sockets and torn-off SMD's!


Maybe they will have a drive sled like they did with PS3 and PS4. Sounds like MS will just sell upgrade modules, kinda like in the 360 days.


----------



## Valantar (Mar 17, 2020)

Darmok N Jalad said:


> Maybe they will have a drive sled like they did with PS3 and PS4. Sounds like MS will just sell upgrade modules, kinda like in the 360 days.


You can also use USB storage, though only for XBone/360/OGXB games or "cold storage" of XSX games (need to load them onto the SSD from what I understand).

Still, a sled is ... not really suitable for m.2. 2.5" drives are (largely for HDDs, entirely for SSDs) encased in a protective shell, and have screw points for easy mounting of a sled, while an m.2 drive is an entirely exposed piece of hardware with the only form of retention being the socket + a single screw. They could always make a more advanced sled with a socket adapter to something more slot-in friendly, but you'd still need people to mount the drive into the carrier then, with the same risk of breakage of both the socket and the drive. Or sell drives pre-mounted into sleds, which is essentially what MS is doing. I just really hope the MS standard is open and not only Seagate gets to make drives. That would _suck_.


----------



## gamefoo21 (Mar 17, 2020)

kapone32 said:


> Yes but they have different specs.



Isn't that the point...

If PS4 and Xbone are different...

Why does anyone think PS5 and the Series X will be the same?


----------



## rvalencia (Mar 18, 2020)

Valantar said:


> That's a misunderstanding. It has the _equivalent_ of 25TF _if the RTRT was done purely in shaders_. The RTRT hardware can't do regular shader workloads, and thus does not translate back into FP32 TFLOPS.


RT cores can use in RT related audio, collision physics,  graphics and BVH search tree/intersect test(fancy branch) applications.


----------



## Valantar (Mar 18, 2020)

rvalencia said:


> RT cores can use in RT related audio, collision physics,  graphics and BVH search tree/intersect test(fancy branch) applications.


Yes? None of that is a general GPU shader workload (even if they all can be performed (much slower) on shader cores). Just like a video decode block can decode video as fast as (or even faster than) a high-end CPU doesn't mean that power is translatable back to general CPU tasks - quite the opposite. All you're saying is "RT hardware can perform RT workloads", which ... well, one would certainly hope so. Saying the XSX has 25TF of compute power is _flat out false_. Saying it has _the equivalent _of 25 TF of compute power if _RT workloads is counted as if they were done on shader cores_ is true. Those two statements are _very clearly_ _not the same_.


----------



## Jism (Mar 18, 2020)

How clever they have 'adressed' the console's memory. Using like 6GB for the system as in slower speeds; i.e less chips, and 10GB for the GPU with higher speed (more chips).

Appearantly they can use GDDR6 as a mixed CPU/GPU memory architecture. How does it cope against traditional DDR4?

As for the chip; it does seem to look like a 2700X or so; 16 threads are'nt even needed in various games as 6 up most would be the most ideal situation.


----------



## Darmok N Jalad (Mar 18, 2020)

gamefoo21 said:


> Isn't that the point...
> 
> If PS4 and Xbone are different...
> 
> Why does anyone think PS5 and the Series X will be the same?


Hardware specs are one thing, interface, first party exclusives, and development kits are another. AMD is still the primary provider of both consoles, so the range of what they can offer can only go so far. Will specs be identical? Not likely, but I don’t suspect it will be much more different than PS4 Pro vs XboxOne X was. As long as they are close, the differences won’t matter. I would almost suspect that the big name developers have a fair amount of influence on the hardware specs, as so many titles are cross platform. They probably don’t want the differences to be so great that it causes them more work to go cross-platform.

I guess I’ll be really surprised if PS5 blows XSX away. Do we really think AMD has that much more to give? These already have bigger GPUs than what you can get from AMD for your PC. I wonder if the weird memory layout was AMDs idea in the first place.


----------



## gamefoo21 (Mar 18, 2020)

Darmok N Jalad said:


> Hardware specs are one thing, interface, first party exclusives, and development kits are another. AMD is still the primary provider of both consoles, so the range of what they can offer can only go so far. Will specs be identical? Not likely, but I don’t suspect it will be much more different than PS4 Pro vs XboxOne X was. As long as they are close, the differences won’t matter. I would almost suspect that the big name developers have a fair amount of influence on the hardware specs, as so many titles are cross platform. They probably don’t want the differences to be so great that it causes them more work to go cross-platform.
> 
> I guess I’ll be really surprised if PS5 blows XSX away. Do we really think AMD has that much more to give? These already have bigger GPUs than what you can get from AMD for your PC. I wonder if the weird memory layout was AMDs idea in the first place.



The specs are out and they show differences in design and the likely impact of performance.

The PS5 is likely going to be slower, but it's going to have the faster NVME drive, but it'll be more energy efficient.

That's ironically not exactly true but not wrong about AMD and it's history. The One X GPU was definitely bigger and badder than anything Polaris but smaller than Vega. Even the new GPU is definitely bigger and badder than Navi but what's interesting is that it's not just a tuned up Vega 44 like last time, it's very likely RDNA 2, which is smaller than Arcturus, which will likely get pulled into the consumer market like Vega 10 and 20. Soo...

Back on topic...

PS5 GPU: 36CUs, up to 2.26Ghz, up to 10.28TFLOPS, 256bit gddr6 memory

XsX GPU: 52CUs, 1.82Ghz, 12TFLOPS, 320bit gddr6 memory

That's quite a bit of variablity for a single manufacturer.


----------



## Valantar (Mar 18, 2020)

gamefoo21 said:


> The specs are out and they show differences in design and the likely impact of performance.
> 
> The PS5 is likely going to be slower, but it's going to have the faster NVME drive, but it'll be more energy efficient.
> 
> ...


Considering those crazy clocks it's not going to be more energy efficient, even if it consumes less power in total. 2.25GHz has to be well past the sweet spot for the DVFS curve ...

On this note, am I the only one with the impression that the PS5 engineering team spent the past 48 hours furiously overclocking the APU to see how little of a disadvantage they might come off looking like they have, with marketing breathing down their necks the whole time?  A faster SSD does little to compensate for the competition being 15% faster _in your best case scenario_. The wording also makes me quite sure the PS5 will run slower than this for the vast majority of games. I have no doubt this will still be a good console, but that is a significant disadvantage for sure.

As for "quite a bit of variability for a single manufacturer" - how? They're semi-custom chips, so there would be two pieces of silicon no matter what. And AMD's architectures are built to be modular and can be scaled up and down as wanted/needed. No surprise whatsoever that this is possible.


----------



## gamefoo21 (Mar 19, 2020)

Valantar said:


> Considering those crazy clocks it's not going to be more energy efficient, even if it consumes less power in total. 2.25GHz has to be well past the sweet spot for the DVFS curve ...
> 
> On this note, am I the only one with the impression that the PS5 engineering team spent the past 48 hours furiously overclocking the APU to see how little of a disadvantage they might come off looking like they have, with marketing breathing down their necks the whole time?  A faster SSD does little to compensate for the competition being 15% faster _in your best case scenario_. The wording also makes me quite sure the PS5 will run slower than this for the vast majority of games. I have no doubt this will still be a good console, but that is a significant disadvantage for sure.
> 
> As for "quite a bit of variability for a single manufacturer" - how? They're semi-custom chips, so there would be two pieces of silicon no matter what. And AMD's architectures are built to be modular and can be scaled up and down as wanted/needed. No surprise whatsoever that this is possible.



I think they are banking on how well the PS4 Pro has done vs the One X.

It's decently slower and can't play 4K blurays.

If anything the Sony engineers are probably trying to overclock the memory. I'm also a little surprised that they aren't going for 3.6ghz on the 8cores instead they are giving it a 3.5ghz max boost... Not set, but 'variable'...

Water cooling your console for stable performance... 

The lack of memory bandwidth is going to hammer the PS5 at 4K. I suspect it'll be 1440p with 'image enhancements' console.

Then there's the storage system... It's going to drink power and it's going to be hot.

I'm really not thrilled that both consoles have killed user replacement/upgrades on the flash, and that stuff wears out.

The PS5 is definitely shaping up to be a cheaper console to build so it's likely going to undercut the Series X by $100 USD at least. IMHO


----------



## notb (Mar 19, 2020)

Valantar said:


> I have no doubt this will still be a good console, but that is a significant disadvantage for sure.


Well, maybe at least they'll keep a familiar, TV-table friendly form factor. There's always time to launch a locked mITX desktop.


----------



## Rahnak (Mar 19, 2020)

gamefoo21 said:


> I'm really not thrilled that both consoles have killed user replacement/upgrades on the flash, and that stuff wears out.


You can expand storage on both consoles. And it wears out? SSD durability easily outlast a console generation. Or two. Or three.


----------



## Valantar (Mar 19, 2020)

notb said:


> Well, maybe at least they'll keep a familiar, TV-table friendly form factor. There's always time to launch a locked mITX desktop.


Sorry, a "locked mITX desktop"? As in a "build-your-own console"? I don't quite see what you're saying here, but if I'm right in that being what you meant, that is never, ever going to happen.


gamefoo21 said:


> I think they are banking on how well the PS4 Pro has done vs the One X.
> 
> It's decently slower and can't play 4K blurays.


While MS sadly stopped publishing sales numbers long before the XOX, it seems the sales deficit for it compared to the PS4 hasn't been even remotely as big as the OG XB1 vs. PS4 (they have hinted at near parity IIRC) - which is a significant achievement for a mid-generation spec bump with a 2-3:1 platform adoption deficit. This launch will be very interesting for sure - Sony obviously still has the mindshare advantage, but that might not be enough. Marketing and presentation (plus pricing, obviously) will be key going forward.


----------



## notb (Mar 19, 2020)

Valantar said:


> Sorry, a "locked mITX desktop"? As in a "build-your-own console"? I don't quite see what you're saying here, but if I'm right in that being what you meant, that is never, ever going to happen.


As in Xbox Series X which is a custom SFF PC with locked OS.
So we should compare it to custom SFF PCs. And guess what: AsRock DeskMini GTX/RX is less than half the size. 

Don't take me wrong. The Xbox looks very promising. If they called it "Surface Desktop" and shipped with Windows, this might have been the first time since Diablo2 that I preorder anything.
And I may still buy it for gaming when prices go down.


----------



## Vayra86 (Mar 19, 2020)

I've got my popcorn out and I'm eagerly awaiting what the new content will look like. Specs are nice but content is where its at, but its not Microsoft's strong suit. Same goes for RT.

But hardware wise... this is fantastic and it will mean a serious boost is on its way for PC graphics too. A new mainstream norm is what they're obviously shooting for, and that norm has 4K in it. I'm not complaining. The baby steps are finally over, its about god damn time. After all on the PC resolution is just one of the many choices to spend resources on.

RDNA2 though. Shit. 2.23 Ghz on the PS5 and here we have a wide GPU doing 1.8. That is good and it will mean these things finally boost and clock proper, capable of doing a wide range. It will be very interesting what Nvidia is going to pull out of the hat now, and its clear they need make a big dent, even despite the Turing headstart. I mean, I'm still not really counting the ridiculous product called 2080ti as a viable thing, and considering that, they've got work to do. Very glad to see AMD return to proper high end, and not trailing a gen or two.



gamefoo21 said:


> I think they are banking on how well the PS4 Pro has done vs the One X.
> 
> It's decently slower and can't play 4K blurays.
> 
> ...



A lot of random thoughts... its all going to hammer and do this and that but you really don't know or can't say. And neither do we 
The spec war however is just not interesting. When in doubt, watch the relevant South Park episode. What really matters is what the majority will do, and that focuses exclusively on the content that the majority can play.


----------



## notb (Mar 19, 2020)

Vayra86 said:


> It will be very interesting what Nvidia is going to pull out of the hat now, and its clear they need make a big dent, even despite the Turing headstart.


Well, obviously, a 7nm GPU, so the efficiency and performance crown will stay with them. AMD will go back to being the "value" option.

Also, these consoles will hopefully provoke an explosion of RTRT games - something Nvidia is already prepared for, while AMD and Intel merely mentioned working on.

So yeah... not much changes in the balance of power.
But we're likely looking at huge jump in game requirements for games ported from consoles - including a possibility of games that won't run (or won't be playable) without RTRT acceleration...

As for the Xbox - I'm really interested if the non-gaming features will be developed further. If yes, this could be an easy buy.


----------



## gamefoo21 (Mar 19, 2020)

Vayra86 said:


> I've got my popcorn out and I'm eagerly awaiting what the new content will look like. Specs are nice but content is where its at, but its not Microsoft's strong suit. Same goes for RT.
> 
> But hardware wise... this is fantastic and it will mean a serious boost is on its way for PC graphics too. A new mainstream norm is what they're obviously shooting for, and that norm has 4K in it. I'm not complaining. The baby steps are finally over, its about god damn time. After all on the PC resolution is just one of the many choices to spend resources on.
> 
> ...



That's true, but the PS4 Pro can't do real 4K.

The PS5 GPU is shaping up to be a lot like the 5700XT.

Memory bandwidth is very important at 4K60.

It'll be interesting either way to see what shakes out between these two.



Rahnak said:


> You can expand storage on both consoles. And it wears out? SSD durability easily outlast a console generation. Or two. Or three.



NAND reliability drops like a rock if it's too hot. Not to mention type of NAND, there's also components related to the storage that can very easily die.

If the drive on the board dies, will Sony let the PS5 boot off the USB drive? MS might let you boot off the second drive but I really doubt it.

I guess I am just not a fan of an Apple approach to hardware. If the SSD fails, new MacBook for you! Or at least a motherboard.

Forgive me while I still have my Xbox, 360, PS2, SNES, etc... Sorry if I insult your sensibilities when I buy a console not to puke it's guts out in 5 to 7 years.

Consumerism is a shitty excuse to drive corporate profits. 

I really hope right to repair sneaks past the big money trying to stop it.


----------



## Rahnak (Mar 19, 2020)

gamefoo21 said:


> NAND reliability drops like a rock if it's too hot. Not to mention type of NAND, there's also components related to the storage that can very easily die.
> 
> If the drive on the board dies, will Sony let the PS5 boot off the USB drive? MS might let you boot off the second drive but I really doubt it.
> 
> ...


If something dies on any of those consoles you mentioned, they're just as equally dead. Newer ones just have more points of failure, as it happens with any newer technology. I'm sure they've taken all your reliability concerns into account for you. Sony wants their consoles to last more than you do, I can guarantee you. And if they fail at it, it's a repeat of the 360, wouldn't be something new and Sony would pay the price for it.


----------



## Super XP (Mar 20, 2020)

gamefoo21 said:


> Seems people don't understand that AMD CPUs and GPUs drive both the Xbox One and PS4 already.


I thought of that too, maybe they really don't know that.



gamefoo21 said:


> That's true, but the PS4 Pro can't do real 4K.
> 
> The PS5 GPU is shaping up to be a lot like the 5700XT.


Speculation aside, the PS5 might end up being faster than the Xbox Series X, far more efficient and much faster over the 5700XT.


----------



## Rahnak (Mar 20, 2020)

Super XP said:


> Speculation aside, the PS5 might end up being faster than the Xbox Series X, far more efficient and much faster over the 5700XT.


My guess (and I could be way off) is that XSX will have the advantage in pretty much all multi-platform games, because it does have more raw power and the upgrades seem more straightforward from current gen. That said, I do think the PS5's top exclusives, like GoW2 and Naughty Dog's next IP, will be better than anything on XSX because they just go the extra mile on getting that last drop of performance.


----------



## Super XP (Mar 20, 2020)

Rahnak said:


> My guess (and I could be way off) is that XSX will have the advantage in pretty much all multi-platform games, because it does have more raw power and the upgrades seem more straightforward from current gen. That said, I do think the PS5's top exclusives, like GoW2 and Naughty Dog's next IP, will be better than anything on XSX because they just go the extra mile on getting that last drop of performance.


Well that makes sense. Microsoft plays up the performance advantage while Sony has the top exclusives to help drive unit sales. I'm not really into consoles myself, but realistically I am hoping both sell very well and share even market share, all for the sake of competition. Wishful thinking though lol


----------



## mechtech (Mar 20, 2020)

HwGeek said:


> I hope MS wil let us use Win10 on it, so we could play or use it as a PC["Desktop Console" ].
> I will buy it at first day if so.



lol if it ends up you can, I think PC sales would plummet.


----------



## AnarchoPrimitiv (Mar 20, 2020)

notb said:


> Well, obviously, a 7nm GPU, so the efficiency and performance crown will stay with them. AMD will go back to being the "value" option.
> 
> Also, these consoles will hopefully provoke an explosion of RTRT games - something Nvidia is already prepared for, while AMD and Intel merely mentioned working on.
> 
> ...




Your allegiance is obvious


----------



## notb (Mar 20, 2020)

AnarchoPrimitiv said:


> Your allegiance is obvious


Liberal democrat?


----------



## Valantar (Mar 20, 2020)

notb said:


> Well, obviously, a 7nm GPU, so the efficiency and performance crown will stay with them. AMD will go back to being the "value" option.
> 
> Also, these consoles will hopefully provoke an explosion of RTRT games - something Nvidia is already prepared for, while AMD and Intel merely mentioned working on.
> 
> ...


You're right that a 7nm Nvidia GPU is likely to have significant efficiency gains from their current 12nm ones. However AMD is nearly on par in efficiency (total, not architectural, and depending on how far the chip is pushed) with the current gen, and they're promising a significant increase with the upcoming cards despite no node change, something that's corroborated by the upcoming consoles (more than 250W power consumption for a console (even one as large as the XSX) is highly unlikely, and the CPU, SSD and so on consume at least some of that - let's say 50W, which would make that a crazy efficient 8c16t CPU - so the GPU must then be below 200W while delivering ~35% higher FLOPS than the 5700XT _and_ RT). I still think Nvidia is likely to have the upper hand if their upcoming GPUs are on TSMC 7nm (there are rumors of consumer GPUs being Samsung 10nm or 8nm, which would be a smaller change from TSMC 12nm), but it won't be huge. 

MS also showed the GoW 5 in-game benchmark in a build "updated" to run on XSX by a single engineer over two weeks (i.e. not optimized whatsoever) matching the performance of the same game with the same settings (PC Ultra IIRC) on a PC with a 2080. Which is a 215-225W GPU. The GoW 5 PC port is also generally regarded as being a very good port that performs very well and scales well with powerful PC hardware.

Beyond that, saying AMD "merely mentioned working on" RT - days after two consoles based on their hardware being announced with full RT support performant enough to run an early/unoptimized build of fully path traced Minecraft - is a serious stretch no matter if their PC GPUs have yet to be announced. While the details of AMD's implementation are scarce, we know it is bound to shader count and will thus scale up with bigger GPUs too.

I still expect Nvidia to have a small performance advantage with Ampere, but it doesn't look likely to be anything more than small, and given the proliferation of new features in the new consoles based on AMD hardware it's not all that likely that Nvidia will maintain an advantage in use/performance of new features simply due to the dominant implementation and development model being AMD-based (even if this ultimately falls back to open APIs like DXR). 

As for non-gaming uses, I don't think that will happen. Consoles are not meant for general purpose use, and part of why MS lost the previous generation so badly was due to not focusing on gaming but too much on other features. They're obviously not making that mistake this time around.


----------



## John Naylor (Mar 20, 2020)

The Quim Reaper said:


> 10Gb of fast RAM, 6Gb of slow RAM...
> 
> The Nvidia 970 designers are smiling.



They never stopped... that single  card oust sold all 26 AMD cards combined for it's generation by a factor > 2.

We see this "is gonna" discussion with every new CPU release and ever new GPU release  ... and when you go back to read the "is gonna" predictions, they never quite live up to the early billing.   Save the enthusiasm for post release testing.


----------



## agentnathan009 (Mar 21, 2020)

ppn said:


> 560GBs 10GB gpu optimal memory, only problem is the cpu will eat alot of that 10GB-5,5=4,5GB, 560-336=224, so the GPU is left with 4,5GB at 224GBs.
> 
> How about NO, and gives us HBM2E 32GB 1TBs, 5nm maxed to 420mm2 GPU only, and separate 60mm2 8 core ZEN3 with separate DDR5 24GB, or forget about it.



They let just any fanboy in here don't they... Clearly you have no clue how much all of that premium would cost... Furthermore, if you used your brain, you would have grasped the elusive concept that consoles, Playstation included, don't use full PC Operating Systems and so they don't have all the other tasks, etc. running in the background and therefore don't need as much memory to function as well as a PC. The Series X has 16GB of memory, and possibly subdivided so GPU gets dedicated and CPU gets dedicated amount to play with. 8-10GB is more than enough for GPU with Ray tracing for a console. 6GB is enough for the rest of the system for everything else that it processes such as audio, AI characters, physics, etc.


----------



## Valantar (Mar 21, 2020)

ppn said:


> 560GBs 10GB gpu optimal memory, only problem is the cpu will eat alot of that 10GB-5,5=4,5GB, 560-336=224, so the GPU is left with 4,5GB at 224GBs.
> 
> How about NO, and gives us HBM2E 32GB 1TBs, 5nm maxed to 420mm2 GPU only, and separate 60mm2 8 core ZEN3 with separate DDR5 24GB, or forget about it.


Did you miss the part where they explained that the CPU has memory separate from the 10GB GPU pool? 16GB total, 2,5GB reserved for OS etc., 10GB prioritizing the GPU, with the rest for the CPU's gaming needs. While the CPU and GPU might share some RAM, the GPU will be the main consumer of memory in high resolution gaming, so the GPU being bandwidth starved. Most game data not related to graphics is rather space efficient, after all.


----------



## rvalencia (Mar 22, 2020)

Valantar said:


> Yes? None of that is a general GPU shader workload (even if they all can be performed (much slower) on shader cores). Just like a video decode block can decode video as fast as (or even faster than) a high-end CPU doesn't mean that power is translatable back to general CPU tasks - quite the opposite. All you're saying is "RT hardware can perform RT workloads", which ... well, one would certainly hope so. Saying the XSX has 25TF of compute power is _flat out false_. Saying it has _the equivalent _of 25 TF of compute power if _RT workloads is counted as if they were done on shader cores_ is true. Those two statements are _very clearly_ _not the same_.


RT cores have assimilated certain workloads that were done on shaders e.g. Volta's DXR. 

RT core is a specialized compute unit optimized for certain workload types and "shaders" are specialized compute units optimized for raster graphics workloads.
RDNA 2 and Turing effectively returns to DX9 style non-unified shader compute units.



gamefoo21 said:


> The specs are out and they show differences in design and the likely impact of performance.
> 
> The PS5 is likely going to be slower, but it's going to have the faster NVME drive, but it'll be more energy efficient.
> 
> ...


In 2013, AMD used to provide multiple SKU levels up to high-end GPU for the PC market while providing GPUs for MS and Sony. Server bias Bulldozer era financial problems and focus on TFLOPS bias server GPUs without scaling raster hardware have caused brain drain on AMD's RTG.   Reminder for RTG, GPUs are not DSPs.



Super XP said:


> I thought of that too, maybe they really don't know that.
> 
> Speculation aside, the PS5 might end up being faster than the Xbox Series X, far more efficient and much faster over the 5700XT.


RDNA 2 36 CU at 2230 Mhz vs RDNA 2 52 CU at 1825Mhz is like comparing RTX 2070 with 36 CU equivalent at 2230 Mhz OC (10.28 TFLOPS) with 448 GB/s BW against MSI RTX 2080 Super Gaming X Trio with 12.15 TFLOPS with 496 GB/s BW.

RTX 2070 at 2230 Mhz wouldn't beat MSI RTX 2080 Super Gaming X Trio. 

RDNA 2 is not GCN i.e. RDNA (aka NAVI)  is designed with scalability. Read AMD's road map.


----------



## Valantar (Mar 22, 2020)

rvalencia said:


> RT cores have assimilated certain workloads that were done on shaders e.g. Volta's DXR.
> 
> RT core is a specialized compute unit optimized for certain workload types and "shaders" are specialized compute units optimized for raster graphics workloads.
> RDNA 2 and Turing effectively returns to DX9 style non-unified shader compute units.


Again: yes, and? None of this means that an RT core (or AMD's equivalent, which seems to be rather different from Nvidia's and more integrated into the regular CU) can perform regular FP32 operations, which would be a requirement for saying the GPU has 25TFlops. So, saying that is wrong. Period. Again: it has _the equivalent of_ 25TFlops _if RT operations are counted as if they were executed on a non-RT GPU_. If you aren't doing that specific thing, it has 12TFlops.

As for saying GPU shaders are specialized compute units ... well, sure, they're specialized for FP32 (and formerly FP64, lately also FP16 and INT8/INT4) units, but FP32 operations are a quite general class of computation with uses far beyond graphics. RT operations are definitely more specialized than this. As such it's entirely accurate to call one a form of general purpose compute and one specialized hardware.


----------



## rvalencia (Mar 23, 2020)

Valantar said:


> Again: yes, and? None of this means that an RT core (or AMD's equivalent, which seems to be rather different from Nvidia's and more integrated into the regular CU) can perform regular FP32 operations, which would be a requirement for saying the GPU has 25TFlops. So, saying that is wrong. Period. Again: it has _the equivalent of_ 25TFlops _if RT operations are counted as if they were executed on a non-RT GPU_. If you aren't doing that specific thing, it has 12TFlops.
> 
> As for saying GPU shaders are specialized compute units ... well, sure, they're specialized for FP32 (and formerly FP64, lately also FP16 and INT8/INT4) units, but FP32 operations are a quite general class of computation with uses far beyond graphics. RT operations are definitely more specialized than this. As such it's entirely accurate to call one a form of general purpose compute and one specialized hardware.


FYI, NVIDIA's' RT unit is inside SM unit level which is equivalent to AMD's CU. 

RT core is less specialized when compared to T&L/TFU (texture filter unit) /ROPS hardware since RT can accelerate non-graphics workloads such as audio and physics logic collision.  BVH search tree and collision hardware have a wider application when compared to T&L hardware. 

Modern ROPS has re-order layers via ROV (Rasterizer Order Views) feature instead of wasting compute shader resource.


----------



## Super XP (Mar 23, 2020)

rvalencia said:


> RT cores have assimilated certain workloads that were done on shaders e.g. Volta's DXR.
> 
> RT core is a specialized compute unit optimized for certain workload types and "shaders" are specialized compute units optimized for raster graphics workloads.
> RDNA 2 and Turing effectively returns to DX9 style non-unified shader compute units.
> ...


Already have read AMD info on all there products. RDNA2 is a new uArch.

Both Console chips are highly customized. Customized RDNA2 and ZEN2 with everything else.


----------



## Valantar (Mar 23, 2020)

rvalencia said:


> FYI, NVIDIA's' RT unit is inside SM unit level which is equivalent to AMD's CU.
> 
> RT core is less specialized when compared to T&L/TFU (texture filter unit) /ROPS hardware since RT can accelerate non-graphics workloads such as audio and physics logic collision.  BVH search tree and collision hardware have a wider application when compared to T&L hardware.
> 
> Modern ROPS has re-order layers via ROV (Rasterizer Order Views) feature instead of wasting compute shader resource.


...still can't do general FP32 compute. Please stop splitting hairs over this. That RT performance can be translated into an equivalent of FP32 performance if RT is done purely in shaders does not mean this conversion can be reversed and RT performance can be added to the total TFlops.


----------



## Valantar (Mar 25, 2020)

rvalencia said:


> View attachment 149202
> Meanwhile at NVIDIA camp... with Turing. NVIDIA PR just road-killed your argument.
> 
> You're an AMD fanboy. eat it.


Lolwut? Here I am arguing against blindly adding together various converted numbers into a meaningless total that won't be comparable to anything, and which overstates the general compute performance of a part, and that makes me an AMD fanboy? Put more simply, you say it has 25TF, I say no, it has 12TF but can be seen as having the equivalent of 25TF if calculated a specific way, and that makes me an AMD fanboy? Seriously? Saying it has 25TF is far more positive for AMD, ffs. Which is what I am saying is a stupid thing to do.

Conversions like this is like saying an F1 car is 10 times the car a Honda Civic is because its 10 times faster, which ignores that the Civic can do a lot more than go fast - it can seat several people, take you grocery shopping, etc. FP32 is general purpose compute. RT cores do not do general purpose compute. Nor do tensor cores or any other specialized hardware. If Nvidia is copying a particularly stupid and easily misunderstood marketing point from AMD, that does not in any way make it less stupid or easily misunderstood.

Also, reported. Thanks for keeping the discussion civil, dude.


----------



## rvalencia (Mar 25, 2020)

Valantar said:


> Lolwut? Here I am arguing against blindly adding together various converted numbers into a meaningless total that won't be comparable to anything, and which overstates the general compute performance of a part, and that makes me an AMD fanboy? Put more simply, you say it has 25TF, I say no, it has 12TF but can be seen as having the equivalent of 25TF if calculated a specific way, and that makes me an AMD fanboy? Seriously? Saying it has 25TF is far more positive for AMD, ffs. Which is what I am saying is a stupid thing to do.
> 
> Conversions like this is like saying an F1 car is 10 times the car a Honda Civic is because its 10 times faster, which ignores that the Civic can do a lot more than go fast - it can seat several people, take you grocery shopping, etc. FP32 is general purpose compute. RT cores do not do general purpose compute. Nor do tensor cores or any other specialized hardware. If Nvidia is copying a particularly stupid and easily misunderstood marketing point from AMD, that does not in any way make it less stupid or easily misunderstood.
> 
> Also, reported. Thanks for keeping the discussion civil, dude.


You can't handle the truth when you censor a debate when you can't win.





Meanwhile, NVIDIA PR throws in RT cores' TFLOPS into marketing.

Expect AMD PR to weaponize RT cores TFLOPS when "Big Navi" arrives.

Why debate about FP32 general-purpose shader compute (not generalize like SSE)  when future game titles have significant RT workloads?
Current shaders accelerate Z-buffer accelerated structures while RT cores accelerate BVH accelerated structures.


----------



## Valantar (Mar 25, 2020)

rvalencia said:


> You can't handle the truth when you censor a debate when you can't win.
> 
> View attachment 149236
> Meanwhile, NVIDIA PR throws in RT cores' TFLOPS into marketing.
> ...


Lol, censoring the debate? It's not my fault you're not able to keep a civil tone in a discussion or keep yourself from personal attacks. That's your own responsibility, not mine. You need to calm down and stop projecting your own missteps onto me.

And again, as addressed in my previous post: Nvidia adopting a bad marketing practice does not in any way wake it a good marketing practice. You apparently need to be spoon fed, so let's go through this point by point.

-TFLOPS in GPU performance metrics is generally accepted to mean FP32 TFLOPS, as that is the "baseline" industry-standard operation (single-precision compute) as opposed to higher or lower precisions (FP64, FP16, INT8, INT4, etc.).

-In GPUs these operations are performed by shader cores, which are fundamentally FP32 compute cores (though sometimes with various degrees of FP64 support either through dedicated hardware or the ability to combine two FP32 cores), which can also perform lower precision workloads either natively at the same speed or faster by combining several operations in one core.

-FP32 compute is a very broad category of general compute operations. Some of these operations can be done by various forms of specialized hardware, or can be done in lower precisions at higher speed (through methods like rapid packed math) without sacrificing the quality of the end result.

-Due to FP32 being a broad category a lot of FP32 operations can also be performed more efficiently by making specialized hardware for a subset of operations. This hardware, by virtue of being specialized for a specific subcategory of operations, _*is not capable of performing general FP32 compute operations*_*.*

-As the operations done on the specialized hardware can also be done on FP32 hardware, you can give an approximation of the equivalent FP32 performance necessary to match the performance of the specialized hardware. I.e. you can say things like "to match the performance of our RT cores you would need X number of FP32 FLOPS". These calculations are then dependent on - among other things - how efficient your implementation of said operation through general FP32 compute is. Two different solutions will very likely perform differently, and will thus result in different numbers for the same hardware.

-This is roughly equivalent to how fixed-function video encode/decode blocks can do this specialized subset of work faster and more efficiently than the same work performed on a CPU or GPU. That doesn't mean you can run your OS or games off a video encode/decode block, as this block is only capable of a small set of operations.

-*These comparisons can't be expanded to other tasks, as the specialized hardware is not capable of general FP32 compute. *FP32 hardware _can_ do RT; RT hardware _can't_ do FP32. I.e. you _cannot_ say that "our RT cores are capable of X FP32 FLOPS" - because that statement is fundamentally untrue - your RT hardware is capable of _zero_ FP32 FLOPS. That your F1 car (specialized hardware) can do some of the things your Civic (general hardware) can do - driving on a flat surface - and is "X times better" at that (i.e. faster around a track) does not mean that this can be transferred to the other things the general hardware can do - your F1 car has nowhere to put your groceries and would get stuck on the first speed bump you encountered, so it is fundamentally incapable of grocery shopping. It would also be fundamentally incapable of driving your friends around, or letting you listen to the radio while commuting. Just because specialized hardware can be compared to general hardware _in the task the specialized hardware can do_ does not mean this comparison can be expanded into the other tasks that general hardware can do - _because the specialized hardware is fundamentally incapable of doing these things_.

-So, to sum up: AMD made a claim in marketing that, while technically true, _needs to be understood in a very specific way to be true_, and is very easy to misunderstand and thus misrepresent the capabilities of the hardware in question. The Xbox Series X is capable of 12.1 TFLOPS of FP32 compute. When performing combined rasterization and RT graphics workloads, it is capable of performing an amount of RT compute that would require 13 TFLOPS of FP32 compute to achieve if said workload was run on pure FP32 hardware (which it isn't, it's run on RT hardware). It is not, and will never be, capable of 25 TFLOPS of FP32 compute. Nvidia copying this does not in any way make it less problematic - I would say it makes it a lot _more_ problematic, as there's no way of knowing if the two companies' ways of performing RT workloads on FP32 cores is equally performant, and unless they are, any comparisons are entirely invalid. Especially problematic is the fact that conversions like this make worse performance look better: if your RT-through-FP32 implementation is _worse_ than the competition, you can claim that your RT hardware is equivalent to _more_ FP32 hardware than theirs is. This tells us _nothing_ of actual performance, only performance relative to something unknown and unknowable.


This just boils down to a very clear demonstration of how utterly useless FP32 FLOPS are as a metric of GPU performance. Not only is the translation from FP32 compute (TFLOPS) into gaming performance not 1:1 but dependent on drivers, hardware utilization, and architectural features, but this now adds another stack abstraction layers, meaning that any numbers made in this way are _completely and utterly incomparable_. Comparing FLOPS from pure shader hardware across AMD and Nvidia was already comparing apples and oranges, but now it's more like comparing apples and ... hedgehogs. Or something.

Btw, I would sincerely like to see you point out what of the above (or my previous posts on this) makes me an AMD fanboy. The ball's in your court on that one.


----------



## gamefoo21 (Mar 25, 2020)

FP64 for life!

*Runs away*


----------



## Valantar (Mar 25, 2020)

gamefoo21 said:


> FP64 for life!
> 
> *Runs away*


I definitely prefer my games in FP64. I also like the CPU load for the games to run on the CPU's video encode/decode block only


----------

