# Sony Reveals PS5 Hardware: RDNA2 Raytracing, 16 GB GDDR6, 6 GB/s SSD, 2304 GPU Cores



## btarunr (Mar 18, 2020)

Sony in a YouTube stream keynote by PlayStation 5 lead system architect Mark Cerny, detailed the upcoming entertainment system's hardware. There are three key areas where the company has invested heavily in driving forward the platform by "balancing revolutionary and evolutionary" technologies. A key design focus with PlayStation 5 is storage. Cerny elaborated on how past generations of the PlayStation guided game developers' art direction as the low bandwidths and latencies of optical discs and HDDs posed crippling latencies arising out of mechanical seeks, resulting in infinitesimally lower data transfer rates than what the media is capable of in best case scenario (seeking a block of data from its outermost sectors). SSD was the #1 most requested hardware feature by game developers during the development of PS5, and Sony responded with something special. 

Each PlayStation 5 ships with a PCI-Express 4.0 x4 SSD with a flash controller that has been designed in-house by Sony. The controller features 12 flash channels, and is capable of at least 5.5 GB/s transfer speeds. When you factor in the exponential gains in access time, Sony expects the SSD to provide a 100x boost in effective storage sub-system performance, resulting in practically no load times.



 

 

 




The secret sauce here is that Sony is using its own protocol instead of NVMe, in supporting 6 data priority tiers versus 2 on NVMe. Each PlayStation 5 ships with an 825 GB SSD, which is expandable using external HDDs over USB, or a selection of third-party M.2 NVMe SSDs certified by Sony. PlayStation 4 games can run directly off your external HDD, but PlayStation 5 games have to be transferred from your HDD to the console's main SSD. Past generations of PlayStation implemented ZLib data compression on Blu-ray and HDD media. PlayStation 5 is implementing Kraken, with hardware-accelerated de-compression via fixed-function hardware built directly into the main SoC. 

SoC is where Cerny sounded restrained in what he wanted to disclose. The SoC is a semi-custom chip designed by Sony and AMD, possibly on a 7 nm-class silicon fabrication process. Sony won't specify if it is a monolithic silicon or an MCM, but there are three building-blocks to it: CPU, GPU, and I/O complex. The CPU is based on AMD "Zen 2" x86-64 microarchitecture, and the GPU is based on the company's upcoming RDNA2 graphics architecture. 

There are eight "Zen 2" CPU cores, although the company didn't mention if SMT is featured. The maximum CPU clock speed is 3.50 GHz. The GPU is a whole different story from the one on the Xbox Series X Velocity Engine semi-custom chip. Sony decided to go with 36 RDNA2 compute units ticking at up to 2.23 GHz engine clock, compared to 52 compute units running at up to 1.825 GHz on the upcoming Xbox. Sony's GPU ends up with up to 10.3 TFLOPs max compute throughput, compared to Microsoft's 12 TFLOPs. 

Sony also shed some "light" on the hardware-accelerated real-time ray-tracing approach AMD is taking with RDNA2. Apparently, each compute unit features a hardware component called "Intersection Engine," with roughly the same function as an RT core on NVIDIA "Turing," which is to calculate the intersection of rays with geometry (such as triangles or polygons) in a scene. This combines with a fairly standardized bounding volume hierarchy (BVH) model to achieve a hybrid of ray-traced elements in an otherwise conventional rasterized 3D scene (pretty much where NVIDIA is right now with RTX). On PlayStation 5, RDNA2's ray-tracing hardware is leveraged for positional audio, global illumination, shadows, reflections, and full ray-tracing. 

The third key component of the SoC is the I/O complex. This handles all of the chip's I/O, not just with peripherals and video output, but also storage and memory. There are dedicated I/O co-processors on-silicon designed to reduce the various I/O's processing stack on the CPU cores, and reduce latencies at various stages. There's also a certain amount of SRAM that caches transfers between the various components on the I/O complex. The custom chip leverages AMD SmartShift in power-management. 

PlayStation 5 uses 16 GB of GDDR6 memory. Sony did not mention the memory clock, bandwidth, or even the memory bus width. It did drop some hints about memory management. It appears like PlayStation 5 does not partition memory the way Xbox Series X does, and possibly sticks to the hUMA model of the PlayStation 4 (using a common pool of physical memory for system- and video memory). 

Lastly, a large chunk of Sony's presentation focused on the next frontier for hardware innovation: positional audio. Sony is investing heavily on positional audio that takes into account the gamer's physical HRTF (head-related transfer function). The company is leveraging the vast amounts of CPU power gained from the upgrade to "Zen 2," to achieve this. 



 

We still don't know what a PlayStation 5 console will look like.

*View at TechPowerUp Main Site*


----------



## MxPhenom 216 (Mar 18, 2020)

Based on other reports. The memory is tied to a 256-bit bus.


----------



## ShurikN (Mar 18, 2020)

Well, now we definitely know which console is going to be cheaper.


----------



## R0H1T (Mar 18, 2020)

btarunr said:


> Sony decided to go with 36 RDNA2 compute units ticking at up to *2.23 GHz engine clock*, compared to 52 compute units running at up to 1.825 GHz on the upcoming Xbox. Sony's GPU ends up with up to 10.3 TFLOPs max compute throughput, compared to Microsoft's 12 TFLOPs.


Well that should settle the debate about the upcoming "big Navi" cause it's gonna be a beast w/better IPC, clocks, more CU, hardware RT & who knows 3d audio?


----------



## ppn (Mar 18, 2020)

Like a playstation4 pro on steroids, same core count. updated uArch. 3x faster.


----------



## Rahnak (Mar 18, 2020)

MxPhenom 216 said:


> Based on other reports. The memory is tied to a 256-bit bus.


And memory bandwidth is 448GB/s.



R0H1T said:


> Well that should settle the debate about the upcoming "big Navi" cause it's gonna be a beast w/better IPC, clocks, more CU, hardware RT & who knows 3d audio?


I think the 3D audio is custom. There's a lot of custom hardware in the PS5, just like in the older ones.


----------



## P4-630 (Mar 18, 2020)

btarunr said:


> Each PlayStation 5 ships with an 825 GB SSD



Which reaching speeds of 5,5GB/s without compression. Compressed up to 8 to 9GB/s.
To expand storage you can use regular m2-nvme-ssd's.

Also there's "3d Audio" support with the "Tempest Engine" which also works with a Stereo setup.
(Which is similar to the Synergistic Processor Unit used in the Sony in de PlayStation 3)









						PlayStation 5 krijgt gpu met 36 cu's op 2,3GHz en 825GB-ssd die 5,5GB/s haalt
					

Sony voorziet zijn PlayStation 5 van een gpu met 36 compute units. Dat zijn er minder dan de 52 cu's van de Xbox Series X, maar in de PS5 is de gpu hoger geklokt. De rekenkracht komt neer op 10,3tflops. Sony gebruikt Zen 2-cores die maximaal 3,5GHz halen.




					tweakers.net


----------



## Ferrum Master (Mar 18, 2020)

R0H1T said:


> Well that should settle the debate about the upcoming "big Navi" cause it's gonna be a beast w/better IPC, clocks, more CU, hardware RT & who knows 3d audio?



You can scratch 3D audio. On windows it is a pain in the arse. I predicted tad HW audio will come back because of VR and low latency needs, but current state of core windows kernel and PC architecture does not allow that. It really needs a custom design and tailored code not jack of all trades that runs of anything. Tough topic tbh. 

Looks like PS5 has more tailored things, especially the GPU cache coherency managers. And it will end up much cooler too imho.


----------



## R0H1T (Mar 18, 2020)

Rahnak said:


> I think the 3D audio is custom. There's a lot of custom hardware in the PS5, just like in the older ones.


I know I'm just hoping they bring something like *TrueAudio* back.



Ferrum Master said:


> It really *needs a custom design* and tailored code not jack of all trades that runs of anything. Tough topic tbh.


Yes a dedicated DSP would be nice, just like their previous solution.


----------



## TechLurker (Mar 18, 2020)

All in all, both the PS5 and Series X seem to be within ~10% of each other.

Xbox has more brute force power, PS5 has faster base speeds and throughput.

Xbox will eke out a few more FPS for the games that demand pure FPS and possibly at higher resolutions. Sony instead offers seamless transition and more dynamic movement; going by how much they've hyped their storage setup and their earlier Spiderman demo (and previous discussion on how character movements were limited by the read speeds of SSDs/HDDs).

The caveat though is that the Xbox seems redundant now that they're sharing releases with PC and thus is going for more of an all-in-one media center with their new Xbox (still a gaming powerhouse, but nevertheless overshadowed by parallel/near-parallel releases with PC), whereas Sony still gets some exclusives and still presents a gaming-first mindset. The ideal combo is looking to be PC + PS5. PC for all the new games releasing alongside Series X on top of an existing Steam/GoG/etc library, and PS5 for exclusives + media center functionality.

Now if only Sony would just release a PSP/Vita successor using AMD's APUs or embedded architecture and offer that seamless transition between on-the-go and at-home-gaming (start a game on one, pick up where you left off on the other, etc).


----------



## kings (Mar 18, 2020)

I highly doubt that these 2.23Ghz maximum boost on the GPU will last for a long time, with the inherent cooling limitations that consoles have.

It gives the impression that it was stretched at the last minute, just to reach the double digit 10TF and not feel much weaker than the Xbox Series X.


----------



## Rahnak (Mar 18, 2020)

kings said:


> I highly doubt that these 2.23Ghz maximum boost on the GPU will last for a long time, with the inherent cooling limitations that consoles have.
> 
> It gives the impression that it was stretched at the last minute, just to reach the double digit 10TF and not feel much weaker than the Xbox Series X.


Cerny said the variable frequency is based on workload, not thermal headroom. Cooling will be unveiled when they do the teardown.

Also, this presentation was recorded before XSX announcement, Eurogamer, for example, saw it a couple of days ago, so a last minute bump in frequency is very unlikely. He said "we" would be happy with their cooling solution though. Whatever that's worth.


----------



## shk021051 (Mar 18, 2020)

series x is my choice


----------



## TheLostSwede (Mar 18, 2020)

Same, yet so different. It's strange that Sony has put so much importance on storage and the SSD, while having what appears to be a much weaker GPU, compared to Microsoft. I guess it's possible that Sony's API (Vulcan?) is more efficient than DirectX, so they don't need as much raw GPU power?
Not a console gamer, but the hardware is intriguing none the less. We've gone from fairly fixed usage devices to what can only be described as one high-end-ish PC running Windows and another running Linux.


----------



## sepheronx (Mar 18, 2020)

I am very curious how much this will be priced at.

With how much power is being proclaimed by Microsoft and Sony regarding their GPU's and CPU's, its hard to imagine they can release this at even a reasonable price.  But if RDNA2 is that good, then I am glad I am hanging onto my GTX 1070 till those GPU's come out.


----------



## R0H1T (Mar 18, 2020)

TheLostSwede said:


> I guess it's possible that Sony's API (Vulcan?) is more efficient than DirectX, so they don't need as much raw GPU power?


Linux itself is much better with multiple cores than Windows, if MS hasn't heavily tweaked the XB series X scheduler then they'll be wasting a lot of zen2 power.


----------



## Turmania (Mar 18, 2020)

Good times for console industry. I never bought a PS but seriously considering my stand on it for PS5.


----------



## Rahnak (Mar 18, 2020)

TheLostSwede said:


> Same, yet so different. It's strange that Sony has put so much importance on storage and the SSD, while having what appears to be a much weaker GPU, compared to Microsoft. I guess it's possible that Sony's API (Vulcan?) is more efficient than DirectX, so they don't need as much raw GPU power?
> Not a console gamer, but the hardware is intriguing none the less. We've gone from fairly fixed usage devices to what can only be described as one high-end-ish PC running Windows and another running Linux.



Cerny gave a simple explanation on why they went with fewer CUs at higher frequency as opposed to more CUs at a lower frequency. Quoting Eurogamer, quoting Cerny:



> Not wishing to draw comparisons with any existing hardware past, present or future, Cerny presents an intriguing hypothetical scenario - a 36 CU graphics core running at 1GHz up against a notional 48 CU part running at 750MHz. Both deliver 4.6TF of compute performance, but Cerny says that the gaming experience would not be the same.
> 
> "Performance is noticeably different, because 'teraflops' is defined as the computational capability of the vector ALU. That's just one part of the GPU, there are a lot of other units - and those other units all run faster when the GPU frequency is higher. At 33 per cent higher frequency, rasterisation goes 33 per cent faster, processing the command buffer goes that much faster, the L1 and L2 caches have that much higher bandwidth, and so on," Cerny explains in his presentation.
> 
> ...



Price might've been a factor as well, dunno.


----------



## ppn (Mar 18, 2020)

Im waiting for the Pro version of it. having this main IO chip on 12nm, and separate GPU CPU, means that whenever 5nm shrink of is available sony will integrate it very quickly. XSX can't do that, it is stuck on 7nm.


----------



## Rahnak (Mar 18, 2020)

ppn said:


> Im waiting for the Pro version of it. having this main IO chip on 12nm, and separate GPU CPU, means that whenever 5nm shrink of is available sony will integrate it very quickly. XSX can't do that, it is stuck on 7nm.


I sincerely hope there are no other versions this time around. The Pro versions sole reason for existing was the mass adoption of 4K TVs. I don't see 8K becoming mainstream in the next 5-7 years. Hopefully.

Not counting the usual "slim" versions, of course.


----------



## kings (Mar 18, 2020)

Sony may be playing the price card as it did with PS4. If they wanted something more powerful, they would also have it, they have access to the same AMD capabilities that Microsoft has.

Anyway, it is far from being a weak console, 10TF GPU, together with a CPU that makes Jaguar look like from last century, plus super fast SSD, will give a fantastic experience.

With these specs, I start to think that they are probably aiming for $399. And at this price, I have no doubt that it will be a success. I don't see the Xbox Series X being sold for less than $500~$550.


----------



## Rahnak (Mar 18, 2020)

399 again would be sweet. But there was that rumor a while ago saying they were having trouble keeping costs under $450, I think it was. Given the custom hardware they're putting on it, I wouldn't be surprised if that were true. Then again, Sony has sold consoles at a loss before.


----------



## Prince Valiant (Mar 18, 2020)

R0H1T said:


> I know I'm just hoping they bring something like *TrueAudio* back.
> 
> Yes a dedicated DSP would be nice, just like their previous solution.


I'd like to see that too. The general attitude toward audio in games would need to change. When TA was announced there were a number of articles along the lines of "do we need this".


----------



## matar (Mar 18, 2020)

Xbox series X is the clear Winner... 10.3 TFLOPs max compute throughput, compared to Microsoft's 12 TFLOPs , Also Xbox software tends to use the headwear better then sony we have seen in in old 360 where sony had better hardware but yet 360 ran games smother and now we have both on the NEW Xbox better hardware and combine it with better software = super smooth game play.


----------



## oxrufiioxo (Mar 18, 2020)

TheLostSwede said:


> Same, yet so different. It's strange that Sony has put so much importance on storage and the SSD, while having what appears to be a much weaker GPU, compared to Microsoft. I guess it's possible that Sony's API (Vulcan?) is more efficient than DirectX, so they don't need as much raw GPU power?
> Not a console gamer, but the hardware is intriguing none the less. We've gone from fairly fixed usage devices to what can only be described as one high-end-ish PC running Windows and another running Linux.




Slightly disappointed after the X reveal but considering Sony is more likely to target this hardware as a base I still think Sony first party games may be more impressive.... It will be interesting to see what general consumers do now that basically all multiplayer games are crossplay if the X hits higher framerates or higher resolution it may sway people torwards it.


----------



## ARF (Mar 18, 2020)

matar said:


> Xbox series X is the clear Winner... 10.3 TFLOPs max compute throughput, compared to Microsoft's 12 TFLOPs , Also Xbox software tends to use the headwear better then sony we have seen in in old 360 where sony had better hardware but yet 360 ran games smother and now we have both on the NEW Xbox better hardware and combine it with better software = super smooth game play.



You can't know this. Xbox has 10 GB VRAM + 6 GB system RAM, while normal PCs go ****today**** with 16 GB system RAM and 8 GB VRAM.
It will be particularly interesting to see these consoles in 3-4 years when the games will become more demanding for hardware resources.

12 vs 10.3 is not much of a difference, especially when you have a locked FPS at 60 or so.


----------



## Vya Domus (Mar 18, 2020)

> The GPU is a whole different story from the one on the Xbox Series X Velocity Engine semi-custom chip. Sony decided to go with 36 RDNA2 compute units ticking at up to 2.23 GHz engine clock



Given that the X series GPU also runs at a pretty high frequency, this leaves me to believe RDNA2 is seriously more power efficient than anything else right know.


----------



## ARF (Mar 18, 2020)

Vya Domus said:


> Given that the X series GPU also runs at a pretty high frequency, this leaves me to believe RDNA2 is seriously more power efficient than anything else right know.



Not that. It's 50% more power efficient, what's striking is that the process supports an AMD architecture to go that high in frequency. Normally, AMD has always lagged behind Nvidia in the frequency department.


----------



## Space Lynx (Mar 18, 2020)

doesn't matter if xbox series x is more powerful, sony always has better exclusives and more of them, plus there are a lot of sony exclusives on my backlog.


----------



## oxrufiioxo (Mar 18, 2020)

R0H1T said:


> Linux itself is much better with multiple cores than Windows, if MS hasn't heavily tweaked the XB series X scheduler then they'll be wasting a lot of zen2 power.



considering what they've been able to get out of the One X and its garbage Jaguar cores I'm guessing it's much better than windows.


ARF said:


> You can't know this. Xbox has 10 GB VRAM + 6 GB system RAM, while normal PCs go ****today**** with 16 GB system RAM and 8 GB VRAM.
> It will be particularly interesting to see these consoles in 3-4 years when the games will become more demanding for hardware resources.
> 
> 12 vs 10.3 is not much of a difference, especially when you have a locked FPS at 60 or so.




The major difference is 12 is the min with the Xbox and 10.3 is the max with the PS5 due to Sony in their infinite wisdom going with a variable boost clock ( which for a console doesn't make sense )..... Likely to make the gpu look closer in spec. Also the 10GB of vram seems to run significantly faster than Sony's setup around 100GBs.

Out of these two systems I will be buying the PS5 but I would feel alot better about it if the specs where reversed I personally don't care about the cost.


----------



## ARF (Mar 18, 2020)




----------



## TheGuruStud (Mar 18, 2020)

What I want to know is what is the IPC gain on zen 2 with this ram? Zen is fairly memory bound...


----------



## Steevo (Mar 18, 2020)

R0H1T said:


> Well that should settle the debate about the upcoming "big Navi" cause it's gonna be a beast w/better IPC, clocks, more CU, hardware RT & who knows 3d audio?


 Since its also a CPU die its probably the benefits of SOI production VS standard direct Silicon production, I wonder if they are moving their GPUs to the same production method.


----------



## IceShroom (Mar 18, 2020)

12 Channel SSD controler on console!!


----------



## HD64G (Mar 18, 2020)

RDNA2 is much more efficient but also clocks faster than any GPU until today. Great signs for big Navi indeed.


----------



## ARF (Mar 18, 2020)

IceShroom said:


> 12 Channel SSD controler on console!!



Likely it will use the SSD space as some type of lower level memory, loading textures directly from the SSD during gameplay?

NVMe standard pales in comparison, as always the PC users get less


----------



## Valantar (Mar 18, 2020)

Rahnak said:


> Cerny gave a simple explanation on why they went with fewer CUs at higher frequency as opposed to more CUs at a lower frequency. Quoting Eurogamer, quoting Cerny:
> [snip]
> Price might've been a factor as well, dunno.


That explanation sure has the sound of someone really wanting their inferior solution to look better than it is. Sure, caches and other things will run faster, but the amounts, bandwidth etc. will also be lower. Adding more CUs also adds more cache etc., and pretty much every "advantage" he extolls here is countered by building a wider GPU, at least one that's also backed by solid caches and VRAM.

Also, compared to the XSX's fixed clocks, variable clocks (even if they're using AMD's clever SmartShift to allocate power between the CPU and GPU) are a poor fit for a console. And it's not like it will matter much when the maximum boost numbers for both parts are lower than the competition; all that tells us is that even in a best-case scenario for PS5, it's still slower across the board in _both_ CPU and GPU power.

Even if the presentation was filmed a few days ago I am still tempted to think the PS5 engineering team spent the past 48 hours furiously overclocking their APU with the marketing team breathing down their necks 

That SSD looks like a beast though, very interested in how good a job it does in compensating for the VRAM bandwidth deficiency. Also very interested in how they will certify m.2 drives for add-on storage considering no m.2 drive on the market is even close to those numbers in sustained performance.



ARF said:


> Likely it will use the SSD space as some type of lower level memory, loading textures directly from the SSD during gameplay?
> 
> NVMe standard pales in comparison, as always the PC users get less


I agree that this drive is bonkers, but "as always PC users get less"? What on earth are you on about? The last time consoles were faster/better than PCs were in the mid-to-late 90s ...


----------



## ARF (Mar 18, 2020)

Valantar said:


> Even if the presentation was filmed a few days ago I am still tempted to think the PS5 engineering team spent the past 48 hours furiously overclocking their APU with the marketing team breathing down their necks



This is funny 

I meant "The secret sauce here is that Sony is using its own protocol instead of NVMe, *in supporting 6 data priority tiers versus 2 on NVMe*."


----------



## oxrufiioxo (Mar 18, 2020)

Valantar said:


> I agree that this drive is bonkers, but "as always PC users get less"? What on earth are you on about? The last time consoles were faster/better than PCs were in the mid-to-late 90s ...




Technically the PS3 and Xbox 360 had features that didn't exist on pc yet.... PS3 with the CPU and Xbox with the GPU.


----------



## ARF (Mar 18, 2020)

oxrufiioxo said:


> Technically the PS3 and Xbox 360 had features that didn't exist on pc yet.... PS3 with the CPU and Xbox with the GPU.



On PC, a SATA SSD is faster in gaming than a PCIe NVMe SSD. Anyone care to explain as to why and why the console gets virtually no loading times right now?

PCs also don't get other things, but they are software related like upscaling for instance.

Another question:
The Variable Rate Shading - how much performance will it give on its own?


----------



## Tomgang (Mar 18, 2020)

Looks like some pretty beefy spec for a console. But given how the world situation is right now with the virus and economic impact and properly some time in the future, I don't think I will be a PS5 owner with in the next year or so.


----------



## oxrufiioxo (Mar 18, 2020)

ARF said:


> On PC, a SATA SSD is faster in gaming than a PCIe NVMe SSD. Anyone care to explain as to why and why the console gets virtually no loading times right now?
> 
> PCs also don't get other things, but they are software related like upscaling for instance.
> 
> ...




I think you quoted the wrong person...... but my guess is software isn't coded to take advantage of the extra speed sorta like games currently being fine at around 12 to 16 threads and not benefiting from 24 or 36 to the same extent at least when it comes to 1% lows. 

I personally still think Sony put way too much money into the storage system and should have focused on sustained clocks for the CPU/GPU or a wider GPU.


----------



## ARF (Mar 18, 2020)

Valantar said:


> I agree that this drive is bonkers, but "as always PC users get less"? What on earth are you on about? The last time consoles were faster/better than PCs were in the mid-to-late 90s ...



Well, look at this gameplay comparison between PS4 and Ryzen 5 1600/ GTX 1060/ 16 GB RAM.
There is no difference in the quality. While we can assume that the PC is (much) faster, no?












oxrufiioxo said:


> I think you quoted the wrong person...... but my guess is software isn't coded to take advantage of the extra speed sorta like games currently being fine at around 12 to 16 threads and not benefiting from 24 or 36 to the same extent at least when it comes to 1% lows.
> 
> I personally still think Sony put way too much money into the storage system and should have focused on sustained clocks for the CPU/GPU or a wider GPU.



Well, I only wanted to agree with what you've just said and continue the discussion adding to what you've said.
You are not the wrong person


----------



## Rahnak (Mar 18, 2020)

Valantar said:


> Even if the presentation was filmed a few days ago I am still tempted to think the PS5 engineering team spent the past 48 hours furiously overclocking their APU with the marketing team breathing down their necks


But they filmed it before that.

Concerning the variable clocks, I don't mind it because a) it's not based on thermals, so everyone gets the same performance regardless and b) it's to keep cooling noise acceptable.
One example he gave for lowering the cpu clock is when it's executing a lot of 256-bit instructions because to maintain the clocks they would need to increase the size of the power supply and fan.


----------



## Vya Domus (Mar 18, 2020)

Valantar said:


> That explanation sure has the sound of someone really wanting their inferior solution to look better than it is. Sure, caches and other things will run faster, but the amounts, bandwidth etc. will also be lower. Adding more CUs also adds more cache etc., and pretty much every "advantage" he extolls here is countered by building a wider GPU, at least one that's also backed by solid caches and VRAM.



He's not wrong, adding more CUs adds more cache however it's not that simple. In a GPU each CU has some cache dedicated to it, if you add more CUs they still have access to the same amount of cache. You'll still encounter the same limitations in memory bound situations.

A GPU with less CUs and higher clocks will perform better in memory bound situations. And it's not uncommon that some parts of a shader contain scalar code which will also run better.

Not that it matters much in this case because PS5's GPU isn't equivalent TFLOP wise with the one in X.



ARF said:


> Well, look at this gameplay comparison between PS4 and Ryzen 5 1600/ GTX 1060/ 16 GB RAM.



I can pull hundreds of examples where the games look noticeably better on such a PC, just browse Digital Foundry's YT channel and you'll find plenty examples. And we both know RDR2 was an unusually poor PC port.


----------



## oxrufiioxo (Mar 18, 2020)

Rahnak said:


> But they filmed it before that.
> 
> Concerning the variable clocks, I don't mind it because a) it's not based on thermals, so everyone gets the same performance regardless and b) it's to keep cooling noise acceptable.
> One example he gave for lowering the cpu clock is when it's executing a lot of 256-bit instructions because to maintain the clocks they would need to increase the size of the power supply and fan.




I'm guessing they've know for a quite a while the Xbox was more powerful.... I'm pretty sure this has more to do with them trying to keep a normal form factor. I still think in a console it doesn't make a whole lot of sense as it forces developers to target a lower level of performance to make sure all PS5 games perform the same in all environments.

Apparently Microsoft tested their system in the desert to make sure it would sustain its clocks.

At the same time these people are much smarter than me so I'm sure their reason are valid.


----------



## dirtyferret (Mar 18, 2020)

This changes everything!!!


----------



## Rahnak (Mar 18, 2020)

oxrufiioxo said:


> I'm guessing they've know for a quite a while the Xbox was more powerful.... I'm pretty sure this has more to do with them trying to keep a normal form factor. I still think in a console it doesn't make a whole lot of sense as it forces developers to target a lower level of performance to make sure all PS5 games perform the same in all environments.


Yeah, I'm sure they all have their inside information and knew all about the other months in advance. I'm not sure what you mean by "it forces developers to target a lower level of performance to make sure all PS5 games perform the same in all environments"?



oxrufiioxo said:


> Apparently Microsoft tested their system in the desert to make sure it would sustain its clocks.


Well, the design does seem more effience at cooling, and the fan is much larger than what it used to be in past consoles. I just it mantains acceptable noise levels. I keep my console on my desk.



oxrufiioxo said:


> At the same time these people are much smarter than me so I'm sure their reason are valid.


Yeah, I'm sure both have done a ton of research that lead to their decisions.

I don't really care much either way, since if I were to buy any console, it would be this one, because of the exclusives.


----------



## mechtech (Mar 18, 2020)

Rahnak said:


> 399 again would be sweet. But there was that rumor a while ago saying they were having trouble keeping costs under $450, I think it was. Given the custom hardware they're putting on it, I wouldn't be surprised if that were true. Then again, Sony has sold consoles at a loss before.



16GB GDDR6 ram, whats the current market price on that?  GPU, CPU, psu, controllers, SSD size? etc.

Random guess $550 US$ at least


----------



## oxrufiioxo (Mar 18, 2020)

Rahnak said:


> I don't really care much either way, since if I were to buy any console, it would be this one, because of the exclusives.




Same here PS5 regardless for me... I personally just like the route at a hardware level Microsoft took. At the end of the day only one of them plays Playstation exclusives so that will be my choice.


Sony going with variable clocks at least from my point of view means developers can't depend on those max clocks in all scenarios whether that be power draw or someone sitting in a 30c room vs a 20c room.

it adds a variable to development that doesn't exist on the other console.


----------



## ARF (Mar 18, 2020)

mechtech said:


> 16GB GDDR6 ram, whats the current market price on that?  GPU, CPU, psu, controllers, SSD size? etc.
> 
> Random guess $550 US$ at least




GPU and CPU is one chip, it's an APU. No RAM, only the GDDR6 chips and large SSD.

But I agree, Sony needs healthy profit margins because it's not sustainable to sell at a loss all the time. They have other struggling divisions, too.

PS4 was much inferior technologically, so yes, it was quite normal to be cheaper.


----------



## rutra80 (Mar 18, 2020)

So, Xbox is high throughput and PS is low latency...


----------



## Fluffmeister (Mar 18, 2020)

It's hard to get too excited about consoles that won't hit the market until the end of the year, but good to see both ray tracing and VRS are a thang in all our futures.


----------



## Assimilator (Mar 18, 2020)

Anyone know when Nintendo will complete the trifecta of trash that will hobble games for the next decade?



ARF said:


> On PC, a SATA SSD is faster in gaming than a PCIe NVMe SSD.



wat


----------



## ARF (Mar 18, 2020)

Assimilator said:


> wat


----------



## LAN_deRf_HA (Mar 18, 2020)

P4-630 said:


> To expand storage you can use regular m2-nvme-ssd's.



It's about the farthest thing from a regular m2. You'll need some currently non-existent gen 4 ssd that's faster than the internal drive and it will have to sustain that performance. Gen 4 drives have been terrible at sustained performance so far. I doubt the new wave of gen 4 controllers will be any better at that.


----------



## Totally (Mar 18, 2020)

ARF said:


> View attachment 148468



When the split is a couple seconds ( 5% ) does it matter in the context of gaming? those seconds saved don't really add up in a meaningful way over time.


----------



## seronx (Mar 18, 2020)

But, will the PS5 have Knack 3?  All this hardware and no Knack 3?!


----------



## Flanker (Mar 19, 2020)

TheLostSwede said:


> Same, yet so different. It's strange that Sony has put so much importance on storage and the SSD, while having what appears to be a much weaker GPU, compared to Microsoft. I guess it's possible that Sony's API (Vulcan?) is more efficient than DirectX, so they don't need as much raw GPU power?


DirectX12 and Vulkan are practically identical, PlayStations have used their own Gnm, which is equally low level as the other two.  I'm guessing Sony considers latency and loading times an important part of the gaming experience, at least more than Microsoft does.


----------



## Rahnak (Mar 19, 2020)

oxrufiioxo said:


> Sony going with variable clocks at least from my point of view means developers can't depend on those max clocks in all scenarios whether that be power draw or someone sitting in a 30c room vs a 20c room.
> 
> it adds a variable to development that doesn't exist on the other console.


Oh, I see. Cerny said temperature won't be a factor to the variable frequency, only workloads, so different ambient temperatures will get the same performance. And he didn't expect them to drop by much, in any case.



			
				Eurogamer said:
			
		

> It's really important to clarify the PlayStation 5's use of variable frequencies. It's called 'boost' but it should not be compared with similarly named technologies found in smartphones, or even PC components like CPUs and GPUs. There, peak performance is tied directly to thermal headroom, so in higher temperature environments, gaming frame-rates can be lower - sometimes a lot lower. This is entirely at odds with expectations from a console, where we expect all machines to deliver the exact same performance. To be abundantly clear from the outset, PlayStation 5 is _not_ boosting clocks in this way. According to Sony, _all PS5 consoles process the same workloads with the same performance level in any environment, no matter what the ambient temperature may be_.
> 
> So how does boost work in this case? Put simply, the PlayStation 5 is given a set power budget tied to the thermal limits of the cooling assembly. "It's a completely different paradigm," says Cerny. "Rather than running at constant frequency and letting the power vary based on the workload, we run at essentially constant power and let the frequency vary based on the workload."


----------



## eidairaman1 (Mar 19, 2020)

Well Here comes RDNA RT to PC...


----------



## gamefoo21 (Mar 19, 2020)

Bah!

Bah!

Apple style SSDs are evil douche moves.


----------



## dicktracy (Mar 19, 2020)

RTX 2080 is now the lowest level of performance @ cheap prices from here on out. I wouldn't buy any graphics card today that doesn't exceed that performance with RT capability. Facts.


----------



## Athena (Mar 19, 2020)

Funny thing is, AMD had 3D audio a LONG time ago... kicker is, hardly NOBODY EVER USED IT!

It was called AMD TrueAudio. From back in 2013!

Then the newer one was called TrueAudio Next.

A new version of TrueAudio, TrueAudio Next, was released with the AMD Radeon 400 series GPUs. TrueAudio Next utilizes the GPU to simulate audio physics.


----------



## Space Lynx (Mar 19, 2020)

oxrufiioxo said:


> Same here PS5 regardless for me... I personally just like the route at a hardware level Microsoft took. At the end of the day only one of them plays Playstation exclusives so that will be my choice.
> 
> 
> Sony going with variable clocks at least from my point of view means developers can't depend on those max clocks in all scenarios whether that be power draw or someone sitting in a 30c room vs a 20c room.
> ...



the variable clock also means we are open to stuttering and screen tearing. one benefit consoles have over PC is that everything is a smooth solid experience, until now anyway. I'm leaning towards not buying any console and just getting a rtx 3080 ti and going balls to the wall PC


----------



## R0H1T (Mar 19, 2020)

mechtech said:


> 16GB GDDR6 ram, whats the current market price on that? GPU, CPU, psu, controllers, SSD size? etc.
> 
> Random guess $550 US$ at least


Sony can't sell this at anything below $500 (or just a cent under that) unless they're willing to take massive losses on the BOM. The SSD itself is top of the line & will cost a pretty penny,


----------



## k3wld00d1 (Mar 19, 2020)

The big difference is 36cu PS5 vs 52cu on the Series X.


----------



## lexluthermiester (Mar 19, 2020)

ShurikN said:


> Well, now we definitely know which console is going to be cheaper.


And which one is going to kick the other in the goolies. PS5 looks like the winner on paper.


----------



## k3wld00d1 (Mar 19, 2020)

ARF said:


> You can't know this. Xbox has 10 GB VRAM + 6 GB system RAM, while normal PCs go ****today**** with 16 GB system RAM and 8 GB VRAM.
> It will be particularly interesting to see these consoles in 3-4 years when the games will become more demanding for hardware resources.
> 
> 12 vs 10.3 is not much of a difference, especially when you have a locked FPS at 60 or so.


do you really believe the PS5 gpu running at 2.23ghz core clock. I don't believe it. PS5 actual 9TF. I'm buying an XSX.


----------



## notb (Mar 19, 2020)

R0H1T said:


> Sony can't sell this at anything below $500 (or just a cent under that) unless they're willing to take massive losses on the BOM. The SSD itself is top of the line & will cost a pretty penny,


Since it's a "custom SSD", Sony can always replace it with something cheaper. As long as it offers the same interface and performance - no one is going to complain.
And it's still almost a year until these consoles hit the shelves.

The expensive 7nm CPU/GPU is what really pushes the price up - and the impact is and will remain larger in the Xbox.


----------



## lexluthermiester (Mar 19, 2020)

k3wld00d1 said:


> I'm buying an XSX.


Ok cool dude, have fun with that...


notb said:


> The expensive 7nm CPU/GPU is what really pushes the price up - and the impact is and will remain larger in the Xbox.


That remains to be seen on both points..


----------



## ratirt (Mar 19, 2020)

So Xbox is faster. Maybe Sony is making room for PS5 Pro at some point? Never underestimate the marketing


----------



## ppn (Mar 19, 2020)

It is a certainty, expect 5, 3nm shrinks every 2 -3 years, PS5 pro, slim, and up to 4608 enabled of 5120 total. PS6 in 2027.


----------



## Rahnak (Mar 19, 2020)

k3wld00d1 said:


> do you really believe the PS5 gpu running at 2.23ghz core clock. I don't believe it. PS5 actual 9TF. I'm buying an XSX.


Yes, because lying to everyone and them being found out about during early reviews would be a great idea. /s


----------



## Jism (Mar 19, 2020)

I wonder why sony choosen for a 825GB model SSD while MS has a 1TB model. Is it perhaps due to overprovisioning and sony wanting to have the SSD a longer life then Microsoft wants?

Many of the tech details are just AMD IP. A Zen+ chip with a RDNA2 feature set GPU. Nothing special.

But it's good for AMD in this as well; its bound to sell millions of consoles with their hardware inside of it. The whole gaming ecosystem will be based upon AMD hardware.


----------



## Rahnak (Mar 19, 2020)

Jism said:


> I wonder why sony choosen for a 825GB model SSD while MS has a 1TB model. Is it perhaps due to overprovisioning and sony wanting to have the SSD a longer life then Microsoft wants?


It was just a better fit for their custom 12 channel controller without being too expensive.


----------



## Valantar (Mar 19, 2020)

ARF said:


> Well, look at this gameplay comparison between PS4 and Ryzen 5 1600/ GTX 1060/ 16 GB RAM.
> There is no difference in the quality. While we can assume that the PC is (much) faster, no?


All that shows is that Rockstar's PC ports are still garbage. There are plenty of comparisons like this showing vast differences.



Rahnak said:


> But they filmed it before that.
> 
> Concerning the variable clocks, I don't mind it because a) it's not based on thermals, so everyone gets the same performance regardless and b) it's to keep cooling noise acceptable.
> One example he gave for lowering the cpu clock is when it's executing a lot of 256-bit instructions because to maintain the clocks they would need to increase the size of the power supply and fan.


Congratulations on missing the most obvious joke ever, I guess? I mean, what you're saying is right there in what you quoted from me.


Valantar said:


> Even if the presentation was filmed a few days ago


Doesn't make that image any less funny IMO.

Beyond that, that it doesnt thermal throttle is nice, but "it's to keep cooling noise acceptable" is BS - if that was the case, they wouldn't be pushing GPU clocks to 2.25GHz in the first place. Designing for a fixed power target instead of a fixed clock target is perfectly viable, but it is an approach that sacrifices some performance.

I also find it rather telling that they're using SmartShift - while it's a brilliant little piece of tech, it's first and foremost designed for thermally constrained laptops (where the heat dissipation capability of the cooling system is lower than what the hardware can produce if left to run free), so using it here clearly indicates that they expect there to be a need for balancing power within a fixed total budget. In other words, the hardware could do more if it just had better cooling and power delivery.



Vya Domus said:


> He's not wrong, adding more CUs adds more cache however it's not that simple. In a GPU each CU has some cache dedicated to it, if you add more CUs they still have access to the same amount of cache. You'll still encounter the same limitations in memory bound situations.
> 
> A GPU with less CUs and higher clocks will perform better in memory bound situations. And it's not uncommon that some parts of a shader contain scalar code which will also run better.
> 
> Not that it matters much in this case because PS5's GPU isn't equivalent TFLOP wise with the one in X.


Well, sure, I didn't mean to say there were no advantages, just that they are mostly tiny and more than counteracted by having a wider GPU with more total resources. Not to mention that the 256-bit PS5 is far more likely to be memory-bound than the 320-bit XSX.


Rahnak said:


> Yeah, I'm sure they all have their inside information and knew all about the other months in advance. I'm not sure what you mean by "it forces developers to target a lower level of performance to make sure all PS5 games perform the same in all environments"?


The PS5 has less GPU horsepower, and that power is dependent on avoiding power spikes, so developers will (at least until they become intimately familiar with the system) need to cut down their targets a bit to maintain acceptable performance.

I would think the XOX vs. PS4 Pro is a reasonable analogy (even if the difference between them is bigger than the difference here): the PS4 Pro consistently runs cross-platform games at lower resolutions and detail levels, and often _still_ struggles to match the frame rates of the XOX. Just check out pretty much any of Digital Foundry's excellent comparison videos (Jedi: Fallen Order springs to mind as a stand-out that I watched).


ARF said:


> GPU and CPU is one chip, it's an APU. No RAM, only the GDDR6 chips and large SSD.


... GDDR6 is still RAM ...


ARF said:


> But I agree, Sony needs healthy profit margins because it's not sustainable to sell at a loss all the time. They have other struggling divisions, too.
> 
> PS4 was much inferior technologically, so yes, it was quite normal to be cheaper.


Consoles are generally sold at break-even or at a loss, with profits made on game licencing. Any game sold for PS4 or Xbox One comes with a $10 licence fee to the platform owner (though I believe this is slightly lower for cheap indies etc.). That's why console games are more expensive and the hardware is much cheaper when compared to PC.

As for the PS4 being technologically inferior ... to what? It soundly beat the XBone. PS4 Pro vs. Xbox One X is another story entirely, but you didn't say Pro.


Assimilator said:


> Anyone know when Nintendo will complete the trifecta of trash that will hobble games for the next decade?


No reason to expect consoles with with loads of fast memory, fast 8c16t CPUs, native flash storage and very powerful RT-enabled GPUs to hobble anything for quite a while. Jaguar was crap even back in 2012-13, Zen2 is state of the art - and they haven't even cut clocks much! These consoles are _far_ superior to the average gaming PC in pretty much every respect (remember, the average PC has a 4c8t CPU and a GTX 1060) and will allow for massive growth in the quality of games moving forward, including sorely missed improvements in CPU-bound tasks, audio, physics, AI, etc.


Athena said:


> Funny thing is, AMD had 3D audio a LONG time ago... kicker is, hardly NOBODY EVER USED IT!
> 
> It was called AMD TrueAudio. From back in 2013!
> 
> ...


Yeah, I'm really looking forward to an increased focus on audio on both consoles. And given that the processing is done on AMD hardware it wouldn't be too big of a stretch of the imagination to see it implemented on the PC through GPU accelerated audio either. True positional and spatial audio will be a massive boon to immersion for sure.


lynx29 said:


> the variable clock also means we are open to stuttering and screen tearing. one benefit consoles have over PC is that everything is a smooth solid experience, until now anyway. I'm leaning towards not buying any console and just getting a rtx 3080 ti and going balls to the wall PC


Yeah, developers need to work hard to avoid power draw spikes to keep clocks consistent and thus avoid this. Hopefully there's at least some leeway for short-term power spikes in the management system.


lexluthermiester said:


> And which one is going to kick the other in the goolies. PS5 looks like the winner on paper.


Care to expand on that? Not that I don't think it will be good, but it will definitely need to be cheaper (even if Sony has a massive mindshare advantage).


R0H1T said:


> Sony can't sell this at anything below $500 (or just a cent under that) unless they're willing to take massive losses on the BOM. The SSD itself is top of the line & will cost a pretty penny,


Flash isn't cheap, the controller likely isn't either, but it won't be that much more expensive than the competition (which has more flash).


notb said:


> Since it's a "custom SSD", Sony can always replace it with something cheaper. As long as it offers the same interface and performance - no one is going to complain.
> And it's still almost a year until these consoles hit the shelves.
> 
> The expensive 7nm CPU/GPU is what really pushes the price up - and the impact is and will remain larger in the Xbox.


Flash is flash, it costs what it costs. Prices will come down in time, but slowly. Only way to make the controller cheaper without losing performance is moving to a smaller node, which takes time unless you want to pay a premium (which obviously negates any savings).


k3wld00d1 said:


> do you really believe the PS5 gpu running at 2.23ghz core clock. I don't believe it. PS5 actual 9TF. I'm buying an XSX.


Judging by what they said they have pushed clocks pretty much as far as they can go - a 10% drop in power for a 2-3% drop in performance was mentioned - so it'll likely still stay quite high even when power limited. 2,1GHz/9,6TFlops is likely entirely sustainable.


Jism said:


> I wonder why sony choosen for a 825GB model SSD while MS has a 1TB model. Is it perhaps due to overprovisioning and sony wanting to have the SSD a longer life then Microsoft wants?
> 
> Many of the tech details are just AMD IP. A Zen+ chip with a RDNA2 feature set GPU. Nothing special.
> 
> But it's good for AMD in this as well; its bound to sell millions of consoles with their hardware inside of it. The whole gaming ecosystem will be based upon AMD hardware.


The odd size is due to the SSD controller having 12 flash channels instead of the "normal" 8 (or 4 for lower end drives). With flash chip capacities being the same you'll then end up with strange total capacities.

The whole gaming ecosystem has been based on AMD hardware since the current console generation launched in 2013, but the difference is that it's now high-end hardware rather than low-end CPUs and mid-range GPUs. As such it'll push games further and promote propagation of advanced tech like spatial audio and RTRT. This is rather exciting, even if the underlying tech is for the most part known.


I have to say the clock rates of this chip makes me rather excited for upcoming AMD GPUs, though. If a console can hit 2.23GHz, PC GPUs should be able to exceed that, at least when OC'd. If the upcoming RDNA 2 GPUs run at 2-2.1GHz stock without being terribly inefficient, that's a big improvement for sure.


----------



## Jism (Mar 19, 2020)

I'm not worried about RDNA2 or so. Navi was a perfect example of what engineers @ AMD could do in the first place. 2nd generation will only be better.

And i dont think the PS5 would be too weak or so. Some PS3's and PS4's could be overclocked by a simple firmware push. They did it with some games, raising the clocks. Yes it generated more heat thus more noise but i'm sure there's plenty of headroom left in those things.


----------



## Space Lynx (Mar 19, 2020)

I tried playing Uncharted games on PS4, aiming with a controller is just an absolutely horrible experience after you have used mouse and keyboard for so long, another reason I will probably just stick with PC only.


----------



## ratirt (Mar 19, 2020)

lynx29 said:


> I tried playing Uncharted games on PS4, aiming with a controller is just an absolutely horrible experience after you have used mouse and keyboard for so long, another reason I will probably just stick with PC only.


Agree. I've played one game a shooter. I don't remember what it's called but I played for 5 min and without a mouse and a keyboard it's unplayable for me. If I were to get PS5 or XBOX, i would still need a pc to play First person shooter games. If you could connect a mouse and a keyboard to either PS5 or XBOX I would have been OK.


----------



## Rahnak (Mar 19, 2020)

Valantar said:


> Congratulations on missing the most obvious joke ever, I guess? I mean, what you're saying is right there in what you quoted from me.


It seemed more like doubling down than a joke, my bad.



Valantar said:


> Beyond that, that it doesnt thermal throttle is nice, but "it's to keep cooling noise acceptable" is BS - if that was the case, they wouldn't be pushing GPU clocks to 2.25GHz in the first place. Designing for a fixed power target instead of a fixed clock target is perfectly viable, but it is an approach that sacrifices some performance.
> 
> I also find it rather telling that they're using SmartShift - while it's a brilliant little piece of tech, it's first and foremost designed for thermally constrained laptops (where the heat dissipation capability of the cooling system is lower than what the hardware can produce if left to run free), so using it here clearly indicates that they expect there to be a need for balancing power within a fixed total budget. In other words, the hardware could do more if it just had better cooling and power delivery.



But the whole point of making power constant and frequency variable was to keep cooling noise down, or in other words, keep the fan from spinning up as much in more demanding games as it did in previous gen, which in his graphic the worst offender was GoW on PS4 Pro. And the reason they went with 2.23 Ghz on GPU is because they came up with a cooling solution that can take it (or so he said, we haven't seen it yet) and from his speech, it also seemed like something he really wanted for the new console.
The XSX is sporting a super beefy cooling solution, but because of the fixed clocks, it'll probably have to ran up and down accordingly (shouldn't be too loud with a 130mm fan though).


----------



## Valantar (Mar 19, 2020)

ratirt said:


> Agree. I've played one game a shooter. I don't remember what it's called but I played for 5 min and without a mouse and a keyboard it's unplayable for me. If I were to get PS5 or XBOX, i would still need a pc to play First person shooter games. If you could connect a mouse and a keyboard to either PS5 or XBOX I would have been OK.


You can. Xbox One has kbm suport in any title where the developer has enabled and implemented it. AFAIK most competitive shooters avoid it though, as it would give too much of an advantage to kbm users.


----------



## Vya Domus (Mar 19, 2020)

Rahnak said:


> And the reason they went with 2.23 Ghz on GPU is because they came up with a cooling solution that can take it (or so he said, we haven't seen it yet) and from his speech, it also seemed like something he really wanted for the new console.



It's not because of that, there is so many ways you can put a fan and a heatsink. They use a different way of scaling frequency, the 2.23 Ghz for the GPU will likely be available when the CPU is under a specific amount of load. Basically you can either have the CPU reach it's highest clocks or the GPU but not as much both.

Nvidia had something somewhat similar in laptops, it's called platform boost or something like that, it's basically a way of managing clocks by taking into account the total power output from both CPU and GPU and not independently. Because you can have for instance a situation when the CPU isn't doing much but the GPU is pegged, the total power output will be low so it makes sense to give the GPU more power headroom.

All in all PS5's custom chip seems much more advanced.


----------



## notb (Mar 19, 2020)

lynx29 said:


> I tried playing Uncharted games on PS4, aiming with a controller is just an absolutely horrible experience after you have used mouse and keyboard for so long, another reason I will probably just stick with PC only.


That's why console games offer aiming aids.


Valantar said:


> You can. Xbox One has kbm suport in any title where the developer has enabled and implemented it. AFAIK most competitive shooters avoid it though, as it would give too much of an advantage to kbm users.


Wouldn't it be better to just add "allow mouse and keyboard" setting in session setup?


----------



## Valantar (Mar 19, 2020)

notb said:


> That's why console games offer aiming aids.
> 
> Wouldn't it be better to just add "allow mouse and keyboard" setting in session setup?


It might be that some have the option, I don't play competitive shooters (and definitely not on console) so I have no idea.


Vya Domus said:


> It's not because of that, there is so many ways you can put a fan and a heatsink. They use a different way of scaling frequency, the 2.23 Ghz for the GPU will likely be available when the CPU is under a specific amount of load. Basically you can either have the CPU reach it's highest clocks or the GPU but not as much both.
> 
> Nvidia had something somewhat similar in laptops, it's called platform boost or something like that, it's basically a way of managing clocks by taking into account the total power output from both CPU and GPU and not independently. Because you can have for instance a situation when the CPU isn't doing much but the GPU is pegged, the total power output will be low so it makes sense to give the GPU more power headroom.
> 
> All in all PS5's custom chip seems much more advanced.


SmartShift is essentially what you describe just offloaded to a dedicated power controller in the Infinity Fabric which can do on-the-fly power balancing. Dell's upcoming G5 Special Edition has it too, though I bet it'll see more active use there than in the PS5 (as the PS5 is hopefully much less limited than a laptop).



lynx29 said:


> I tried playing Uncharted games on PS4, aiming with a controller is just an absolutely horrible experience after you have used mouse and keyboard for so long, another reason I will probably just stick with PC only.





ratirt said:


> Agree. I've played one game a shooter. I don't remember what it's called but I played for 5 min and without a mouse and a keyboard it's unplayable for me. If I were to get PS5 or XBOX, i would still need a pc to play First person shooter games. If you could connect a mouse and a keyboard to either PS5 or XBOX I would have been OK.


I have to disagree - kbm movement in 3rd person games looks so jarring and stupid I avoid it at all costs. Analogue control of movement far outstrips the value of precise aiming in titles like that. For first person shooters my answer would be different, but then Uncharted definitely isn't that. And these games are adventure games, not shooters, so the need for precise aiming is kept reasonably low. After buying the current gen consoles I definitely had a learning period getting used to using a controller for everything on them, but it didn't take that long for it to be perfectly fine (with the added benefit of the ability to use the controller for suitable games on my PC too). Some games are nearly unplayable with one or the other, but the vast majority are okay with both TBH.


----------



## ratirt (Mar 19, 2020)

Valantar said:


> I have to disagree - kbm movement in 3rd person games looks so jarring and stupid I avoid it at all costs. Analogue control of movement far outstrips the value of precise aiming in titles like that. For first person shooters my answer would be different, but then Uncharted definitely isn't that. And these games are adventure games, not shooters, so the need for precise aiming is kept reasonably low. After buying the current gen consoles I definitely had a learning period getting used to using a controller for everything on them, but it didn't take that long for it to be perfectly fine (with the added benefit of the ability to use the controller for suitable games on my PC too). Some games are nearly unplayable with one or the other, but the vast majority are okay with both TBH.


You are kidding right? It is not about the movement but how to aim. I wasted my entire damn clip without landing one shot. Basically shoot the opponent all around instead him. Aiming for me is impossible with an analog pad. I'm only talking about shooters. I know you can play Tomb Raider or uncharted. Played these two games myself and it was OK. I understand that other can play but I'm talking about myself. Get a guy to play CS:GO on a whatever play station ( if possible) playing with a pad with no aim aids and you would know what I'm talking about.


----------



## Valantar (Mar 19, 2020)

I just finished watching the presentation, and it was ... weird. I mean, let's pass by Cerny's charmingly uncharismatic personality (though those oddly-timed smiles of his were rather unsettling); the setting itself looked like a parody TV skit (cheap backdrop and four silhouetted "audience members") and the presentation was bone dry. Interesting, but definitely not the kind of stuff that drums up user interest and gamer excitement. The 3D audio stuff sounds exciting, I'm cautiously optimistic about the effects of the SSD, but _so much_ of that GPU presentation came off as saying _"please please please please _don't buy an Xbox just because it's more powerful, our stuff is still good!" Nothing wrong, nothing really misleading, but a lot of conspicuously angled and bold-faced defensively presented arguments that when put together shouted "please ignore our performance deficit!"

I still think this will be a good console, and as last time around I'll be getting both, but the XSX will likele be where I play most console games unless the 3D audio stuff Sony is doing is radically better than MS' competing initiative (it also has dedicated 3D audio hardware, after all).



ratirt said:


> You are kidding right? It is not about the movement but how to aim. I wasted my entire damn clip without landing one shot. Basically shoot the opponent all around instead him. Aiming for me is impossible with an analog pad. I'm only talking about shooters. I know you can play Tomb Raider or uncharted. Played these two games myself and it was OK. I understand that other can play but I'm talking about myself. Get a guy to play CS:GO on a whatever play station ( if possible) playing with a pad with no aim aids and you would know what I'm talking about.


I wasn't talking about combat movement, but general in-game movement - in third-person titles specifically. And the example brought up by the post you responded to was specifically Uncharted, after all. Also, is this when I'm supposed to say "git gud"? Or does that just apply to hardcore PC games? Practice does make perfect, after all, and as I said, after an initial acclimation period I got perfectly adequately used to aiming with a gamepad in the games that required it. But then again, as I said, I don't play competitive shooters, and I definitely don't play them on consoles.


----------



## ratirt (Mar 19, 2020)

Valantar said:


> I just finished watching the presentation, and it was ... weird. I mean, let's pass by Cerny's charmingly uncharismatic personality (though those oddly-timed smiles of his were rather unsettling); the setting itself looked like a parody TV skit (cheap backdrop and four silhouetted "audience members") and the presentation was bone dry. Interesting, but definitely not the kind of stuff that drums up user interest and gamer excitement. The 3D audio stuff sounds exciting, I'm cautiously optimistic about the effects of the SSD, but _so much_ of that GPU presentation came off as saying _"please please please please _don't buy an Xbox just because it's more powerful, our stuff is still good!" Nothing wrong, nothing really misleading, but a lot of conspicuously angled and bold-faced defensively presented arguments that when put together shouted "please ignore our performance deficit!"
> 
> I still think this will be a good console, and as last time around I'll be getting both, but the XSX will likele be where I play most console games unless the 3D audio stuff Sony is doing is radically better than MS' competing initiative (it also has dedicated 3D audio hardware, after all).
> 
> ...


That's the point. I was talking about first person shooter combat aiming and yet you still had to disagree even though you are not talking about the same thing. git gud? No idea what that means.


----------



## Rahnak (Mar 19, 2020)

Valantar said:


> I just finished watching the presentation, and it was ... weird. I mean, let's pass by Cerny's charmingly uncharismatic personality (though those oddly-timed smiles of his were rather unsettling); the setting itself looked like a parody TV skit (cheap backdrop and four silhouetted "audience members") and the presentation was bone dry. Interesting, but definitely not the kind of stuff that drums up user interest and gamer excitement. The 3D audio stuff sounds exciting, I'm cautiously optimistic about the effects of the SSD, but _so much_ of that GPU presentation came off as saying _"please please please please _don't buy an Xbox just because it's more powerful, our stuff is still good!" Nothing wrong, nothing really misleading, but a lot of conspicuously angled and bold-faced defensively presented arguments that when put together shouted "please ignore our performance deficit!"


I think the presentation needed some real world examples showcasing the new tech and why we should care and be excited about it. Like MS did. I think their event was more succesfull. They showed their SSD tech and then showed a few examples of what advantages it would bring to the players.



Valantar said:


> I still think this will be a good console, and as last time around I'll be getting both, but the XSX will likele be where I play most console games unless the 3D audio stuff Sony is doing is radically better than MS' competing initiative (it also has dedicated 3D audio hardware, after all).


Why not PC for multi-platforms?


----------



## ppn (Mar 19, 2020)

loading times are CPu limited most of the time. so you get half loading times. everything else is meaningless, they should have gone 384 bit 3840 core and be set for the next 7 years for the life of the platform. Now they are set for 2 years ahead and then buy again the upgraded version on 5nm that would still be bottlenecked by the 256 bit bus, only the core count may change.. XSX is another thragedy of its own. 6GB system memory.. even if the Game engine loads everything in the GPU memory and doesn't keep a copy of what would be the system memory portion of it, I still don't believe it. nothing guarantees it will work and be optimised that way. show me how it works. 7 years of torture.


----------



## Vayra86 (Mar 19, 2020)

R0H1T said:


> Well that should settle the debate about the upcoming "big Navi" cause it's gonna be a beast w/better IPC, clocks, more CU, hardware RT & who knows 3d audio?



Yeah, holy shit. The clocks that is. The rest will have to be seen.


----------



## ARF (Mar 19, 2020)

Valantar said:


> I just finished watching the presentation, and it was ... weird. I mean, let's pass by Cerny's charmingly uncharismatic personality (though those oddly-timed smiles of his were rather unsettling); the setting itself looked like a parody TV skit (cheap backdrop and four silhouetted "audience members") and the presentation was bone dry. Interesting, but definitely not the kind of stuff that drums up user interest and gamer excitement. The 3D audio stuff sounds exciting, I'm cautiously optimistic about the effects of the SSD, but _so much_ of that GPU presentation came off as saying _"please please please please _don't buy an Xbox just because it's more powerful, our stuff is still good!" Nothing wrong, nothing really misleading, but a lot of conspicuously angled and bold-faced defensively presented arguments that when put together shouted "please ignore our performance deficit!"
> 
> I still think this will be a good console, and as last time around I'll be getting both, but the XSX will likele be where I play most console games unless the 3D audio stuff Sony is doing is radically better than MS' competing initiative (it also has dedicated 3D audio hardware, after all).



If they are not excited, we should be even less. And to be honest, I do understand them. They are stuck with hardware which probably has never been their first choice to begin with, there is absolutely nothing ground-breaking in AMD's x86-64 implementation, just a fix to the mediocre Jaguar-derivative that was even worse.



ppn said:


> loading times are CPu limited most of the time. so you get half loading times. everything else is meaningless, they should have gone 384 bit 3840 core and be set for the next 7 years for the life of the platform. Now they are set for 2 years ahead and then buy again the upgraded version on 5nm that would still be bottlenecked by the 256 bit bus, only the core count may change.. XSX is another thragedy of its own. 6GB system memory.. even if the Game engine loads everything in the GPU memory and doesn't keep a copy of what would be the system memory portion of it, I still don't believe it. nothing guarantees it will work and be optimised that way. show me how it works. 7 years of torture.



Yup, at least somewhat better than Jaguar.
All will depend on the available contents upon launch. They have to push the hardware to its limits and offer the most exciting visual experience ever.
If not.... 7 years of not expected sales will follow.


----------



## Parn (Mar 19, 2020)

Haven't touched a console since PS2 time. But my boy has been asking for a console for quite some time now. So if we're going to get him one for next Christmas, I'd go for PS5 simply because I love some of the Sony exclusives and would like to take advantage of the console when he isn't playing.


----------



## dirtyferret (Mar 19, 2020)

Hey everyone, I'm building a Ryzen 3700x & Nvidia rtx 2080 in an itx case so I can max out these console games.  Do you think my sfx 200w power supply is good enough or should I upgrade to a 225w version?  I mean it's not even 80 plus but I don't think that cpu and gpu will pull more then 175w.  I also plan to use passive heatsinks on both the cpu and gpu but it's all good as my itx case comes with one case fan.


----------



## ppn (Mar 19, 2020)

dirtyferret said:


> Hey everyone, I'm building a Ryzen 3700x & Nvidia rtx 2080 in an itx case so I can max out these console games.  Do you think my sfx 200w power supply is good enough or should I upgrade to a 225w version?  I mean it's not even 80 plus but I don't think that cpu and gpu will pull more then 175w.  I also plan to use passive heatsinks on both the cpu and gpu but it's all good as my itx case comes with one case fan.



Realistically by the time PS5 is out, you have to compare it to Ryzen 4700 @ 3,5 and the rdna2 successor of RX 5700 2304 @ 2.2 undervolted as much as possible. And then overprovision the PSU for oDb operation. Not impossible. Considering the prices of 3700X and 5700X will be much lower then.


----------



## fynxer (Mar 19, 2020)

*XBOX FTW*, can hold 12TOPS continuously while PS5 boosts up to 10.3 which means you most of the time will get under well under 10TOPS from PS5

*XBOX will have aprox 20-25% faster GFX*

ALSO XBOX can hold 3.8GHz continuously while PS5 boosts up to 3.5 GHz which means you get maybe 3.2GHz in average from PS5

*XBOX will have aprox 15-20% faster CPU

Keep in mind what HIDDEN POWER XBOX holds* if they decide to *unlock boost*, then you would *see up to a total of 40% faster GFX and 40% faster CPU than PS5*

PS5 saved money on silicon and cooling solution, sure maybe it will be a little cheaper but who cares about 50 or 100 bucks difference when you going to have that console for years to come.

*This is  EZZZZZ, i am going with XBOX*


----------



## Valantar (Mar 19, 2020)

ratirt said:


> That's the point. I was talking about first person shooter combat aiming and yet you still had to disagree even though you are not talking about the same thing. git gud? No idea what that means.


No, you were agreeing with someone talking about a third-person action-adventure game, which it should be rather obvious that the bulk of my response was directed at. If you actually read what I wrote it ought to also be rather obvious that I wasn't so much contradicting you than saying a wholesale dismissal of controllers based on


ratirt said:


> play[ing] one game a shooter [...] for 5 min


is rather silly. That's rather brash, no? My entire point was that the kbm vs. controller debate depends on both the game in question as well as the skills, experiences and preferences of the player. If that was your entryway into controller-based gaming you essentially gave yourself a worst-case scenario: an FPS game for someone with little/no controller aiming experience. On the other hand there are quite a few professional FPS gamers on consoles, and they don't generally use kbm. So again, it comes down to a lot of personal preference even if there are absolutely use cases where one is intrinsically superior to the other - FPS and other games requiring quick, pin-point precise movement for mice, and racing games, third-person games, and anything else requiring nuanced controls but not immediacy for controllers. 

As for "git gud", that's what PCMR douche-bros tell people when they complain something is too difficult or otherwise ought to be tweaked to be more accessible, i.e. telling them that the fault lies not in the game but their lack of skill - it was meant as a joke based on your brash dismissal of controllers seeming to stem from rather little experience with them, and in a worst-case scenario at that.



Rahnak said:


> I think the presentation needed some real world examples showcasing the new tech and why we should care and be excited about it. Like MS did. I think their event was more succesfull. They showed their SSD tech and then showed a few examples of what advantages it would bring to the players.
> 
> 
> Why not PC for multi-platforms?


I obviously have that too, but I work researching games, so having access to all relevant current platforms is crucial to being able to do my work properly (including understanding the ecosystems these games exist within). 

I agree that real-world examples would be helpful, though a lot of the stuff they're talking about is hard to demonstrate - the 3D audio won't work as a demo since it needs the dedicated hardware and tuning to work, and the SSD... well, they showed that Spiderman demo a long time back, which I guess demonstrates it, but also feels kind of unnecessary. I doubt there'll be much perceptible difference between "instantaneous" fast travel/load times/etc on a PS5 and the couple of seconds the same would take on an XSX with half the SSD speed.


ARF said:


> If they are not excited, we should be even less. And to be honest, I do understand them. They are stuck with hardware which probably has never been their first choice to begin with, there is absolutely nothing ground-breaking in AMD's x86-64 implementation, just a fix to the mediocre Jaguar-derivative that was even worse.
> 
> Yup, at least somewhat better than Jaguar.
> All will depend on the available contents upon launch. They have to push the hardware to its limits and offer the most exciting visual experience ever.
> If not.... 7 years of not expected sales will follow.


Uh, what? Zen has no relation to Jaguar beyond being an X86 CPU. As for "nothing ground-breaking in AMD's x86-64 implementation", you know AMD literally invented X86-64, right? There's nothing ground-breaking in any other implementations of it either, but it works pretty damn well. As for the hardware not being their first choice - why not? Sure, Nvidia currently has faster GPUs, but they don't make CPUs (unless you want low-power, low-performance ARM), and making a console with two different hardware vendors would be momentously more complicated at today's development complexity levels in terms of fine-tuning the OS, firmware, APIs and ultimately software.

As for being excited, the exciting thing here is the baseline spec for multi-platform games (which includes >99% of AAA games) moving from 8 dog-slow single-thread ~2GHz CPU cores to 8 fast dual-threaded CPU cores at 170-180% frequency, with GPU performance jumping 2-5x (depending on whether you measure from base consoles or Pro/X) allowing for massively improved visual fidelity for the next decade of game development. Adding RT and other modern, advanced rendering techniques will also allow for radically more advanced games in the coming years. You're of course right that games will make these platforms, but I don't see that as an issue - current consoles are already struggling heavily with current titles (try playing Jedi: Fallen Order on a base PS4 or XB1 - it's terrible), so giving developers more to play with will inevitably result in great-looking and -feeling titles. There's a lot that can be done with these consoles that can't possibly be done with current ones, so the future of cross-platform game development is suddenly looking dramatically better than it has done for the past year or more.


----------



## lexluthermiester (Mar 19, 2020)

Valantar said:


> Care to expand on that? Not that I don't think it will be good, but it will definitely need to be cheaper (even if Sony has a massive mindshare advantage).


I'm not considering price as that is unknown ATM. Otherwise, my statement is self-explanatory.


----------



## ratirt (Mar 19, 2020)

Valantar said:


> No, you were agreeing with someone talking about a third-person action-adventure game, which it should be rather obvious that the bulk of my response was directed at. If you actually read what I wrote it ought to also be rather obvious that I wasn't so much contradicting you than saying a wholesale dismissal of controllers based on


About aiming which is horrible with a PS controller for me. So yes I know what i have agreed but still what I said stands. Aiming with an analog PS controller is horrible even if it is Uncharted. Switching on aiming aids is no fun either.


Valantar said:


> is rather silly. That's rather brash, no? My entire point was that the kbm vs. controller debate depends on both the game in question as well as the skills, experiences and preferences of the player. If that was your entryway into controller-based gaming you essentially gave yourself a worst-case scenario: an FPS game for someone with little/no controller aiming experience. On the other hand there are quite a few professional FPS gamers on consoles, and they don't generally use kbm. So again, it comes down to a lot of personal preference even if there are absolutely use cases where one is intrinsically superior to the other - FPS and other games requiring quick, pin-point precise movement for mice, and racing games, third-person games, and anything else requiring nuanced controls but not immediacy for controllers.


I'm glad you have your point but you don't get mine. I couldn't play the game because aiming for me with a PS controller was not any way fun nor accurate. Glad you have your points but I'd rather talk about mine if you don't mind? That's why I have joined the conversation. There are a lot of games on a PC and you don't use analog pad so what's your point?
I just need to use analog because that's the way it is? Well, as a matter of fact I can choose what I want and will stay with a PC with FPS games. I didn't bash anyone I said i prefer KBM instead controller because aiming for me is horrible and you can't compare analog to KBM because accuracy and response is way better on KBM. Now you talk about preference but when I mentioned KBM for me you got somehow offended?


Valantar said:


> As for "git gud", that's what PCMR douche-bros tell people when they complain something is too difficult or otherwise ought to be tweaked to be more accessible, i.e. telling them that the fault lies not in the game but their lack of skill - it was meant as a joke based on your brash dismissal of controllers seeming to stem from rather little experience with them, and in a worst-case scenario at that.


I didn't know what "git gud" means and who is using it and why thanks for clarification. Anyway it is not fair to call somebody douche-bro when that person is not agreeing with you.
It's not about difficulty but a matter of preference. If you find PS analog controller better for aiming then fine. I don't. Mouse and keyboard please since for me it's way more accurate.


----------



## dirtyferret (Mar 19, 2020)

fynxer said:


> *XBOX FTW*, can hold 12TOPS continuously while PS5 boosts up to 10.3 which means you most of the time will get under well under 10TOPS from PS5
> 
> *XBOX will have aprox 20-25% faster GFX*
> 
> ...


it appears you have something wrong with your keyboard, it makes words in bold randomly


----------



## notb (Mar 19, 2020)

ratirt said:


> About aiming which is horrible with a PS controller for me. So yes I know what i have agreed but still what I said stands. Aiming with an analog PS controller is horrible even if it is Uncharted. Switching on aiming aids is no fun either.


That depends what kind of fun you're after. Most console gamers are perfectly fine with aids - casual gaming is about the story/experience/relax rather than agility training and stress. 

In fact, CS:GO afair wasn't a big hit on PS3 and wasn't ported to PS4. Yet, it remains one of most popular games on PCs.


----------



## Vya Domus (Mar 19, 2020)

ppn said:


> loading times are CPu limited most of the time.



Loading times are almost never CPU limited, I don't know where you got this idea from. If they are CPU limited that happens because you're decompressing data, which you had to do in order to save space.


----------



## ironcerealbox (Mar 19, 2020)

Rahnak said:


> Cerny gave a simple explanation on why they went with fewer CUs at higher frequency as opposed to more CUs at a lower frequency. Quoting Eurogamer, quoting Cerny:
> 
> 
> 
> Price might've been a factor as well, dunno.



Yeah, I watched that part of the presentation twice to make sure I heard him correctly. I thought he was very knowledgeable and demonstrated Sony's understanding of what they have for hardware and how far they can optimize software for the hardware.

It, to me, feels a lot like the Japanese efficiency paradigm in full effect here (cars, maximization of limited capability in other sectors, etc.). Whereas, in Microsoft's case, they seem to be using a brute force method (Americana - "there's no replacement for displacement"). Both methods get the job done. However, I think Sony learned more from their one generation of more traditional PC hardware usage. What I mean is that they (Sony) seemed to have maximized their short experience with more traditional PC hardware from the PS4 than Microsoft has in their more than one generation of using traditional PC hardware.


----------



## Valantar (Mar 19, 2020)

ratirt said:


> About aiming which is horrible with a PS controller for me. So yes I know what i have agreed but still what I said stands. Aiming with an analog PS controller is horrible even if it is Uncharted. Switching on aiming aids is no fun either.
> 
> I'm glad you have your point but you don't get mine. I couldn't play the game because aiming for me with a PS controller was not any way fun nor accurate. Glad you have your points but I'd rather talk about mine if you don't mind? That's why I have joined the conversation. There are a lot of games on a PC and you don't use analog pad so what's your point?
> I just need to use analog because that's the way it is? Well, as a matter of fact I can choose what I want and will stay with a PC with FPS games. I didn't bash anyone I said i prefer KBM instead controller because aiming for me is horrible and you can't compare analog to KBM because accuracy and response is way better on KBM. Now you talk about preference but when I mentioned KBM for me you got somehow offended?
> ...


Dude, _read what I'm writing _FFS. I did very explicitly not call _you _a douche-bro, I said that "git gud" is something PCMR douche-bros say to other people when they (the others, that is) complain. The joke (which obviously fell completely flat) was that if _I_ were a PCMR douche-bro the response rather than talking to you would be to say "git gud" and nothing else. Your reading comprehension needs serious work if you thought that was directed at you.

As for the rest, I never said you needed to use a controller at all, I just said that your outright dismissal of controllers at all based on what you yourself call one five-minute experience with one is a poor decision; I wouldn't try to convince you to use a controller in a competitive FPS, but there are plenty of other games where it's a far superior solution to kbm.

So let's discuss your point:


ratirt said:


> Agree. I've played one game a shooter. I don't remember what it's called but I played for 5 min and without a mouse and a keyboard it's unplayable for me.


Okay, you had a bad experience. Not much time at all to try to get used to something (I bet it took you a lot more than five minutes to figure out mouse aim the first time!), but sure.


ratirt said:


> If I were to get PS5 or XBOX, i would still need a pc to play First person shooter games. If you could connect a mouse and a keyboard to either PS5 or XBOX I would have been OK.


Again: controllers are not ideal for this type of game. Not by a long shot. But that apparently doesn't get in the way of the rather large group of professional console gamers in games like CoD, so apparently they aren't _horrible_ either - it just depends on what you're used to. I would never accept anything other than kbm being _better_, but dismissing it outright after five minutes is premature at best. I also entirely agree with having a PC for mouse-dominant games (I could never play most RPGs on a console, ugh), but as I also clarified above: my main response was directed towards the person you said you agreed with, who was saying aiming in Uncharted sucked and therefore controllers sucked, to which I said movement in the game (a third-person action-adventure) would suck _far more _if you were using a keyboard. I guess it was dumb of me to group the two of you together like that seeing how you were saying different things, but that was also part of the point: that you were "agreeing" with each other while talking about entirely different things.


----------



## lexluthermiester (Mar 19, 2020)

Valantar said:


> Dude, _read what I'm writing _FFS. I did very explicitly not call _you _a douche-bro, I said that "git gud" is something PCMR douche-bros say to other people when they complain.


Let's lose the name-calling altogether and we don't have to worry who it's aimed at.


----------



## Valantar (Mar 19, 2020)

lexluthermiester said:


> I'm not considering price as that is unknown ATM. Otherwise, my statement is self-explanatory.


Based on ... lower GPU specs, lower CPU specs, a faster SSD, and slightly fancier 3D audio than the competition? I'm not discounting the value of the latter (audio is extremely important for immersion and worldbuilding), but the faster SSD has questionable value at best unless MS has borked their implementation completely. Eliminating load times is cool, but whether fast travel is "immediate" (I take that to mean some sort of fade out-fade in animation with no pause) or takes a few seconds is not really important IMO. A few seconds is still a world away from current loading times.



lexluthermiester said:


> Let's lose the name-calling altogether and we don't have to worry who it's aimed at.


It wasn't aimed at anyone at all; though I guess if you need it to be it was aimed at myself as a joke. I guess I should report myself?


----------



## notb (Mar 19, 2020)

ppn said:


> loading times are CPu limited most of the time.


Maybe on (some) PCs.

On consoles it's all about the drives. HDDs in PS4 and Xbox One are pretty slow in general. And it doesn't help that games got twice as big between 2015 and today.

I have the One S. Loading times for most games are atrocious.
Forza Motorsport 6 is the worst. It takes few minutes to launch the game and few more to start a race.
So if I just want to play a single race, I spend 10 minutes playing and 5-6 waiting.

So yeah, for me loading times are the biggest issue with consoles. I'm fine with current quality and 1080p even. RTRT is the only other change I care about.

Does it mean all this "awesome custom SSD" nonsense is needed? No. "Normal SSD" in the Xbox will be perfectly fine. But Sony probably didn't have other advantages to talk so much about...


----------



## lexluthermiester (Mar 19, 2020)

Valantar said:


> Based on ... lower GPU specs, lower CPU specs, a faster SSD, and slightly fancier 3D audio than the competition?


Maybe I read the Xbox specs wrong then. The PS5 specs seem higher... Maybe I should go reread..

EDIT; Xbox seems spec'd higher. My bad. Sometimes speed-reading has it's downsides..



Valantar said:


> It wasn't aimed at anyone at all; though I guess if you need it to be it was aimed at myself as a joke. I guess I should report myself?


It's all good.


----------



## T4C Fantasy (Mar 19, 2020)

ratirt said:


> Agree. I've played one game a shooter. I don't remember what it's called but I played for 5 min and without a mouse and a keyboard it's unplayable for me. If I were to get PS5 or XBOX, i would still need a pc to play First person shooter games. If you could connect a mouse and a keyboard to either PS5 or XBOX I would have been OK.











						How to Connect a Keyboard and Mouse to a PS4 - Make Tech Easier
					

Sony's PS 4 can be used with a keyboard and mouse without any adpator. Here is how you can connect a keyboard and mouse to PS4.




					www.google.com
				




There's no reason to believe you can't on the new consoles


----------



## lexluthermiester (Mar 19, 2020)

T4C Fantasy said:


> How to Connect a Keyboard and Mouse to a PS4 - Make Tech Easier
> 
> 
> Sony's PS 4 can be used with a keyboard and mouse without any adpator. Here is how you can connect a keyboard and mouse to PS4.
> ...


For me, It's going to be a deal breaker if not. There's no reason for a lack of KB&M support.


----------



## T4C Fantasy (Mar 19, 2020)

lexluthermiester said:


> For me, It's going to be a deal breaker if not. There's no reason for a lack of KB&M support.


For ff14 I use a controller on my PC, I have a g910 kb and g900 mouse, but I still use controller for that game, everyone plays things different, atleast in this respect a console and PC can use the same stuff.


----------



## lexluthermiester (Mar 19, 2020)

T4C Fantasy said:


> For ff14 I use a controller on my PC


And I agree with you for those kinds of games. For FPS, RTS and other types of games that need a KB&M for optimal control, that's what I'm talking about. ARPG's, RPG's, Adventure, platformers and many other types a controller is preferred!


----------



## MxPhenom 216 (Mar 19, 2020)

ARF said:


> GPU and CPU is one chip, it's an APU. No RAM, only the GDDR6 chips and large SSD.
> 
> But I agree, Sony needs healthy profit margins because it's not sustainable to sell at a loss all the time. They have other struggling divisions, too.
> 
> PS4 was much inferior technologically, so yes, it was quite normal to be cheaper.



PS4 until the Xbox One X came out was significantly stronger than the original Xbox One. Original Xbox One was bandwidth starved with that DDR3 (NOT GDDR3) memory and a super weak GPU.

Overall PS4 sales since launch of both consoles is still higher.

Both Microsoft and Sony usually sell their systems at a loss initially. They make it up with accessories, game licensing, subscriptions to services.



Jism said:


> I wonder why sony choosen for a 825GB model SSD while MS has a 1TB model. Is it perhaps due to overprovisioning and sony wanting to have the SSD a longer life then Microsoft wants?
> 
> Many of the tech details are just AMD IP. A Zen+ chip with a RDNA2 feature set GPU. Nothing special.
> 
> But it's good for AMD in this as well; its bound to sell millions of consoles with their hardware inside of it. The whole gaming ecosystem will be based upon AMD hardware.



Xbox users wont have full access to that 1TB though.



fynxer said:


> *XBOX FTW*, can hold 12TOPS continuously while PS5 boosts up to 10.3 which means you most of the time will get under well under 10TOPS from PS5
> 
> *XBOX will have aprox 20-25% faster GFX*
> 
> ...



LOL do you know how TFLOPS are calculated? Do you also know it only takes into consideration one operation by a compute chip? It doesnt tell the whole story. Its also primarily marketing for consoles since average console player doesnt know what the hell it is. Its rarely used as an actual way of marketting a chip when it comes to PCs, etc.

Overall system performance between the 2 systems, Xbox is really only about 10-15% more powerful. There's no "unlocking" boost on Xbox. They are already pushing thermals at this point.

Also making parts of your posts in bold, doesnt make you sound any smarter.


----------



## vega22 (Mar 19, 2020)

Reading the specs it seems Sony are chasing low latency, probably for vr reasons.

Time will tell


----------



## Master Tom (Mar 19, 2020)

For comparison reasons. My Radeon 64 Liquid has 13.66 TFLOPS.
But Navi is more efficient at calculating game graphics and will be much faster.


----------



## ratirt (Mar 19, 2020)

T4C Fantasy said:


> How to Connect a Keyboard and Mouse to a PS4 - Make Tech Easier
> 
> 
> Sony's PS 4 can be used with a keyboard and mouse without any adpator. Here is how you can connect a keyboard and mouse to PS4.
> ...


I thought you can get banned from the server if you use KBM on a PS FPS games when you play online.


Valantar said:


> Dude, _read what I'm writing _FFS. I did very explicitly not call _you _a douche-bro, I said that "git gud" is something PCMR douche-bros say to other people when they (the others, that is) complain. The joke (which obviously fell completely flat) was that if _I_ were a PCMR douche-bro the response rather than talking to you would be to say "git gud" and nothing else. Your reading comprehension needs serious work if you thought that was directed at you.
> 
> As for the rest, I never said you needed to use a controller at all, I just said that your outright dismissal of controllers at all based on what you yourself call one five-minute experience with one is a poor decision; I wouldn't try to convince you to use a controller in a competitive FPS, but there are plenty of other games where it's a far superior solution to kbm.
> 
> ...


That's the point. I don't care what you said. I shared MY preference with someone who has same experience about shooting and aiming wiht a PS controller. Not everybody's, MINE. Yet you write essays about how others like it different and that I'm inexperienced because I didn't try harder. Great. You can use whatever you want to play games I don't care. Use a joystick if you prefer that. I'm not gonna judge. Sharing my preference and what I like.



notb said:


> That depends what kind of fun you're after. Most console gamers are perfectly fine with aids - casual gaming is about the story/experience/relax rather than agility training and stress.
> 
> In fact, CS:GO afair wasn't a big hit on PS3 and wasn't ported to PS4. Yet, it remains one of most popular games on PCs.


I understand but that but that was not the point. I just said I PREFER KBM rather than a PS controller for FPS and yet our colleague @Valantar tells me how others like different. I dont care just shared what I like with someone who likes it just like I do. That's it. Nothing more to add. BTW: I play PS3 which I have and It is better to use PS controller for variety of games than a KBM. But FPS KBM only for me. If someone likes different that is fine with me


----------



## T4C Fantasy (Mar 19, 2020)

ratirt said:


> I thought you can get banned from the server if you use KBM on a PS FPS games when you play online.
> 
> That's the point. I don't care what you said. I shared MY preference with someone who has same experience about shooting and aiming wiht a PS controller. Not everybody's, MINE. Yet you write essays about how others like it different and that I'm inexperienced because I didn't try harder. Great. You can use whatever you want to play games I don't care. Use a joystick if you prefer that. I'm not gonna judge. Sharing my preference and what I like.



there is no ban on KB/M on consoles, i believe even ps3/360 can use them too.


----------



## ratirt (Mar 19, 2020)

T4C Fantasy said:


> there is no ban on KB/M on consoles, i believe even ps3/360 can use them too.


I know they can I used few times. I heard, you shouldnt be using because you can get banned because it is unfair to others playing with controller. I might be wrong it is just what I've heard. I don't play much on PS3 nowadays anyway.


----------



## T4C Fantasy (Mar 19, 2020)

ratirt said:


> I know they can I used few times. I heard, you shouldnt be using because you can get banned because it is unfair to others playing with controller. I might be wrong it is just what I've heard. I don't play much on PS3 nowadays anyway.


according to Terms of Use (according to what people read from it) its not bannable, i think its just all the fear mongering of people saying its cheating and that you'll get banned, but in practice i don't think anyone was banned for it unless actually cheating somehow (not in an advantage way) but a hacking way.


----------



## MxPhenom 216 (Mar 19, 2020)

T4C Fantasy said:


> according to Terms of Use (according to what people read from it) its not bannable, i think its just all the fear mongering of people saying its cheating and that you'll get banned, but in practice i don't think anyone was banned for it unless actually cheating somehow (not in an advantage way) but a hacking way.



It depends on the game im pretty sure. Like I think for Overwatch on consoles, if you use mouse and keyboard. You can be kicked or banned.


----------



## TheLostSwede (Mar 19, 2020)

__ https://twitter.com/i/web/status/1240328728716214273


----------



## DeathReborn (Mar 20, 2020)

Master Tom said:


> For comparison reasons. My Radeon 64 Liquid has 13.66 TFLOPS.
> But Navi is more efficient at calculating game graphics and will be much faster.



Yup, to further illustrate the point, Nvidia historically has worse TFLOPS but more FPS, GTX 1080 Ti for example has 11.34 TFLOPS and the (comparable to Vega 64) GTX 1080 has just 8.873 TFLOPS (both theoretical).


----------



## ratirt (Mar 20, 2020)

MxPhenom 216 said:


> It depends on the game im pretty sure. Like I think for Overwatch on consoles, if you use mouse and keyboard. You can be kicked or banned.


I thought that was the deal but I wasn't sure. When I've heard about it I simply stopped using KBM on the PS.


----------



## Assimilator (Mar 20, 2020)

DeathReborn said:


> Yup, to further illustrate the point, Nvidia historically has worse TFLOPS but more FPS, GTX 1080 Ti for example has 11.34 TFLOPS and the (comparable to Vega 64) GTX 1080 has just 8.873 TFLOPS (both theoretical).



It just goes to show, once again, that TFLOPS are a totally useless way of measuring anything. You might as well use clockspeed, or the number of air vents...


----------



## R-T-B (Mar 20, 2020)

ARF said:


> This is funny
> 
> I meant "The secret sauce here is that Sony is using its own protocol instead of NVMe, *in supporting 6 data priority tiers versus 2 on NVMe*."



So it has better QoS tiers?

I mean it's clear your a Sony fanboy, but the thing isn't even out yet to properly compare to NVMe.  I wouldn't bet on it beating a proper MLC drive, personally.


----------



## ARF (Mar 20, 2020)

R-T-B said:


> I mean it's clear your a Sony fanboy





I have got nothing that is Sony, excuse me!
And I am not planning to get anything Sony soon!


----------



## R-T-B (Mar 20, 2020)

ARF said:


> I have got nothing that is Sony, excuse me!
> And I am not planning to get anything Sony soon!



You are in love with Cell and all the tech slides.  Ownership is not a requirement to being a fan.

Sorry, "fanboy" may have been a bit harsh.


----------



## ARF (Mar 20, 2020)

R-T-B said:


> You are in love with Cell and all the tech slides.  Ownership is not a requirement to being a fan.
> 
> Sorry, "fanboy" may have been a bit harsh.



Everyone praised the Cell in 2005-2006. All the tech media, newspapers wrote very positive articles about it.
I just respect the uber state-of-the-art technology that is Cell and its high potential.


----------



## R-T-B (Mar 20, 2020)

ARF said:


> Everyone praised the Cell in 2005-2006. All the tech media, newspapers wrote very positive articles about it.
> I just respect the uber state-of-the-art technology that is Cell and its high potential.



In 2005-2006 it was a great way to eek the most out of a limited silicon process.  Don't get me wrong, Cell was marvelous back then, if you could code for it.


----------



## Vya Domus (Mar 20, 2020)

R-T-B said:


> I mean it's clear your a Sony fanboy, but the thing isn't even out yet to properly compare to NVMe.  I wouldn't bet on it beating a proper MLC drive, personally.



If your suggesting that it's not going to be the fastest thing in the world, yes, you'd be right, but it will be pretty close and for a ~500$ console that's pretty damn good. Sony's solution goes beyond just the actual storage, there's a lot of custom hardware in their chip that isn't present in normal PCs.


----------



## Valantar (Mar 20, 2020)

I don't think the QoS tiers of the Sony SSD has a very major effect, but the dedicated decompression hardware and DMA processor on the other hand sound like major performance enhancements. The less these data-juggling processes bog down the CPU, the faster the system will be overall - especially if/when said hardware is faster than using a CPU core for the same workload.


----------



## MxPhenom 216 (Mar 20, 2020)

R-T-B said:


> You are in love with Cell and all the tech slides.  Ownership is not a requirement to being a fan.
> 
> Sorry, "fanboy" may have been a bit harsh.



So? Cell is not a Sony technology....


----------



## Vya Domus (Mar 20, 2020)

MxPhenom 216 said:


> So? Cell is not a Sony technology....



It is a Sony technology, as much as it is an IBM one and a Toshiba one.


----------



## Super XP (Mar 21, 2020)

I am sure SMT will be a standard for the PS5 with an option to perhaps turn off or keep on? Just like the Xbox Series X.
Another gaming console looking at a price of about $499 MAX just like the XBox Series X. Unless of course they plan on selling none with a higher price tag.



fynxer said:


> *XBOX FTW*, can hold 12TOPS continuously while PS5 boosts up to 10.3 which means you most of the time will get under well under 10TOPS from PS5
> 
> *XBOX will have aprox 20-25% faster GFX*
> 
> ...


Xbox Series X looks like the better performer, and I agree. But with consoles, Developers try and utilize the entire system as a whole as efficiently as possible. Every single component will be efficiently utilized in both of these gaming consoles.



Vya Domus said:


> If your suggesting that it's not going to be the fastest thing in the world, yes, you'd be right, but it will be pretty close and for a ~500$ console that's pretty damn good. Sony's solution goes beyond just the actual storage, *there's a lot of custom hardware in their chip that isn't present in normal PCs*.


And that may be the differentiating factor between the PS5 & the XBox Series X, excluding game exclusives and such.


----------



## agentnathan009 (Mar 21, 2020)

ppn said:


> Im waiting for the Pro version of it. having this main IO chip on 12nm, and separate GPU CPU, means that whenever 5nm shrink of is available sony will integrate it very quickly. XSX can't do that, it is stuck on 7nm.



According to TMC, the 7nm node can be shrunk to 6nm without changing anything and gain 10-15% performance for the same power. Stop being a PS fanboy and go read about technology that you fail to understand...









						TSMC: Most 7nm Clients Will Transition to 6nm
					






					www.anandtech.com


----------



## Super XP (Mar 21, 2020)

ppn said:


> Im waiting for the Pro version of it. having this main IO chip on 12nm, and separate GPU CPU, means that whenever 5nm shrink of is available sony will integrate it very quickly. XSX can't do that, it is stuck on 7nm.


Next Generation Gaming Consoles coming Christmas 2020. If Sony chooses to launch an Pro version of a unannounced 5nm, you are looking at a 2023-2024 release? 7nm is more than enough, buy either have fun.


----------



## Valantar (Mar 21, 2020)

ppn said:


> Im waiting for the Pro version of it. having this main IO chip on 12nm, and separate GPU CPU, means that whenever 5nm shrink of is available sony will integrate it very quickly. XSX can't do that, it is stuck on 7nm.


There are no separate chips, both consoles are powered by monolithic APUs. Don't let block diagrams showing different components within a monolithic chip fool you into thinking they are separate pieces of silicon. Unless Sony plans on making a bunch of different versions (as in tens of versions or more) going chiplet based rather than monolithic is _much_ more expensive. And, well, this is a console, so there will be a single version, with a possible Pro down the line, but if so it will be a new monolithic design. The reason chiplets are cheaper for AMD in the PC space is that they can use them across a wide range of designs - from desktop to HEDT to server - saving the cost of designing many different pieces of silicon. This is not the case for a console.


----------



## rvalencia (Mar 22, 2020)

Vya Domus said:


> It is a Sony technology, as much as it is an IBM one and a Toshiba one.


Sony couldn't continue with CELL evolution without R&D resources from IBM.


----------



## Vayra86 (Mar 22, 2020)

ARF said:


> Everyone praised the Cell in 2005-2006. All the tech media, newspapers wrote very positive articles about it.
> I just respect the uber state-of-the-art technology that is Cell and its high potential.



That's all good but its a 15 year old chip by now


----------



## MicroUnC (Mar 24, 2020)

MxPhenom 216 said:


> So? Cell is not a Sony technology....



by IBM


----------



## rvalencia (Mar 25, 2020)

MxPhenom 216 said:


> PS4 until the Xbox One X came out was significantly stronger than the original Xbox One. Original Xbox One was bandwidth starved with that DDR3 (NOT GDDR3) memory and a super weak GPU.
> 
> Overall PS4 sales since launch of both consoles is still higher.
> 
> ...


RDNA CU includes ALU, TMU, TFU, SRAM and RT cores.  XSX's GPU TFLOPS increase and additional PC CPU node were matched by memory bandwidth increase.

Sony loaded PS5's 448 GB/s bandwidth (the same as 5700/5700 XT memory bandwidth) with additional PC CPU node and slightly higher GPU TFLOPS.

Try again.



DeathReborn said:


> Yup, to further illustrate the point, Nvidia historically has worse TFLOPS but more FPS, GTX 1080 Ti for example has 11.34 TFLOPS and the (comparable to Vega 64) GTX 1080 has just 8.873 TFLOPS (both theoretical).











						NVIDIA GeForce GTX 1080 Ti Founders Edition 11 GB Review
					

Today, NVIDIA released the GTX 1080 Ti, which is the company's fastest graphics card ever. It conclusively beats the much more expensive GTX Titan X in our testing. While the NVIDIA reference cooler looks amazing, its cooling potential could be improved, as our review shows.




					www.techpowerup.com
				



GTX 1080 Ti FE has 1,777 Mhz average clock speed, which yields about  12.74 TFLOPS.



R-T-B said:


> In 2005-2006 it was a great way to eek the most out of a limited silicon process.  Don't get me wrong, Cell was marvelous back then, if you could code for it.






GeForce 8800 is better.

CELL CPU =  half-assed in-order Atom like CPU.
CELL SPU = DSP, half-assed wannabe GPU

CELL = master of none.

I'm game for another round of PS3  vs Core 2 + GeForce 8800 debates


----------



## R-T-B (Mar 25, 2020)

rvalencia said:


> GeForce 8800 is better.



Which was also pretty good around the turn of the milenia...  you are missing the point.  My point was old hardware is old.



rvalencia said:


> I'm game for another round of PS3 vs Core 2 + GeForce 8800 debates



I'll raise you something modern and kick all your butts.


----------



## MxPhenom 216 (Mar 25, 2020)

rvalencia said:


> RDNA CU includes ALU, TMU, TFU, SRAM and RT cores.  XSX's GPU TFLOPS increase and additional PC CPU node were matched by memory bandwidth increase.
> 
> Sony loaded PS5's 448 GB/s bandwidth (the same as 5700/5700 XT memory bandwidth) with additional PC CPU node and slightly higher GPU TFLOPS.
> 
> ...



Try again?


----------



## Vayra86 (Mar 26, 2020)

I think a new definition of spreadsheet heroes is slowly taking shape here.


----------



## rvalencia (Mar 26, 2020)

MxPhenom 216 said:


> Try again?


It's for your following statement

_LOL do you know how TFLOPS are calculated? Do you also know it only takes into consideration one operation by a compute chip? It doesnt tell the whole story. Its also primarily marketing for consoles since average console player doesnt know what the hell it is. Its rarely used as an actual way of marketting a chip when it comes to PCs, etc. _

Your argument mirrors Mark Cerny's PS5 defense argument, but an increase in CU count also increases TMU, TFU, SRAM and RT cores.

XSX already scaling into RTX 2080 level results with two weeks raw Gears of War 5 port's built-in benchmark at PC's Ultra settings, hence TFLOPS is scaling.


----------



## Super XP (Mar 26, 2020)

rvalencia said:


> It's for your following statement
> 
> _LOL do you know how TFLOPS are calculated? Do you also know it only takes into consideration one operation by a compute chip? It doesnt tell the whole story. Its also primarily marketing for consoles since average console player doesnt know what the hell it is. Its rarely used as an actual way of marketting a chip when it comes to PCs, etc. _
> 
> ...


Based on what has been revealed by both Microsoft & Sony, the Xbox Series X is the faster console versus the PS5. Will that make a difference in actual gaming performance and visual quality? To me, it seems Microsoft took the brute force method of achieving high end performance without image quality loss, where as Sony took the non brute force method, which may require some clever development techniques to somehow utilize the entire platform to gain high end performance without image quality loss.

Not saying that they won't cleverly design games to fully utilize the Xbox Series X too, but at the end of all this, the XSX is the stronger console, Microsoft went all out, and seems a lot more serious over Sony this time around. And I don't blame them for such a decision, they need to gain as much market share as possible, get back to the Xbox 360 sales figures or beyond.

The only thing that will determine the success of the XSX is price. If they get that wrong, the SP5 is going to clober it.


----------



## Valantar (Mar 26, 2020)

Super XP said:


> Based on what has been revealed by both Microsoft & Sony, the Xbox Series X is the faster console versus the PS5. Will that make a difference in actual gaming performance and visual quality? To me, it seems Microsoft took the brute force method of achieving high end performance without image quality loss, where as Sony took the non brute force method, which may require some clever development techniques to somehow utilize the entire platform to gain high end performance without image quality loss.
> 
> Not saying that they won't cleverly design games to fully utilize the Xbox Series X too, but at the end of all this, the XSX is the stronger console, Microsoft went all out, and seems a lot more serious over Sony this time around. And I don't blame them for such a decision, they need to gain as much market share as possible, get back to the Xbox 360 sales figures or beyond.
> 
> The only thing that will determine the success of the XSX is price. If they get that wrong, the SP5 is going to clober it.


I kind of disagree - IMO pushing clocks for a limited size die is more of a brute force approach than sizing the die sensibly for the workload at hand. Regardless of Sony's fancy talk of balancing power draw etc., that they have such a high peak clock tells us quite clearly that they at some point in development (beyond the point of no return for hardware designs) realized they were significantly behind in power and decided to boost clocks to compensate.

As for Cerny's argument that a smaller, higher clocked GPU is "more nimble" - that's nonsense, plain and simple. If that was indeed the case, overclocking PC GPUs would yield higher-than-linear results, and lower-tier OC cards would outperform stock-clocked higher tier cards. What actually happens is that gains from OC's are almost universally far lower than the clock increase would indicate, with the typical recent examples being 2-3% average performance increases from 8-10% clock increases and 20-30% power increases. Sony is trying to put a positive spin on putting their money on a weaker chip.

Now, the difference between the consoles in compute power isn't massive by any means, and the PS5 will no doubt have excellent looking games, but cross-platform games will look better and/or sustain frame rates better, just like on the XOX vs. PS4 Pro.

The effects of Sony's other hardware investments (SSD and 3D Audio) will be very interesting to see, but I sincerely doubt they'll do anything to alleviate the performance bottleneck - slightly shorter loading times is good, but unless MS has really botched their implementation the difference won't be huge, and while I'm _really_ looking forward to games implementing a good, realistic 3D audio system, it won't be what makes or breaks a game. Of course a 15-20% performance advantage isn't likely to either, but it's more immediately noticeable. Then again Sony has such a mindshare advantage that they'll really need to botch this to not still come out ahead in terms of sales.


----------



## Super XP (Mar 26, 2020)

Valantar said:


> I kind of disagree - IMO pushing clocks for a limited size die is more of a brute force approach than sizing the die sensibly for the workload at hand. Regardless of Sony's fancy talk of balancing power draw etc., that they have such a high peak clock tells us quite clearly that they at some point in development (beyond the point of no return for hardware designs) realized they were significantly behind in power and decided to boost clocks to compensate.
> 
> As for Cerny's argument that a smaller, higher clocked GPU is "more nimble" - that's nonsense, plain and simple. If that was indeed the case, overclocking PC GPUs would yield higher-than-linear results, and lower-tier OC cards would outperform stock-clocked higher tier cards. What actually happens is that gains from OC's are almost universally far lower than the clock increase would indicate, with the typical recent examples being 2-3% average performance increases from 8-10% clock increases and 20-30% power increases. Sony is trying to put a positive spin on putting their money on a weaker chip.
> 
> ...


*Interesting.* So you actually think Sony's announced clock speeds were not that high? Because they seen the XBox Series X specifications and went WOW, we have a problem on our hands people? We completely underestimated Microsoft's XSX specs. And since it's too late to re-design our special PS5 console, lets jack up the clocks higher and hope by the time its released, we gain an attractive performance/ energy consumption/ balance.

*I AGREE. *


----------



## Valantar (Mar 26, 2020)

Super XP said:


> *Interesting.* So you actually think Sony's announced clock speeds were not that high? Because they seen the XBox Series X specifications and went WOW, we have a problem on our hands people? We completely underestimated Microsoft's XSX specs. And since it's too late to re-design our special PS5 console, lets jack up the clocks higher and hope by the time its released, we gain an attractive performance/ energy consumption/ balance.
> 
> *I AGREE. *


Pretty much, yes. They obviously knew well before the announcement (or at least had a solid idea of ballpark performance), as they otherwise wouldn't have had time to actually test power/cooling/etc. for an announcement (disregarding the fact that it was recorded days if not weeks before, demonstrated by some media outlets being allowed to see it early). But yes, I do believe they initially ordered what they saw as a powerful chip at an acceptable price for selling the console (especially when accounting for the additional cost of a bespoke 12-channel SSD controller and several accompanying bespoke silicon designs going into the APU), got word that MS' chip was ~33% faster (12TF v. 9TF), and kind of went "Oh shi-".


----------



## MxPhenom 216 (Mar 26, 2020)

Valantar said:


> Pretty much, yes. They obviously knew well before the announcement (or at least had a solid idea of ballpark performance), as they otherwise wouldn't have had time to actually test power/cooling/etc. for an announcement (disregarding the fact that it was recorded days if not weeks before, demonstrated by some media outlets being allowed to see it early). But yes, I do believe they initially ordered what they saw as a powerful chip at an acceptable price for selling the console (especially when accounting for the additional cost of a bespoke 12-channel SSD controller and several accompanying bespoke silicon designs going into the APU), got word that MS' chip was ~33% faster (12TF v. 9TF), and kind of went "Oh shi-".



If Sony's game library is better like it was with PS4, performance won't even matter. People with a gaming PC have almost zero reason to have an Xbox since all those games will be playable on PC. All of them.


----------



## Super XP (Mar 26, 2020)

MxPhenom 216 said:


> If Sony's game library is better like it was with PS4, performance won't even matter. People with a gaming PC have almost zero reason to have an Xbox since all those games will be playable on PC. All of them.


Not necessarily true. And not sure why people continue comparing gaming PCs with consoles. They both serve completely different markets. 
So far the PS5 will only be backwards compatible with PS4 games. With no added visual enhancements. 
The XSX will be backwards compatible with all previous Xbox games regardless of system and will also have image quality enhancements and resolution upscaling capabilities. 

Sure Sony has a large library of exclusives but Microsoft has been busy buying up game companies and has been steadily increasing its exclusive library. 

And once again, stop comparing PCs with Consoles, as soon as one does this, the argument becomes mute.


----------



## MxPhenom 216 (Mar 26, 2020)

Super XP said:


> Not necessarily true. And not sure why people continue comparing gaming PCs with consoles. They both serve completely different markets.
> So far the PS5 will only be backwards compatible with PS4 games. With no added visual enhancements.
> The XSX will be backwards compatible with all previous Xbox games regardless of system and will also have image quality enhancements and resolution upscaling capabilities.
> 
> ...



Just because they have been buying up game companies doesn't mean they will make games people actually want to play.

If you buy a console for games, and you have a PC for gaming. Where is the invalidity of that comparison? Especially when consoles tech is much closer now to what youll get in a PC than previous generations.


----------



## Super XP (Mar 27, 2020)

MxPhenom 216 said:


> Just because they have been buying up game companies doesn't mean they will make games people actually want to play.
> 
> If you buy a console for games, and you have a PC for gaming. Where is the invalidity of that comparison? Especially when consoles tech is much closer now to what youll get in a PC than previous generations.


Microsoft is crossing its T's and dotting its I's this time around, they know how bad they messed up with the XBOX One and they will not repeat the same mistakes. Buying up game companies and asking them to design games based on market perception and feedback is a positive move IMO.
* Consoles this time around will be powerful with a ZEN3 processor & RDNA2 graphics. Consoles will always remain behind PC Gaming no matter how powerful anybody tried to make them. It's a common fact that is indisputable. But that's alright this generation will allow even better graphics and pushing game design boundaries for the total PC & Console gaming industry.

The difference is playing side by side with friends and family in the living room, sitting in a comfortable couch in front of a big screen 4K HDTV. You can't do that with a Gaming PC, which is why a Gaming PC is a lot more personal. They shouldn't be compared to one another as I've stated because both serve different market segments.

A select few managed to setup a gaming PC in the living room, and most likely are living alone or have no kids.

Forgot to mention, the one brilliance of the Xbox Series X is the compatibility & upscaling piece of the puzzle. Compatible with all past XBOX games that will clean up image quality and upscale. Not sure if they can achieve this but it would be a game changer for many if they can.


----------



## Xuper (Mar 30, 2020)

PS5 is not based on RDNA2 just RDNA with ray-racing tech.


----------



## Valantar (Mar 30, 2020)

Xuper said:


> PS5 is not based on RDNA2 just RDNA with ray-racing tech.


Source? Digital Foundry disagrees with you at least. And they do tend to have good sources and accurate reporting.


----------



## Xuper (Mar 30, 2020)

Valantar said:


> Source? Digital Foundry disagrees with you at least. And they do tend to have good sources and accurate reporting.











						次世代コンソールゲーム機 「プレイステーション 5」に名称決定 2020年年末商戦期に発売
					

ソニー・インタラクティブエンタテインメント（SIE）は、次世代コンソールゲーム機の名称を「プレイステーション 5」（PS5）に決定し、2020年の年末商戦期に発売することをお知らせいたします。



					www.sie.com


----------



## Valantar (Mar 30, 2020)

Xuper said:


> 次世代コンソールゲーム機 「プレイステーション 5」に名称決定 2020年年末商戦期に発売
> 
> 
> ソニー・インタラクティブエンタテインメント（SIE）は、次世代コンソールゲーム機の名称を「プレイステーション 5」（PS5）に決定し、2020年の年末商戦期に発売することをお知らせいたします。
> ...


That might be a typo, copywriting error, mistranslation, or a host of other errors (or just someone assuming that RDNA 2 is accurately described by "RDNA-based"). Until proven otherwise I trust what they tell the press more than a single spec sheet. And the press were told RDNA 2 (unless every single press outlet in attendance heard "RDNA-based" and wrote RDNA 2). You might obviously be right, but for now that's the least reasonable assumption.


----------



## Xuper (Mar 30, 2020)

Valantar said:


> That might be a typo, copywriting error, mistranslation, or a host of other errors (or just someone assuming that RDNA 2 is accurately described by "RDNA-based"). Until proven otherwise I trust what they tell the press more than a single spec sheet. And the press were told RDNA 2 (unless every single press outlet in attendance heard "RDNA-based" and wrote RDNA 2). You might obviously be right, but for now that's the least reasonable assumption.



Sony hasn't confirmed it yet.









						Playstation 5 [PS5] [Release November 12 2020]
					

the PS5 hardware seems very great...except for that bandwidth...if they can just get up to high 400's that would be fine. But to have the same amount of resources as the equivalent RDNA 1 GPU split between the CPU and GPU both with 448...it feels artificially limiting in the same way Pro's...




					forum.beyond3d.com
				




Start from post 906.Does PS5 have VRS ? VRS is a RDNA2 feature.If PS5 doesn't have it then it's just RDNA 1 with just some feature from RDNA2 / some Tech for Ray-tracing.

Why does AMD mention MS not Sony (LINK) ? The lack of info from Sony could be translated into bad news.If you look at PS5 number vs XBSX, PS5 is a failure in many respects to XBSX.



Spoiler





__ https://twitter.com/i/web/status/1244617320875638785





Spoiler





__ https://twitter.com/i/web/status/1216473659281375239


Not my twitter


----------



## Super XP (Mar 31, 2020)

Xuper said:


> PS5 is not based on RDNA2 just RDNA with ray-racing tech.


Umm no it's not, it's based on RDNA2. Both consoles are based on a customized RDNA2.
Anybody claiming it's based on RDNA1 doesn't know what they are talking about.

The problem is people and companies are calling RDNA2 simply as RDNA or next gen RDNA.


----------



## Rahnak (Apr 2, 2020)

PlayStation 5 uncovered: the Mark Cerny tech deep dive
					

On March 18th, Sony finally broke cover with in-depth information on the technical make-up of PlayStation 5. Expanding …




					www.eurogamer.net
				




Deep dive on Sony's technical presentation by DF. Interesting read.


----------



## Chomiq (Apr 2, 2020)

Xuper said:


> Sony hasn't confirmed it yet.
> 
> 
> 
> ...


From the above deep dive:


> Both Sony and AMD have confirmed that PlayStation 5 uses a custom RDNA 2-based graphics core, but the recent DirectX 12 Ultimate reveal saw AMD confirm features that Sony has not, including variable rate shading


----------



## Valantar (Apr 2, 2020)

Chomiq said:


> From the above deep dive:


Yep, as I mentioned above, I would be _very_ surprised if DF was wrong about this.

It's also worth mentioning that just because Sony hasn't mentioned something (yet) doesn't mean they don't have it. Their PR department seems to have had a collective aneurysm, at least judging by what's coming from that side lately.


----------



## Rahnak (Apr 2, 2020)

Yeah, Sony's been too secretive about their system given that Microsoft has been showering the public with information about theirs for a good while now. It might be detrimental in the long run as Microsoft is taking in most of the hype.


----------



## Chomiq (Apr 2, 2020)

MS can use all them fancy DXR and VRS because they're the ones implementing it in DX12.


----------



## Rahnak (Apr 2, 2020)

Is VRS a DX feature? I looked it up but didn't find any concrete information. There's considerable Microsoft documentation about it that would make me think so, but nvidia's website says it's compatible with DX11, DX12, OpenGL and Vulkan.


----------



## Xuper (Apr 2, 2020)

Chomiq said:


> From the above deep dive:



Still too many rumour.I'll wait



Rahnak said:


> Is VRS a DX feature? I looked it up but didn't find any concrete information. There's considerable Microsoft documentation about it that would make me think so, but nvidia's website says it's compatible with DX11, DX12, OpenGL and Vulkan.











						Variable-rate shading (VRS) - Win32 apps
					

Variable-rate shading—or coarse pixel shading—is a mechanism that lets you allocate rendering performance/power at rates that vary across your rendered image.



					docs.microsoft.com


----------



## Valantar (Apr 2, 2020)

Rahnak said:


> Is VRS a DX feature? I looked it up but didn't find any concrete information. There's considerable Microsoft documentation about it that would make me think so, but nvidia's website says it's compatible with DX11, DX12, OpenGL and Vulkan.


There are different implementations of the concept, one is through DX12_2 (Ultimate). Nothing stopping others from making their own solution, but MS has it ready to go.


----------



## Xuper (Apr 5, 2020)

AMD Big Navi and RDNA 2 GPUs: Everything We Know
					

The AMD Big Navi / RDNA 2 architecture powers the latest consoles and high-end graphics cards.




					www.tomshardware.com
				






> What do the console specifications mean for Big Navi / Navi 2x and RDNA 2 desktop GPUs? Obviously times change, but we definitely know a few things. *First, AMD is fully capable of building an RDNA 2 / Big Navi GPU with at least 52 CUs, and very likely can go higher*. AMD is also using two completely different GPU configurations for the Xbox Series X and PlayStation 5, though that doesn't mean either configuration will actually end up in a PC graphics card. Sony's Mark Cerny was quick to point out that there's some undisclosed 'special sauce' in the PS5 processor, for example. Basically, the upcoming consoles give us a minimum baseline for what AMD can do with Big Navi.



This buzzword.


----------



## rvalencia (Apr 7, 2020)

Xuper said:


> AMD Big Navi and RDNA 2 GPUs: Everything We Know
> 
> 
> The AMD Big Navi / RDNA 2 architecture powers the latest consoles and high-end graphics cards.
> ...


XSX has 52 CU with four disabled CUs for yield issues. XT variant would be 56 CU.


----------



## Valantar (Apr 7, 2020)

rvalencia said:


> XSX has 52 CU with four disabled CUs for yield issues. XT variant would be 56 CU.


Why? The XSX is a semi custom APU, not a dGPU. There's no reason to assume AMD will release the same configuration as a dGPU. In fact it's quite unlikely as designing a new chip from scratch would easier than separating out the GPU part from that APU. The architecture is modular and scalable, so they can configure it to have as many CUs as they think suitable. There was never a dGPU version of the XOX GPU.


----------



## rvalencia (Apr 8, 2020)

Valantar said:


> Why? The XSX is a semi custom APU, not a dGPU. There's no reason to assume AMD will release the same configuration as a dGPU. In fact it's quite unlikely as designing a new chip from scratch would easier than separating out the GPU part from that APU. The architecture is modular and scalable, so they can configure it to have as many CUs as they think suitable. There was never a dGPU version of the XOX GPU.


HD 7790 was close to XBO's GPU.

X1X dev kits have 44 CUs enabled.


----------



## Valantar (Apr 8, 2020)

rvalencia said:


> HD 7790 was close to XBO's GPU.
> 
> X1X dev kits have 44 CUs enabled.


...so? Just because there exist many different hardware configurations across consoles and PC GPUs doesn't _whatsoever _indicate that there will be a PC GPU with the same layout or CU count. As you say, the HD 7790 was _close to_ the XBO GPU - but not the same! And as CU counts grow, the likelihood of matching layouts drops. Back when 14 CUs was relevant, it made sense that both the XBO and a dGPU had that many. Now, with the XSX at 52 (56, 4 disabled) and AMD removing their previous hard architectural limit of 64 CUs when launching RDNA, the closest PC SKU might have 40, 50, 60, whatever. Of course it _could_ end up with 56 or 52, but that depends on how performance scales and AMD wants to segment their product stack. The point is that you are assuming a causal relation here that does not exist. The XSX GPU design is entirely separate from any related and similarly sized RDNA 2 PC GPU design; it was made by a different division and customized according to Microsoft's wishes. Expecting a similarly sized PC GPU makes sense - there's room for it in their product stack - but assuming a direct relation like saying "The XSX has 52 CUs, an XT version might have 56" as if they were the same silicon design makes _no _sense whatsoever.


----------



## Dmu (Apr 8, 2020)

What do you guys think of the new controller for the PS5 ?

IT's really looking Xbox alike.


----------



## Valantar (Apr 8, 2020)

Dmu said:


> What do you guys think of the new controller for the PS5 ?
> 
> IT's really looking Xbox alike.


IMO the visual design is way too "look how futuristic I am" - looks like a prop from some mid-budget sci-fi TV show. If this is any indication of the console design, I'm kind of pessimistic (frankly it makes me expect the console to look like a squished stormtrooper helmet). Doesn't work for me, and sadly the colours and lighting in the photos serve to hide the physical design of the controller, making it difficult to tell how it's sculpted. At least it looks somewhat like it was designed for actual human hands, unlike previous Playstation controllers, so it ought to be an improvement no matter what. I still think it won't match the ergonomics of the XB1 controller (curious how the smaller size of the XSX controller will affect that impression), but it will undoubtedly be better.


----------



## Super XP (Apr 8, 2020)

Any digitization like a screen built into the game controllers are most likely borrowed by the highly innovative Sega Dreamcast, which was years ahead of its time. And fighting between Sega of Japan vs. Sega of America caused poor console decisions etc., 


			Google Image Result for https://www.digitalgamemuseum.org/wp-content/uploads/2012/09/Exhibit020.jpg
		

Not sure if this version was ever official. 


			Redirect Notice


----------



## Valantar (Apr 8, 2020)

Super XP said:


> Any digitization like a screen built into the game controllers are most likely borrowed by the highly innovative Sega Dreamcast, which was years ahead of its time. And fighting between Sega of Japan vs. Sega of America caused poor console decisions etc.,
> 
> 
> Google Image Result for https://www.digitalgamemuseum.org/wp-content/uploads/2012/09/Exhibit020.jpg
> ...


There's no screen. Just haptic triggers (kind of fun to see PS fans crowing over this "new" and "exciting" feature that Xbox users have had since 2013) and some vaguely improved haptic feedback (akin to Nintendo's HD Rumble maybe?). And the "share" button is now a "create" button, which... is the same? I mean, you need to create something (a screenshot, video, etc.) to share anyhow. We'll see how the software pans out. Also a moved light bar, integrated microphone, and overall rounded design.


----------



## Super XP (Apr 8, 2020)

Valantar said:


> There's no screen. Just haptic triggers (kind of fun to see PS fans crowing over this "new" and "exciting" feature that Xbox users have had since 2013) and some vaguely improved haptic feedback (akin to Nintendo's HD Rumble maybe?). And the "share" button is now a "create" button, which... is the same? I mean, you need to create something (a screenshot, video, etc.) to share anyhow. We'll see how the software pans out. Also a moved light bar, integrated microphone, and overall rounded design.


I know I was referring to a few rumored models they have online. Which claim Sony might release a controller with a LED screen.
Sony Patent. 








						Playstation 5's new Dualshock design leaked? » E-Sports Continental
					

In this game news article, we discuss the leaked design sketches of the rumoured PS5 Dualshock 5 controller set to be released in Holiday 2020.



					esportscontinental.com


----------



## rvalencia (Apr 16, 2020)

Valantar said:


> ...so? Just because there exist many different hardware configurations across consoles and PC GPUs doesn't _whatsoever _indicate that there will be a PC GPU with the same layout or CU count. As you say, the HD 7790 was _close to_ the XBO GPU - but not the same! And as CU counts grow, the likelihood of matching layouts drops. Back when 14 CUs was relevant, it made sense that both the XBO and a dGPU had that many. Now, with the XSX at 52 (56, 4 disabled) and AMD removing their previous hard architectural limit of 64 CUs when launching RDNA, the closest PC SKU might have 40, 50, 60, whatever. Of course it _could_ end up with 56 or 52, but that depends on how performance scales and AMD wants to segment their product stack. The point is that you are assuming a causal relation here that does not exist. The XSX GPU design is entirely separate from any related and similarly sized RDNA 2 PC GPU design; it was made by a different division and customized according to Microsoft's wishes. Expecting a similarly sized PC GPU makes sense - there's room for it in their product stack - but assuming a direct relation like saying "The XSX has 52 CUs, an XT version might have 56" as if they were the same silicon design makes _no _sense whatsoever.


Don't expect miracles over PC counterpart.


----------



## Super XP (Apr 17, 2020)

Valantar said:


> ...so? Just because there exist many different hardware configurations across consoles and PC GPUs doesn't _whatsoever _indicate that there will be a PC GPU with the same layout or CU count. As you say, the HD 7790 was _close to_ the XBO GPU - but not the same! And as CU counts grow, the likelihood of matching layouts drops. Back when 14 CUs was relevant, it made sense that both the XBO and a dGPU had that many. Now, with the XSX at 52 (56, 4 disabled) and AMD removing their previous hard architectural limit of 64 CUs when launching RDNA, the closest PC SKU might have 40, 50, 60, whatever. Of course it _could_ end up with 56 or 52, but that depends on how performance scales and AMD wants to segment their product stack. The point is that you are assuming a causal relation here that does not exist. The XSX GPU design is entirely separate from any related and similarly sized RDNA 2 PC GPU design; it was made by a different division and customized according to Microsoft's wishes. Expecting a similarly sized PC GPU makes sense - there's room for it in their product stack - but assuming a direct relation like saying "The XSX has 52 CUs, an XT version might have 56" as if they were the same silicon design makes _no _sense whatsoever.


Agreed. XBOX Series X is a Customized RDNA2 & ZEN2 combined solution. (So is the PS5, pure Customization to suit the Vender that's buying these chips).
Nobody knows how it differs from just buying an equivalent ZEN2 & RDNA2 discrete GPU. They won't be the same in other words.


----------

