# Microsoft Confirms Xbox Series X Specs - 12 TFLOPs, Custom APU With Zen 2, RDNA 2, H/W Accelerated Raytracing



## Raevenlord (Feb 24, 2020)

Microsoft has confirmed the official specs for the Xbox Series X games console, due Holiday 2020 (think November). The new specs announcement confirms the powerhouse of a console this will be, with its peak 12 TFLOPs compute being 8 times that of the original Xbox One, and twice that of the Xbox One X, which already quite capable of powering true 4K experiences. This 12 TFLOPs figure is a mighty impressive one - just consider that AMD's current highest-performance graphics card, Radeon VII, features a peak 13.4 TFLOPs of computing power - and that's a graphics card that was launched just a year ago. 

The confirmation also mentions support for Hardware-Accelerated raytracing, something that all but confirms the feature being built into AMD's RDNA 2 microarchitecture (of which we are expecting news anytime now). this, alongside Variable Rate Shading (VRS) support, brings AMD to feature parity with NVIDIA's Turing, and should allow developers to optimize their performance and graphical targets without any discernible quality loss. 



 



Another very important metric, however, is still unknown: pricing. But with the new system featuring so many pieces of top-of-the-line technology, including SSD storage, this won't be a cheap endeavor. With Sony also keeping its cards close to its chest in regards to PS5 pricing (and the company even canceling their appearances at PAX East and GDC), we'll have to wait and see how interesting these systems really are from a value standpoint.

However, I have to throw my two cents in here: Microsoft's alleged approach towards releasing a top-tier (and top pricing) console in the Xbox Series X alongside a cheaper, more nimble system does paint the next-gen scenario as more positive for Microsoft than Sony, should the Japanese company choose to release a single, premium system (which some are saying hardware parts costs are set at $450).

*Features of XBOX Series X:*

*Next Generation Custom Processor:* Xbox Series X is our most powerful console ever powered by our custom designed processor leveraging AMD's latest Zen 2 and RDNA 2 architectures. Delivering four times the processing power of an Xbox One and enabling developers to leverage 12 TFLOPS of GPU (Graphics Processing Unit) performance - twice that of an Xbox One X and more than eight times the original Xbox One. Xbox Series X delivers a true generational leap in processing and graphics power with cutting edge techniques resulting in higher framerates, larger, more sophisticated game worlds, and an immersive experience unlike anything seen in console gaming.
*Variable Rate Shading (VRS):* Our patented form of VRS empowers developers to more efficiently utilize the full power of the Xbox Series X. Rather than spending GPU cycles uniformly to every single pixel on the screen, they can prioritize individual effects on specific game characters or important environmental objects. This technique results in more stable frame rates and higher resolution, with no impact on the final image quality.
*Hardware-accelerated DirectX Raytracing: *You can expect more dynamic and realistic environments powered by hardware-accelerated DirectX Raytracing - a first for console gaming. This means true-to-life lighting, accurate reflections and realistic acoustics in real time as you explore the game world.
*SSD Storage:* With our next-generation SSD, nearly every aspect of playing games is improved. Game worlds are larger, more dynamic and load in a flash and fast travel is just that - fast.
*Quick Resume:* The new Quick Resume feature lets you continue multiple games from a suspended state almost instantly, returning you to where you were and what you were doing, without waiting through long loading screens.
*Dynamic Latency Input (DLI):* We're optimizing latency in the player-to-console pipeline starting with our Xbox Wireless Controller, which leverages our high bandwidth, proprietary wireless communication protocol when connected to the console. With Dynamic Latency Input (DLI), a new feature which synchronizes input immediately with what is displayed, controls are even more precise and responsive.
*HDMI 2.1 Innovation:* We've partnered with the HDMI forum and TV manufacturers to enable the best gaming experience through features such as Auto Low Latency Mode (ALLM) and Variable Refresh Rate (VRR). ALLM allows Xbox One and Xbox Series X to automatically set the connected display to its lowest latency mode. VRR synchronizes the display's refresh rate to the game's frame rate, maintaining smooth visuals without tearing. Ensuring minimal lag and the most responsive gaming experience.
*120 fps Support:* With support for up to 120 fps, Xbox Series X allows developers to exceed standard 60 fps output in favor of heightened realism or fast-paced action.

*View at TechPowerUp Main Site*


----------



## Cheeseball (Feb 24, 2020)

Hmm... 12 TFLOPS eh? For comparison, the RTX 2080 Ti pushes up to 11 TFLOPS (at FP32) non-boost with the Titan RTX at 12 TFLOPS. This is not bad from a gaming standpoint.


----------



## cucker tarlson (Feb 24, 2020)

Niiiiiceeeee



Cheeseball said:


> Hmm... 12 TFLOPS eh? For comparison, the RTX 2080 Ti pushes up to 11 TFLOPS (at FP32)


13.5/16.3


----------



## Cheeseball (Feb 24, 2020)

cucker tarlson said:


> 13.5/16.3





Cheeseball said:


> ... non-boost ...


----------



## dj-electric (Feb 24, 2020)

TFLOPS became de-facto marketing term by now. Hollow and meaningless as they come.


----------



## cucker tarlson (Feb 24, 2020)

Cheeseball said:


>


I don't know what is the reason for looking at base clock but ok


anyway

5700xt=2070
9.75tflop=7.5tflop

to get 12 tflop rdna1 power turing would need 9.2tflop,which is around 2070 super FE (9.1)

2080 super is 11 tflops for comparison

consoles getting a 2070 super - this is really good,this card pretty much rips through 1440p ultra ATM.If the price is $500 it's really good for current gen.Dunno about ampere,with 7nm you may get current 2070 super performance at $350.


----------



## Joss (Feb 24, 2020)

> due Holiday 2020 (think November)


That's if the Coronavirus situation is contained by then.


----------



## shk021051 (Feb 24, 2020)

monster


----------



## Cheeseball (Feb 24, 2020)

cucker tarlson said:


> I don't know what is the reason for looking at base clock but ok



Well, you're not wrong at least.  

120 FPS at 1080p (its not stated if that figure is at 4K or what, but I doubt it) at console-level is welcome. Now they just need to make mainstream HDTVs (e.g. less than $600) that have real 120+ Hz panels. Alternative is just to hook the console up to a monitor and route the audio to external speakers.


----------



## Xuper (Feb 24, 2020)

MS is using TF to avoid any speculation on RDNA2. It's simple.


----------



## Durvelle27 (Feb 24, 2020)

I'm definitely getting one on day one


----------



## erek (Feb 24, 2020)

dj-electric said:


> TFLOPS became de-facto marketing term by now. Hollow and meaningless as they come.



remember the RV770?  Was pretty sweet on the TFlop


----------



## efikkan (Feb 24, 2020)

Let's hope this is 12 Tflop fp32, not fp16 (or something else). RX 5700 XT does 9.6 Tflop fp32 at 225W.


----------



## HD64G (Feb 24, 2020)

12TFs means easy 120FPS@1440P and 60FPS@4K with high-very high detail for modern and upcoming titles. And it leads us to assume that the RDNA2 GPU in it has 52 or 56CUs clocked around 1,4-1,5GHz (close to the optimum point of its efficiency curve for better thermals-noise). Monstrous console and having fully backwards compatibilty makes it appealing indeed if priced between $600-700 and a bargain for less.


----------



## Rowsol (Feb 24, 2020)

This is a massive jump in performance and I can't wait to see what developers are able to do with it.


----------



## THANATOS (Feb 24, 2020)

cucker tarlson said:


> I don't know what is the reason for looking at base clock but ok
> 
> 
> anyway
> ...


You should look at the real average clockspeed.
RTX 2070 - 1862Mhz -> 8.6 TFLOPs (100%) https://www.techpowerup.com/review/nvidia-geforce-rtx-2070-founders-edition/37.html
RTX 2070 Super - 1879Mhz -> 9.6 TFLOPs (111.6%) https://www.techpowerup.com/review/nvida-geforce-rtx-2070-super/33.html
RX 5700 XT - 1887Mhz -> 9.7 TFLOPs (112.8%) https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/34.html

Another thing is that this is based on RDNA2 and we don't know how much better It's compared to the first RDNA architecture.


----------



## BArms (Feb 24, 2020)

TFLOP figure includes both CPU and GPU figures right?  Presumably the GPU might only be 10TFLOP + 2 from the CPU, not that you would use the CPU for floating point that much but it's good enough for marketing.


----------



## cucker tarlson (Feb 24, 2020)

BArms said:


> TFLOP figure includes both CPU and GPU figures right?  Presumably the GPU might only be 10TFLOP + 2 from the CPU, not that you would use the CPU for floating point that much but it's good enough for marketing.


3700x is 0.5tflop


----------



## medi01 (Feb 24, 2020)

Cheeseball said:


> Hmm... 12 TFLOPS eh? For comparison, the RTX 2080 Ti pushes up to 11 TFLOPS (at FP32) non-boost with the Titan RTX at 12 TFLOPS. This is not bad from a gaming standpoint.


Cross platform TF comparison is highly misleading, when talking about gaming perf.

5700XT is RDNA and 9.7Tflops.
RDNA2 chip at 12T should be *about 20%+ faster, so around 2080.*

Next gen consoles are going to be insane value.

PS
It is also curious that AMD is sitting on a 2080 perf level 320-330mm2 chip and not releasing it.


----------



## Fluffmeister (Feb 24, 2020)

Nice specs for sure, shame it's still at least 9 months away apparently. It will be good when games start taking advantage of tech like VRS and DXR.

Turing users are already set, otherwise it's case of getting a discrete RDNA2 card or a shiny new Ampere card.


----------



## THANATOS (Feb 24, 2020)

cucker tarlson said:


> 3700x is 0.5tflop


From what I found It's actually twice as much. 
8cores * 4.4Ghz * 32FLOPs -> 1,13 TFLOPs


----------



## Chomiq (Feb 24, 2020)

I'm most happy about the fact that backwards compatibility is one of the main selling factors. Wish that Sony would follow in their footsteps.


----------



## Lindatje (Feb 24, 2020)

Now Ray Tracing will really get off the ground, hope Nvidia can still keep up with Ray Tracing as the games are going to be optimized for AMD.


----------



## Fluffmeister (Feb 24, 2020)

Lindatje said:


> Now Ray Tracing will really get off the ground, hope Nvidia can still keep up with Ray Tracing as the games are going to be optimized for AMD.



Keep up? They are waiting for AMD to support it too, this is good all round.

Besides the myth that the current gen consoles would help AMD storm ahead didn't come to shit either.


----------



## Chomiq (Feb 24, 2020)

Fluffmeister said:


> Keep up? They are waiting for AMD to support it too, this is good all round.
> 
> Besides the myth that the current gen consoles would help AMD storm ahead didn't come to shit either.


2018:








						Nvidia: These 11 Titles First to Support RTX Ray Tracing
					

Nvidia's RTX 2000-series cards offer promising Ray Tracing and AI-based feature upgrades. But beyond a short list of games, there's plenty we don't know yet.




					www.tomshardware.com
				



Late 2019:


			https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/
		


So they went from 11 to 10 titles that support SOME of the RTX features. Remind me please, what's Nvidia's GPU marketshare? Sure, they're waiting for AMD to keep up...


----------



## TheDeeGee (Feb 24, 2020)

Joss said:


> That's if the Coronavirus situation is contained by then.



Doubt it at this rate, no cure being tested on people until late April.


----------



## Fluffmeister (Feb 24, 2020)

Chomiq said:


> 2018:
> 
> 
> 
> ...



Last time I checked the only hardware to support either VRS or DXR was Turing, but you are all correct Nvidia are once again doomed and playing catch-up.


----------



## Assimilator (Feb 24, 2020)

Remember when people bought consoles to play games on? Imagine if the marketing focused on the games that this console will let you play. Crazy suggestion, I know!



Chomiq said:


> 2018:
> 
> 
> 
> ...



If RTX is so irrelevant then why is AMD supporting it?


----------



## ARF (Feb 24, 2020)

medi01 said:


> Cross platform TF comparison is highly misleading, when talking about gaming perf.
> 
> 5700XT is RDNA and 9.7Tflops.
> RDNA2 chip at 12T should be *about 20%+ faster, so around 2080.*
> ...



Around RTX 2080 Super.
That's a custom chip made for MIcrosoft, they can't release it as a graphics card for gaming.
That's Navi 21 505 mm2 job.



TheDeeGee said:


> Doubt it at this rate, no cure being tested on people until late April.



China has just reported that they have a vaccine.



Fluffmeister said:


> Keep up? They are waiting for AMD to support it too, this is good all round.
> 
> Besides the myth that the current gen consoles would help AMD storm ahead didn't come to shit either.



They do help but Nvidia uses other things to uplift its graphics performance - it's like AMD uses High quality settings natively in the Radeon Settings, while Nvidia uses Performance settings in the Control panel.



Assimilator said:


> If RTX is so irrelevant then why is AMD supporting it?



Because it's a joint effort of everyone in the industry, including Microsoft and AMD.
I don't know why Nvidia hurried so much, when the hardware is not ready and the RTX performance is quite low.


----------



## medi01 (Feb 24, 2020)

"Big Navi" "leak" by a known "you could as well trust things randomly typed by monkeys" site:








						[UPDATE]Alleged AMD Next-Gen Flagship Navi 'Radeon RX' GPU Specifications Leaked - 5120 Cores, 24 GB HBM2e Memory, 2 TB/s Bandwidth
					

The alleged specifications of AMD's next-gen 'Big Navi' GPU powerhouse that powers the enthusiast gaming Radeon RX cards have leaked.




					wccftech.com
				







Assimilator said:


> If RTX is so irrelevant then why is AMD supporting it?


I remember VR that many rushed to support, Sony even rolled out PS4 pro.
Where is all that now? ))

Consoles are made for years to come, they didn't want to miss they train, understandably.

AMD supports DXR, not "RTX", mkay?
DXR is not RTX.
RT is not RTX either.

30-40 millions of consoles with AMD's chip will be rolled out in 2021. (and roughly at the same pace, for about 5 years to come)
Whatever tech it supports, will become a de facto standard.
It could, and in fact, based on AMD's patent it likely is, quite different from NV approach.



ARF said:


> Around RTX 2080 Super.


How so?
I used ref 5700XT as a base.


----------



## Vayra86 (Feb 24, 2020)

dj-electric said:


> TFLOPS became de-facto marketing term by now. Hollow and meaningless as they come.



This. I don't even bother with that number tbh

But... its not some shitty last years midranger that is for sure. Man, I really hope AMD is going to wow us with a new GPU that does RT well and smashes performance charts. Let's pray that the consoles are going to be good at least for thát.


----------



## ARF (Feb 24, 2020)

medi01 said:


> How so?
> I used ref 5700XT as a base.



12 : 9.7 and then compared to:







medi01 said:


> "Big Navi" "leak" by a known "you could as well trust things randomly typed by monkeys" site:
> 
> 
> 
> ...



“…Radeon product management lead, Mithun Chandrasekhar, hinted at the red team’s plan for 4K domination…”

Like Ryzen, AMD’s Big Navi is “going to similarly disrupt 4K” gaming








						Like Ryzen, AMD’s Big Navi is “going to similarly disrupt 4K” gaming
					

"All of us need a thriving Radeon GPU ecosystem. So, are we going after 4K, and going to similarly disrupt 4K? Absolutely."




					www.pcgamesn.com


----------



## Vayra86 (Feb 24, 2020)

Assimilator said:


> Remember when people bought consoles to play games on? Imagine if the marketing focused on the games that this console will let you play. Crazy suggestion, I know!
> 
> 
> 
> If RTX is so irrelevant then why is AMD supporting it?



Good games take effort and actual talent man, and you can't just outsource a bunch of goons to do it.



medi01 said:


> It could, and in fact, based on AMD's patent it likely is, quite different from NV approach.



The block diagrams we got to see were showing something rather similar really. Probably some tweaks here and there.



ARF said:


> “…Radeon product management lead, Mithun Chandrasekhar, hinted at the red team’s plan for 4K domination…”
> 
> Like Ryzen, AMD’s Big Navi is “going to similarly disrupt 4K” gaming
> 
> ...



Plans, yes. Disruption, yes. Products, pls


----------



## cucker tarlson (Feb 24, 2020)

all in all,from what little I gathered about what rt cores do,it is not really that computationally intensive.they're just an in-between calculation step for normal shading. 
I mean,what is it about the amd approach that's really that different ?


----------



## Lindatje (Feb 24, 2020)

ARF said:


> China has just reported that they have a vaccine.


Source?


----------



## ARF (Feb 24, 2020)

Lindatje said:


> Source?


*Animal testing begins for first novel coronavirus vaccines in China https://www.pharmaceutical-technology.com/news/animal-testing-novel-coronavirus-vaccine/

There is communication between Chinese authorities and Russia and I read it.*


----------



## renz496 (Feb 24, 2020)

Lindatje said:


> Now Ray Tracing will really get off the ground, hope Nvidia can still keep up with Ray Tracing as the games are going to be optimized for AMD.



When 8th gen console being dominated by AMD did tessellation suddenly becoming much faster on AMD hardware on pc?


----------



## sepheronx (Feb 24, 2020)

I just may end up with one of these.


----------



## kings (Feb 24, 2020)

CD Projekt Red:



> Gamers should never be forced to purchase the same game twice or pay for upgrades. Owners of #Cyberpunk2077 for Xbox One will receive the Xbox Series X upgrade for free when available.




__ https://twitter.com/i/web/status/1231961469669068800


----------



## dinmaster (Feb 24, 2020)

just want to point out the new hardware that isn't out yet in the pc market which is what the consoles are going to use. When it does come out, pc cards (3080, 6000 series) will be the same or better than console then it will like before slip into oblivion as future hardware comes out for pc. gpu and m.2 is the driving force of these new consoles from what i get. If i can change them then i might be interested... but that hasn't happened gpu wise yet.


----------



## ARF (Feb 25, 2020)

medi01 said:


> "Big Navi" "leak" by a known "you could as well trust things randomly typed by monkeys" site:
> 
> 
> 
> ...


----------



## Chomiq (Feb 25, 2020)

Assimilator said:


> If RTX is so irrelevant then why is AMD supporting it?


I never said it's irrelevant. I said that over 2 years we haven't seen that much of support for a feature that was (and to this day is) the key selling point for their cards. Some Nvidia guy even admitted recently that their initial prognosis was too optimistic and that RTX implementation turned out to be problematic for "some" of the devs. 
Also don't confuse RTX with DXR.


----------



## HwGeek (Feb 25, 2020)

Did some calc based on avg clocks from TPU reviews:
RTX 2070 FE: 8.5 Tflops@1.85Ghz avg clock
Nitro RX 5700XT: 10.2 Tflops @2Ghz avg clock
RTX 2070 has 84% of Tflops vs 5700XT while performance is 93% vs 5700XT.

IMO NV's tflops values are misleading because their listed boost clocks is always around -200mhz vs actual avg clock in games (1620mhz listed- 1850Mhz actual).
So in reality performance/tflops on RDNA is closer to RTX then listed on official numbers, RDNA2 will close the gap IMO.


----------



## ZoneDymo (Feb 25, 2020)

dj-electric said:


> TFLOPS became de-facto marketing term by now. Hollow and meaningless as they come.



But how many megapixels does it have?



Assimilator said:


> If RTX is so irrelevant then why is AMD supporting it?



Same reasong PhysX went nowhere it was suppose to go to.
As long as its 1 company pushing it, its not going to get the support it needs to be relevant.


----------



## cucker tarlson (Feb 25, 2020)

HwGeek said:


> Did some calc based on avg clocks from TPU reviews:
> RTX 2070 FE: 8.5 Tflops@1.85Ghz avg clock
> Nitro RX 5700XT: 10.2 Tflops @2Ghz avg clock
> RTX 2070 has 84% of Tflops vs 5700XT while performance is 93% vs 5700XT.
> ...


2070 and 5700xt are the same


----------



## HwGeek (Feb 25, 2020)

I used most recent RX 5700XT Nitro review for calc - not the reference.


----------



## The Quim Reaper (Feb 25, 2020)

All very impressive but the fact remains that with no exclusive games, as a PC gamer with a 9900K & 2080, I have absolutley no incentive whatsoever to want one or need one.

..as powerful as it is, I'm just one GPU upgrade away from stomping all over it and seeing the PC version of their games run even better than the Series X can.

The PS5 on the other hand with all its exclusive 1st party content is a different matter entirely, even if it is (slightly) less powerful than the X.


----------



## ARF (Feb 25, 2020)

cucker tarlson said:


> 2070 and 5700xt are the same



The RX 5700 XT has problems at 2160p, it has bottlenecks somewhere in its configuration.
Better show 1080p where it is equal to RTX 2070 Super.


----------



## cucker tarlson (Feb 25, 2020)

HwGeek said:


> I used most recent RX 5700XT Nitro review for calc - not the reference.


And stock 2070
You're beating the dead horse,the reviews have been out for months.
Aib vs aib 5700xt is same as 2070,aib OC same as 2060 super.


----------



## HwGeek (Feb 25, 2020)

You don't get it do you? tflops I have calculated based on the avg clocks of the cards and compared to their performance on RX 5700XT Nitro review.
Using different RTX 2070 won't change the performance/tflops ratio (I am comparing tflops/performance ratio between RDNA and Turing).


----------



## cucker tarlson (Feb 25, 2020)

ARF said:


> The RX 5700 XT has problems at 2160p, it has bottlenecks somewhere in its configuration.
> Better show 1080p where it is equal to RTX 2070 Super.


1080p (CPU bound) comparing reference vs fastest aib,nice try.
4k (actual GPU benchmark) aib vs aib 2070s is 10-15% faster,OC vs OC its almost 20%.


----------



## Calmmo (Feb 25, 2020)

That sounds promising, should be faster than any current GPU on PC (until nvidia's new cards come out and maybe an even bigger rdna2 chip for pc) given the lower level access vs going through  generic api's on pc. Like it should be when game boxes fist come out.


----------



## ARF (Feb 25, 2020)

cucker tarlson said:


> 1080p (CPU bound) comparing reference vs fastest aib,nice try.
> 4k (actual GPU benchmark) aib vs aib 2070s is 10-15% faster,OC vs OC its almost 20%.



I look at all the reviews available from different sources:












						AORUS Radeon RX 5700 XT 8G review
					

Gigabyte has released a new Radeon RX 5700 XT in its premium brand AORUS, which we review. It is one of the faster 5700 XT cards we've had our hands on. Fitted with a thick three fan cooling solution... DX11: Codemasters Formula 1




					www.guru3d.com


----------



## cucker tarlson (Feb 25, 2020)

ARF said:


> I look at all the reviews available from different sources:
> 
> 
> 
> ...


Mhm,cherry picked games at cherry picked resolutuons.
You can find 2060s beating 5700xt more often than 5700xt matching 2070s.
Full testing suite,4k,aib vs aib,only proper way of looking.


----------



## notb (Feb 25, 2020)

medi01 said:


> Next gen consoles are going to be insane value.


Actually, they won't.
The general feeling is that these consoles will be too expensive and hard to sell.
Which means a cheaper "on-premise" model is likely to appear soon after the flagship.

Early rumors suggested we would get one "on-premise" console and one suitable just for streaming (like Nvidia Shield).


----------



## ARF (Feb 25, 2020)

notb said:


> Actually, they won't.
> The general feeling is that these consoles will be too expensive and hard to sell.
> Which means a cheaper "on-premise" model is likely to appear soon after the flagship.



There will be some premium charge but of course its capabilities are unrivalled, so it's totally worth it.
It's designed to be immediately shrunk to 5nm and then to 3nm once the processes are available for mass production.
And then its chips will become much smaller and cheaper.


----------



## cucker tarlson (Feb 25, 2020)

ARF said:


> There will be some premium charge but of course its capabilities are unrivalled, so it's totally worth it.
> It's designed to be immediately shrunk to 5nm and then to 3nm once the processes are available for mass production.
> And then its chips will become much smaller and cheaper.


Well unrivalled for a console
If this is 2080 stock indeed then 3060 may match it as soon as it arrives


----------



## ARF (Feb 25, 2020)

cucker tarlson said:


> Well unrivalled for a console
> If this is 2080 stock indeed then 3060 may match it as soon as it arrives



But comparisons don't work like that. On the console there is a closer to the metal API and its performance will be higher than on PCs.
On the console, you will be easily able to get 4K@120, while the PC will struggle at 4K@60.

Hope is that you get the point.


----------



## cucker tarlson (Feb 25, 2020)

ARF said:


> But comparisons don't work like that. On the console there is a closer to the metal API and its performance will be higher than on PCs.
> On the console, you will be easily able to get 4K@120, while the PC will struggle at 4K@60.
> 
> Hope is that you get the point.


I forgot to mention that a pc is a pc
Does polaris in PS4 pro perform better than rx580?
No,its worse quality with worse performance in many cases.


----------



## AnarchoPrimitiv (Feb 25, 2020)

cucker tarlson said:


> I don't know what is the reason for looking at base clock but ok
> 
> 
> anyway
> ...



You're making your "assessment" based on RDNA1....this is RDNA2, and with VRS, AMD won't have to brute force as much, I guarantee this thing is more powerful than a 2070 Super


----------



## cucker tarlson (Feb 25, 2020)

AnarchoPrimitiv said:


> You're making your "assessment" based on RDNA1....this is RDNA2, and with VRS, AMD won't have to brute force as much, I guarantee this thing is more powerful than a 2070 Super


True,but cant guarantee anything.
Rdna1 to 2 is confirmed rt and vrs plus duv to euv.
There's nothing certain about rdna2 and hyping up amd products apparently never failed miserably before.
Ipc improvement may be single digit.
And my assessment was wrong but not because od that.
Aftet corrections it points to 2080fe performance,like @medi01 said,not 2070s.


----------



## ARF (Feb 25, 2020)

No, if EUV N7+ is limited to 429 mm^2, then Navi 21 is N7 DUV with 505 mm^2.

RDNA 1.0 is hybrid between RDNA 2.0 and GCN 5.0.
RDNA 2.0 will be the true new micro-architecture.


----------



## Durvelle27 (Feb 25, 2020)

I'm not sure how this thread went from being about the Xbox Series X to 5700 XT vs RTX 2070. So please guys stay on topic. If you want to debate start another thread in the appropriate sub forum.


----------



## notb (Feb 25, 2020)

ARF said:


> There will be some premium charge but of course its capabilities are unrivalled, so it's totally worth it.


But that's not how console market works. 

You can't buy/build a cheaper alternative. And you can't buy an older/used model to play new games on low settings. Basically, the MSRP becomes the cost of entry if you want to try modern titles.
$600 (maybe more) for next gen will be unacceptable for many consumers who are used to the levels we've seen earlier ($300-500).

The argument about price/performance ratio may only be relevant for gamers who are willing to choose between console and PC.

There have been suggestions that games made for Xbox Series X will run on Xbox One ("forward compatible"), but AFAIK there was no official statement if that includes One S or just One X.
Either way, if true, that means Xbox side already has a ~$250-300 budget model.

A lot less is known about Sony's plans.


----------



## ARF (Feb 25, 2020)

notb said:


> But that's not how console market works.
> 
> You can't buy/build a cheaper alternative. And you can't buy an older/used model to play new games on low settings. Basically, the MSRP becomes the cost of entry if you want to try modern titles.
> $600 (maybe more) for next gen will be unacceptable for many consumers who are used to the levels we've seen earlier ($300-500).
> ...



You will learn it the difficult way.
Get ready for prices rise.

Apple iPhone 11 Pro 512 GB costs up to $1349.

A much more powerful and expensive and advanced console should cost a lot, too.

$300 is a joke asking price.


----------



## Chomiq (Feb 25, 2020)

ARF said:


> You will learn it the difficult way.
> Get ready for prices rise.
> 
> Apple iPhone 11 Pro 512 GB costs up to $1349.
> ...


Except Sony and Microsoft can dump consoles on market with price below cost of manufacturing, knowing they'll get extra revenue from PS Now / Xbox Live, sales profits from the games, etc. They'll play the game of chicken when it comes to price announcements. If series x will cost $499 Sony will dump their PS5 below this, that's why they're staying quiet.


----------



## Prince Valiant (Feb 25, 2020)

ARF said:


> But comparisons don't work like that. On the console there is a closer to the metal API and its performance will be higher than on PCs.
> On the console, you will be easily able to get 4K@120, while the PC will struggle at 4K@60.
> 
> Hope is that you get the point.


Being able to target specific hardware isn't going to magically give developers a 100% performance boost. Why do people still believe that nonsense?


----------



## ARF (Feb 25, 2020)

Chomiq said:


> Except Sony and Microsoft can dump consoles on market with price below cost of manufacturing, knowing they'll get extra revenue from PS Now / Xbox Live, sales profits from the games, etc. They'll play the game of chicken when it comes to price announcements. If series x will cost $499 Sony will dump their PS5 below this, that's why they're staying quiet.



AMD needs the money. It's unfair for AMD who supply and always stay in the red because of such criminal practices.

I wouldn't buy a next-gen console for $300 because it's dirty cheap.



Prince Valiant said:


> Being able to target specific hardware isn't going to magically give developers a 100% performance boost. Why do people still believe that nonsense?



Why does RDR2 run only at 1080p60 max on RTX 2080 Ti?


----------



## Prince Valiant (Feb 25, 2020)

ARF said:


> AMD needs the money. It's unfair for AMD who supply and always stay in the red because of such criminal practices.
> I wouldn't buy a next-gen console for $300 because it's dirty cheap.
> *
> Why does RDR2 run only at 1080p60 max on RTX 2080 Ti?*


Incorrect.


Spoiler


----------



## notb (Feb 25, 2020)

ARF said:


> You will learn it the difficult way.
> Get ready for prices rise.
> 
> Apple iPhone 11 Pro 512 GB costs up to $1349.
> ...


As I said: there is no cheaper alternative to consoles (other than used stuff). That's the whole point.
Your elitist opinions won't change how the world works. 

Apple can ask $1300 for a smartphone because they aren't interested in the low-end.
There are smartphones priced between $50 and $1500 (likely even wider).

MS and Sony are interested in the low-end. If they ask $600 for the cheapest console, a lot of people will move to PCs. So this will not happen. New games will either work on older consoles (which would remain in production) or we'll see new entry-level models.


----------



## cucker tarlson (Feb 25, 2020)

notb said:


> As I said: there is no cheaper alternative to consoles (other than used stuff). That's the whole point.
> Your elitist opinions won't change how the world works.
> 
> Apple can ask $1300 for a smartphone because they aren't interested in the low-end.
> ...


main reason for the people to go with a console - pc components too expensive.

high end console is not a thing.


----------



## R0H1T (Feb 25, 2020)

notb said:


> *Actually, they won't.*
> The general feeling is that these consoles will be too expensive and hard to sell.
> Which means a cheaper "on-premise" model is likely to appear soon after the flagship.
> 
> Early rumors suggested we would get one "on-premise" console and one suitable just for streaming (like Nvidia Shield).


Actually it totally depends on the cost they debut at, rumors are Sony & probably even MS(?) could sell them at or near cost though with negative (overall) margins like they did sometime previously. So it will be a much better perf/$ proposition than any of the similar specced PCs today.


----------



## cucker tarlson (Feb 25, 2020)

well,if the asking price is $500-600,the console performs like 2080,and ampere 3060 can match it at $350,then those consoles are dead in the water.

jensen already made references to console gpus (2080maxq matching new consoles),though I don't think the maxq can really do that.desktop 2080fe - we'll see,but most likely.


----------



## Chomiq (Feb 25, 2020)

cucker tarlson said:


> main reason for the people to go with a console - pc components too expensive.
> 
> high end console is not a thing.


Ever heard of PS4 pro or XOX?


----------



## cucker tarlson (Feb 25, 2020)

Chomiq said:


> Ever heard of PS4 pro or XOX?


they're high end  ? 1800p at 30 fps medium  
they're same as rx580/590 and at their price they were already too expensive for many.
now ask 2200pln for a console and see ppl line up when 1600pln buys them a new 3060 and 1400 a used 1080Ti


----------



## Chomiq (Feb 25, 2020)

cucker tarlson said:


> they're high end  ? 1800p at 30 fps medium
> they're same as rx580/590 and at their price they were already too expensive for many.
> now ask 2200pln for a console and see ppl line up when 1600pln buys them a new 3060 and 1400 a used 1080Ti


Relative high end, relative. There was no base/enhanced option in previous generations. Now it's a thing.


----------



## medi01 (Feb 25, 2020)

cucker tarlson said:


> main reason for the people to go with a console - pc components too expensive.


Haha, what?

Consoles = games you cannot get on PC.
Consoles = plays nicely in living room on your large TV
Consoles = comfy sofa gaming

PC has it's uses it doesn't need to be either or.


----------



## cucker tarlson (Feb 25, 2020)

Chomiq said:


> Relative high end, relative. There was no base/enhanced option in previous generations. Now it's a thing.


relative high end.
aka mid range gpus sold as high end consoles.




medi01 said:


> Haha, what?
> 
> Consoles = games you cannot get on PC.
> Consoles = plays nicely in living room on your large TV
> ...


no,just no.
main thing is cost.


----------



## medi01 (Feb 25, 2020)

cucker tarlson said:


> f the asking price is $500-600,the console performs like 2080,


No, consoles always perform way better than PC counterpart, because games are tailored for them, while PC gets abstract "scaling slider" that does "some things".



cucker tarlson said:


> main thing is cost.


No, in fact, I don't know a single person who'd buy consoles because "PC is too expensive".
In fact, I don't know anyone who owns a console, but doesn't have a PC.

Consoles = a lot of comfort. PC is much more fiddling.


----------



## cucker tarlson (Feb 25, 2020)

medi01 said:


> No, consoles always perform way better than PC counterpart, because games are tailored for them, while PC gets abstract "scaling slider" that does "some things".


no,again.
watch some yt comparisons.
"better cause tailored" is a myth.
1060 can outperform ps4pro a lot of times.

they're running 30 fps at ~1800p ~low-medium quality,I guess you did not know that.


----------



## Chomiq (Feb 25, 2020)

cucker tarlson said:


> relative high end.
> aka mid range gpus sold as high end consoles.


Good enough for console pleb.


----------



## cucker tarlson (Feb 25, 2020)

medi01 said:


> No, consoles always perform way better than PC counterpart, because games are tailored for them, while PC gets abstract "scaling slider" that does "some things".
> 
> 
> No, in fact, I don't know a single person who'd buy consoles because "PC is too expensive".
> In fact, I don't know anyone who owns a console, but doesn't have a PC.


yes,cause "who medi01 knows" was always a point of reference


----------



## medi01 (Feb 25, 2020)

cucker tarlson said:


> yes,cause "who medi01 knows" was always a point of reference


Oh, your opinion based on thin air is of course much more important. Figures.



cucker tarlson said:


> "better cause tailored" is a myth.



Get real, check how Horizon, GoW look at guts of (non-pro) PS4, and tell me how you'd expect that to be possible on 7850.

I won't mention common sense as it doesn't work with strong bias applied, chuckle.


----------



## cucker tarlson (Feb 25, 2020)

medi01 said:


> Oh, your opinion based on thin air is of course much more important. Figures.
> 
> 
> 
> ...


of course consoles are supposed to be wallet friendly,what are you talking about.
have consoles suddenly become workstations everyone has been waiting to drop +$500 on cause your beloved amd says they're 12 tflops + rt ?

It makes me chuckle how in 2018 pc enthusiast weren't ready for RT cards but now console users are ready to drop that much


----------



## medi01 (Feb 25, 2020)

cucker tarlson said:


> amd says they're 12 tflops + rt


Microsoft.

Jeez, are you hurt?



cucker tarlson said:


> of course consoles are supposed to be wallet friendly,what are you talking about.


There is difference between vague "wallet firendlieness" of consoles and most console buyers buying it because they are poor.

Consoles are cheaper for a number of reasons:
1) Uh, hold on, average PCs of a typical gamer runs 1060/580 or worse. Yeah, let's get that straight first
2) Economy of scale, of course and better contracts on parts
3) Normal business model is to make money on games, not consoles (Nintendo is a notable exception)

It is curious that XSeX that is unlikely to cost beyond $550, will have GPU that is faster than $700-800 GPU from NV, but it's not usual.



cucker tarlson said:


> It makes me chuckle how in 2018 pc enthusiast weren't ready for RT cards but now console users are ready to drop that much


I can't make any sense of this statement,.
I assume there is some virtual war of yours in which you personally have achieved impressive victory and inflicted devastating blow to your enemies after Sony or Microsoft or AMD or Intel or NV did something, I just can't figure exactly which of those wars it was or exactly whom you are trying to adress in a thread about XSeX specs, chuckle.


----------



## cucker tarlson (Feb 25, 2020)

faster than $800 ? what are they 2080Ti's now ?  
lol.
this thread


----------



## medi01 (Feb 25, 2020)

cucker tarlson said:


> faster than $800 ? what are they 2080Ti's now ?


I'm pretty sure there are no 2080Tis sold for 800 Euros in Poland and you are just having weird meltdown.


----------



## cucker tarlson (Feb 25, 2020)

please someone notify me when medi01 starts making sense.


----------



## efikkan (Feb 25, 2020)

The Quim Reaper said:


> All very impressive but the fact remains that with no exclusive games, as a PC gamer with a 9900K & 2080, I have absolutley no incentive whatsoever to want one or need one.
> 
> ..as powerful as it is, I'm just one GPU upgrade away from stomping all over it and seeing the PC version of their games run even better than the Series X can.
> 
> The PS5 on the other hand with all its exclusive 1st party content is a different matter entirely, even if it is (slightly) less powerful than the X.


PC and console sales are driven by different motives, practically no one chooses a console over a PC because it's "better value" (even if it was). Console sales are all driven by specific games, so it doesn't really matter how many Tflop the next Xbox have vs. PlayStation or vs. a "comparable" PC.



medi01 said:


> No, consoles always *perform* way better than PC counterpart, because games are tailored for them, while PC gets abstract "scaling slider" that does "some things".


This is complete nonsense.
The only thing which is "tailored" for console games is the assets(model details, texture details) are tuned to a desired frame rate, but even that usually only applies to top games.


----------



## medi01 (Feb 25, 2020)

efikkan said:


> The only thing which is "tailored" for console games is the assets(model details, texture details) are tuned to a desired frame rate.


So model geometry, textures being tailored for consoles prove... tailoring for consoles is "a myth" and "complete nonsense".







Amazing.

Oh, and look at the assessed effects of such "non-tailoring tailoring":


__ https://twitter.com/i/web/status/436012673243693056
Chuckle.





Asset tailoring would be more than enough, but sometimes they don't stop there. Blizzard has mentioned they optimized for 6 cores when doing a console port, because, well, remember, Jaguar isn't really a monster CPU.


----------



## efikkan (Feb 25, 2020)

medi01 said:


> efikkan said:
> 
> 
> > medi01 said:
> ...


I even highlighted the main point in the quote from you, and yet you somehow missed it and topped it off by adding a clip from "dumb and dumber", the irony…

There are no magical fairy dust inside consoles. If you run the same code on comparable hardware, it will perform comparably, that's a fact. The only way to make the code faster on consoles is if they have unique and faster API features, which is becoming more and more rare. The majority of console games are mostly using off-the-shelf engines and have nothing close to optimal code at all, and most launch titles are rushed, so console games are not necessary more polished than games in general.

Console games today are developed on PCs, then tested and if needed debugged on devkits towards the end. Early titles are often developed largely without the help of devkits at all. Devkits running hardware comparable to the final product are only available a few months ahead of release of the console. Let's end this BS about console games performing better right now.


----------



## kings (Feb 25, 2020)

$600 consoles are a niche, most people just want something cheap to play FIFA, COD, Fortnite and others. As strange as it may seem to some, 1st party exclusives are a very small part of sales in the general universe of game sales.

That's why much is speculated about a significantly cheaper "Series S". If Microsoft launches just one console in the $600 zone, it easily loses the generation, if Sony surprises again with the $399, even with inferior hardware.

Sony has already tried to have a single console at $600 on the market and has done extremely badly, only after a few years with many price cuts along the way, it started to recover in sales.


----------



## ARF (Feb 25, 2020)

kings said:


> $600 consoles are a niche, most people just want something cheap to play FIFA, COD, Fortnite and others. As strange as it may seem to some, 1st party exclusives are a very small part of sales in the general universe of game sales.
> 
> That's why much is speculated about a significantly cheaper "Series S". If Microsoft launches just one console in the $600 zone, it easily loses the generation, if Sony surprises again with the $399, even with inferior hardware.
> 
> Sony has already tried to have a single console at $600 on the market and has done extremely badly, only after a few years with many price cuts along the way, it started to recover in sales.



There is no way Sony can justify $399 price tag because this would cover less than 50% of the manufacturing/R&D cost of the components inside:
1. Processor;
2. Graphics;
3. SSD;
4. Main board;
5. Case;
6. Power supply and delivery circuit;
7. Etc related costs.


----------



## kings (Feb 25, 2020)

We don't know what hardware the PS5 will have, the only thing we know is that it will be based on Zen 2 and Navi, nothing more.

If Microsoft is going to be able to sell a 12TF console at $600 or slightly less, why Sony would not be able to sell a 9TF console for example, at $400?

Even though it´s $450, it´s a significant difference to $600, most people would not pay that premium.


----------



## ARF (Feb 25, 2020)

kings said:


> We don't know what hardware the PS5 will have, the only thing we know is that it will be based on Zen 2 and Navi, nothing more.
> 
> If Microsoft is going to be able to sell a 12TF console at $600 or slightly less, why Sony would not be able to sell a 9TF console for example, at $400?
> 
> Even though it´s $450, it´s a significant difference to $600, most people would not pay that premium.



But we do know the current market prices and what we can do with $399 - if you are lucky a Ryzen 3 or Athlon with 8GB RAM max for office use.

I do expect $799 for the Microsoft top offer and not less than $550 for the corresponding Sony offer.


----------



## medi01 (Feb 25, 2020)

efikkan said:


> I even highlighted the main point in the quote from you





efikkan said:


> There are no magical fairy dust inside consoles.


And who said there was? Oh, I see, nobody. Ah, but you can highlight a single word in this, rather clear, sentence:

"No, consoles always *perform* way better than PC counterpart, *because games are tailored for them*, while PC gets abstract "scaling slider" that does "some things".

But feel free to ignore part marked red, and defeat the strawman. 
The irony.


----------



## Prima.Vera (Feb 26, 2020)

This means the PS5 will have simmilar specs and features, since both consoles share the same CPU+GPU?


----------



## Valantar (Feb 26, 2020)

ARF said:


> AMD needs the money. It's unfair for AMD who supply and always stay in the red because of such criminal practices.
> 
> I wouldn't buy a next-gen console for $300 because it's dirty cheap.


"Criminal"? How? It's not like it's anticompetitive; it is how the console market has always operated, everyone does it and it's a perfectly viable business model. Hardware is sold as a loss leader to allow for profits through software licencing (FIY, this is also done in a lot of other industries, from movie tickets being sold at a loss with profits made on candy and drinks, to TVs sold at a loss with profits made on accessories, cables, insurance, etc. Don't get me started on capsule coffee makers.). For most current console games there's a ~$10 licencing fee per game (which is why console games have used to be $60 instead of $50 on PC). It would be very different if someone did this in a market where the software licencing part didn't exist or they had a dominant market position and the money to force competitors out through doing this, but that's not the case here. And besides, AMD's margins aren't affected here - if MS and Sony sell their consoles at a loss doesn't mean AMD are selling their chips at a loss, it just means that AMD's margins are a part of the BOM cost of the console. AMD isn't likely to have fat margins on a part like this (ordering 10-50 million chips is likely to give you a decent amount of leverage in price negotiations), but they're nowhere near losing money on this. Semi-custom has been one of AMD's most profitable departments in the previous console generation, no reason to expect that to change.


medi01 said:


> No, consoles always perform way better than PC counterpart, because games are tailored for them, while PC gets abstract "scaling slider" that does "some things".


Sure, there are some titles that are optimized in amazing ways and use a lot of clever tricks and smart utilization of the specific console hardware to look far better than anything comparable on PC, but for the majority of cross-platform games the console ports just run at a lower detail level (normally somewhere roughly equal to medium or medium-low on PC). Just watch a few Digital Foundry analyses and you'll see this clearly. Low-level hardware access is mostly a thing of the past, as modern consoles use PC-equivalent APIs (or even just PC APIs; Xbone uses DX12) for ease of development. They do demonstrate very well how much can be done by abandoning the rather silly "Ultra or nothing" mantra of many PC gamers, but that's about it.


ARF said:


> There is no way Sony can justify $399 price tag because this would cover less than 50% of the manufacturing/R&D cost of the components inside:
> 1. Processor;
> 2. Graphics;
> 3. SSD;
> ...


1: Very expensive, yes, both in R&D and production.
2: Same silicon as 1, no additional cost. Same R&D.
3: Flash is expensive, otherwise this is cheap, likely using off-the-shelf parts with semi-custom firmware.
4: Medium cost, but cheap in the long run due to mass production. Highly optimized for cost with few PCB layers and likely a single-sided board. _Much_ cheaper than the cheapest PC mobo, and produced in >100-1000x the quantities for much lower R&D costs per board.
5: Cheap AF. Initial tooling is expensive (likely hundreds of thousands of dollars, possibly millions), and the design work and certifications isn't free by any means (though I guess the latter goes under point 7), but amortized over >10 000 000 consoles all of this amounts to a few dollars per console at most, and production costs are very low thanks to a simple stamped steel frame with injection-molded plastic for aesthetics.
6: very cheap, likely requires near zero design, just tweaking of specs and layout/form factor from an existing OEM solution. MS/Sony will just go to Delta/MeanWell/whoever and say "we need an internal PSU in roughly this form factor, at this efficiency level, rated for this temperature, at this level of output noise and ripple, with x Amps on the 12V rail and a 5VSB rail", and the OEM likely has a suitable solution already that just needs some layout/form factor tweaks.
7: FCC testing and other certifications do cost quite a lot, but again, amortized over millions of consoles it's next to nothing per unit.

That being said, $399 is unlikely due to the size and performance level of the SoC - as that's _by far_ the most expensive component of the build.


As for people saying that consoles don't sell due to price: don't be silly. A basic off-the-shelf gaming PC is ~$1000, with deals down to ~$700 if you're lucky and know where to look. They can be built cheaper, but that requires knowledge that the average user isn't even close to. So cost is definitely one of the main reasons for consoles being popular. Simplicity is another - they're mostly plug-and-play, and don't require any real skill to configure, and no assembly. Using them is also dead simple, and the software is relatively easy to learn. A third reason is that you can buy any game for the platform and expect it to work (at least in theory ... these days, yeah ...), unlike on PC where you have to know if your PC is powerful enough and/or start tweaking settings for it to run properly. A fourth reason is the couch experience - they fit smoothly into a contemporary TV-centric living room. PCs can too, but most don't whatsoever, and the UX is poor. So let's stop trying to point at _one_ reason why consoles are popular - as with anything the answer is complex and consists of many discrete parts. Break any one of them, though, and it becomes a lot less attractive - raise the price too much, complicate the software, make game compatibility complicated, or mess up the UX; any of that can turn users off a console.


----------



## Whitestar (Feb 26, 2020)

Just wondering: Aside from RDNA2, what features does the Series X bring that doesn't exist in the PC GPU market today?
Or is RDNA2 the big thing here?


----------



## INSTG8R (Feb 26, 2020)

Whitestar said:


> Just wondering: Aside from RDNA2, what features does the Series X bring that doesn't exist in the PC GPU market today?
> Or is RDNA2 the big thing here?


Well RayTracing and VRS will now be a mainstream“ feature for the “masses“


----------



## Bytales (Feb 26, 2020)

And still no Support for mouse nad Keyboard.


----------



## Valantar (Feb 26, 2020)

Whitestar said:


> Just wondering: Aside from RDNA2, what features does the Series X bring that doesn't exist in the PC GPU market today?
> Or is RDNA2 the big thing here?


Well, yes. Consoles are never feature leaders when compared to PC hardware (at least not in the past couple of decades). It'll mainly be a _major _performance increase from current gen consoles while adding high-end features like VRS and RTRT. Also, things like variable refresh rate should be better implemented this time around (Xbone X/S have it, but poorly implemented). The CPU performance increase is also a major change to baseline game designs for cross-platform games, going from 7 Jaguar cores (8th is reserved for the OS/system) at low clock speeds to (likely) 7 Zen2 cores at higher clock speeds - the IPC change alone is something like 300%, with the clock speed bump likely increasing that by another 30-40% on top. Which in sum will allow for higher refresh rates, much better graphics, and more CPU-driven functionality in games (like improved NPC/enemy AI, physics, whatever else you can throw a fast CPU thread at), as well as the potential for very realistic lighting, reflections, etc.



Bytales said:


> And still no Support for mouse nad Keyboard.


Xbox One has (limited) mouse and keyboard support. It's up to game developers to allow it in games though.


----------



## ARF (Feb 26, 2020)

Valantar said:


> "Criminal"? How? It's not like it's anticompetitive; it is how the console market has always operated, everyone does it and it's a perfectly viable business model. Hardware is sold as a loss leader to allow for profits through software licencing (FIY, this is also done in a lot of other industries, from movie tickets being sold at a loss with profits made on candy and drinks, to TVs sold at a loss with profits made on accessories, cables, insurance, etc. Don't get me started on capsule coffee makers.).



It is anti-competitive because forces AMD to bleed money. You know there is such type of practice in the automotive industry where suppliers are blackmailed by the big players, so they supply at unrealistically low prices.

It is not a perfectly viable business model because AMD is under threat of going under.

You remember how Intel was fined $1.3B for damaging AMD's sales and profits?!



Why do we need these consoles specs 8-9 MONTHS before the actual launch, while we don't have any confirmation of real RDNA 2.0 videocards? Which are about to debut much, much earlier?!

At least some benchmarks with performance results from the Navi 21?!

NO?!


----------



## Valantar (Feb 26, 2020)

ARF said:


> It is anti-competitive because forces AMD to bleed money. You know there is such type of practice in the automotive industry where suppliers are blackmailed by the big players, so they supply at unrealistically low prices.
> 
> It is not a perfectly viable business model because AMD is under threat of going under.
> 
> You remember how Intel was fined $1.3B for damaging AMD's sales and profits?!


This isn't relevant here whatsoever. Yes, obviously console makers will negotiate with AMD on price, but _AMD is under no circumstances selling console SoCs at a loss_. Period. _Consoles_ are sold at a loss - to MS or Sony, but those losses are on _their_ bottom lines, not their parts suppliers. How does this work? Let's make up some numbers for an example $500 retail price console:
- $250 SoC
- $100 SSD
- $30 BD drive
- $20 PSU
- $40 motherboard (including chipset, network controllers, WiFi, etc.)
- $20 case
- $10 assembly
- $?? software development (ongoing cost, so hard to include, but it's definitely there)
- $?? long-term support, RMA handling, etc.
- $?? marketing
- $?? developer support

So, here we have a theoretical console at $470 BOM cost, plus significant support and development costs, yet it's sold at $500 - a figure that also needs to include distributor and retailer profit margins - in other words, the console maker is _not_ getting $500 for this, the number is likely below $450. This is the console maker's loss alone. That theoretical $250 SoC? The price _obviously_ includes _all_ costs to the SoC maker _as well as their profit margins_. That $250 price likely includes 20-30% profit margins for AMD, even after amortized R&D costs. Selling a console at a loss does not in any way imply that the parts suppliers are selling their parts at a loss - why would they? Wouldn't they then rather refuse the contract? Your thinking here makes no sense at all. The same obviously applies to the third-party OEMs making the flash and controller for the SSD, the PSU, the BD drive, etc., etc. - they are all making a profit. Only the console maker, the one assembling all the parts and selling them as a finished product, is losing anything. This is a calculated risk on their part - they might go bust by doing this after all - but they aim to make back any losses (and then some!) through the licensing costs for selling games for their platform, which is paid by game distributors per copy of any licensed game sold. Game license fees are typically $10, so at a $470 BOM and $450 price to distributors, MS can break even at two games sold per console (not counting what is required to finance development, marketing  and support costs), with anything more being profit to them. But the important part is that regardless of whether the console maker makes or loses money, all their parts suppliers are getting paid, including profit margins - barring, of course, the console maker defaulting on their debt as part of going bankrupt. Which isn't likely to happen with any of the current ones.

The Semi-custom department at AMD, which is responsible for console SoCs, has been one of their most profitable departments over the past decade, and a significant reason for their survival through times when both CPU and GPU departments have had significant losses.

Tl;dr: your point here is complete nonsense and doesn't apply to this situation whatsoever. A console maker selling a console at a loss does not mean that parts suppliers are selling parts at a loss.


ARF said:


> Why do we need these consoles specs 8-9 MONTHS before the actual launch, while we don't have any confirmation of real RDNA 2.0 videocards? Which are about to debut much, much earlier?!
> 
> At least some benchmarks with performance results from the Navi 21?!
> 
> NO?!


These products operate in entirely different markets with very different competitive landscapes. AMD is the only major high-performance player in consoles, as such they don't have to worry about competitors one-upping them by revealing features and broad specifications early. This also lets them give console makers some freedom in talking about specs for upcoming consoles to drum up interest for an upcoming generation - also a necessity given the long lifespans of consoles (5+ years) compared to PC GPUs (2-3 years) and the general unwillingness for most console gamers to upgrade (compared to the frenzied hardware fetishization in the PC gaming space).

On the other hand, in the PC space AMD is the perennial underdog, competing against a much larger and wealthier competitor that also has a performance and feature advantage currently. This means that tipping their hand too early is just feeding information to the competition, allowing them to adjust their product stack to compete at time of launch, minimizing any advantage AMD might gain from a new launch. Why would they give Nvidia early information so that they can adjust, rather than launch as "unexpectedly" as possible? AMD has _zero_ to gain from leaking specs for upcoming GPUs early - they risk hurting sales of their current products while also giving Nvidia time to either adjust pricing of current produtcs or adjust specs/pricing/marketing of their upcoming products to better compete with what AMD is launching. The current timing would be especially terrible for this considering that Nvidia is expected to present consumer Ampere within a month - why not wait _at least_ until Nvidia makes a public presentation so that AMD is then the one who can adjust?


----------



## ARF (Feb 26, 2020)

AMD might not be selling at a loss but they are definitely not selling with a profit either. Their quarter results which are quite bad in the profit section indicate that.
You also didn't add the RAM cost which if HBM will be pretty high as well.

And I don't believe that Nvidia has no information about AMD's future products. If anyone knows best, it is exactly Nvidia.
They don't do anything because it's in their interest to sell the RTX lines at overinflated prices.

And how does Navi 21 interfere with the much lower specced and lower performance Navi 10!
Those are cards for completely different market segments.

Navi 10 is bad, while Navi 21 is expected to fix its problems.



Valantar said:


> The current timing would be especially terrible for this considering that Nvidia is expected to present consumer Ampere within a month - why not wait _at least_ until Nvidia makes a public presentation so that AMD is then the one who can adjust?



There is no Ampere launch within a month. The chips were cancelled and new ones haven't been taped out yet,
Ampere is a 2021 thing if released ever.


----------



## INSTG8R (Feb 26, 2020)

ARF said:


> Navi 10 is bad, while Navi 21 is expected to fix its problems.


Sorry what’s “bad” about Navi 10?


----------



## ARF (Feb 26, 2020)

INSTG8R said:


> Sorry what’s “bad” about Navi 10?



Performance per watt (7nm vs 16/12nm Nvidia and still behind), price not corresponding to the performance level, heat, driver problems, lack of any new features like ray-tracing, etc.
Should I list even more?


----------



## INSTG8R (Feb 26, 2020)

ARF said:


> Performance per watt (7nm vs 16/12nm Nvidia and still behind), price not corresponding to the performance level, heat, driver problems, lack of any new features like ray-tracing, etc.
> Should I list even more?


Please do. I’d like to know what’s wrong with my recent purchase thats an upgrade from Vega in performance and power consumption.


----------



## kapone32 (Feb 26, 2020)

Bytales said:


> And still no Support for mouse nad Keyboard.



Why would you want to use mouse and keyboard on a console?


----------



## ARF (Feb 26, 2020)

INSTG8R said:


> Please do. I’d like to know what’s wrong with my recent purchase thats an upgrade from Vega in performance and power consumption.




Noisy in gaming
Overclocking is complicated
Driver bugs
High temperatures
No support for raytracing acceleration
No idle fan stop
High multi-monitor power draw
CrossFire no longer supported









						AMD Radeon RX 5700 XT Review
					

The AMD Radeon RX 5700 XT is based on AMD's all-new Navi 10 GPU featuring the RDNA architecture. We thoroughly test the card's gaming performance and look at power, heat, noise, overclocking, and clock frequency stability, too, sometimes with surprising results.




					www.techpowerup.com
				





Large increase in power consumption, power efficiency lost
Memory overclocking limited by adjustment range
Memory not overclocked
No hardware-accelerated raytracing









						XFX Radeon RX 5700 XT THICC III Ultra Review
					

The XFX RX 5700 XT THICC III Ultra is a brand new triple-fan design from the company, which runs higher clocks, too. XFX listened to criticism and improved the memory cooling plate, reducing temperatures and noise levels significantly, which makes the THICC III one of the quietest RX 5700 XT...




					www.techpowerup.com
				





High price
Large increase in power consumption, power efficiency lost
No hardware-accelerated raytracing









						Sapphire Radeon RX 5700 XT Nitro+ Special Edition Review
					

Sapphire's RX 5700 XT Nitro+ Special Edition is the fastest air-cooled Radeon RX 5700 XT on the market. It has what no other card has: overclocked memory. Noise levels and thermals of the adjustable RGB fans are impressive. Does it beat the PowerColor Red Devil?




					www.techpowerup.com
				





Price seems a bit high
Large increase in power consumption, power efficiency lost
Memory overclocking limited by adjustment range
Memory not overclocked
No hardware-accelerated raytracing









						ASRock Radeon RX 5700 XT Taichi OC+ Review
					

The ASRock RX 5700 XT Taichi OC+ is the fastest factory overclocked Navi card we've tested so far. We measured it to run almost 2 GHz real clock on average in our real-life, mixed gaming load. Temperatures are good, too, and the cooler includes idle-fan-stop.




					www.techpowerup.com
				




*Asus blames AMD for overheating ROG Strix RX 5700 Series cards*








						Asus blames AMD for overheating ROG Strix RX 5700 Series cards
					

In an unusual move, Asus shifted blame for its overheating ROG STRIX 5700 series graphics cards to AMD, claiming the GPU maker's mounting pressure guidelines delivered inadequate mounting force.




					www.notebookcheck.net
				




*Gamers are ditching Radeon graphics cards over driver issues*








						Gamers are ditching Radeon graphics cards over driver issues
					

As I planned and was beginning to run our next big graphics card benchmark test, I felt I had to shift gears to discuss AMD's driver woes...




					www.techspot.com
				




*AMD is Investigating Black Screen Driver Issues on Radeon Cards*





						AMD is Investigating Black Screen Driver Issues on Radeon Cards | ExtremeTech
					

AMD is aware of problems with black screens and Radeon GPU driver crashes and is currently investigating the problem.  ...




					www.extremetech.com


----------



## kapone32 (Feb 26, 2020)

ARF said:


> Performance per watt (7nm vs 16/12nm Nvidia and still behind), price not corresponding to the performance level, heat, driver problems, lack of any new features like ray-tracing, etc.
> Should I list even more?



Lack of new features like what? PCI_E 4.0, 7nm, Faster than the Vega 64 with less power draw, cheaper than the 2070. Oh but no ray tracing but I was talking about Navi. When you talk about price to performance I could buy 3 5700XTs for the price of 1 2080TI or 2 5700XTs for the price of 1 2080 Super and those are the only cards faster than the 5700Xt. Nvidia is much bigger and richer than AMD and does not have to split it's resources between CPUs and GPUs so Navi to me is great and ray tracing may be nice but not yet mainstream.


----------



## Valantar (Feb 26, 2020)

ARF said:


> AMD might not be selling at a loss but they are definitely not selling with a profit either. Their quarter results which are quite bad in the profit section indicate that.


...those results are a) most likely before mass production of SoCs for the XSX and PS5 started, and on the tail end of a console generation - when sales typically drop significantly. How many people do you think are buying the PS4 or XOX/XOS today? Not many. Thus MS and Sony aren't ordering many chips from AMD _for the current generation_, thus AMD don't have large profits in that sector _currently_. This will obviously change dramatically within the next year as the next generation ramps up.


ARF said:


> You also didn't add the RAM cost which if HBM will be pretty high as well.


Well, sorry if my completely made-up numbers meant to illustrate a broad point about cost distribution and third party components supply skipped a point. I don't see how it makes a difference to what I was arguing whatsoever - I could have just said "let's assume a $500 retail console has a $250 SoC and a total BoM of $470" and left it at that.

That being said, there is _no chance in hell_ that next-gen consoles are getting HBM. Zero. Zilch. Nada. Never going to happen. HBM is way too expensive for consoles, and two of its main advantages (space savings and power savings) don't matter much in that scenario. A 250W console can afford to spend 20W more than "necessary" on memory, and has a large motherboard capable of fitting heaps of GDDR6 channels.


ARF said:


> And I don't believe that Nvidia has no information about AMD's future products. If anyone knows best, it is exactly Nvidia.
> They don't do anything because it's in their interest to sell the RTX lines at overinflated prices.


Who cares? Believing that Nvidia likely has well-placed sources (even if this runs the risk of industrial espionage if taken far enough, which is rather illegal) does not in any way make for an argument that AMD (or whoever who is competing with whoever else, really) should go around divulging information of not-yet-ready-for-market products ahead of time. Period. Seriously, what do they stand to gain from this? Nothing at all.


ARF said:


> And how does Navi 21 interfere with the much lower specced and lower performance Navi 10!
> Those are cards for completely different market segments.


Not necessarily. Do you think AMD will only be launching a single RDNA2 GPU this year? I think there will be a fuller stack this year than last year. And a lot of people considering buying an RX 5700XT might consider holding off and saving up another $100-200 to get the next tier up if they know it's coming, which is a loss to AMD _today_ with only the potential (not sure in any way) of future profits to show for it. Your logic here is very poor.


ARF said:


> Navi 10 is bad, while Navi 21 is expected to fix its problems.


...no. Navi 10 is a significant improvement from GCN in any guise. It's far more efficient (even beyond the node change - a 210W 5700 XT matches a 295W Radeon VII on the same 7nm node, after all, and with 20 fewer CUs to boot - they have driven GPU prices down (I'd like more, but we're moving in the right direction at least). They haven't caught up to Nvidia, true, but expecting that to happen in a single generation (particularly when Nvidia has many times the R&D budget of AMD) is entirely unrealistic. RDNA (1) is a very good architecture, and hopefully RDNA2 will improve upon this again.


ARF said:


> There is no Ampere launch within a month. The chips were cancelled and new ones haven't been taped out yet,
> Ampere is a 2021 thing if released ever.


Source?


----------



## kapone32 (Feb 26, 2020)

ARF said:


> Noisy in gaming
> Overclocking is complicated
> Driver bugs
> High temperatures
> ...



Where do you get the large increase in power consumption? vs what?


----------



## ARF (Feb 26, 2020)

Valantar said:


> Source?



*Igors Lab: No, NVIDIA’s Ampere GPUs Are Not Launching In March*








						Igors Lab: No, NVIDIA's Ampere GPUs Are Not Launching In March
					

We have been covering leaks and rumors about Ampere GPUs pretty much non-stop for the past few weeks so it's only fair we cover a dissenting voice as well. The venerable Igor of Igor's Lab (a very reliable source I might add) has stated that he does not expect NVIDIA to launch/announce Ampere...




					wccftech.com
				







			NVidia Ampere GPUs  from RTX 3050 to RTX 3090 Ti - Personal View Talks


----------



## INSTG8R (Feb 26, 2020)

kapone32 said:


> Lack of new features like what? PCI_E 4.0, 7nm, Faster than the Vega 64 with less power draw, cheaper than the 2070. Oh but no ray tracing but I was talking about Navi. When you talk about price to performance I could buy 3 5700XTs for the price of 1 2080TI or 2 5700XTs for the price of 1 2080 Super and those are the only cards faster than the 5700Xt. Nvidia is much bigger and richer than AMD and does not have to split it's resources between CPUs and GPUs so Navi to me is great and ray tracing may be nice but not yet mainstream.


Thanks kapone i think you summed it up nicely his negatives that I either A: Don’t care(RT) or B:haven’t experienced or don’t see them as negative. I’m impressed tho that he did make a pretty thorough list. My Nitro+ has been nothing but a great upgrade over my V64 Nitro+ in every metric, performance, power consumption, temperatures all improvements over Vega. I just can’t comment on the driver issues.


----------



## kapone32 (Feb 26, 2020)

INSTG8R said:


> Thanks kapone i think you summed it up nicely his negatives that I either A: Don’t care(RT) or B:haven’t experienced or don’t see them as negative. I’m impressed tho that he did make a pretty thorough list. My Nitro+ has been nothing but a great upgrade over my V6$ Nitro+ in every metric, performance, power consumption, temperatures all improvements over Vega. I just can’t comment on the driver issues.



I have wanted to pull the trigger on Navi but I can't seem to pull myself away from my Watercooled Vega 64s. I am patiently waiting for big Navi.


----------



## Valantar (Feb 26, 2020)

ARF said:


> Noisy in gaming
> Overclocking is complicated
> Driver bugs
> High temperatures
> ...


So, let me see...

You're complaining that cards that AMD's partners (not AMD themselves) have overclocked are less efficient than stock. I shouldn't have to tell you, this is how overclocking works, and it's _exactly _the same for Nvidia cards. If this bothers you, don't buy an overclocked card.
You're complaining that a bog-standard blower design is hot and loud. Okay, sure, the reference design is less than ideal (even if the reasoning behind using a blower is acceptable it's less and less relevant to real-world uses). Get a better ventilated open air card with better thermals then (and not an OC version if that bothers you).
"High temperatures" is not something that can be leveled as a general complaint of a GPU. Get a card with a good cooling solution then. Same goes for noise. Turing blower designs are also hot and loud.
Driver bugs - largely overstated, and _a lot_ of what was bad at launch was fixed within a few weeks of launch. Still some stuff there, true, and some persistent issues that should have been dealt with long ago, but it's nowhere near as bad as the media would like it to be. Media want sensations to drive traffic, and love to overstate problems as this is a major source of said traffic.
You're complaining that a blower card doesn't have idle fan stop. No blower cards do, as they lack the access to open air to allow for passive heat dissipation.
No support for ray tracing ... well, there's something to that, though one could respond by saying the competition at a similar (or even higher) price isn't delivering useable RT either (at least not without DLSS upscaling). It's a missing feature, but that doesn't detract from the card's overall performance or efficiency, and the games that might have made use of said feature number in the low, low double digits. Anyhow, catching up on a new and revolutionary feature in one generation is rather decent, no?
Nvidia has a similar high multi monitor power draw bug.
CrossFire no longer supported - so what? _Games_ don't support CF or SLI any more, as there are no people using setups like that, and the technology is inherently problematic. Nvidia only supports SLI on its highest tier SKUs.
There's nothing _wrong _with Navi 10. Yes, overclocking and general tuning in Radeon Software needs serious improvement, including fan controls. And no, AMD didn't catch up with Nvidia entirely, but it's a damn sight better than what they had before, and they're making major architectural improvements. You don't seem to have a very good grasp on reality if you're expecting more than that.


ARF said:


> *Igors Lab: No, NVIDIA’s Ampere GPUs Are Not Launching In March*
> 
> 
> 
> ...


Check your sources. The Igor's lab article WCCFTech "bases" their wild speculations on says _nothing_ of Ampere being a 2021 product, but rather speculates whether Gamescom (late August) would be a more logical expectation. It even explicitly states that "even if AMD takes the crown with Big Navi [at Computex], it might only last for two months".



INSTG8R said:


> Thanks kapone i think you summed it up nicely his negatives that I either A: Don’t care(RT) or B:haven’t experienced or don’t see them as negative. I’m impressed tho that he did make a pretty thorough list. My Nitro+ has been nothing but a great upgrade over my V64 Nitro+ in every metric, performance, power consumption, temperatures all improvements over Vega. I just can’t comment on the driver issues.





kapone32 said:


> I have wanted to pull the trigger on Navi but I can't seem to pull myself away from my Watercooled Vega 64s. I am patiently waiting for big Navi.


Same here - I considered replacing my Fury X with a 5700XT, but I've held off as I want my next GPU to last as long as this one (nearly five years now!), so I want a higher performance tier. For that kind of longevity, I also want RT - I don't see it as a necessity today or even next year, but when the new consoles arrive with RT, it's going to start being a necessary feature, and waiting until 2025 for my PC to keep up feature-wise with my consoles isn't an option. So I've held off, but Big Navi is 99% sure going into my PC this year. Hopefully in a relatively short form factor card (hoping for HBM!) and obviously with a water block. Can't go back once I went custom loop - never seeing a 275W GPU exceed 55C is rather amazing.


----------



## kapone32 (Feb 26, 2020)

Valantar said:


> So, let me see...
> 
> You're complaining that cards that AMD's partners (not AMD themselves) have overclocked are less efficient than stock. I shouldn't have to tell you, this is how overclocking works, and it's _exactly _the same for Nvidia cards. If this bothers you, don't buy an overclocked card.
> You're complaining that a bog-standard blower design is hot and loud. Okay, sure, the reference design is less than ideal (even if the reasoning behind using a blower is acceptable it's less and less relevant to real-world uses). Get a better ventilated open air card with better thermals then (and not an OC version if that bothers you).
> ...



Everything you just said has me excited!  Once you Watercool a GPU there is no turning back. I love seeing my Vegas idle at 2 or 3 degrees above room temps and going into the 60s while gaming but back to 25 C 10 minutes after a session.


----------



## Valantar (Feb 26, 2020)

kapone32 said:


> Everything you just said has me excited!  Once you Watercool a GPU there is no turning back. I love seeing my Vegas idle at 2 or 3 degrees above room temps and going into the 60s while gaming but back to 25 C 10 minutes after a session.


Yeah, it's rather amazing. My loop lives in a rather airflow-starved NZXT H200i, and is arguably undersized for the thermal load (95W 1600X + 275W Fury X across 120mm + 240mm ~30mm thick rads), but it works extremely well. While gaming the CPU runs barely above 60C and the GPU normally around 50-52, with idle (with _very_ low fan speeds on the front 240mm and the rear 120mm fan stopped) temps around 35-40 CPU and 30 GPU. If given just a tad more airflow I know this would drop even more. Still, I'm more than happy with it (it's so quiet too!), and I'm very much looking forward to getting a new GPU in there. I'm probably more eager for Big Navi to arrive than @ARF seems to be (and for it to kick butt), but thankfully I know how to temper my expectations with a dose of realism.


----------



## kapone32 (Feb 26, 2020)

Valantar said:


> Yeah, it's rather amazing. My loop lives in a rather airflow-starved NZXT H200i, and is arguably undersized for the thermal load (95W 1600X + 275W Fury X across 120mm + 240mm ~30mm thick rads), but it works extremely well. While gaming the CPU runs barely above 60C and the GPU normally around 50-52, with idle (with _very_ low fan speeds on the front 240mm and the rear 120mm fan stopped) temps around 35-40 CPU and 30 GPU. If given just a tad more airflow I know this would drop even more. Still, I'm more than happy with it (it's so quiet too!), and I'm very much looking forward to getting a new GPU in there. I'm probably more eager for Big Navi to arrive than @ARF seems to be (and for it to kick butt), but thankfully I know how to temper my expectations with a dose of realism.



That sounds very good I was never a fan of water cooling but I was always into AIOs. When I got my first Eisbaer it started my journey into watercooling everything.


----------



## INSTG8R (Feb 26, 2020)

kapone32 said:


> I have wanted to pull the trigger on Navi but I can't seem to pull myself away from my Watercooled Vega 64s. I am patiently waiting for big Navi.


I decided for 1440 this is more than enough card and the uplift or Vega is worth it  We saw the “rumour mill card” I don’t need that, but perhaps I’ll change my mind when it gets here.


----------



## Valantar (Feb 26, 2020)

INSTG8R said:


> I decided for 1440 this is more than enough card and the uplift or Vega is worth it  We saw the “rumour mill card” I don’t need that, but perhaps I’ll change my mind when it gets here.


Yeah, the RX 5700 XT is an excellent 1440p card for now and the coming years. And given that 4k gaming is still rather silly, the only real reason for a faster card is 1440p120 or longevity, both of which play into my choice of holding off. But the 5700 XT is still _very_ compelling.


----------



## INSTG8R (Feb 26, 2020)

Valantar said:


> Yeah, the RX 5700 XT is an excellent 1440p card for now and the coming years. And given that 4k gaming is still rather silly, the only real reason for a faster card is 1440p120 or longevity, both of which play into my choice of holding off. But the 5700 XT is still _very_ compelling.


Gets me where I wanna be I have a Freesync 2 HDR 144hz monitor to pair it with so I'm very pleased with the combo.


----------



## ARF (Feb 26, 2020)

Valantar said:


> Yeah, the RX 5700 XT is an excellent 1440p card for now and the coming years. And given that 4k gaming is still rather silly, the only real reason for a faster card is 1440p120 or longevity, both of which play into my choice of holding off. But the 5700 XT is still _very_ compelling.



You think 2160p gaming is silly. Why?


----------



## INSTG8R (Feb 26, 2020)

ARF said:


> You think 2160p gaming is silly. Why?


For me I don’t see the GFX horsepower needed to get a decent 4K experiences is worth it. 1440 is the “sweet spot” I’ll enjoy 4K on my TV


----------



## ARF (Feb 26, 2020)

INSTG8R said:


> For me I don’t see the GFX horsepower needed to get a decent 4K experiences is worth it. 1440 is the “sweet spot” I’ll enjoy 4K on my TV



AMD offers Radeon Boost - movement based dynamic resolution; Radeon Image Sharpening - enhances visual detail.

And now with the upcoming Variable Rate Shading, I think 4K experiences will be the main headline.

After all, with Nvidia SLI, users do game at 8K Battlefield V:


----------



## INSTG8R (Feb 26, 2020)

ARF said:


> AMD offers Radeon Boost - movement based dynamic resolution; Radeon Image Sharpening - enhances visual detail.
> 
> And now with the upcoming Variable Rate Shading, I think 4K experiences will be the main headline.
> 
> After all, with Nvidia SLI, users do game at 8K Battlefield V:


Thats fine I just don’t see the “wow factor” that goes along with the price/performance ratio. My monitor can do 4K over HDMI but again it doesn’t interest me in the slightest. I don”t see the big deal for such diminishing returns just like SLI


----------



## milewski1015 (Feb 26, 2020)

kapone32 said:


> Everything you just said has me excited!  Once you Watercool a GPU there is no turning back. I love seeing my Vegas idle at 2 or 3 degrees above room temps and going into the 60s while gaming but back to 25 C 10 minutes after a session.



This has me excited. I love my Nitro+ 5700 XT, but having recently purchased open-back headphones for gaming (Philips SHP9500 + V-Moda Boom Pro), I would still like it to be quieter under load. Thinking about saving up for a waterblock and making the plunge into watercooling. Have so much research to do...


----------



## Valantar (Feb 26, 2020)

ARF said:


> You think 2160p gaming is silly. Why?


4k is 8294400 pixels per frame. 1440p is 3686400 pixels per frame. In other words, for the shader resources needed to run 4k60 you can run 1440p135 instead (and that's ignoring other performance deficits like VRAM needs increasing etc.). Of course real world scaling isn't that linear (due to both GPU and CPU limitations), but 1440p is still _dramatically_ easier to run than 4k. I don't play strategy games or anything else where extreme levels of detail actually make a difference, so I would _far_ prefer the increased smoothness to the increased resolution. Also, with anything where the majority of the screen is showing movement, the perceptible difference between 1440p and 4k on a ~27" display at desktop viewing distances (or a TV sized display at TV viewing distances) is near zero. This of course depends on the display and other factors, but the difference in perceived sharpness is tiny compared to the difference in perceived smoothness of increasing the framerate even by 1.5x. Now, I don't subscribe to the "everything must be ULTRA!!!!!1!1!1!" mode of thinking for graphics settings (a lot of games look _great_ even at medium, and high is often imperceptible from ultra), but even 4k60 high is difficult to run on anything but a top-end GPU still for a lot of games. Now imagine two years into the future.



ARF said:


> AMD offers Radeon Boost - movement based dynamic resolution; Radeon Image Sharpening - enhances visual detail.
> 
> And now with the upcoming Variable Rate Shading, I think 4K experiences will be the main headline.
> 
> After all, with Nvidia SLI, users do game at 8K Battlefield V:


Dynamic resolution output at 4k is not gaming at 4k, even if you're sharpening the output. Upscaling literally means the game is being rendered at a lower resolution. Now, 4k panels do have the advantage of (usually, though dependent on screen size and viewing distance) looking good even at resolutions below their native one, which is something most LCDs suck at, but even then, upscaled, sharpened 1440p output at 4k on a 4k display is not going to look any better than 1440p on a native 1440p display of the same quality. Having the choice can be nice, but that would also mean getting a monitor capable of high frame rates when you want that instead of high resolutions, which immediately sends monitor prices into the stratosphere. Which means that buying a high refresh rate 1440p monitor and a GPU to match is a _much_ smarter choice than going for 4k.



milewski1015 said:


> This has me excited. I love my Nitro+ 5700 XT, but having recently purchased open-back headphones for gaming (Philips SHP9500 + V-Moda Boom Pro), I would still like it to be quieter under load. Thinking about saving up for a waterblock and making the plunge into watercooling. Have so much research to do...


I use my system with my open-backed Sennheiser HD 599s, and it is awesome. I can sometimes notice a vague background fan hiss, but that is it. Even the best air cooled GPU would be very audible in the same conditions.


----------



## milewski1015 (Feb 26, 2020)

Valantar said:


> I use my system with my open-backed Sennheiser HD 599s, and it is awesome. I can sometimes notice a vague background fan hiss, but that is it. Even the best air cooled GPU would be very audible in the same conditions.



That's what I'm hoping for! Again, don't get me wrong, my triple fan 5700 XT is quieter than my old Nitro+ 580, and it's quieter than the GPUs my friends use, but I'd still like it even quieter. You can see in my system specs, but I've got a Meshify C with 4 bequiet SW3s for case fans and a Dark Rock 4 cooling my 2600. I love that when just web browsing/watching Youtube, etc. there's only a slight hum coming from my rig. Honestly, the buzz of my UPS is louder than my system at low/no load (which unfortunately I don't think I can solve as that's electrical noise and can't be fixed with a fan replacement or something similar, as far as I know anyway). Now I just want to tackle the GPU noise under load annoyance


----------



## ARF (Feb 26, 2020)

milewski1015 said:


> That's what I'm hoping for! Again, don't get me wrong, my triple fan 5700 XT is quieter than my old Nitro+ 580, and it's quieter than the GPUs my friends use, but I'd still like it even quieter. You can see in my system specs, but I've got a Meshify C with 4 bequiet SW3s for case fans and a Dark Rock 4 cooling my 2600. I love that when just web browsing/watching Youtube, etc. there's only a slight hum coming from my rig. Honestly, the buzz of my UPS is louder than my system at low/no load (which unfortunately I don't think I can solve as that's electrical noise and can't be fixed with a fan replacement or something similar, as far as I know anyway). Now I just want to tackle the GPU noise under load annoyance



Undervolt and underclock the card, and find an appropriate noise dampened case. There are such.

Adjust the fans curve as well. Once you decrease the clocks and voltages, allow the fans to ramp up later on the curve.



Valantar said:


> 4k is 8294400 pixels per frame. 1440p is 3686400 pixels per frame. In other words, for the shader resources needed to run 4k60 you can run 1440p135 instead (and that's ignoring other performance deficits like VRAM needs increasing etc.). Of course real world scaling isn't that linear (due to both GPU and CPU limitations), but 1440p is still _dramatically_ easier to run than 4k. I don't play strategy games or anything else where extreme levels of detail actually make a difference, so I would _far_ prefer the increased smoothness to the increased resolution. Also, with anything where the majority of the screen is showing movement, the perceptible difference between 1440p and 4k on a ~27" display at desktop viewing distances (or a TV sized display at TV viewing distances) is near zero. This of course depends on the display and other factors, but the difference in perceived sharpness is tiny compared to the difference in perceived smoothness of increasing the framerate even by 1.5x. Now, I don't subscribe to the "everything must be ULTRA!!!!!1!1!1!" mode of thinking for graphics settings (a lot of games look _great_ even at medium, and high is often imperceptible from ultra), but even 4k60 high is difficult to run on anything but a top-end GPU still for a lot of games. Now imagine two years into the future.
> 
> 
> Dynamic resolution output at 4k is not gaming at 4k, even if you're sharpening the output. Upscaling literally means the game is being rendered at a lower resolution. Now, 4k panels do have the advantage of (usually, though dependent on screen size and viewing distance) looking good even at resolutions below their native one, which is something most LCDs suck at, but even then, upscaled, sharpened 1440p output at 4k on a 4k display is not going to look any better than 1440p on a native 1440p display of the same quality. Having the choice can be nice, but that would also mean getting a monitor capable of high frame rates when you want that instead of high resolutions, which immediately sends monitor prices into the stratosphere. Which means that buying a high refresh rate 1440p monitor and a GPU to match is a _much_ smarter choice than going for 4k.
> ...



1440p is just Full HD+. It's not even double the number of pixels. I am not interested, at all.

I have tried F1 2018 at 2160p and to be honest, it's a new game compared to 1080p.


----------



## Valantar (Feb 26, 2020)

ARF said:


> Undervolt and underclock the card, and find an appropriate noise dampened case. There are such.


The former is a good idea, the latter is giving away performance that shouldn't be necessary (though water cooling is a very expensive way of avoiding this). As for noise dampened cases, they restrict airflow to the extent that their thermals generally force you to run faster fans, largely negating their noise dampening (unless, again, you accept giving away performance as the GPU boosts lower due to thermals). For a PC build without any uncontrollable sources of noise (HDDs, coil whine, etc.) a case with good airflow and well laid out low-rpm fans is generally as quiet as - if not quieter than - any dampened case with the same hardware at the same temperatures and performance level. Of course, having a GPU with poor cooling fits badly into this, but that isn't @milewski1015's problem here, just that a quiet air cooler is still not quite there for them. Which is understandable, as three thin ~90mm fans are still likely to make a bit of noise.

A budget option is obviously to remove the fans and shroud from the GPU and strap some big, quiet fans like Noctuas or BeQuiets to the heatsink. This can work wonders if you're not concerned about aesthetics, and if you get the right adapter cable you can usually get the GPU to control their speed too.



ARF said:


> 1440p is just Full HD+. It's not even double the number of pixels. I am not interested, at all.


Well, I'm sorry that a 78% increase in pixels isn't enough for you. Have you actually tried gaming on a good 1440p monitor? It's a massive difference from 1080p. And as I said, the perceptible difference between 1440p and 4k in most games is near zero at most average size and viewing distance combinations.


ARF said:


> I have tried F1 2018 at 2160p and to be honest, it's a new game compared to 1080p.


I don't doubt that - 1080p is rather low. Racing games might be one of the edge cases where higher resolution makes a bigger difference though, as you have the semi-static view of the car + the relatively static sky box visible at all times and filling a large portion of the frame. Still, I would think the difference between 1440p and 4k is smaller than you think it is, and it would allow you to run these latency/response time-intensive games at much higher frame rates, which is a definite advantage. But to each their own, of course - if you prefer resolution over refresh rate to such a large degree, rather than wanting a balance with the best of both, that's obviously your right. Just don't expect it to apply to everyone.

Bringing this a bit back to the topic, I'm very much looking forward to next-gen consoles bringing both higher resolution and higher refresh rate with them. Given that the Xbox One X supports 1440p I would expect the Series X to do the same, which should be perfect for some higher refresh rate play in suitable titles. I'd love to see what Gears of War or Halo looks like at that kind of performance level.


----------



## efikkan (Feb 26, 2020)

ARF said:


> It is anti-competitive because forces AMD to bleed money. You know there is such type of practice in the automotive industry where suppliers are blackmailed by the big players, so they supply at unrealistically low prices.
> 
> It is not a perfectly viable business model because AMD is under threat of going under.


AMD is participating in the console market voluntarily. Sony and Microsoft have financed some of the development costs for the custom processors, which trickle down to their PC lineup. While AMD don't make huge profits per unit sold, the custom console chips are "safe" money.

Selling consoles at a loss has been the business model for many years, but it's Sony, MS and Nintendo taking this loss (not AMD!), and making it back through overpriced games, subscription fees and accessories. This business model (including the billions invested in these custom machines) will ultimately cause consoles to "merge" with the PC market at some point, but that's another topic.



ARF said:


> You remember how Intel was fined $1.3B for damaging AMD's sales and profits?!


Governments do stupid things.
While there are exceptions, most of these "anti-trust" cases are just government overreach.


----------



## Valantar (Feb 26, 2020)

efikkan said:


> Governments do stupid things.
> While there are exceptions, most of these "anti-trust" cases are just government overreach.


  
Current antitrust laws in both the US and EU are ridiculously weak, and are still barely enforced at all. And when they are, and someone is found guilty, the repercussions for breaking even the most fundamentally obvious laws (such as "paying your customers to not buy from your competitor is contrary to a free and open market") are usually so small as to be entirely meaningless. Calling the ridiculously few applications of them overreach is... well, rather silly. It takes a serious love of monopolies to argue against antitrust.

But this discussion has already veered quite far off topic, let's not wade into politics too, eh?


----------



## efikkan (Feb 26, 2020)

Valantar said:


> Current antitrust laws in both the US and EU are ridiculously weak, and are still barely enforced at all. And when they are, and someone is found guilty, the repercussions for breaking even the most fundamentally obvious laws (such as "paying your customers to not buy from your competitor is contrary to a free and open market") are usually so small as to be entirely meaningless. Calling the ridiculously few applications of them overreach is... well, rather silly. It takes a serious love of monopolies to argue against antitrust.
> 
> But this discussion has already veered quite far off topic, let's not wade into politics too, eh?


Enforcement of anti-trust laws in principle is a good thing. In practice though, they generally do nothing with the real cases and make some symbolic and ridiculous cases.
But yeah, let's end it there.


----------



## milewski1015 (Feb 26, 2020)

ARF said:


> Undervolt and underclock the card, and find an appropriate noise dampened case. There are such.
> 
> Adjust the fans curve as well. Once you decrease the clocks and voltages, allow the fans to ramp up later on the curve.



I have undervolted the card - was able to bring the voltage down without lowering clocks any so no loss in performance. Have tried adjusting the fan curve as well. Unfortunately, it seems that's where AMD's drivers get me - I've tried increasing the point at which the fans ramp up, and also manually disabling fan-stop mode an an attempt to run them at a slow speed at low load to keep temps down. The issue is that the card doesn't seem to respond to my set curves. I'll have the first point set 20% RPM at 25C and the card still operates in fan-stop mode until about 60C and then ramps up hard. 

As for the noise-dampened case, as @Valantar mentions, the noise-dampening is negated by higher fan speeds due to higher thermals. I'm trying to avoid having to buy a new case if possible as I like the aesthetics and footprint of the Meshify C. 



Valantar said:


> The former is a good idea, the latter is giving away performance that shouldn't be necessary (though water cooling is a very expensive way of avoiding this). As for noise dampened cases, they restrict airflow to the extent that their thermals generally force you to run faster fans, largely negating their noise dampening (unless, again, you accept giving away performance as the GPU boosts lower due to thermals). For a PC build without any uncontrollable sources of noise (HDDs, coil whine, etc.) a case with good airflow and well laid out low-rpm fans is generally as quiet as - if not quieter than - any dampened case with the same hardware at the same temperatures and performance level. Of course, having a GPU with poor cooling fits badly into this, but that isn't @milewski1015's problem here, just that a quiet air cooler is still not quite there for them. Which is understandable, as three thin ~90mm fans are still likely to make a bit of noise.
> 
> A budget option is obviously to remove the fans and shroud from the GPU and strap some big, quiet fans like Noctuas or BeQuiets to the heatsink. This can work wonders if you're not concerned about aesthetics, and if you get the right adapter cable you can usually get the GPU to control their speed too.



Yeah, I agree. Noise is the main motivation for wanting to watercool the card, but PC hardware has become a hobby and I'd happily throw some more money at it. Aesthetics are a concern, but I have thought about strapping some good fans onto the card. Maybe I'll give that a try first, as I'll have to end up buying fans for a rad if I do go the watercooling route anyway. Was thinking of splitting them to a single motherboard header and setting their fan curve in the BIOS based on GPU temperature. Of course, I could also just live with the noise - after doing some preliminary research, it looks like watercooling the card would be quite an undertaking. I'd have to figure out where to put the res/pump (would like to avoid having to drill holes in the PSU shroud), and working in just the lower confines of the main chamber (as to avoid the Dark Rock 4 I have cooling the CPU) might get cramped. I considered scrapping the Dark Rock 4 and just adding a CPU/monoblock, but it looks like there aren't any monoblocks for the B450 Gaming Pro Carbon, and all the CPU blocks I've seen for AM4 appear to have clearance issues due to the large heatsinks. So I'd likely have to get a new mobo to add a CPU block, which I would also like to avoid. Only possible way I can think of fitting everything without mods is going with a 280 rad in front, removing the cover at the front end of the PSU shroud and doing a pump/res combo down in the basement and then doing a run up to the GPU. Idk, I'm completely new to all this and still have a lot of research to do so I could be overthinking it/getting discouraged for no reason. Just think chucking a waterblock on the GPU would be a fun (albeit expensive) foray into watercooling.


----------



## Valantar (Feb 27, 2020)

milewski1015 said:


> Yeah, I agree. Noise is the main motivation for wanting to watercool the card, but PC hardware has become a hobby and I'd happily throw some more money at it. Aesthetics are a concern, but I have thought about strapping some good fans onto the card. Maybe I'll give that a try first, as I'll have to end up buying fans for a rad if I do go the watercooling route anyway. Was thinking of splitting them to a single motherboard header and setting their fan curve in the BIOS based on GPU temperature. Of course, I could also just live with the noise - after doing some preliminary research, it looks like watercooling the card would be quite an undertaking. I'd have to figure out where to put the res/pump (would like to avoid having to drill holes in the PSU shroud), and working in just the lower confines of the main chamber (as to avoid the Dark Rock 4 I have cooling the CPU) might get cramped. I considered scrapping the Dark Rock 4 and just adding a CPU/monoblock, but it looks like there aren't any monoblocks for the B450 Gaming Pro Carbon, and all the CPU blocks I've seen for AM4 appear to have clearance issues due to the large heatsinks. So I'd likely have to get a new mobo to add a CPU block, which I would also like to avoid. Only possible way I can think of fitting everything without mods is going with a 280 rad in front, removing the cover at the front end of the PSU shroud and doing a pump/res combo down in the basement and then doing a run up to the GPU. Idk, I'm completely new to all this and still have a lot of research to do so I could be overthinking it/getting discouraged for no reason. Just think chucking a waterblock on the GPU would be a fun (albeit expensive) foray into watercooling.


What waterblocks have you been looking at? Judging by MSI's gallery pics like this there should be plenty of clearance around the socket for pretty much any regular water block there. Something like the EK Supremacy Classic barely extends past the socket and mounting holes at all, and is very clearly within the keep-out-zones for AM4, let alone anywhere near interfering with surrounding components on an ATX board. I've attached a mock-up of how that waterblock would fit (tried my best at aligning its screws with the socket mounting screws, but perspective distortion in the photos mean they don't align perfectly and thus look like they take up slightly more space than in reality). Still, there's heaps of space there. Heck, I have mine (well, from before they were called "classic", but same design) on an ITX board and it fits perfectly. Water blocks generally conform very well to socket keep-out zones unless you look at the silly oversized ones from Bykski and the likes.


But perhaps we should stop this massively OT discussion now?


----------



## kapone32 (Feb 27, 2020)

milewski1015 said:


> I have undervolted the card - was able to bring the voltage down without lowering clocks any so no loss in performance. Have tried adjusting the fan curve as well. Unfortunately, it seems that's where AMD's drivers get me - I've tried increasing the point at which the fans ramp up, and also manually disabling fan-stop mode an an attempt to run them at a slow speed at low load to keep temps down. The issue is that the card doesn't seem to respond to my set curves. I'll have the first point set 20% RPM at 25C and the card still operates in fan-stop mode until about 60C and then ramps up hard.
> 
> As for the noise-dampened case, as @Valantar mentions, the noise-dampening is negated by higher fan speeds due to higher thermals. I'm trying to avoid having to buy a new case if possible as I like the aesthetics and footprint of the Meshify C.
> 
> ...



Though noise is a good reason to Watercool a GPU the mitigating benefit is performance. Usually a GPU that is properly watercooled will not go past 65C under full load (We all know that heat effects performance). The added benefit of undervolting a GPU with water cooling is increasing the power level will allow for better performance. Of course all of this based on the quality of the GPU in question not by brand or level but just like CPUs the silicon lottery. If you just want to walk into Watercooling I would go with the Alphacool Eisbaer. It is a pre-built loop guised as an AIO. The thing about it is the res has a fill port and it is expandable with G1/4 so anything from GPU blocks to Distro plates will be available. The other thing is price/performance, it is always super competitive with other AIOs in terms of price but most of those have aluminum rads while both the block and the rad are copper on the Eisbaer. The newest iteration actually has Be Quiet fans or ARGB PWM fans. They come from 240 to 420MM in terms of rad. 









						Alphacool Eisbaer Aurora 280 CPU - Digital RGB
					

Der Alphacool Eisbaer Aurora AIO CPU Wasserkühler ist eine Weiterentwicklung des beliebten und bekannten Eisbaer Kühlers. Alphacool hat dabei viele Details verbessert und einige Standards beibehalten. Allen voran bildet die...




					www.alphacool.com


----------



## milewski1015 (Feb 27, 2020)

Valantar said:


> What waterblocks have you been looking at? Judging by MSI's gallery pics like this there should be plenty of clearance around the socket for pretty much any regular water block there. Something like the EK Supremacy Classic barely extends past the socket and mounting holes at all, and is very clearly within the keep-out-zones for AM4, let alone anywhere near interfering with surrounding components on an ATX board. I've attached a mock-up of how that waterblock would fit (tried my best at aligning its screws with the socket mounting screws, but perspective distortion in the photos mean they don't align perfectly and thus look like they take up slightly more space than in reality). Still, there's heaps of space there. Heck, I have mine (well, from before they were called "classic", but same design) on an ITX board and it fits perfectly. Water blocks generally conform very well to socket keep-out zones unless you look at the silly oversized ones from Bykski and the likes.
> View attachment 146060
> But perhaps we should stop this massively OT discussion now?



Was just poking around on EK's site and noticed that a lot of the AM4 blocks specifically excluded the B450 GPC in regards to compatibility. I saw a reddit thread about a guy having to sand down his Bykski one to fit. I guess I just didn't look hard enough - again, all very new to me. Thanks for the mockup and recommendation and all the helpfukl info. Yeah, we probably should table this for another thread 



kapone32 said:


> Though noise is a good reason to Watercool a GPU the mitigating benefit is performance. Usually a GPU that is properly watercooled will not go past 65C under full load (We all know that heat effects performance). The added benefit of undervolting a GPU with water cooling is increasing the power level will allow for better performance. Of course all of this based on the quality of the GPU in question not by brand or level but just like CPUs the silicon lottery. If you just want to walk into Watercooling I would go with the Alphacool Eisbaer. It is a pre-built loop guised as an AIO. The thing about it is the res has a fill port and it is expandable with G1/4 so anything from GPU blocks to Distro plates will be available. The other thing is price/performance, it is always super competitive with other AIOs in terms of price but most of those have aluminum rads while both the block and the rad are copper on the Eisbaer. The newest iteration actually has Be Quiet fans or ARGB PWM fans. They come from 240 to 420MM in terms of rad.
> 
> 
> 
> ...



Yeah, I thought the potential for increased performance went without saying  I'd happily try and squeeze some more performance out of my 5700 XT. Will have to look into the Eisbaer. Thanks for putting it on my radar.


----------



## kapone32 (Feb 27, 2020)

milewski1015 said:


> Was just poking around on EK's site and noticed that a lot of the AM4 blocks specifically excluded the B450 GPC in regards to compatibility. I saw a reddit thread about a guy having to sand down his Bykski one to fit. I guess I just didn't look hard enough - again, all very new to me. Thanks for the mockup and recommendation and all the helpfukl info. Yeah, we probably should table this for another thread
> 
> 
> 
> Yeah, I thought the potential for increased performance went without saying  I'd happily try and squeeze some more performance out of my 5700 XT. Will have to look into the Eisbaer. Thanks for putting it on my radar.



No problem, if you are looking for reviews Kit Guru has one on Youtube and they usually publish a written review on their website.


----------

