Replied to each valid point, but you must be blind or Google translate isn't working great for you. And yes any dumb claim by you will be met with a laugh from me since even the sources that you cite disagree with you. LOL.
I see that you have become narrow in your answers, inventions are less present. You understand the reason, as you venture suppositions you are wrong.
You also understand the reason why: You don't know what you are talking about, and as you try you can only see where you lean.
That is the only coherence that is seen, and it is seen well, also because nothing else is seen...
No its more like 5 FPS for the PS5 in BG3.
The 3600X on a PC gets 9000 FPS as well.
Only you see 5fps maximum, but now we understand why…
At least on numbers the translation works, or not even then? Anyway, here's the truth:
Same point, 10fps more than 22, which squares with the difference seen in the other test. Big difference, from a low-end 6 core from 5 years ago.
I don't really care if the PS5 was as good as the 3600, but demanding a more up-to-date 8 core isn't an option.
I don't even care if I have to buy it, because I would never recommend a new PC with a Ryzen 3600, or even a 5600. That wasn't the point.
Its not my problem that you can't write proper English and half of what your write looks like a bad Google Translate translation.
Your problem is that you try to take advantage of the fact that I am not a native English.
We are not talking about theoretical philosophy, so so many subtleties are not needed.
Better that I do not point out the pettiness of such behavior...
I certainly don't start writing on papyrus in a language other than my own, where I am very fast.
You are lucky for this, otherwise you would have changed air a long time ago...
Continuing to argue that a $200 7600X3D is better when the whole budget is $700, I'll just LOL and move on.
Continuing to pretend not to understand the context from where the 7600X3D came out.
This was about the amount of cores and cache, the usual topic that the mind hides from you...
Okay, the Dual Sense that I use on my PC must be a special version then.
The PS5 pad works badly even on compatible games, it needs the cable. Did you ever know?
The PS5 layout is different from the Xbox one supported for almost 20 years on PC.
The on-screen buttons are different, and the sensitivity set the same by MS via XInput is also different.
Do you really play on PC? Only stupid and senseless things do you want to argue?
This is for the details, because I choose the pad myself, and if I want I only use the keyboard.
Demanding the purchase of the expensive and uncomfortable (on PC) PS5 pad is ridiculous, as is taking Wi-Fi into account even if you don't use it.
It has to be repeated, just as it has to be repeated where these insane demands come from.
If PS5 doesn't allow you to choose the pad, it's not a PC problem.
If they don't sell it without Wi-Fi, that's their problem and that of those who buy it...
This is the meaning of non-unilateral comparison. If you don't understand it, it's not my problem.
If you don't like equality, go to a console forum and open the topic: How to inflate the price of the PC, so as to always make the PS5-Pro cheaper.
Subtitle: Any attempt to minimize the processor weakness of the PS5-Pro will be appreciated. Remember, it runs at 120fps where Switch does 60...
LOL at least when you cite a video, watch it properly, they clearly mentioned that PC games aren't optimized for GDDR6 memory for the CPU.
PS5 CPU works on PS5 games, not PC games. Take performance measurements and conclusions on a PS5.
Bravo, you discovered hot water, the PC tests are not optimized for PS5.
It was enough to see the preview, there was no need to hear the obvious.
Does this mean that on PS5 the ratios would be different? NO. That's what you hope for.
Maybe it means that PS5 multiplies the cache x4 and halves the GDDR latency? Not at all! maybe in your dreams...
It was simply needed to consider the impact of the cache, the high latency of the GDDR (which is called that for a reason), and the higher IPC of recent architectures.
Do you understand the presence of the Ryzen Pro 4750G?
It has 8 cores, 8MB of cache, similar frequency to the XSX SOC and the same architecture.
Result:
First you see the loss due to the low reactivity of the GDDR in comparison with the Ryzen Pro 4750G.
Then you see the incidence of the cache with Ryzen 3600.
Then you see the importance of the IPC with Ryzen 7600.
Damn, you're on TPU, have you ever seen a CPU with fewer cores outperform a CPU with more cores in gaming? CP-2077 makes good use of 8 cores.
How much can the number of cores be worth, when you start from the worst base ever?
Not much, between GDDR latency, amount of cache and architecture: The 7600 breaks through that poor processor combined with memory not suited to it.
This happens despite the bandwidth all for itself, which in the beginning you wanted to pass off as an advantage, when PS5 took them from that unfortunate mix.
However, before that the bandwidth needed only for the processor was very little, not worthy of note.
This is because you wanted to increase the pure bandwidth of the GPU on PS5.
However, when it was seen that that badly put together mix does not yield much, you preferred to say that it is not optimized for PS5.
Wow, you must be a smart, impartial, and above all consistent guy.
Last but not least, you are also trying to say that the sources say something different than mine.
Where did this need come from? From the fact that you have always avoided in every way to talk about cache, IPC and GDDR latency.
This is because, despite your ignorance, you at least understood that the argument is not in favor of PS5.
PS5 usually loses in comparison to the crippled XSX SOC and the Ryzen 3600. The rest is just talk.
The 6700 that DF pairs with it is downclocked, yet it runs better, so PS5 is closer to the 6600 XT.
It was clear right away, but you like to deny the evidence, if it doesn't suit PS5.
You started with the video that you believed to be in your favor, when in this there was the premise of the different game setting, the statement that on average PS5 runs worse (than a 6700 DC, remember), the presence of the 3600 (and the tests with the ruined XSX SOC), and there was also talk of 6800 and 3070 Ti.
After this precedent, you go looking for my contradictions that aren't there?
You have to admit that you have courage, even if it is badly directed...
DF is for you, who were the first to mention it, even though it didn't even suit you.
They are more in favor of consoles, but they can't say completely absurd things.
I don't need DF to understand that a processor with GDDR and 1/4 cache is lame.
I wasn't even looking for that, I found it by chance looking at the details of the Frankenstein configuration.
If you know what you're talking about and tell the truth, there's no need to change your point of view, coherence follows automatically.
DigitalFoundry, which you cite as a source, said that a PS5 Pro GPU performance will be close to a RTX 4070.
RTX4070 = ~7800XT.
Aside from the fact that 4070 dc means little, even the 4080 dc comes to the 4070, right? The fact is that in reality they always talk about 6800 or 3070 Ti.
The outburst in the effective response given to those who wanted it... counts for very little on the road they give examples with inferior GPUs.
They have already done tests with the 3070 Ti, warn them to be consistent, I would agree with you...
He could have also talked about 4060 Ti OC. Sure, but you wouldn't have appreciated that, ehh...
First I said close to 2x, not 2x, don't invent things on your own.
Second reducing the image to ~75% on each axis produce an image that is 2x smaller, its not rocket science, but it may be hard for you.
You removed the PS5-Pro overhead from the game regardless.
You talk about x2 or almost without considering it at all, so in your hypotheses you exclude it a priori.
You don't need to talk about it, it's called implicit and explicit logic.
If that additional weight becomes 20%, PS5-Pro might not even compete with the 3070.
I insist on the fact that the SONY (AMD) upscaler does not perform like DLSS, so it reasonably still scales.
Let's say that if it goes badly it could be similar to the 3060 Ti?
When have you ever even considered the aforementioned? NEVER!
You just tried to go above, and why is still clear.
Sorry for the repetitiveness. Repetita iuvant (in theory...)
You said that the RTX-3060 is worse than the PS5. You didn't even mention DLSS.
At least try to understand the context of the answer, otherwise you have to blame yourself for your poor understanding of the context and the rhetoric...
The basics of semantics are the same in all languages, my friend.
All this is just remotely for you.
It's for others, and while I'm at it I'll put it aside, so the next time I run into the usual ignorant fanboy I'll be quicker...