Tuesday, September 8th 2020

Microsoft Unveils the Xbox Series S: The Smallest Xbox Ever

Microsoft today surprised us with the Xbox Series S announcement. The Xbox Series S offers "next gen performance" and is the "smallest Xbox ever." The company promised to share more details, but when it goes on sale, it will cost just USD $299 (ERP). The announcement teaser had a pretty clear image of the finished product, revealing it to be barely more than two controllers in volume. A large fan intake makes up one of its side panels. It retains the general design of the larger Xbox Series X. Microsoft stated it will share more details about the new console.
Source: Microsoft Xbox (Twitter)
Add your own comment

113 Comments on Microsoft Unveils the Xbox Series S: The Smallest Xbox Ever

#76
HD64G
Not a bad price at all for a true 1080P console imho.
Posted on Reply
#77
R0H1T
Vya DomusI find it funny that the one who thinks I believe people who buy consoles are stupid thinks himself that they are stupid enough to not want things that are objectively better.
What is better though, is it (just) about visual fidelity? Is gaming at 1000 fps definitely better than 360 fps then? You're making a subjective "experience" into something which can be objectively measured. In that no 4k is not better just because it's 4x the number of pixels, nor is 60 fps inferior because you think it's objectively better to have 120 fps! Gaming is a personalized subjective experience, next you're going to tell me people in the 80's playing Pac-Man were doing it wrong :nutkick:
Posted on Reply
#78
Vya Domus
R0H1TWhat is better though, is it (just) about visual fidelity?
Like I said above, I reckon 480p is enough too. Can't you see how stupid this sounds ?

If this console really is just a 4 TFLOP machine that would make it one of the weakest "next-gen" console ever. It's probably hardly any faster than Xbox One X GPU wise. None of that stuff is subjective.
Posted on Reply
#79
matar
i am buying this.
Posted on Reply
#80
R0H1T
Well people are buying it for gaming, aren't they? Do gamers buy GPUs because they offer only 30 TFLOPS or performance or because they can game with it? No one's buying a console only to look at their shiny new toy's specs (alright some might but they'd be a minor statistical blip I'd imagine) vs actually buying them for games they play.
Posted on Reply
#81
Vya Domus
R0H1TDo gamers buy GPUs because they offer only 30 TFLOPS or performance or because they can game with it?
They'd buy the 30 TFLOPS GPU because it's probably also faster. It's quite simple, I don't know why you don't get it.
R0H1TNo one's buying a console only to look at their shiny new toy's specs
So tell me again why is MS releasing two consoles with vastly different GPU specs if that doesn't matter? Why not just sell the Series S and that's it, the focus is clearly still on Series X.

You say console buyers aren't stupid but you sure think they're blind it seems.
Posted on Reply
#82
Valantar
Vya DomusYou know very well that's not what I am arguing. In terms of GPU/CPU from all three vendors, probably less than ~30 of their products account for 95% of the stuff people actually want to buy. Other than that, RAM, motherboard, storage, etc, those are all standardized and are more less inconsequential. Just because there are a million brands making the exact same damn thing doesn't mean you actually have that much choice, that's really dumb.

In my opinion 100 is much closer to the same magnitude that 2 is, instead of the "millions" like you argue.
I'll gladly admit that millions is a bit of an exaggeration, but you're going way too far the other way. I'm obviously not talking about different "brands making the exact same damn thing" - if so, I would have siad hundreds of GPUs per generation, not a handful to a dozen like I said. For reference, the past generation from Nvidia consisted of... let's count: 1650, 1650S, 1650 Ti, 1660, 1660S, 2060, 2060S, 2070, 2070S, 2080, 2080S, 2080 Ti. That's a clean dozen SKUs from one vendor in one generation, with a compute performance span of 2.98TF to 13.45TF - 4.5x. In one generation. And there are arguably two more that are relevant, though only high end GPUs from the latter of those (980, 980 Ti) are an option for current AAA games, and only at lower resolutions. But the point still stands: you'll easily reach fifty relevant GPU variations alone for an average PC game. Then there's CPUs, which, while there's little practical difference beteen a 4c4t Sandy Bridge or Haswell CPU, there's definitely a meaningful difference between those and a Zen2 6c12t, Whatever Lake 10c20T, or upcoming Zen3 16c32t. Where do you place your minimum? How do you make things scale? Where do you focus your optimizations? And sure, RAM is mostly a checkbox, though heaps of gamers still have 8GB, which is starting to become a limitation. The same goes for storage, which with the upcoming consoles is going to become a bottleneck and NVMe SSDs will soon a requirement for a lot of games. And so on and so forth. Even if you (over-)simplify the CPU performance nightmare down to, let's say, low-end (4c8t and below), midrange (6c6t to 8c16t) and high end (10c20t and upwards), and then exclude edge case combinations of CPU and GPU performance, you're still likely looking at 3^30 or more combinations. And while that is indeed closer to one than a million in pure numbers, complexity doesn't scale linearly - optimizing for two configurations is a good deal more complex than one, but optimizing for ten configurations is far more than 5x more complex than that again. And optimizing for a hundred is essentially impossible in any practical sense, forcing shortcuts with inevitable shortcomings. The same is not true for two configurations.
Vya DomusConsoles used to be sold on the premise that there is going to be just one hardware configuration which will last for years, 2 > 1, so that principle is already broken. Xbox One games look like mud (720p and even less) compared to their Xbox One X counter parts. Xbox Series S games will look the same compared to whatever higher end console they'll make (which you know they will, eventually) or maybe even to Series X right now.

You view it as a compromise, I view it as MS selling underpowered crap and creating a bunch of platforms delivering inconsistent quality.
Sure, you can argue that that promise is broken already. But you could also argue that the promise will be less broken this go around than the last one - last time around you had a high end GPU console with a shitty CPU that made it highly unpredictable, while this time around you have a solid base of high CPU performance with your choice of an entry or high end GPU, alongside low or high resolution. That IMO is a level of complexity that is essentially required by today's gaming landscape, given how display resolutions have taken off in the past decade and how the gaming audience has diversified massively. (Though, to be fair, the "I'll just buy a Playstation to play Fifa" crowd has been there for a long, long time.) Aiming for a one size fits all configuration is utopian in today's landscape, as you would either alienate a huge customer base by pricing them out, or you'd alienate those who want their games to look really good but don't want the hassle of PC gaming - which is also a huge group. Two configurations lets you get both. That is well worth the compromise for me. And that elitist "MS is selling underpowered crap" stance is something I'm truly happy I don't share. Not everyone can afford a $500 console or a $1500 PC, so providing access to upcoming games to these audiences is of great benefit to the gaming community as a whole.
Vya DomusThey don't have a "million" options, they have very few, it's rare that I see a game with more than 5 options say, maximum about 10. It doesn't matter because the same idea applies, out of those 10 options, maybe 3-4 have a noticeable impact in quality/performance. But you don't even need to touch those, there are presets anyway.
I would love to see you show me a relatively recent game with 5 graphics options. Pretty please? For reference, here's Rocket League, a relatively simple and not particularly fancy-looking esports game. I count 14 relevant settings.
external-preview.redd.it/7_gCiLe4LpqC1HpeKHEeQYmXxsTRJoJpUH1CTjiWIpE.jpg
Overwatch?
www.prosettings.com/site/wp-content/uploads/taimou-graphic-settings.jpg
A similar amount, though we can't see the whole panel.
CoD Warzone? Had to look up an article, but at least 17.

That's not quite enough for a representative overview, but the only game I found with as few as 5 was LoL.
Vya DomusOne, two, ten, who cares. It was just an example, large 1080p displays look horrid in comparison with 4K ones, that's just the reality. No matter how far you get away, the pixel grid remains visible, a 65" 1080p TV has 33 PPI for Christ sake. 33

That's dismal.
DPI itself is meaningless without accounting for viewing distance, so the number you should be looking at is some version of distance over DPI. Recommendations for print DPI vs. viewing distance are reasonably applicable here. And while 65" 1080p is definitely on the high end for that resolution - I doubt I'd go above 55" for that - there have been plenty of blind tests done with various degrees of scientific accuracy on this, and the vast majority seem to conclude that at normal viewing distances, most people can't really tell the difference between 4k and 1080p.
Vya DomusIf you are talking about scaling that's a non issue, you can always do the scaling on the GPU and avoid anything the TV does to the image. 1080p on a 4K display never looks worse than on a 1080p one, I've seen really cheap 4K TVs ,like under 250$ cheap and I wound't trade it for the most expensive 1080p TV in the world. It just doesn't compare, the PPI is simply not there.
...but I'm not talking about scaling, am I? I'm talking about image quality FFS. Cheap 4k TVs generally have terrible image quality, and it really doesn't matter if your DPI is high if your contrast is shit and your color reproduction is terrible. And especially not if your response times are slow on top of that. All of which is true for most cheap TVs. Which is why a good 1080p TV will look better than a cheap 4k TV for anything except rendering text.
Posted on Reply
#83
R0H1T
Vya DomusLike I said above, I reckon 480p is enough too. Can't you see how stupid this sounds ?

If this console really is just a 4 TFLOP machine that would make it one of the weakest "next-gen" console ever. It's probably hardly any faster than Xbox One X GPU wise. None of that stuff is subjective.
Not sure why you're going with the same circular reasoning over & over again? Why can't there be two tiers of (console) gamers?
The casual ones, or the cash strapped ones, can get this one while the hardcore &/or better off ones will get the more expensive ones. Do I really need to remind you of the Trillions of dollars, yes with capital T, the global economy's shed just this year with more pain yet to come! Even some of the hardcore fans will have to think twice about their purchase decisions, going with something cheap isn't denigrating (nor "inferior" for gaming) because that's the message I see being repeated from many of your posts.
Posted on Reply
#84
Valantar
Vya DomusLike I said above, I reckon 480p is enough too. Can't you see how stupid this sounds ?

If this console really is just a 4 TFLOP machine that would make it one of the weakest "next-gen" console ever. It's probably hardly any faster than Xbox One X GPU wise. None of that stuff is subjective.
The problem with your line of argumentation is you seem to think that perceived quality increases are linear, which they obviously aren't. Is even 720p vastly better than 480p? Of course! Is 1080p much better than 720p, pretty much regardless of viewing distance and display size? Sure! Is the same true for 4k vs. 1080p? Not really, unless your TV or monitor is really huge or you're looking at it from very, very close. We've reached a point of rapidly diminishing returns in terms of resolution increases. That, like a lot of what you are saying, is also a fact, and isn't by any means subjective.
Posted on Reply
#85
Vya Domus
ValantarI would love to see you show me a relatively recent game with 5 graphics options. Pretty please? For reference, here's Rocket League, a relatively simple and not particularly fancy-looking esports game. I count 14 relevant settings.
external-preview.redd.it/7_gCiLe4LpqC1HpeKHEeQYmXxsTRJoJpUH1CTjiWIpE.jpg
Overwatch?
www.prosettings.com/site/wp-content/uploads/taimou-graphic-settings.jpg
A similar amount, though we can't see the whole panel.
CoD Warzone? Had to look up an article, but at least 17.
That's not quite enough for a representative overview, but the only game I found with as few as 5 was LoL.
I said it myself there are games with more than 5, I know very well some have a lot of options, here's another one: MS flight simulator. It doesn't matter how many are there, you don't have to ever worry about them if you don't want to.
ValantarI'll gladly admit that millions is a bit of an exaggeration, but you're going way too far the other way. I'm obviously not talking about different "brands making the exact same damn thing" - if so, I would have siad hundreds of GPUs per generation, not a handful to a dozen like I said. For reference, the past generation from Nvidia consisted of... let's count: 1650, 1650S, 1650 Ti, 1660, 1660S, 2060, 2060S, 2070, 2070S, 2080, 2080S, 2080 Ti. That's a clean dozen SKUs from one vendor in one generation, with a compute performance span of 2.98TF to 13.45TF - 4.5x. In one generation. And there are arguably two more that are relevant, though only high end GPUs from the latter of those (980, 980 Ti) are an option for current AAA games, and only at lower resolutions. But the point still stands: you'll easily reach fifty relevant GPU variations alone for an average PC game. Then there's CPUs, which, while there's little practical difference beteen a 4c4t Sandy Bridge or Haswell CPU, there's definitely a meaningful difference between those and a Zen2 6c12t, Whatever Lake 10c20T, or upcoming Zen3 16c32t. Where do you place your minimum? How do you make things scale? Where do you focus your optimizations? And sure, RAM is mostly a checkbox, though heaps of gamers still have 8GB, which is starting to become a limitation. The same goes for storage, which with the upcoming consoles is going to become a bottleneck and NVMe SSDs will soon a requirement for a lot of games. And so on and so forth. Even if you (over-)simplify the CPU performance nightmare down to, let's say, low-end (4c8t and below), midrange (6c6t to 8c16t) and high end (10c20t and upwards), and then exclude edge case combinations of CPU and GPU performance, you're still likely looking at 3^30 or more combinations. And while that is indeed closer to one than a million in pure numbers, complexity doesn't scale linearly - optimizing for two configurations is a good deal more complex than one, but optimizing for ten configurations is far more than 5x more complex than that again. And optimizing for a hundred is essentially impossible in any practical sense, forcing shortcuts with inevitable shortcomings. The same is not true for two configurations.
You are still having a very hard time differentiating between the shear number of components available and the choices that are actually important. For example there may be 1650, 1650S, 1650 Ti, 1660, 1660S, 2060, 2060S, 2070, 2070S, 2080, 2080S, 2080 Ti but :

1650, 1650S, 1650 Ti, 1660, 1660S
2060, 2060S, 2070
2070S, 2080, 2080S, 2080 Ti

They are all kind of grouped in performance tiers and within each group the differences aren't that big. So in reality the choice get's simplified to just three groups, after that you are facing differences that can be argued to not be that important for the overall experience.

If people really had to chose from 3^30 combinations, no one would get anything done. Clearly things are much simpler in reality.
ValantarDPI itself is meaningless without accounting for viewing distance, so the number you should be looking at is some version of distance over DPI. Recommendations for print DPI vs. viewing distance are reasonably applicable here. And while 65" 1080p is definitely on the high end for that resolution - I doubt I'd go above 55" for that - there have been plenty of blind tests done with various degrees of scientific accuracy on this, and the vast majority seem to conclude that at normal viewing distances, most people can't really tell the difference between 4k and 1080p.
How do you do a blind test about displays ? Sorry ... I couldn't help my self ...

You still don't quite get what I am saying, I not talking about the resolution itself as much as I am about the pixel structure, which is physically larger in a 1080p display of the same size. I am talking about the screen door effect more precisely, I can see the gird of pixels on practicality every single 1080p larger than 40 inches from no matter how many meters away (well, any reasonable distance).
Valantar...but I'm not talking about scaling, am I? I'm talking about image quality FFS. Cheap 4k TVs generally have terrible image quality, and it really doesn't matter if your DPI is high if your contrast is shit and your color reproduction is terrible.
I guess you have to see recent cheap 4K TV then to convince yourself they are superior. Again, I would trade worse contrast and for not being able to see that pixel grid any day of the week.
ValantarThe problem with your line of argumentation is you seem to think that perceived quality increases are linear, which they obviously aren't. Is even 720p vastly better than 480p? Of course! Is 1080p much better than 720p, pretty much regardless of viewing distance and display size? Sure! Is the same true for 4k vs. 1080p? Not really, unless your TV or monitor is really huge or you're looking at it from very, very close. We've reached a point of rapidly diminishing returns in terms of resolution increases. That, like a lot of what you are saying, is also a fact, and isn't by any means subjective.
Objectively, yes they are linear, 4K means four times the information of a 1080p image. Subjectivity, it may not be linear but it still exists.
R0H1TNot sure why you're going with the same circular reasoning over & over again? Why can't there be two tiers of (console) gamers?
The casual ones, or the cash strapped ones, can get this one while the hardcore &/or better off ones will get the more expensive ones. Do I really need to remind you of the Trillions of dollars, yes with capital T, the global economy's shed just this year with more pain yet to come! Even some of the hardcore fans will have to think twice about their purchase decisions, going with something cheap isn't denigrating (nor "inferior" for gaming) because that's the message I see being repeated from many of your posts.
We went from "people don't care about 4K" to the economy and how I am apparently shaming people that chose cheaper hardware. You're not just moving the goal post, you are moving the entire plot of land along with it.
Posted on Reply
#86
R0H1T
Let's see the chronology of events ~
Vya DomusYeah, yeah, I know the drill. Eyes can't see 60 fps, higher resolutions don't matter, etc. I find it funny that the one who thinks I believe people who buy consoles are stupid thinks himself that they are stupid enough to not want things that are objectively better.
What's the thing you're referring to? Gaming isn't a thing & I've said repeatedly it's not objectively measurable, at least coming from my PoV.
Vya DomusLike I said above, I reckon 480p is enough too. Can't you see how stupid this sounds ?
Can you tell me why is it stupid, if someone can only afford 480p gaming, say on a phone or handheld console, why is it not better for a person who clearly enjoys whatever he's doing on that 480p screen?
Vya DomusSo tell me again why is MS releasing two consoles with vastly different GPU specs if that doesn't matter? Why not just sell the Series S and that's it, the focus is clearly still on Series X.
Price & covering (a) more diverse set of gamers.
Vya DomusWe went from "people don't care about 4K" to the economy and how I am apparently shaming people that chose cheaper hardware. You're not just moving the goal post, you are moving the entire plot of land along with it.
Right, I'm the one throwing shade at you ~
Yeah, yeah, I know the drill. Eyes can't see 60 fps, higher resolutions don't matter, etc.

Can't you see how stupid this sounds ?

You say console buyers aren't stupid but you sure think they're blind it seems.
Newsflash 90% of the people who game, on any device, just lower the settings to get a more palatable gaming experience. Not everyone throws their mid range phone, Switch, PS4, 2080Ti in the bin just so they can enjoy gaming at the highest settings! I'd argue 95% but then it'd go into the same pissing contest we've seen on this page.
Posted on Reply
#87
Vya Domus
R0H1TI've said repeatedly it's not objectively measurable
8294400 pixels > 2073600 pixels for example would be a subjective statement in other words ?
R0H1Tif someone can only afford 480p gaming, say on a phone or handheld console, why is it not better for a person who clearly enjoys whatever he's doing on that 480p screen?
What the hell are you even trying to say ? If they can only afford 480p gaming then they're going to game in 480p and that's the end of it, who cares. You can't then go on and say that there is no such thing as having a better experience than one where you are playing games at 480p. Given the choice they would sure chose to go for more than that.
R0H1TPrice & covering (a) more diverse set of gamers.
But you've just said that there is no such thing as an "objective measure" when it comes down to this kind of stuff. Someone who pays more for a Series X for instance, what is he getting exactly in your view ?
R0H1TNewsflash 90% of the people who game, on any device, just lower the settings to get a more palatable gaming experience.
Now we have "palatable gaming experience". Great. You know, we shouldn't even plug in our PCs and consoles, let's just look at them sit in their box. That's probably the ultimate palatable gaming experience.
Posted on Reply
#88
Caring1
LionheartHow the hell is $299 stupid? Dafuq do you smoke.
That was in response to a post mentioning $400 +
Posted on Reply
#89
R0H1T
Let's see If I can get through to you one last time o_O
Vya Domus8294400 pixels > 2073600 pixels for example would be a subjective statement in other words ?
No numbers aren't subjective, otherwise why have Maths? Gaming or gaming experience is though, that's easy enough for you to follow?
Vya DomusWhat the hell are you even trying to say ? If they can only afford 480p gaming then they're going to game in 480p and that's the end of it, who cares. You can't then go on and say that there is no such thing as having a better experience than one where you are playing games at 480p. Given the choice they would sure chose to go for more than that.
Hey you're the one who brought up the inane point about consistent (console) gaming experience! And then trying to discredit this cheaper version of the next gen console by saying it only has 4TFLOPS of computing power, it sucks & no way should anyone buy it :slap:

Why is it so damn hard for you to understand not everyone wants to spend $499 or $599 upfront for gaming hardware, or play at 4k? Is that you just being you? I know you're probably one of the more persistently hard to nudge people off their views, on TPU ~ kinda like me :shadedshu:
Vya DomusBut you've just said that there is no such thing as an "objective measure" when it comes down to this kind of stuff. Someone who pays more for a Series X for instance, what is he getting exactly in your view ?
The ability to play at (native) 4k, an additional option with the extra computing power. More drive space, perhaps CPU cores & definitely GPU cores ~ that's objectively more (>) but is it also better? Well I'll let you play scrabble with that one.
Vya DomusNow we have "palatable gaming experience". Great. You know, we shouldn't even plug in our PCs and consoles, let's just look at them sit in their box. That's probably the ultimate palatable gaming experience.
You know what that's a great idea! Saves you lots on electricity for sure, possibly ISP, the cost of the game itself & then time.
Posted on Reply
#90
Space Lynx
Astronaut
ChomiqPricing for series x - $499 (this is based on windows central, which also dropped the $299 price tag before ms announcement).

Launches November 10th.

Xbox all access financing:
Series s - $25 a month
Series x - $35 a month

Looks like they will be pushing for large install base through the financing options.
i have to admit the financing options are genius move by them. sony really doesn't need to do this though since they will be sold out anyway instantly, just cause sony has the exclusives play card monopoly. and I admit there are many sony exclusives I want to play but was never able to do so. I tried Playstation Now and it was a hot mess of lag.
Posted on Reply
#91
saki630
these console cash grabbing manufacturers are almost as bad as EA. You have to pay up for last gen hardware, and pay a subscription to play online. While the poor PCmasterRace spends $2k to play current gen maxed out settings on Console Ports because original PC only High Quality and High Enjoyment games dont exist anymore.

For real, what can a 3070 let you do that an XboxSex cant already do? I'm just sad we have no good games to look forward to when Nvidia does another paper launch this month.
Posted on Reply
#92
Tomorrow
saki630these console cash grabbing manufacturers are almost as bad as EA. You have to pay up for last gen hardware, and pay a subscription to play online. While the poor PCmasterRace spends $2k to play current gen maxed out settings on Console Ports because original PC only High Quality and High Enjoyment games dont exist anymore.

For real, what can a 3070 let you do that an XboxSex cant already do? I'm just sad we have no good games to look forward to when Nvidia does another paper launch this month.
No good games? Cyperpunk 2077 is what?

PC has the advantage of backward compatibility. Where as with consoles you mostly have to keep your old box to play your older games.
Posted on Reply
#93
Vya Domus
R0H1T~ that's objectively more (>) but is it also better?
No, it has to be worse I imagine. You are straight up out of your mind.
Posted on Reply
#94
Chomiq
saki630these console cash grabbing manufacturers are almost as bad as EA. You have to pay up for last gen hardware, and pay a subscription to play online. While the poor PCmasterRace spends $2k to play current gen maxed out settings on Console Ports because original PC only High Quality and High Enjoyment games dont exist anymore.

For real, what can a 3070 let you do that an XboxSex cant already do? I'm just sad we have no good games to look forward to when Nvidia does another paper launch this month.
How the hell is this "last gen hardware" if its got Zen 2 rdna2 and nvme in it?
Can you game just with your 3070? Because there's more pricey components needed to build a PC and it's comes at well above $299.
Posted on Reply
#95
Valantar
Vya DomusI said it myself there are games with more than 5, I know very well some have a lot of options, here's another one: MS flight simulator. It doesn't matter how many are there, you don't have to ever worry about them if you don't want to.
Please stop moving the goal posts. You said, and I quote:
Vya Domusit's rare that I see a game with more than 5 options say, maximum about 10
A statement I easily disproved by showing examples of games nearing 2x your stated maximum amount of settings. See the issue here? If you say "5, maybe up to 10" you can't then say "I said more than 5", because that's a plain-faced misrepresentation of what you said.
Vya DomusYou are still having a very hard time differentiating between the shear number of components available and the choices that are actually important. For example there may be 1650, 1650S, 1650 Ti, 1660, 1660S, 2060, 2060S, 2070, 2070S, 2080, 2080S, 2080 Ti but :

1650, 1650S, 1650 Ti, 1660, 1660S
2060, 2060S, 2070
2070S, 2080, 2080S, 2080 Ti

They are all kind of grouped in performance tiers and within each group the differences aren't that big. So in reality the choice get's simplified to just three groups, after that you are facing differences that can be argued to not be that important for the overall experience.
Can you please make up your mind whether you are talking about game experience or development difficulty? The "one size fits all" console paradigm does of course address both those (users know what they get; developers have a single target to aim for), but you can't make an argument pointing to one of these and then start pointing to the other when you are contradicted. While those three groups might be somewhat applicable when it comes to user experience (though resolution throws a serious wrench into that, as there are people in each tier playing at vastly different resolutions), they don't apply to development and optimization at all - The 1660S is so close to the 2060 that it is much better catered for by optimizing for that than for the 1650, for example. In other words, a three-tier system is insufficient for that. And besides, you're still just working with one of the at least three relevant generations from just one of two GPU vendors. I used that generation to exemplify the complexity of tuning a game for the selection of GPUs out there; you can't then take that limited example and present it as if it (or was meant as) an overview of the totality of the situation.
Vya DomusIf people really had to chose from 3^30 combinations, no one would get anything done. Clearly things are much simpler in reality.
No, they aren't. But developers take a lot of shortcuts and consciously simplify things simply because testing for all the different relevant combinations is entirely impossible. But this also inevitably leads to problems - something clearly demonstrated by the fact that the most popular GPUs always get the most optimizations.
Vya DomusYou still don't quite get what I am saying, I not talking about the resolution itself as much as I am about the pixel structure, which is physically larger in a 1080p display of the same size. I am talking about the screen door effect more precisely, I can see the gird of pixels on practicality every single 1080p larger than 40 inches from no matter how many meters away (well, any reasonable distance).
That still comes down to the panel, and pixel pitch and pixel size. I've also seen 1080p panels with tiny pixels and huge gaps between them, but they are relatively rare. I have to get really close - far less than 1m) to my current 40" 1080p TV (8 years old and not particularly high end at the time) to see any type of grid. At 65" that distance would of course be longer, but as I said, 65" is on the far end of what 1080p can handle in a normal living room IMO.
Vya DomusI guess you have to see recent cheap 4K TV then to convince yourself they are superior. Again, I would trade worse contrast and for not being able to see that pixel grid any day of the week.
We're going to have to disagree on that. Mostly because, as I said above, I can't say that I've ever seen a relevant example of a grid/screen door effect on a 1080p TV.
Vya DomusObjectively, yes they are linear, 4K means four times the information of a 1080p image. Subjectivity, it may not be linear but it still exists.
Could you please reread that? And then please make an effort to abandon your naivistic belief in absolute resolution? I was talking about perceived resolution, which while partially subjective, is also determined by factors beyond absolute display resolution such as viewing distance, viewing angle, brightness, contrast/dynamic range, color gamut, and a few others, with viewing distance being the most important of these by far. While people spouting things like "the human eye can only see 1080p" are presenting an argument just as naive as this (again, they're ignoring exactly the same things!), your statement makes a claim to objectivity in a way that strips any relevant context from the data, rendering it meaningless. It's like determining which is the best commuter car by measuring top speed - while the metric might be objectively true in a vacuum, its objectivity and truth are both irrelevant in an in-use context. Remember, you - or any other human! - have never actually seen "1080p resolution" or "4k". You have seen "1080p at X distance on a display of size Y". Perception is always contextual, and decontextualized numbers, while "true", are only one piece of the puzzle, and a piece that can never be perceived by itself. (I mean, you can always get really close to a display and count all its pixels, but that doesn't really qualify as perceiving its absolute resolution either, as you'd then only be perceiving part of it at a time.)

So, for future reference in this debate, can we please stop throwing absolute resolution around as if it is a meaningful metric by itself? Effective resolution, or DPI vs. viewing distance, is the bare minimum of what is useful. And effective resolution drops as viewing distance increases. Always, and indisputably.

And before you object with "perceived resolution is subjective" - no it isn't. There are obviously limits to human visual acuity. They also obviously can't be measured or described in the same way display resolution can - for example, the human eye is much better at distinguishing diagonal lines than pixel grids - but they nonetheless exist and are just as objective as the resolution of a display, even if there is a range with a vaguely defined ceiling rather than a fixed number. This is just as objective as the absolute resolution of a display, and again, unlike absolute resolution it is actually relevant.

The kind of funny thing here is that your argument about screen-door effects on 1080p TVs is exactly this kind of contextual argument (in this case dependent on pixel size and pitch) yet you are trying to present it as proof of absolute resolution being relevant regardless of viewing distance and display size. Which, again, simply isn't true.
ChomiqHow the hell is this "last gen hardware" if its got Zen 2 rdna2 and nvme in it?
Can you game just with your 3070? Because there's more pricey components needed to build a PC and it's comes at well above $299.
After reading that post twice, I think it was meant to be sarcastic. Always hard to tell in writing.
TomorrowPC has the advantage of backward compatibility. Where as with consoles you mostly have to keep your old box to play your older games.
"Mostly", but going out of style fast. The XSX (and XOne series too) can play most X360 and OG Xbox games. That's better than most PCs - you'd really struggle to get a game from the early 2000s to run properly on a modern PC. The XSX even adds in nice-to-have features like resolution boosts, upscaling, and even automatic HDR(!). No PC comes even close to that. The selection of older games on PCs is still much, much larger, and the freedom of choice in how they are played is much larger, but overall, we're looking at a radically changed console landscape compared to ten years ago.

Of course, the PS5's backwards compatibility is limited to "most" PS4 games, whatever that means, and a small selection of earlier games through PS Now streaming (which is rather terrible still).
Posted on Reply
#96
ThrashZone
Hi,
Hope they offer larger storage 500gb is nothing now days

Posted on Reply
#97
Valantar
ThrashZoneHi,
Hope they offer larger storage 500gb is nothing now days

For $299 anything above 500GB NVMe sounds highly unlikely to me - but there is always the expansion slot with those Seagate SSD cards. Hopefully the premium over a regular SSD won't be outrageous.
Posted on Reply
#98
Vya Domus
ValantarA statement I easily disproved by showing examples of games nearing 2x your stated maximum amount of settings. See the issue here? If you say "5, maybe up to 10" you can't then say "I said more than 5", because that's a plain-faced misrepresentation of what you said.
I said "about", I always chose my words very well and leave some wiggle room, don't get mad because of that.
ValantarNo, they aren't. But developers take a lot of shortcuts and consciously simplify things simply because testing for all the different relevant combinations is entirely impossible. But this also inevitably leads to problems - something clearly demonstrated by the fact that the most popular GPUs always get the most optimizations.
Per GPU optimizations do not really exist, things can only be optimized per architecture. When was the last time you saw a driver update or game patch explicitly mention that one GPU in particular has received an optimization ?
ValantarThe "one size fits all" console paradigm does of course address both those (users know what they get; developers have a single target to aim for), but you can't make an argument pointing to one of these and then start pointing to the other when you are contradicted.
Where exactly was I contradicted ?
ValantarCould you please reread that? And then please make an effort to abandon your naivistic belief in absolute resolution? I was talking about perceived resolution, which while partially subjective, is also determined by factors beyond absolute display resolution such as viewing distance, viewing angle, brightness, contrast/dynamic range, color gamut, and a few others, with viewing distance being the most important of these by far. While people spouting things like "the human eye can only see 1080p" are presenting an argument just as naive as this (again, they're ignoring exactly the same things!), your statement makes a claim to objectivity in a way that strips any relevant context from the data, rendering it meaningless.
No, sorry, no matter how hard you will try I wont adhere to your strange idea that somehow there isn't a direct link between absolute resolution and perceived resolution. If one increases so does the other.
ValantarIt's like determining which is the best commuter car by measuring top speed - while the metric might be objectively true in a vacuum, its objectivity and truth are both irrelevant in an in-use context.
We are not debating which is the best all around machine here like you make it out to be. I always spoke strictly about resolution/visuals, if I drive two cars and one reaches a higher top speed then that's the car which is faster. Period. I don't know which is the better commuter, that's something you came up with.
ValantarThe kind of funny thing here is that your argument about screen-door effects on 1080p TVs is exactly this kind of contextual argument (in this case dependent on pixel size and pitch) yet you are trying to present it as proof of absolute resolution being relevant regardless of viewing distance and display size. Which, again, simply isn't true.
There is nothing contextual about that, no matter the size of the pixels and pitch when viewed from the same distance the pixel gird is going to be more apparent on a 1080p panel. You simply can't disprove that, I am sure you'll try though.
Posted on Reply
#99
Space Lynx
Astronaut
ValantarFor $299 anything above 500GB NVMe sounds highly unlikely to me - but there is always the expansion slot with those Seagate SSD cards. Hopefully the premium over a regular SSD won't be outrageous.
they probably don't mind selling the console at a loss, since they know they will make up for it when millions of people get hooked on xbox game pass (which is such an amazing value and great business model), i expect xbox game pass for PC will go from 5 a month to 10 a month next year or in 2022. microsoft is playing the long game. i mean im subscribed to xbox pc game pass, cause for 5 bucks why not, its insane value for what you get.
Posted on Reply
#100
Colddecked
Vya DomusThat's exactly what I am saying, and as they become more and more PC-like the reasons for them being slowly disappear.



I don't know but if they feel insulted they should get a life.
LMK when you can build a 500 dollar 300 dollar mITX system, with a 70 dollar controller bundled in mind you, with these specs. Get past the specs, the consoles still offer people a much simpler interface to get to their content, that the majority of consumers prefer. Honestly you sound like a crusty old man.
Posted on Reply
Add your own comment
Dec 4th, 2024 04:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts