Tuesday, September 8th 2020

Microsoft Unveils the Xbox Series S: The Smallest Xbox Ever

Microsoft today surprised us with the Xbox Series S announcement. The Xbox Series S offers "next gen performance" and is the "smallest Xbox ever." The company promised to share more details, but when it goes on sale, it will cost just USD $299 (ERP). The announcement teaser had a pretty clear image of the finished product, revealing it to be barely more than two controllers in volume. A large fan intake makes up one of its side panels. It retains the general design of the larger Xbox Series X. Microsoft stated it will share more details about the new console.
Source: Microsoft Xbox (Twitter)
Add your own comment

113 Comments on Microsoft Unveils the Xbox Series S: The Smallest Xbox Ever

#101
Valantar
lynx29they probably don't mind selling the console at a loss, since they know they will make up for it when millions of people get hooked on xbox game pass (which is such an amazing value and great business model), i expect xbox game pass for PC will go from 5 a month to 10 a month next year or in 2022. microsoft is playing the long game. i mean im subscribed to xbox pc game pass, cause for 5 bucks why not, its insane value for what you get.
Oh, I absolutely know that they will in all likelihood be selling at a loss, the question is just how much of a loss they can stomach (and yes, subscriptions is exactly how they make a profit). Given the pricing, the XSS is likely to outsell the XSX - maybe even by a significant margin - and given that businesses operate on percentage margins rather than dollar values most of the time, swallowing a $50 loss per XSS is going to hurt more than the same loss on an XSX. A 1TB SSD would eat up a lot of the BOM on a $299 console - flash and controller prices are relatively fixed, after all, being commodity products where little volume pricing is possible, so it's not like MS is getting a mind-blowing deal on flash or controllers from someone (though their use of PCIe 4.0x2 controllers likely saves them a few bucks per unit, not to mention motherboard costs etc.). The cheapest 1TB NVMe SSDs on the market today are just below $100, which makes it reasonable to expect part and production costs to be somewhere around $60-70. You can cut that nearly in half for a half-capacity drive, so that's a BOM savings of ~$30. Definitely not nothing, but it is ~10% of the cost of the console, so definitely noticeable. While I don't believe MS ultimately cares about making anything near a profit on hardware, they might care about not losing an "unnecessary" $30 million per million consoles sold. That could finance a pretty big first-party game production, after all, or outfit a decent handful of server racks to improve the attractiveness of their streaming solution - both of which could attract more subscribers. So, while it's not impossible that they'd splurge on a 1TB drive, they've already invested significantly (both in research and PR) in Smart Delivery to reduce download sizes and avoid downloading unnecessary assets, so my guess is that they're taking a calculated risk on a 500GB drive, lowering the loss per unit, while having the backing of smaller download sizes for the XSS and having external SSD cards as an option for space-starved users. Who knows, maybe they'll launch a $350 1TB version?
Posted on Reply
#103
Vya Domus
ColddeckedHonestly you sound like a crusty old man.
Like I said, they should get a life.
Posted on Reply
#104
Colddecked
Vya DomusLike I said, they should get a life.
That's sage advise. Does trolling count as "getting a life"?
Posted on Reply
#105
Valantar
Vya DomusI said "about", I always chose my words very well and leave some wiggle room, don't get mad because of that.
... so 17 is "about 10"? For someone arguing strict adherence to the value of absolute numbers, that's a bit of a self-contradiction. My point still stands. Oh, and please don't come dragging the tired-ass memelord "U mad?" rhetoric, as if that adds anything to the debate. It's a used-up and utterly transparent attempt at making your opposition look bad and indirectly belittling them as irrational and emotional rather than having to present actual arguments, so if you're interested in having anything resembling a reasonable debate, please stop. It certainly isn't helping your cause.
Vya DomusPer GPU optimizations do not really exist, things can only be optimized per architecture. When was the last time you saw a driver update or game patch explicitly mention that one GPU in particular has received an optimization ?
That's because those optimizations are always bundled together - game optimized drivers don't go out piecemeal, they go out in one blob, containing various tweaks for various GPUs. Besides, I was talking about game development, not GPU drivers, which is typically a black box, with tweaks under the hood happening continuously with very little information given out.
Vya DomusWhere exactly was I contradicted ?
Hmm, maybe here? It would be nice if you actually read stuff, you know, rather than just throwing out come-backs like that. That kind of bad-faith arguing only serves to make you come off as immature and uninterested in an actual informed debate.
Vya DomusNo, sorry, no matter how hard you will try I wont adhere to your strange idea that somehow there isn't a direct link between absolute resolution and perceived resolution. If one increases so does the other.
Uhhh.... did I say that? Please show me where I said that. I'll give you $100 if you can. Seriously.
Let me quote myself and add some enhancements to spell it out to you:
ValantarI was talking about perceived resolution, which while partially subjective, is also determined by factors beyond absolute display resolution such as viewing distance
See that? That means that it's not only determined by resolution, which you have been arguing for two pages of posts here, but by that and other factors. Given that absolute resolution is one of the factors in effective resolution it would take some seriously flawed logic to think that changing the absolute resolution has no effect on the effective resolution, right?
Vya DomusWe are not debating which is the best all around machine here like you make it out to be. I always spoke strictly about resolution/visuals, if I drive two cars and one reaches a higher top speed then that's the car which is faster. Period. I don't know which is the better commuter, that's something you came up with.
No, we were debating your claim that with two console tier you can no longer expect to
Vya Domusget a consistent experience no matter which consoles you cho[o]se
And ever since, you have been arguing that a drop in resolution is fundamentally incompatible with having a "consistent experience", while we others have been arguing that absolute resolution is a quite small part of what constitutes the gaming experience, and arguably one of the most flexible parts of that whole, with other metrics like frame rate/smoothness and the ability to play the same games at a similar level of perceived quality matter more than absolute resolution.
Vya DomusThere is nothing contextual about that, no matter the size of the pixels and pitch when viewed from the same distance the pixel gird is going to be more apparent on a 1080p panel. You simply can't disprove that, I am sure you'll try though.
Of course there is! The physical characteristics of your specific TV are part of the viewing context, as they are only valid for that model of TV and any others using the same panel. Pixel sizes, pitches and so on vary quite a lot within displays of the same resolution and size. And besides that, beyond a certain distance (depending on the size of the display, obviously), unless the 1080p panel is a particularly bad one with small pixels and huge gaps in between, the grid will be entirely invisible. At this distance there will likely still be a decently clear advantage in sharpness and clarity for a 4k display of the same size and overall image quality, but that advantage also drops off quickly once the distance increases. You're making it sound like the only 1080p displays you have looked at were downright terrible. Which might of course be the case - I have no idea. Or maybe you've only sat very close to them? I know for sure that I've used several 1080p displays and TVs that I've never noticed any type of screen door effect on at normal viewing distances. I can see it clearly on my TV from 1m or less, but ... I don't watch TV from 1m or less. A 40" display at that distance is quite uncomfortable, after all, especially when viewing full-screen media.
Posted on Reply
#106
Colddecked
ValantarSee that? That means that it's not only determined by resolution, which you have been arguing for two pages of posts here, but by that and other factors. Given that absolute resolution is one of the factors in effective resolution it would take some seriously flawed logic to think that changing the absolute resolution has no effect on the effective resolution, right?

No, we were debating your claim that with two console tier you can no longer expect to

And ever since, you have been arguing that a drop in resolution is fundamentally incompatible with having a "consistent experience", while we others have been arguing that absolute resolution is a quite small part of what constitutes the gaming experience, and arguably one of the most flexible parts of that whole, with other metrics like frame rate/smoothness and the ability to play the same games at a similar level of perceived quality matter more than absolute resolution.
EXACTLY! The CPU running the actual game logic will be the same in Series S, the same bandwidths (SSD at least), so the experience will be extremely consistent. Look at it this way, I can have a 2070 super or a 2080ti with my 9900k. I can play ultra settings 1440p at >60fps with the 2070s, but with the 2080ti I can play at same frame rates but at 4k. OFC its better to play at 4k, but to say the experience is not consistent is super disingenuous.
Posted on Reply
#107
Vya Domus
ValantarFor someone arguing strict adherence to the value of absolute numbers
No, I am talking about absolute numbers only when it's the case, like for instance when I am talking about resolutions. To be honest the only meme lord here is you, I said most games don't have more than about 10 graphics options. You then went like "hEreS oNe wItH mOrE tHaN tHaT". Good on you man.
ValantarThat's because those optimizations are always bundled together - game optimized drivers don't go out piecemeal, they go out in one blob, containing various tweaks for various GPUs. Besides, I was talking about game development, not GPU drivers, which is typically a black box, with tweaks under the hood happening continuously with very little information given out.
What a strange concept, so there are per GPU optimizations but then they are bundle together ? How does that work ? Why would a shader for instance behave differently on GPUs of the same architecture, they all share the same capabilities.
ValantarHmm, maybe here? It would be nice if you actually read stuff, you know, rather than just throwing out come-backs like that.
Genuinely can't tell what is that you mean.
ValantarUhhh.... did I say that? Please show me where I said that. I'll give you $100 if you can. Seriously.
Let me quote myself and add some enhancements to spell it out to you:

See that? That means that it's not only determined by resolution, which you have been arguing for two pages of posts here, but by that and other factors. Given that absolute resolution is one of the factors in effective resolution it would take some seriously flawed logic to think that changing the absolute resolution has no effect on the effective resolution, right?
And what is it that you are doing exactly ? You keep arguing for the past couple of pages that higher resolutions somehow don't always mean better visuals, that it's about perceived and effective resolution, it's about this and the other and that it's all subjective.

Just stop already, if you think that you are going convince me that more pixels don't translate to a batter image quality you are wasting your time.
ValantarOf course there is! The physical characteristics of your specific TV are part of the viewing context, as they are only valid for that model of TV and any others using the same panel. Pixel sizes, pitches and so on vary quite a lot within displays of the same resolution and size. And besides that, beyond a certain distance (depending on the size of the display, obviously), unless the 1080p panel is a particularly bad one with small pixels and huge gaps in between, the grid will be entirely invisible. At this distance there will likely still be a decently clear advantage in sharpness and clarity for a 4k display of the same size and overall image quality, but that advantage also drops off quickly once the distance increases. You're making it sound like the only 1080p displays you have looked at were downright terrible. Which might of course be the case - I have no idea. Or maybe you've only sat very close to them? I know for sure that I've used several 1080p displays and TVs that I've never noticed any type of screen door effect on at normal viewing distances. I can see it clearly on my TV from 1m or less, but ... I don't watch TV from 1m or less. A 40" display at that distance is quite uncomfortable, after all, especially when viewing full-screen media.
1080, 4K, they are all the same because you can just make the pixels bigger, got it. Who would have thought it's that easy ? What can I say, me and a couple of other million people must a bunch of blind idiots.
Posted on Reply
#108
hat
Enthusiast
I think this is good. Like PC, console gamers now have a choice to spend more on the best or spend less on an okay machine, to a degree. That said, I find the comments that tiered consoles are "harder to dev for" laughable. Imagine how hard it must be to dev a PC game with thousands of different possible configurations?
Posted on Reply
#109
Valantar
Vya DomusNo, I am talking about absolute numbers only when it's the case, like for instance when I am talking about resolutions. To be honest the only meme lord here is you, I said most games don't have more than about 10 graphics options. You then went like "hEreS oNe wItH mOrE tHaN tHaT". Good on you man.
Okay, so not conducting a study large enough to produce representative numbers is somehow a worse offense than tossing out baseless off-topic ad hominems trying to undermine the credibility of the person you are arguing against? Got you.

Oh, and since you apparently need more, here's the count from the first ten games I have installed in my Steam library. Again, not representative (though I would say it covers a broad spectrum of games, only really missing very new AAA games), but nonetheless it provides far more relevant insight than your entirely anecdotal "mostly less than five and maximum about 10":
Deus Ex: Human Revolution: 11
Divinity: Original Sin: 13
Life is Strange: 9
Rocket League: 14
Among the Sleep: 7, one of which is an otherwise unlabeled "quality" toggle
Rage 2: 19
No Man's Sky: 16
Prey: 11
Everything: 1 (resolution)
Giana Sisters: Twisted Dreams: 7

From my quick napkin math, that places most graphically intensive games (No Man's Sky, Rage, Prey, Deus Ex) at a minimum/baseline of "about 10" (your "maximum", for reference), with indies, platformers and the like fluctuating below, but even broad-appeal esports games like Rocket League can exceed 10 settings easily. One game had less than five settings. You're of course welcome to contest this, but the very least you have to do at that point is to provide some examples of comparable games with far fewer options. Even a non-representative selection of games is more data than a purely anecdotal statement, after all.
Vya DomusWhat a strange concept, so there are per GPU optimizations but then they are bundle together ? How does that work ? Why would a shader for instance behave differently on GPUs of the same architecture, they all share the same capabilities.
Ah, yes, because the only optimizations possible are to shader performance or other things that are generally applicable to an architecture. Of course. How about tuning the VRAM usage so that it fits wihtin the frame buffer of the most popular GPUs at each resolution? How about tuning the auto-settings algorithm so that it sets the best geometry/texture/shader settings for the particular balance of features your specific GPU has? All of this is very clearly done, yet never documented, and of course it is tweaked over time.
Vya DomusGenuinely can't tell what is that you mean.
Well, that's too bad. I would suggest rereading, but seeing how I've already done that, I'm at a loss. My points are quite clear.
Vya DomusAnd what is it that you are doing exactly ? You keep arguing for the past couple of pages that higher resolutions somehow don't always mean better visuals
Yes!
Vya Domusthat it's about perceived and effective resolution
Yes!
Vya Domusit's about this and the other
Depends on what you mean, but sure.
Vya Domusand that it's all subjective.
.... no. Not whatsoever. Again: I have very specifically been arguing about non-subjective factors. Viewing distance and display size are not subjective factors, they are inherent factors that affect all viewing situations the same way, and as such they are intrinsic to the perception of resolution. There is nothing at all subjective about this, as there is no way to perceive resolution regardless of distance or display size.
Vya DomusJust stop already, if you think that you are going convince me that more pixels don't translate to a batter image quality you are wasting your time.
Well, that's your loss. If you're not willing to accept that more pixels don't translate to better image quality when you can't tell that there are more pixels in the first place, then you have apparently hit some sort of mental block. I have never claimed that this isn't true when viewed at distances where one can actually tell the difference, yet you keep arguing that lower resolution will always result in a worse play experience. This is, put simply, just not true. Even with effective resolution there are significantly diminishing returns as you go higher on the scale. And there always will be.
Vya Domus1080, 4K, they are all the same because you can just make the pixels bigger, got it. Who would have thought it's that easy ? What can I say, me and a couple of other million people must a bunch of blind idiots.
... and here we go again - rather than actually arguing a point in a reasonable and polite way, you're presenting a ridiculous caricature to yell at. 4k resolution clearly and obviously has its value in situations where display size and viewing distance makes the increase in sharpness and detail perceptible over lower resolutions. It is of course especially valuable at distances where one might experience a screen door effect at lower resolutions, though as I've said I've never come across that in an actually realistic usage scenario. You're welcome to provide an example of a realistic usage scenario to contradict that, but so far you haven't. The fact remains that at normal TV sizes and at normal TV viewing distances, the perceived quality difference between a 1080p panel and a 4k panel of the same quality (colors, contrast, etc.) is so close to zero as to make nearly no difference. Is there a perceptible difference? Sure, and if you're attuned to one, you're more likely to notice the differences (the 1080p panel might for example feel vaguely less sharp), but in common usage for people who aren't explicitly looking for these things, and instead for example focusing on playing a game, the difference is so small as to be irrelevant. This of course changes if your TV grows or viewing distance shrinks, but that is clear from how effective resolution works, and in no way contradictory to anything I've said.

The fact of the matter is, the XSS will likely be an excellent next-gen console option for the millions of people out there who don't (yet) have 4k TVs, and will serve them well (with a perceptible step up in quality due to 1440p rendering and upscaling) if they do upgrade later. And it will just as likely provide an excellent gaming experience that is entirely consistent with that of the XSX, despite the lower resolution.
Posted on Reply
#110
Vya Domus
ValantarYou're of course welcome to contest this, but the very least you have to do at that point is to provide some examples of comparable games with far fewer options.
Of course I can if I want to keep dragging this dumb debate forever like you seem to be dead set on for some reason. I could for instance include the plethora of indie games that have few or no visual settings at all outside resolution and then the typical figure would plummet well below 10 or maybe even 5. I didn't think of those, my figure was biased towards AAA games.

I haven't looked up every game that has ever existed to count how many graphics option exist on average and neither did you. I simply gave a ball park figure, it's going to less or more than that in reality, who the hell care, will you ever give up ?
ValantarAh, yes, because the only optimizations possible are to shader performance or other things that are generally applicable to an architecture. Of course. How about tuning the VRAM usage so that it fits wihtin the frame buffer of the most popular GPUs at each resolution? How about tuning the auto-settings algorithm so that it sets the best geometry/texture/shader settings for the particular balance of features your specific GPU has? All of this is very clearly done, yet never documented, and of course it is tweaked over time.
Most popular GPUs are Intel's integrated graphics which have dynamic memory allocation. So you might want to rethink that.

I have some insight in the game development world and I can tell you with certainty that no one is ever optimizing anything with regards to algorithms or whatever to scale up based on each individual GPU, you can take my word or not for it. That'd be insane, not feasible and really dumb.

The closest thing to a "per GPU optimization" is when they leave the compilation of the shaders "online", meaning they get compiled sometime at run time of the application which means the driver component that does the translation can use the latest optimization for the currently used GPU. You can see some games doing that when you boot them up for the first time, it's exceedingly rare though. Why ? Because they just don't bother with it, per GPU optimization are a waste of time.

The reason it's not documented is because no one is doing that.
ValantarIf you're not willing to accept that more pixels don't translate to better image quality
You bet I wont ever be able to accept that, it's an assault on my common sense, let alone any other objective measure out there.
Valantar... and here we go again - rather than actually arguing a point in a reasonable and polite way, you're presenting a ridiculous caricature to yell at.
You are trying to convince me that bigger pixels and viewing distance can somehow make up for the colossal difference in density, which is straight up nonsense, the pixel gird doesn't scale like you think it does. If you make the pixels bigger then the individual sub-pixels become more apparent also and you get nowhere, you just can't get away from the lower density. It's always going to be inferior.

No one buys a 1080p TV in the detriment of a 4K one thinking to themselves : "Man, you know what ? I'll just sit twice as close to the TV and it will just be the same thing right ?".

So yes, caricatures are all I have left.
Posted on Reply
#111
Valantar
Vya DomusOf course I can if I want to keep dragging this dumb debate forever like you seem to be dead set on for some reason. I could for instance include the plethora of indie games that have few or no visual settings at all outside resolution and then the typical figure would plummet well below 10 or maybe even 5. I didn't think of those, my figure was biased towards AAA games.
Ah, okay, so when you're arguing that the XSS will provide an inferior gaming experience to the XSX due to its lower resolution you were specifically talking about non-graphically intensive indie games? That's a good one. I mean, if that's where you're aiming, that console can likely run most if not all of those at native 4k if MS allows it.
Vya DomusI haven't looked up every game that has ever existed to count how many graphics option exist on average and neither did you. I simply gave a ball park figure, it's going to less or more than that in reality, who the hell care, will you ever give up ?
Not until you accept that your ballpark figure was wildly inaccurate.
Vya DomusMost popular GPUs are Intel's integrated graphics which have dynamic memory allocation. So you might want to rethink that.
Ah, yes, that's the most popular GPU for playing even mildly graphically intensive games. Sure. I am entirely aware that those iGPUs outnumber all other GPUs by something like 10:1, but that doesn't change the fact that I said GPUs, plural. Which can of course be interpreted in various ways, but in this context the obviously most relevant interpretation is "the most popular GPUs for at least mildly graphically intensive gaming", which then means something like the following: the GTX 1060 for Nvidia, the RX 480/580 for AMD, the 960/970 before that for Nvidia, and to a lesser degree newer models like the GTX 1660 series. And in this case, optimizing a game for these GPUs during development would mean things like ensuring that the game plays well on these GPUs at popular resolutions - you know, providing a consistent experience. And while that isn't a thing that typically happens on a per-GPU level, but rather by aiming for what looks good and allowing for some upwards and downwards sacling - the absolute performance of these popular (specific) GPUs obviously influences where these goal posts are placed, and also often just how much upwards or downwards scaling is allowed.

You keep trying to shift the goal posts, but it's not getting you anywhere. Remember, this is about discussing a next-generation console. People generally don't buy those for playing games that are playable on their laptop iGPU. Most of the point of next-gen consoles is next-gen graphics.
Vya DomusI have some insight in the game development world and I can tell you with certainty that no one is ever optimizing anything with regards to algorithms or whatever to scale up based on each individual GPU, you can take my word or not for it. That'd be insane, not feasible and really dumb.
Okay, so the auto settings systems in games just make wild guesses, not at all based on the hardware in your system? Got it.
Vya DomusYou bet I wont ever be able to accept that, it's an assault on my common sense, let alone any other objective measure out there.
And the way you keep quoting me out of context continues to be a plain-faced bad-faith arguing style that just demonstrates that you aren't even remotely looking for a reasonable debate. There really isn't much common sense in that. Here's the full quote, since you apparently need it:
ValantarIf you're not willing to accept that more pixels don't translate to better image quality when you can't tell that there are more pixels in the first place
See? I never said that "more pixels don't translate to better image quality" without any caveats, like you're wanting to make it out as if I did. And you still haven't provided even a single argument nor data point against the fact that at any fixed display size, the effect of absolute resolution on perceived resolution drops as viewing distance increases. So again, please stop making bad-faith misrepresentations of what I'm saying to derail the debate and make me look bad - it's backfiring, badly.
Vya DomusYou are trying to convince me that bigger pixels and viewing distance can somehow make up for the colossal difference in density, which is straight up nonsense, the pixel gird doesn't scale like you think it does. If you make the pixels bigger then the individual sub-pixels become more apparent also and you get nowhere, you just can't get away from the lower density. It's always going to be inferior.
What? I never said that. I said that outside of apparent edge cases like you seem to be describing, at normal TV distances and panel sizes the difference in quality between a good 1080p panel and a good 4k panel is quite small, typically to such a degree that it doesn't matter, and that as distance increases, the difference ultimately disappears entirely.
Vya DomusNo one buys a 1080p TV in the detriment of a 4K one thinking to themselves : "Man, you know what ? I'll just sit twice as close to the TV and it will just be the same thing right ?".
... well, no, nobody does that, as that is exactly the opposite of how this works. If you're sitting at half the distance, you'll notice absolute resolution to a much higher degree.

Besides, people's purchase decisions are rarely that rational.
Vya DomusSo yes, caricatures are all I have left.
And that's the problem. I would recommend you take a step back and calm down a tad. Maybe try approaching this as, you know, a reasonable debate? Leave the bad-faith arguments, derailing techniques and (seemingly) purposeful misreadings at the door, and we might get somewhere.
Posted on Reply
#112
Vya Domus
ValantarNot until you accept that your ballpark figure was wildly inaccurate.
I am going to admit that may ballpark figure was probably mildly inaccurate. Unless you do a survey of thousands of games from all categories you wont be able to claim a very accurate figure either no matter how hard you try. Sorry.
ValantarAh, okay, so when you're arguing that the XSS will provide an inferior gaming experience to the XSX due to its lower resolution you were specifically talking about non-graphically intensive indie games?
Are you sure it's me who is derailing these things ? I am pretty sure we were talking about PC games graphics option not XSX.
ValantarOkay, so the auto settings systems in games just make wild guesses, not at all based on the hardware in your system? Got it.
The "auto-settings" thingy just sets the graphical options to some specific preset, usually based on some basic criteria like available memory. They don't auto tune algorithms or whatever else you think they do, so it has absolutely nothing to do with applying per GPU optimizations of any kind.
ValantarAnd the way you keep quoting me out of context continues to be a plain-faced bad-faith arguing style that just demonstrates that you aren't even remotely looking for a reasonable debate. There really isn't much common sense in that. Here's the full quote, since you apparently need it:
I only quote what feel is important, again, I kind of stopped arguing some time ago on some matters because I can't go on forever repeating the same things.
Posted on Reply
#113
Valantar
Vya DomusI am going to admit that may ballpark figure was probably mildly inaccurate. Unless you do a survey of thousands of games from all categories you wont be able to claim a very accurate figure either no matter how hard you try. Sorry.
Well, at least that's some movement. Great! The first sign of improvement so far.
Vya DomusAre you sure it's me who is derailing these things ? I am pretty sure we were talking about PC games graphics option not XSX.
Uhhh... we got into PC graphics settings through talking about the necessity of optimizing for two rather than one performance tier for consoles. This is getting the discussion back on track, returning to the original point after a digression, not derailing it. This topic is about the XSS, after all.
Vya DomusThe "auto-settings" thingy just sets the graphical options to some specific preset, usually based on some basic criteria like available memory. They don't auto tune algorithms or whatever else you think they do, so it has absolutely nothing to do with applying per GPU optimizations of any kind.
Available memory? Do you mean VRAM? I certainly hope so, 'cause there's no game out there that adjusts graphics settings based on system RAM alone. And you seem to be missing the point here entirely: applying optimized settings for a certain GPU - a system that is definitely not tested for all available GPUs, but definitely is tested for the most popluar ones for any game with a QC budget - is optimizing for that GPU, especially when one factors in the more low-level tweaks done to hit baseline performance targets that are themselves determined by - again! - the most popular GPUs. There are tons of games out there where developers aim for specific performance levels on specific GPUs - such as 1080p60 high on a GTX 1060, or 4k60 Ultra with RTX On on a 2080 Ti - from relatively early on in development, and while they obviously don't always hit those targets, there are nonetheless many specific optimizations made to tune for these specific performance levels, which are - again - determined by specific GPUs.
Vya DomusI only quote what feel is important, again, I kind of stopped arguing some time ago on some matters because I can't go on forever repeating the same things.
"What you feel is important." Ah. That's ... rather telling. What is important to you is, apparently, to not read an argument in full, but instead dismiss it before finishing reading the sentence, and then portray a misguided caricature of it that you can yell at. Again: you're really not coming off as someone applying much common sense here. I mean, the core of your "absolute resolution is always important" argument must be a fundamental denial of the existence of visual perspective, otherwise it is a logically impossible stance. Things further away look smaller, and smaller things are more difficult to make out details in. Thus, if two displays of the same size and overall image quality but different resolutions are moving gradually away from someone looking at them, the perceptible difference in resolution will gradually diminish until it is entirely imperceptible, at which point the perceived resolution of both is equal. This is what you are arguing against, and as this debate has shown that I really need to repeat myself: by arguing against this, you are arguing against the existence of visual perspective.
Posted on Reply
Add your own comment
Dec 11th, 2024 16:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts