I feel that might be way overblown, linear subpixel layouts like IPS are also not perfectly square and suffer from the same problem on anything that's not a straight horizontal/vertical line. That get's mitigated with antialiasing and higher ppi.
Higher PPI can alleviate that, but antialiasing can't - unless the pixel density is there, all AA does is make things blurry rather than jagged. And jagged text is often more readable than blurry text - but this seems to deliver both.
My phone like most phones today uses a 1080p pentile oled, i certainly don't spend as much time reading from it as I do from a computer monitor but I still see no difference from my previous phone with 1080p IPS (the differences I see are the amazing contrast and colour and higher refresh).
Your phone is also drastically higher PPI (which is partly but not entirely balanced out by it being viewed much closer), and crucially: doesn't run Windows. Android has a ton of optimizations in text rendering and its graphics stack in general to make pentile AMOLEDs look good, thanks to Samsung contributing tons to the Android kernel and driver stack for that specific purpose (so that they can sell pentile display panels). Windows doesn't, and is rather notorious for not handling non-RGB stripe subpixel layouts poorly (which is also exacerbated by the low pixel densities of most PC monitors). ClearType helps, but is by no means a fix.
Feelings (like my own) don't mather though, is there concrete data on this that goes either way? (i'm not talking about pixel peeping, like a study with people comparing text for example or something)
This is a contradiction in terms - any such study would literally just be an aggregation of a group of people's "feelings" (i.e. sensory impressions and interpretations of those) about sharpness (likely defined in some more or less specific way, but still subjective) and subsequent analysis in order to try and find patterns, trends, ideosyncracies, etc. Sensory perception is fundamentally subjective, and there is nothing that can meaningfully be understood as "objective data" in relation to it. (Heck, one can question whether there is even such a thing as objective truth, though that's another debate entirely.) What is great for you might be unacceptable to your neighbour, and the factors affecting any such distinction are far too complex to map out across any significant number of study participants unless you really narrow the scope of the research. You can always find general trends, but those will never invalidate the experiences of those seeing things differently - they will simply be descriptions of the more common experiences, and can't be meaningfully generalized. If you are reading (about) a study saying something such as this, and you are reading as if "these findings are providing objective truth", then you are fundamentally misunderstanding how the science works.
Put it this way: the "concrete data" is the subpixel structure and how this affects rendering, and when comparing RGB stripe to this triangle thing, (assuming a conventional panel orientation) RGB stripe allows for regular, uninterrupted vertical stripes and regular, easily predictable interruptions in horizontal stripes - with anything else being spaced out depending on the specifics of the angle of the line, etc. This triangle on the other hand can only achieve any straight line with two subpixels at a time, which is an inherent deficit in that particular structure compared to RGB stripe.
The question is whether this is perceptible. And that is dependent on the use case (word processing or gaming, for example), viewing distance, ppi, the tuning of the display, GPU diver and OS, the user's visual acuity, habits and preferences, the brightness relative to ambient brightness, the display coating, whether the user's eyes are tired or rested, and a bunch of other factors that can't really be eliminated or factored in without also limiting the scope of the study. So: what you're asking is ultimately entirely subjective, and the best anyone can do is attempt to give advice consciously and explicitly situated in their own habits, preferences and experiences. No experience is universal, but with sufficient self-reflection one can make an attempt at extrapolating from experience - but this will of course always be speculative.
In this thread I've only made statements regarding my own use case and preferences, for which this seems poorly suited - my work requires hours every day of looking at text on a white background, and I am sensitive to sharpness issues (tired eyes, headaches). Thus this would seem like a particularly poor choice for me. I've never made any claim to this being applicable outside of these parameters.
LG's OLEDs are white, Samsung's are blue. Blue still burns out faster. But it's not clear if, in the presence of QDs, Samsung's blues must burn with the same intensity as LG's whites.
Like I said: there are many unknowns and an award badge does little to clear up any of them.
There is no such thing as a white OLED emitter. LEDs, including organic ones, emit relatively narrow spectrums of light, and are thus incapable of natively producing white light. LG's WOLED emitters are AFAIK blue with a phosphor coating transforming the blue light to a broad spectrum, i.e. white.