You are right but DCI refers to a movie projection like DCI2k DCI4k. The standards for monitors and gaming are different. 1080p aka FHD can be considered 2k but in the gaming standard it is not. It is called FHD and the prim for the 2k resolution in the gaming arena holds the 2560x1440 as a QHD or 2k as some people refer to. 1920x1080 is a separate resolution than the DCI 2k standard and it has been addressed as such and it is not being recognized as DCI 2k resolution in that standard.
Given that I actually linked to a description of the DCI standards, it would be reasonable to assume that I know the industry and use cases they stem from, no? And didn't my post make it quite clear that DCI 2K and 1080p are different, just like DCI 4K and UHD 4K are different? I never claimed the two were the same, just that the closest consumer resolution to the only resolution
actually named 2K is 1080p.
Beyond that: No. There is no consumer resolution that has ever been named 2K. In fact, there isn't actually anyconsumer resolution that is officially named 4K, as the official term for 3840x2160 is UHD (or UHD 4K to differentiate it from DCI 4K). The term 4K has been "adopted" from the DCI terminology. 2K as a name for 1440p is some marketing nonsense made up by some monitor brand wanting to sell their monitors as "similar to 4k, but less, still better than 1080p". Apparently "1440p" is too complicated for them. But there is absolutely no consumer resolution named 2K in any capacity, and if there were to be one, we should follow the "standard" from UHD 4K and use the closest consumer resolution to the DCI resolution with that name, which would then be 1080p. 1440p has
zero reason to be called "2K".
Monitor resolutions are officially denominated by their pixel resolutions (1920x1080, 2560x1440, etc.) or their abbreviated naming (EGA, VGA, XGA, FHD, QHD, WQHD, etc.). TV and broadcast resolutions used to be named for the number of horizontal lines in the image and whether they were interpolated or progressive-scan (480p, 360i, etc), which was used up until HDTVs hit the market and "HD" naming entered the scene (HD, FHD, UHD), though there was a lot of overlap there (768i/p was rather uncommon, but 1080i/p were often used to more clearly communicate that this was a
real HDTV). "K" naming got adopted into this as the difference between "HD", "Full HD" and "Ultra HD" is pretty vague and fails to communicate that there are major differences, leading TV manufacturers to start calling UHD "UHD 4K" instead. Monitor and TV naming has somewhat blended together since the introduction of the "K" naming - despite this being entirely unofficial, with no actual standards for this beyond the DCI standard, but that can't be used to retroactively rename a resolution, and especially not one that doesn't fit any reasonable criteria for the name.
Just because a bunch of gamers have adopted a misused marketing term to mean something it doesn't doesn't make that meaning correct.