# At 4K resolution, do we still need AA?



## silapakorn (Jun 26, 2014)

So I stumbled upon this review about gaming @4K: http://www.vmodtech.com/en/article/vmodtech-ultimate-gaming-project-4k-battlefield
It's not entirely in English, but you can look at pics and graphs.

They maxed out setting in all games @4K resolution on high-end graphic cards to date. The result is very interesting. 

















They got 40-50fps at MAX setting (with AA on, of course). So I imagine that IF we turn off the AA, we should be able to get 60fps with no sweats.

So here is my question, do we still need AA at 4K resolution? Since 4K is 4x1080p, so that would equal 4xMSAA, right? I mean, the image quality shouldn't be that much different. 

Anyone with 4K monitor, can you show me the screenshots of 4xAA and no AA in-game images?


----------



## Durvelle27 (Jun 26, 2014)

No we do not


----------



## Scrizz (Jun 26, 2014)

lol that's the same thing they said about 1080p


----------



## AsRock (Jun 26, 2014)

What you need and what you want are different things lol.  So do you need it ?,  No..  Do you want it well that's preference.


----------



## Mussels (Jun 26, 2014)

it entirely depends on the size of the screen.

a 4K 24"? you wont see it

a 46"? you'd see it.



it really comes down to the fact we get used to a certain quality - when i started gaming, 1024x768 was awesome. then 1280x1024, then WIDESCREEEEEEEN blew us away, then 720p did nothing, and 1080p blew us away again... and 1080p with AA held us over until 4k came out.


----------



## Solaris17 (Jun 26, 2014)

Scrizz said:


> lol that's the same thing they said about 1080p



This


----------



## AphexDreamer (Jun 26, 2014)

There still is no way to downsample on AMD cards with the latest drivers is there?

There is http://blog.metaclassofnil.com/?tag=gedosato but it only works in DX9 games.

And even with some DX9 games it will not work or needs tweaking


----------



## Durvelle27 (Jun 26, 2014)

AphexDreamer said:


> There still is no way to downsample on AMD cards with the latest drivers is there?
> 
> There is http://blog.metaclassofnil.com/?tag=gedosato but it only works in DX9 games.
> 
> And even with some DX9 games it will not work or needs tweaking


c
CRu works for me


----------



## GhostRyder (Jun 26, 2014)

Like already stated it's not so much a need as a want.

Does it make a difference?  Yes it does, but I would not call it night and day.

The reality is that it will make a difference but it maybe harder to notice.


----------



## AphexDreamer (Jun 26, 2014)

Durvelle27 said:


> c
> CRu works for me



Says it won't scale down higher resolution? Does it? 

http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU


----------



## natr0n (Jun 26, 2014)

Somethings will always have aliasing eventually.

also, OCD makes people look for reasons to use it.


----------



## Durvelle27 (Jun 26, 2014)

AphexDreamer said:


> Says it won't scale down higher resolution? Does it?
> 
> http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU


It's allows me to run a higher resolution than my monitor actually supports.


----------



## RCoon (Jun 26, 2014)

Mussels said:


> it entirely depends on the size of the screen.



This. AA is pointless on a 24" 4K screen. The bigger the screen, the less PPI, the easier jaggies are to see, the more important AA is. That's why smart phone interfaces look so smooth, 1080p on such a high PPI makes everything look vectorised and super smooth.

Imagine taking an image from a 1080p phone screen that's 4.8 inches, then upscaling that image, maintaining the size of the image to a 40 inch 1080p TV. Things are going to get jaggy


----------



## Shambles1980 (Jun 26, 2014)

cru kind of works.. 
but if you have a 1680x1050 max res monitor. you will be lucky if you get it to 1080p (1920x1080 @60hz) 
but the chances are you will be able to get it to 1080i (1920x1080 @30hz Interlaced) thats all good and nice. but to me 60hz 720p is nicer to use than 30hz interlaced 1080i. the 30hz interlaced makes things kind of flicker. and it will take a while for your eyes to get used to (desktop, web pages. txt documents and so on) 
Then you will play a game, or a video. and that will be just fine no flickering. But 1st thing you close the game your back to the flickering and have to let your eyes adjust to it. 

im forced to use 1080i due to my gpu having rediculous tearing on desktop with 2d clocks in use. funny enough there is no tearing using 30hz interlaced so thats what i have to do untill the bios gets modified. 

i dont use cru now, but i was using it to try and set a non interlaced res and refresh rate on multiple monitors that stoped the tearing. never actually found one. but cru does work provided you keep your monitor within a reasonable range that it can work at Which is usually more the refresh rate at the resolution rather than the resolution itself.


----------



## vega22 (Jun 26, 2014)

Mussels said:


> it entirely depends on the size of the screen.
> 
> a 4K 24"? you wont see it
> 
> a 46"? you'd see it.



28 inch 1440/1600 you dont need it.
24 inch 1200 you dont need it.

40 inch 1080 you do.
30 inch 720 you do.

i mean come on guys, 1080 is a fucking phone res today...


----------



## a_ump (Jun 26, 2014)

marsey99 said:


> 28 inch 1440/1600 you dont need it.
> 24 inch 1200 you dont need it.
> 
> 40 inch 1080 you do.
> ...



Speaking of, so is 2560x1600, or at least the next gen high end smartphones are going quad HD(LG G3).  Here's what i'm dying to know. if every high-mainstream to top noch smartphone can have 1080p and even 1600p. why are pc monitors till stagnant? i mean come on, 4.8-5.5" smartphone gets 1600p/1080p yet we can't have 17" 1080p monitors? 22" 1600p monitors?  I really don't understand it. and with it becoming so damn common 1600p monitors ought to cost $200 tops.


----------



## Champ (Jun 26, 2014)

I say no. I went to best buy and saw a 4k 55" with 4k content and it was the sharpest thing I've ever seen. It has to be seen properly in person to understand


----------



## Shambles1980 (Jun 26, 2014)

640x480 for the win !


----------



## repman244 (Jun 26, 2014)

a_ump said:


> Speaking of, so is 2560x1600, or at least the next gen high end smartphones are going quad HD(LG G3).  Here's what i'm dying to know. if every high-mainstream to top noch smartphone can have 1080p and even 1600p. why are pc monitors till stagnant? i mean come on, 4.8-5.5" smartphone gets 1600p/1080p yet we can't have 17" 1080p monitors? 22" 1600p monitors?  I really don't understand it. and with it becoming so damn common 1600p monitors ought to cost $200 tops.



Because it's not that easy to make a big screen with high resolution, it's not as simple as making the pixels bigger like people think most of the time.


----------



## lexluthermiester (Jun 26, 2014)

I haven't used AA since buying a 1200p[1920x1200] screen some years ago. The visual difference is not big but the performance difference is[depending on the card]. And at 3840x2160 you will not see the pixel laddering at all, even with a larger screen[40+]. But you will feel the performance difference at that high res. Try it yourself with a bench marking tool or a game. Run it once with AA at 4x or 8x and then again with AA completely off. If you like it better on and your FPS are playable, then leave it on. If you can't tell the difference, leave it off. AF on the other hand, benefits greatly from high resolution and if your card has the horsepower, max it out.

My current screens are a 1080p and 1440p in expanded desktop config. I NEVER use AA, it's just not needed at such high res and only drags the system down. AF is at 16x. All my games look and play smooth as silk[EVGA GTX770 4GB].

Test it out and figure out what works for you, then stick with it.


----------



## tokyoduong (Jun 26, 2014)

I've read a few research papers on this before. At 4K, AA is still relevant but not necessary. At 8K, it's not needed for the most part unless you have better than 20/20 eyes. 

At the end of the day, I guess it's how close you sit and the size of your screen. But seriously, who buys a bigger monitor/TV just to sit further unless they just have too much room. Whether you need AA at 4k or not is going to totally depend on your eyes, system setup, distance to screen, and size of screen. Don't take anyone's word for it! just try it when it is available to you and see what works best.


----------



## Peter1986C (Jun 26, 2014)

4K is not a resolution, it is a display mode. Yes, smartphones have a higher resolution because of the large display mode for such relatively small screens but they are not required to do anything meaningful (compared to e.g. workstations).
And things like wires still tend to look weird without AA on my screen (still I leave it off for obvious reasons).


----------



## AphexDreamer (Jun 27, 2014)

Durvelle27 said:


> It's allows me to run a higher resolution than my monitor actually supports.



Well my monitor just says resolution not supported. When trying to force a higher resolution.


----------



## Durvelle27 (Jun 27, 2014)

AphexDreamer said:


> Well my monitor just says resolution not supported. When trying to force a higher resolution.


What connections


----------



## AphexDreamer (Jun 27, 2014)

Durvelle27 said:


> What connections


DVI @ 2560x1440


----------



## Durvelle27 (Jun 27, 2014)

AphexDreamer said:


> DVI @ 2560x1440


The reason I ask is because DVI wouldn't work at all for me but HDMI was fine and worked.


----------



## LAN_deRf_HA (Jun 27, 2014)

Since 1440p made pretty much zero difference on the AA level I need and 4k is only about a 50% ppi bump it's safe to say AA will still be quite necessary at desktop screen sizes. 8k will be when AA might loose some value, but even then I don't think it'd be safe to say it won't need it.



marsey99 said:


> 28 inch 1440/1600 you dont need it. 24 inch 1200 you dont need it.



Sure, if you have glaucoma.


----------



## AphexDreamer (Jun 27, 2014)

Durvelle27 said:


> The reason I ask is because DVI wouldn't work at all for me but HDMI was fine and worked.



Well I managed to to up my refreshrate by 5 hertz. I'd do HDMI but I'm already using that port and I have to have two screen at the very least. 
Also I take it back, I'm using VGA to DVI.


----------



## Aquinus (Jun 27, 2014)

Durvelle27 said:


> The reason I ask is because DVI wouldn't work at all for me but HDMI was fine and worked.


Was both the cable and then port on the GPU using dual-link DVI? Single-link can't handle that resolution. On my 6870s I have two DVI ports and I think only one of them supports dual-link.


----------



## Durvelle27 (Jun 27, 2014)

Aquinus said:


> Was both the cable and then port on the GPU using dual-link DVI? Single-link can't handle that resolution. On my 6870s I have two DVI ports and I think only one of them supports dual-link.


My 290X had 2x DVI-D (Dual-Link), HDMI, and DP 1.2


----------



## 2big2fail (Jun 27, 2014)

Mussels said:


> it entirely depends on the size of the screen.
> 
> a 4K 24"? you wont see it
> 
> a 46"? you'd see it.


+1

Precisely, it depends on the [(pixel area) / (screen area)]^-1 =< (2D Nyquist sampling rate of eyeball) @ x distance from monitor


----------



## gongster (Jun 27, 2014)

Goodness gracious, 4k is really becoming popular, looks like my tri-x 290s wont be able to handle the load


----------



## Scrizz (Jun 27, 2014)

Chevalr1c said:


> 4K is not a resolution, it is a display mode. Yes, smartphones have a higher resolution because of the large display mode for such relatively small screens but they are not required to do anything meaningful (compared to e.g. workstations).
> *And things like wires still tend to look weird without AA* on my screen (still I leave it off for obvious reasons).


THIS!!!!!

there's a huge difference with and without AA. lines such as power lines telephone wires etc look like ****. It drives me crazy
on other things not so much.



gongster said:


> Goodness gracious,
> 
> 
> 
> ...


That's what she said..........


----------



## rooivalk (Jun 27, 2014)

I'm satisfied with low AA on 1080p, probably won't use it at 4K.

To simulating the situation, just play 3D game at iPad 1/2 from your normal viewing distance, it's 130+ ppi screen, almost equal to 32" 4K.


----------



## AphexDreamer (Jun 27, 2014)

rooivalk said:


> I'm satisfied with low AA on 1080p, probably won't use it at 4K.
> 
> To simulating the situation, just play 3D game at iPad 1/2 from your normal viewing distance, it's 130+ ppi screen, almost equal to 32" 4K.



And what do you do to simulate an iPad?


----------



## SaltyFish (Jun 27, 2014)

a_ump said:


> Speaking of, so is 2560x1600, or at least the next gen high end smartphones are going quad HD(LG G3).  Here's what i'm dying to know. if every high-mainstream to top noch smartphone can have 1080p and even 1600p. why are pc monitors till stagnant? i mean come on, 4.8-5.5" smartphone gets 1600p/1080p yet we can't have 17" 1080p monitors? 22" 1600p monitors?  I really don't understand it. and with it becoming so damn common 1600p monitors ought to cost $200 tops.


They have 3200x1800 resolutions on laptop screens as small as 13 inches. And for crying out loud, they can get 1440p on a 11.6 inch screen. It's strange that desktop monitors don't get such pixel density love. Especially since it's more feasible for desktops to have the power to make use of such high resolutions. I don't think it has anything to do with the sizes of screens. 1440p is available starting from 27 inches on desktops while laptops can get them at 11.6 inches; I don't think there's any real technological hurdle to make them at 17 to 24 inches for desktop monitors.


----------



## Peter1986C (Jun 28, 2014)

Windows font scaling is the reason.


----------



## SaltyFish (Jun 28, 2014)

That maybe the official reason, but it's a crappy reason.

Don't laptops have to face the same things? Sure pretty much all of those high resolution and/or pixel density screen laptops come with Windows 8, which does ave better font scaling, but isn't Windows 8 also for desktops? ... oh, right.  *sigh*

Even with it, desktop monitors will have lower pixel density simply due to physical size so font scaling issues are somewhat mitigated there.


----------



## GhostRyder (Jun 28, 2014)

I think the consensus here is your going to have to decide for yourself.  It does make a difference but it depends on the user and the eye looking at the image if it helps them or makes it a better experience.


----------



## johnspack (Jun 28, 2014)

I want 4k at 64x msaa......


----------



## alwayssts (Jun 28, 2014)

Chevalr1c said:


> Windows font scaling is the reason.



I think money and practicality are the reasons.

4k (for the most part) started with high-margin 55''+ tvs, where the practical use was not only arguably the most tangible, but are/(were) also more lucrative (for the panel maker) than a smaller monitor, not to mention it made it more exclusive (which keeps the margin higher longer).  Obviously a monitor vendor wants the opposite; they can make their 30% markup per higher-selling commodity item, but they are reliant on the panel makers and specs to make that feasible.   Also, just like a phone/tablet, a tv is going to likely be a more passive experience with less stringent guidelines (be that color, refresh rate, response time, input lag, input options, etc) and that's where (until recently) we sat. 

Monitors are getting there now, as <70'' panel prices (across resolutions) are expected to take a substantial nosedive by the end of this year (If you've noticed a lot of September release announcements...that's why), and as (like I've mentioned before) the panel suppliers are beginning to get their panels at high-enough yield rates and specs (along with inputs like hdmi 2.0) to make them feasible to cut down and sell in bulk for monitors (obviously at least up to 32'' from auo).  The question then becomes one of practicality/market for something like gaming...and I would argue on the whole the ecosystem is not there yet (meaning for the average consumer).  No doubt this year and into the next it will drop the level from 'elite' to high-end, but things like true adobe RGB color/new displayport are things that will probably help a lot of people make the jump when conditions are more ideal.  As such, that's likely when they will get the most attention by suppliers.  That's not to say there won't be landmark products along the way.  

As for the AA question, I agree it's a case-by-case, but personally, I use a metric of 20/15 vision as many (most?) people fall roughly into that criteria. Arguably it doesn't matter if you go by 20/15 or 20/20 though as the result is similar; in many cases 4k will benefit from 2x AA (or essentially 8k), but it certainly is less predominantly required for many people's scenarios than before 4k.

To make it simple using 20/20 vision:

28'' at 2' would be the threshold for perhaps typically not really needing it.  Might you sit closer or have a relatively larger screen/distance?  Sure.  Might your vision be slightly better?  Probably.  That is why I think 2xAA will remain a thing (for those using up to ~40-42''/2ft, ~32/20'' etc)...just like 8xAA was preferred, but 4xAA was the sweet spot for 1080p.


----------



## dr0thegreatest (Jul 2, 2014)

2big2fail said:


> +1
> 
> Precisely, it depends on the [(pixel area) / (screen area)]^-1 =< (2D Nyquist sampling rate of eyeball) @ x distance from monitor


Exactly, 4k monitors are really at this point a waste of money. Not only that, i played on a 4k monitor on a 4 Crossfired setup, it was good but it wasnt THAT GOOd


----------



## erocker (Jul 2, 2014)

marsey99 said:


> 28 inch 1440/1600 you dont need it.



Well, I can definitely tell the difference between 0, x2 and x4 MSAA with 1440p on a 27". Most of the time I use x2. With some games and no AA I still see jagginess.


----------



## Aquinus (Jul 2, 2014)

erocker said:


> Well, I can definitely tell the difference between 0, x2 and x4 MSAA with 1440p on a 27". Most of the time I use x2. With some games and no AA I still see jagginess.


Even at 1080p I seldom set AA to anything over 2x. I'll use 4x if it really needs it.


----------



## lilhasselhoffer (Jul 2, 2014)

LAN_deRf_HA said:


> Since 1440p made pretty much zero difference on the AA level I need and 4k is only about a 50% ppi bump it's safe to say AA will still be quite necessary at desktop screen sizes. 8k will be when AA might loose some value, but even then I don't think it'd be safe to say it won't need it.
> 
> 
> 
> Sure, if you have glaucoma.



I'd love to live in your world.  It's a cheap dig, but it'd be 1k, 4k, 9k, 16k, 25k, etc...  The term would be related to functionally just tacking a bunch of 1920x1080 monitors together, and 8k would not form a rectangle.


On a non-pedantic note, AA has never been particularly useful to me beyond 2x.  I adopted 1920x1080 whenever 20-24" monitors were a reasonable price.  Before that, you had 1024x768.  At that low of a resolution the reasonably priced 18" monitors didn't look bad (read: video game graphics were polygons, so AA wasn't a high priority).  By the time 4K is reasonably priced, the monitor options will likely start around 27".  Considering that the pixel density will actually be higher than my current monitor, AA will be less useful than the 2x that I occasionally use now.

All this aside, somebody out there always claims that at least 4x AA is a necessity on anything with a pixel density less than 1920x1080 on a 15' monitor.  Those people generally are unsatisfied by anything because they want higher benching values, rather than functional enough operation.  Consider it as such.  UT3 was prettier than UT2k4, which by some people's standards make it better than its predecessor.  I'd say that if UT3 never existed the Unreal Tournament franchise would have been better off.  Sometime better graphics and higher resolutions aren't whet we need, we need content to make 4k anything worth viewing.  As yet, I don't see much out there.


----------



## a_ump (Jul 3, 2014)

SaltyFish said:


> They have 3200x1800 resolutions on laptop screens as small as 13 inches. And for crying out loud, they can get 1440p on a 11.6 inch screen. It's strange that desktop monitors don't get such pixel density love. Especially since it's more feasible for desktops to have the power to make use of such high resolutions. I don't think it has anything to do with the sizes of screens. 1440p is available starting from 27 inches on desktops while laptops can get them at 11.6 inches; I don't think there's any real technological hurdle to make them at 17 to 24 inches for desktop monitors.



I don't think there really is a reason, just excuses.  As a pc gamer i take it as a punch to the face that we're pushed to the back of the room where as TV's, Smartphones, and laptops are getting same resolutions if not higher on much smaller screens.  Mind i still game at 1280x1024(used to game on 1680x1050), I've just been biding my time expecting 2560x1600 screens to have reached reasonable price and size; "22-24" + <$200, by now.

EDIT: I got it! i'll get a smartphone that has "video in" instead of out, and hook up my desktop to it


----------



## Enterprise24 (Jul 7, 2014)

1440p still need AA but 4K is absolutely not from my try.


----------



## yogurt_21 (Jul 11, 2014)

late to the party but screen size and viewing distance play roles here. 1080p could show no jaggies with no aa if you're on a 20" screen sat far from you. On a 27" screen right infront of you it can look like jagggy city. Your eyes at a distance will naturally smooth what the game does not. 18" from your face might be a nightmare, whereas 30" might not. Looking at my mighty mouse avvy fron normal sitting reveals jaggies, as I scoot back it gets smoother and smaller. 

Remember that a 20" 1080P and 40" 4K have the same pixel pitch. So I'd say it depends on the screen size , game, viewing distance, and screen quality. An 80" 4k viewed from your couch just might benefit from a bit of AA. Just sayin.


----------



## badhomaks (Jul 12, 2014)

At 24 and 27 inches no because that's like playing a game with 4x supersampling. In sniper elite 3 at 2.25x supersampling already looks glorious on its own so I can't imagine people not being pleased with the aliasing in games. Unless you want ABSOLUTELY NO SHIMMERING at all in which case just put on txaa 2x.


----------



## vega22 (Jul 12, 2014)

erocker said:


> Well, I can definitely tell the difference between 0, x2 and x4 MSAA with 1440p on a 27". Most of the time I use x2. With some games and no AA I still see jagginess.



of course you can tell their is a difference, they make it more blurry all over the image, not just jaggies.

tbh i talking more about 1600 than 1440 but i guess it all depends upon how you define "need". 

personally i dont think you need it on a 27 1440p. but i too would prefare x2 if it was a vaible option.


----------



## Scrizz (Jul 12, 2014)

I use my AA@ 2x. I don't go higher due to performance.
I can tell a difference with higher AA levels,but it is not enough of a difference to warrant the hit to performance.


----------

