Thursday, January 5th 2017

This Video Clip of Radeon Vega Running DOOM at 4K is Glorious
AMD today posted this video clip showing an AMD Radeon "Vega" graphics card running "Doom" (2016) at 4K Ultra HD, with all its details maxed out. Vega made short work of the AAA game with its Vulkan renderer enabled, clocking frame-rates of around 70 frames per second. This kind of performance should put "Vega" firmly in the high-end segment of graphics cards, when it hits the shelves a little later this year.
75 Comments on This Video Clip of Radeon Vega Running DOOM at 4K is Glorious
I still find that putting a cap of 143 FPS and using G-sync alone is the best solution, this is probably true for freesync as well.
AMD Vega And RYZEN Form Perfect Duo For 'Doom' Slaughterfest At 4K Ultra Setting
Second your opinion has no value for me on this , you tried one type and know nothing of the others and as for freesync 2 or gsync HDR ,no one knows much if anything soooo it could I say again be as you wish done in hardware ,not a resource based on CPU time.
That would be better.
Fast Sync is an additional feature to G-Sync, not a replacement.
1) Fast Sync works by decoupling the rendering from displaying the frame. This means the game may render at 200 FPS, and the screen will display the finished frames in it's own pace (e.g. 60 Hz). The game will continue uninterrupted and think there is no V-Sync.
2) The display logic will render the last finished frame, regardless of how many frames was rendered between last sync.
3) The benefit of Fast Sync is lowering latency, by allowing the GPU work unsynchronized the frames may be "fresher" when displayed. Fast Sync may lower the latency if the frame rate is higher than the screen refresh rate.
4) Fast Sync may lower latency of high framerates, but it doesn't affect when frames are generated. So while frames are drawn at a steady pace, their "age" may vary creating stutter.
But for DOOM, stable 90fps is the bottom line. And 120fps is the target for me.
But hey, this GPU is nice to have.
You possibly wanted to talk about DX12 vs DX11, drop is nowhere as big and AMD doesn't care about OpenGL (R.I.P.) which nVidia loves sooo much it has killed it (by pushing for proprietary shader crap) Fast sync is just good old tripple buffering with a "cooler" marketing name. LOL
en.wikipedia.org/wiki/List_of_games_with_DirectX_11_support
Also using frame buffers != triple buffering...